RTX highend , leaked twin fan Ti and pre-orders microcenter...

Intel is saying they will set their Graphics free in 2020 - Wonder where they're at on the idea of Ray Tracing?





I'm of 2 minds about all of this.

1: Very cool! More performance, more realism, all around great.
2: Stop temping me! I'm TRYING to stick with my 1080 until ~2023 or so. That is, as long as my 1080P TV holds up.
 
Intel is saying they will set their Graphics free in 2020 - Wonder where they're at on the idea of Ray Tracing?




Intel has been toying with Raytracing for years...everyone know it the "holy grail" of graphics:


That was 11 years ago...gives you a clue to how long Raytracing has been in the pipeline ;)
 
I don't think anyone really cares where they are on ray tracing. If we can get more competition in the GPU arena it is good for us--the consumer.


Personally I'd like to see them be competitive vs produce aimless low end products - As such they are going to need to be able to address the market (both their bigger competitors now have done press releases on Ray Tracing as example).
 
Intel has been toying with Raytracing for years...everyone know it the "holy grail" of graphics:


That was 11 years ago...gives you a clue to how long Raytracing has been in the pipeline ;)



Good find - you've cleared some cobwebs for me, I remember seeing some old Intel videos from way back. My hope is somehow Intel comes out with both barrels on fire with solid product that makes nVidia sit and take notice - In my opinion, that's what will be good for us (gaming) consumers.
 
Personally I'd like to see them be competitive vs produce aimless low end products - As such they are going to need to be able to address the market (both their bigger competitors now have done press releases on Ray Tracing as example).

That's probably because you're only buying the very high end which is probably a very small % of the cards they sell. If intel even released a GPU that competed with the 1060 or 1070 that would be fantastic.
 
That's probably because you're only buying the very high end which is probably a very small % of the cards they sell. If intel even released a GPU that competed with the 1060 or 1070 that would be fantastic.

The 1080/Titan market tends to have more size than most give credit - take us away, and suddenly your low and mid end products go up in price to cover costs and development - If Intel released in 2020 a 1060 or 1070 level product short of taking a huge hit on costs, they will never hit the price points needed to sell into the low margin low/mid end graphics sector.

Steam numbers, 1070/Ti and up 7.81% vs 1050/1060 16.26% market share - Factor in the selling price of the upper end vs the low to mid end, the dollars and cents tend to add up pretty well on the upper end to help offset costs.
 
Last edited:
The 1080/Titan market tends to have more size than most give credit - take us away, and suddenly your low and mid end products go up in price to cover costs and development - If Intel released in 2020 a 1060 or 1070 level product short of taking a huge hit on costs, they will never hit the price points needed to sell into the low margin low/mid end graphics sector.

Steam numbers, 1070/Ti and up 7.81% vs 1050/1060 16.26% market share - Factor in the selling price of the upper end vs the low to mid end, the dollars and cents tend to add up pretty well on the upper end to help offset costs.

GeForce (consumer) cards are cheap compared to QUADRO / TESLA cards...
 
I think its time for a rebrand. GeForce was the intro of HW T&L. Raytrace intro should be something new. Or bring back the TnT name and ditch the big numbers. TnT-R1, TnT-R1 GT, TnT-R1 Ultra. TnT-R2 next gen after that...
 
  • Like
Reactions: spine
like this
I'm hoping we'll see remastered current-gen games with ray tracing added. Namely ALIEN: Isolation, can you imagine how good that would look?!
 
Last edited:
I'm hoping we'll see remastered current-gen games with ray tracing added. Namely ALIEN: Isolation, can you imagine how good that would look?!

Honestly it would be good. But I not really sure if we will get great frame rates with ray tracing. I mean the card probably does wonders for studios that don't need high frame rates. I think by second third gen ray tracing hardware we might see games run well. I could be wrong may be they will be ray tracing only few scenes or objects may be. Would love to see the everything ray traced but that will probably require a few gen refinement.
 
well, he did say this is the greatest advancement since cuda in 06 and that was pretty big 12 years ago
 
GeForce (consumer) cards are cheap compared to QUADRO / TESLA cards...

Well of course... however entirely different market segment. Notice reporting notes GTX leading revenues - Quadro's bring their own costs (increases) not limited to dev area's either, its own marketing into many sectors (vs 1 for enthusiast), plus its own budget. Gaming (GTX) revenues are actually so great they surpass all other (4 other) markets combined for nVidia. Those Quadro's aren't going to give you cheap or (make for) cheaper gaming cards if you did away from the upper class cards (1070/Ti and up) - That's nothing short of a wet dream.

http://files.shareholder.com/downlo...6EF/Q3FY17_CFO_Commentary_with_FTs_FINAL_.pdf
 
well, he did say this is the greatest advancement since cuda in 06 and that was pretty big 12 years ago

Yea but that proves nothing how capable it is in games with ray tracing. Ray tracing has been there for a while I highly doubt first gen hardware will give you constant 60fps where everything in the game is ray traced. We will seee, if it does color me surprised.
 
Yea but that proves nothing how capable it is in games with ray tracing. Ray tracing has been there for a while I highly doubt first gen hardware will give you constant 60fps where everything in the game is ray traced. We will seee, if it does color me surprised.

Proves zilch sure but its just a statement that suggest the future will be built on it.. this is the first step in next gen IQ and graphics
 
So we have a rumor of 2080 being 8% faster than 1080 Ti. And another that just came out that says it will clock to 2.5 GHz and be faster than Titan V. Quite a range of possibilities.
 
Its all good to have ray tracing. But games aren't going to be fully fledged with it with first gen hardware. It's on the right track but it will be a while before we see true action. This will help out the pro industry for sure though. I don't expect games to have fully fledged ray tracing throughout the game just yet.

Well consider when the Heaven benchmark first debuted -- tessellation was a relatively new thing. Now all modern GPUs are strong at tessellation and crush the Heaven benchmark.

So yes it will take some time for ray tracing to be taken advantage of by gaming software but Nvidia is providing the hardware.

If it's any comfort, you will see ray tracing used in games as soon as Metro's next game, Exodus, which was developed intensively with Nvidia to take advantage of the new ray tracing hardware ability.
 
So we have a rumor of 2080 being 8% faster than 1080 Ti. And another that just came out that says it will clock to 2.5 GHz and be faster than Titan V. Quite a range of possibilities.

I'm quietly hoping it's the former since I just purchased a 1080 Ti. Although, I should want it to crush it so we can keep getting better chips from each company.
 
Well consider when the Heaven benchmark first debuted -- tessellation was a relatively new thing. Now all modern GPUs are strong at tessellation and crush the Heaven benchmark.

So yes it will take some time for ray tracing to be taken advantage of by gaming software but Nvidia is providing the hardware.

If it's any comfort, you will see ray tracing used in games as soon as Metro's next game, Exodus, which was developed intensively with Nvidia to take advantage of the new ray tracing hardware ability.

True but ray tracing and tessllations are different beasts. As I said it really depends on how its implemented. We just gotta wait and see. I saw the exodus demo but It was a fly by. I would have loved to see with and without ray tracing. or ingame action that has a lot of shit going on.
 
I'm quietly hoping it's the former since I just purchased a 1080 Ti. Although, I should want it to crush it so we can keep getting better chips from each company.

I hope the 2.5 GHz claim hopefully is real as well. During 2016 prior to 1080 launch there were rumors it would clock to 2.5 GHz as well. Those ended up being debunked, so hopefully it's not just a recycled rumor.
 
I hope the 2.5 GHz claim hopefully is real as well. During 2016 prior to 1080 launch there were rumors it would clock to 2.5 GHz as well. Those ended up being debunked, so hopefully it's not just a recycled rumor.

I am not sure. WTF Tech always hypes everycard before launch. Like rx 480s were hitting 1600mhz and crap. Its going to have more cores and its a smaller shrink on the node. So I am thinking clocks will be similar to current cards at best since we are looking at higher core count overall. Only way I see higher clocks is if nvidia actually has allowed for more voltage and power.
 
Ray Tracing has been the promised holy grail of graphics for over a decade now. Intel made it a point to show off a version of Wolfenstein using ray tracing running off of Larrabee.

One of the evident problems was the model animation systems were all broken in these ancient proof of concept demos. Even now, how much of nVidia's recent demos this week leveraged animations in the models? The robot one did but it wasn't clear that that was real time.

Another player in the ray tracing acceleration market was Caustic Graphics which were purchased by Imagination technologies. Under Imagination, there was the OpenRT API which provided a closed source, OpenGL-like syntax for ray tracing. It was supposed to be open enough that if nVidia or AMD wanted to jump on board they could but obviously didn't. Imagination was able to produce demo hardware and looking to sell the IP but collapsed roughly a year ago when Apple stopped licensing their cores.

While not the first, nVidia has the best shot currently of making ray tracing mainstream.
 
So do you guys think Ray Tracing is going to be used for Cut scenes or regular gameplay in general?
 
I have a 1080ti/Vive Pro and have zero problems with VR reprojection? Most games I upscale to 4k per eye.

I only have one game upgrading would help and it already looks great....
That's good to hear, though maxed-out Raw Data and Elite: Dangerous are relatively easy mode as far as VR goes.

The real challenge is DCS World and IL-2 Sturmovik: Battle of Stalingrad. I don't know if there's a CPU in existence that can keep those two from dropping below 90 FPS the moment you start looking toward the ground or get into some kind of AI slugfest, since reading up on forums is quickly giving me the impression that the GPU isn't the limiting factor with those if you aren't running ludicrous amounts of supersampling.

That said, my GTX 980's getting about due for an upgrade anyway. I told myself when I got it that I was skipping Pascal entirely and holding out for Volta/Ampere/Turing/whatever they're calling it now, and I hope the 2080 Ti or whatever they want to call it makes for an appropriate step up.

There's also the matter of yet another USB-C implementation called VirtualLink, which current graphics cards don't have. It's not much of a factor for the current Rift and Vive, but I have a feeling that the next-gen HMDs may wind up using it to simplify cable clutter, and that NVIDIA's next-gen cards will probably implement it alongside the usual smattering of DisplayPort and HDMI.
 
Am I the only one that isn't impressed with ray tracing? None of the demos that I've seen have gotten me excited yet. The opening scenes in old Crytek games like Crysis 3 impressed me more than those ray tracing demos. I'd rather both companies concentrate on creating video cards that can run games at 4K/120 and doubling VR frame rates. I might be impressed with a video card that can handle 8K/120 video streams. Why not increase the quality of game capture on the GPU so that it exceeds CPU capture? Make sure that new cards have at least (2) HDMI ports on cards since nicer TVs are coming with FreeSync support and my Oculus Rift uses HDMI also.

I can think of 100 improvements for video cards, but ray tracing isn't one.
 
Am I the only one that isn't impressed with ray tracing? None of the demos that I've seen have gotten me excited yet. The opening scenes in old Crytek games like Crysis 3 impressed me more than those ray tracing demos. I'd rather both companies concentrate on creating video cards that can run games at 4K/120 and doubling VR frame rates. I might be impressed with a video card that can handle 8K/120 video streams. Why not increase the quality of game capture on the GPU so that it exceeds CPU capture? Make sure that new cards have at least (2) HDMI ports on cards since nicer TVs are coming with FreeSync support and my Oculus Rift uses HDMI also.

I can think of 100 improvements for video cards, but ray tracing isn't one.

EVERYTHING you see in a game is a "cheat" in order to mimick Raytracing.
EVERYTHING!
And even if the "cheats" come close...they are not true, the "cheats" introduce errors, unrealistic lighting/effects etc. something I would call REDUCED Image Quality.

Why have more "fake" graphic at higher resolution...the errors/fakeness will just be more easy to spot.

Again...EVERYTHING you see on-screen in a game is a "cheat" to mimick Raytraing.

Raytracing is what makes CGI looks so good compared to ingame graphics.
That gap will diminish now.
I am amused by your "unimpressed".
 
EVERYTHING you see in a game is a "cheat" in order to mimick Raytracing.
EVERYTHING!
And even if the "cheats" come close...they are not true, the "cheats" introduce errors, unrealistic lighting/effects etc. something I would call REDUCED Image Quality.

Why have more "fake" graphic at higher resolution...the errors/fakeness will just be more easy to spot.

Again...EVERYTHING you see on-screen in a game is a "cheat" to mimick Raytraing.

Raytracing is what makes CGI looks so good compared to ingame graphics.
That gap will diminish now.
I am amused by your "unimpressed".
I thought the demos shown looked boring and dull personally.
 
I thought the demos shown looked boring and dull personally.

Then you were looking at the wrong things.
This is not "FAST PACED ADHD COD GAMEPLAY DEMOS"
This is TECHNICAL demos.
BIG difference that just flew right above your head.
Raytracing, just like facts don't care, if you are bored.
Look again:


4K/8K fake graphics bore me more...
 
I don't think anyone really cares where they are on ray tracing. If we can get more competition in the GPU arena it is good for us--the consumer.

Never thought I’d say this, but I can’t wait for Intel to enter the GPU market. AMD has had a rough time these past few years simply because R&D money challenges, stemming from their previous CPU architecture drying up their revenue. Now Nvidia is so far ahead they can comfortably keep AMD down with more advanced GPUs or, if AMD catches up, Nvidia can lower prices a lot to keep their advantage. AKA, the exact same strategy Intel used with AMD for a long time. Nvidia is the new Intel.

Intel in the GPU space means tons of R&D money being thrown at a new GPU, which should make Nvidia nervous. Some people dismiss Intel as a GPU player, that’s misguided. Intel has never seriously pursued a GPU (you can count Larrabee as their first push, failure, and you bet they’ve learned for their 2nd attempt). I can smell change coming and that’s fantastic. I fear for AMD’s ability to compete in GPU space because they have less R&D money (partially alleviated by the Ryzen success), but Nvidia needs a serious kick in the butt to stop being so lazy.

I feel bad for AMD: they must divide their little money between CPU and GPU development, leaving that much less for both. Their competitors only have to focus on one thing. Intel is big enough to throw money at both things, AMD isn’t. All the more reason why I root for them.
 
Never thought I’d say this, but I can’t wait for Intel to enter the GPU market. AMD has had a rough time these past few years simply because R&D money challenges, stemming from their previous CPU architecture drying up their revenue. Now Nvidia is so far ahead they can comfortably keep AMD down with more advanced GPUs or, if AMD catches up, Nvidia can lower prices a lot to keep their advantage. AKA, the exact same strategy Intel used with AMD for a long time. Nvidia is the new Intel.

Intel in the GPU space means tons of R&D money being thrown at a new GPU, which should make Nvidia nervous. Some people dismiss Intel as a GPU player, that’s misguided. Intel has never seriously pursued a GPU (you can count Larrabee as their first push, failure, and you bet they’ve learned for their 2nd attempt). I can smell change coming and that’s fantastic. I fear for AMD’s ability to compete in GPU space because they have less R&D money (partially alleviated by the Ryzen success), but Nvidia needs a serious kick in the butt to stop being so lazy.

I feel bad for AMD: they must divide their little money between CPU and GPU development, leaving that much less for both. Their competitors only have to focus on one thing. Intel is big enough to throw money at both things, AMD isn’t. All the more reason why I root for them.

One problem Intel will face in GPU's is the lack of patents.
NVIDIA has loads of patents.
AMD has load of patents.
This is not like x86 GPU's where Intel hold the license to x86.
They will come from the bottom up...and they might even hit AMD harder, as NVIDA as a high-end market that AMD stopped competing in...so Intel in graphics might pose bigger problems for AMD than NVIDIA.
 
You could actually see the ray tracing better with the old Amiga stuff not sure if this is Amiga but whatever.


 
You could actually see the ray tracing better with the old Amiga stuff not sure if this is Amiga but whatever.

1. No way that is running realtime on an Amiga.
2. Those shadows are NOT raytraced.

Here is a counter:
 
I think its time for a rebrand. GeForce was the intro of HW T&L. Raytrace intro should be something new. Or bring back the TnT name and ditch the big numbers. TnT-R1, TnT-R1 GT, TnT-R1 Ultra. TnT-R2 next gen after that...

Titan TnT RDX edition! :D

And bring back the Detonator Driver branding.
 
Then you were looking at the wrong things.
This is not "FAST PACED ADHD COD GAMEPLAY DEMOS"
This is TECHNICAL demos.
BIG difference that just flew right above your head.
Raytracing, just like facts don't care, if you are bored.
Look again:


4K/8K fake graphics bore me more...

That was the video that didn't amaze me. I simply cared nothing for the graphics shown. It wasn't the graphics style that was the issue. I just wasn't amazed. I didn't care about Toy Story and refused to watch Disney movies because I thought they looked... Meh?

I've been seeing ray tracing demos for years and none have made me think, "I can't wait for that tech to come out!"
 
That was the video that didn't amaze me. I simply cared nothing for the graphics shown. It wasn't the graphics style that was the issue. I just wasn't amazed. I didn't care about Toy Story and refused to watch Disney movies because I thought they looked... Meh?

I've been seeing ray tracing demos for years and none have made me think, "I can't wait for that tech to come out!"

Then stop buying new hardware and live in the past *shrugs*
 
I hope the 2.5 GHz claim hopefully is real as well. During 2016 prior to 1080 launch there were rumors it would clock to 2.5 GHz as well. Those ended up being debunked, so hopefully it's not just a recycled rumor.

Some people have calculated the new RTX Quadros being 200Mhz higher based off the calculation numbers nVidia showed. There could be some truth to it.
 
Some people have calculated the new RTX Quadros being 200Mhz higher based off the calculation numbers nVidia showed. There could be some truth to it.

Both Pascal and Volta have only been able to clock to around 2100MHz on air or water. So Nvidia playing with TFLOPS numbers by raising stock clocks upward but still below that 2100MHz barrier for RTX Geforce is basically just re-arranging deck chairs on the Titanic and isn't achieving any real performance increases.

Titan V's TFLOPs numbers are "rated" at 1450MHz but it can clock to 2GHz+ like all of these other cards. I don't think anyone considers a cut down die with less cores "rated" at 1750MHz being any actual performance improvement since anybody with MSI Afterburner can just push up the core boost numbers. They need to make an actual clockspeed breakthrough above 2.2GHz to actually make the RTX 2000 series interesting.
 
Back
Top