Ray Tracing - Game Changer or Overhyped?

Ray Tracing - Game Changer or Overhyped?

  • Yes - Game Changer for sure

    Votes: 118 46.6%
  • No - Overhyped, not what was intended

    Votes: 135 53.4%

  • Total voters
    253
I've been excited for ray tracing for years now.... Intel was pushing it pretty hard with their GPU venture as well were they not? Is this NV trying to get ahead of intel :O
 
The hardest thing for me is grasping the trade offs made to call this ray-tracing. The whole bottom/top acceleration structure makes me feel this is still in the environmental light arena compared to a recursive sampling from the camera's point of view.

Tracing a ray from one source to another is technically ray tracing, this just may be a bit different than camera centric algorithm.

You essentially need an acceleration structure to do anything meaningful. Brute force has every pixel checking every triangle in the scene for intersection.

An acceleration structure can turn that into checking against some boxes and if you hit one of those boxes then you can work down the line. Potentially you can skip gigantic amounts of work.
 
Something we should all keep in mind: Nvidia isn’t raytracing all those reflections and stuff. It is doing a fraction of the work and then denoising the heck out of that grainy product. Their raytraced image, before post processing, looks like an incomplete mess. An expensive to process one at that, I don’t mean to cheapen the results, but you’re not seeing fully raytraced effects, it’d take way too long to render in real-time.

The great invention this generation, is a world class denoiser. The good part is it can only get better. I’ll skip Turing, but I’m looking forward to the 7nm generation.
 
The first iteration of T&L on Geforce 256 was why the RTX 2800 Ti is called a GPU today. This will evolve, perhaps like you've said 3-5 years from now as the true game changer. The evolution of this card, will undoubtedly change the game imo whether it's 3, 5, or 8 years from now. When ray tracing is fully capable at mid grade on up GPU. I personally (and don't flame me for this lol) think that the consoles like Xbox held us back visually on PC. I think we should be like 3-5 years at least into Ray Tracing already...which was where we were heading. At least Nvidia got the ball rolling though. Intel had the inkling with Larrabee, which was good, but ended up fizzling. Will Intels mystery dedicated 2020 GPU do RTR? We shall see...but I feel it will. And competition is always a good thing, makes companies strive harder and push the envelope more which will be a great thing for us consumers. Hopefully AMD is cooking something up also.


https://www.techpowerup.com/68545/quake-4-run-ray-tracing-enabled-on-intel-larrabee

Old article makes you get all fuzzy.

26b.jpg


Noticed it said linear scaling with cores etc thats what we all strive for and want.
 
Last edited:
The first iteration of T&L on Geforce 256 was why the RTX 2800 Ti is called a GPU today. This will evolve, perhaps like you've said 3-5 years from now as the true game changer. The evolution of this card, will undoubtedly change the game imo whether it's 3, 5, or 8 years from now. When ray tracing is fully capable at mid grade on up GPU. I personally (and don't flame me for this lol) think that the consoles like Xbox held us back visually on PC. I think we should be like 3-5 years at least into Ray Tracing already...which was where we were heading. At least Nvidia got the ball rolling though.

yea and we will need a 60 month loan to buy one
 
yea and we will need a 60 month loan to buy one

Maybe true. I do think this first gen is very expensive....but I think it goes in line with the economics. The economy and consumer confidence level is way way up at the moment, lots of jobs. So companies are charging more, simple as that as more potential cash flow around.
 
Maybe true. I do think this first gen is very expensive....but I think it goes in line with the economics. The economy and consumer confidence level is way way up at the moment, lots of jobs. So companies are charging more, simple as that as more potential cash flow around.

i'll stick with my 1080 ti strix and wait for AMD and 7nm

I'm not PRE paying for their R&D .

RT is a toy at this point and won't be used in anymore games than they have said by the time this will be replaced in a year with 7nm
 
You do make some good points. It just depends on how many enthusiests want to adopt early tech. I know alot of computer guys are clammoring for something like this because it gives them an edge on consoles at the moment and other GPU's.
 
You do make some good points. It just depends on how many enthusiests want to adopt early tech. I know alot of computer guys are clammoring for something like this because it gives them an edge on consoles at the moment and other GPU's.
since all we get now are consoles ports until AMD gives the consoles RT it will not go main stream
 
Game changer but not yet, maybe in 1-2 generations it should be standard and have hardware that can push it properly at 1440p and above.
 
Not really looking at the tech, but the implementations we've been shown, I say overhyped. Every demo I've seen looks like they went over every surface with a high-gloss wax finish. Breaks whatever "realism" RT is supposed to bring to the scene, and IMHO, looks incredibly tacky.
 
Not really looking at the tech, but the implementations we've been shown, I say overhyped. Every demo I've seen looks like they went over every surface with a high-gloss wax finish. Breaks whatever "realism" RT is supposed to bring to the scene, and IMHO, looks incredibly tacky.

I think it's more along lines of calculated reflections and refraction
 
Not really looking at the tech, but the implementations we've been shown, I say overhyped. Every demo I've seen looks like they went over every surface with a high-gloss wax finish. Breaks whatever "realism" RT is supposed to bring to the scene, and IMHO, looks incredibly tacky.

Could you post pictures/video's as you are seeing something else than I am? :)
 
So should the Titan V really perform well with ray tracing? Also how does the cpu compare to the gpu in ray tracing?
 
IIRC the Titan V doesn't have the guts to natively ray trace. I could be wrong...

As far as CPU vs. GPU, you're not getting real-time ray tracing from generic hardware; the hardware doing this is specialized.
 
IIRC the Titan V doesn't have the guts to natively ray trace. I could be wrong...

As far as CPU vs. GPU, you're not getting real-time ray tracing from generic hardware; the hardware doing this is specialized.

You are correct. DICE developed RT on Titan Vs and they said it did not have the hardware and was very slow. They only had 2080tis with actual RT hardware for two weeks before Gamescon.
 
I need to see some side beside gameplay of with and without ray tracing. To see if it really is worth while.
 
I need to see some side beside gameplay of with and without ray tracing. To see if it really is worth while.

There is a battlefield 5 video that shows it side by side. I personally thought the difference was quite impressive.
 
Looking at some of the specific reflections and such in glass and other surfaces it does look good and solves an issue I found in glass surfaces. But the amount of cycles needed to do that right I bet are well... insane.
 
I wonder who will pull ahead cpu wise. We have monster cpu's from both, alot of cores on AMD side. RT seems to like cores / threads for seamless gameplay.
 
This is a MASSIVE game changer. The difference in quality is absolutely astounding. Unless NVIDIA somehow screws this up big time the only question is how fast the changeover for all upper end 3D games happens. As good as this looks? I'd say 1-3 years likely and 5 at the outside until all gamers expect and demand this tech in one form or another be used for lighting.

The 20 series cards may not have enough power to give us the hybrid ray tracing at 4k 60 this time around, but it can't possibly be more than 1 or 2 gens away now. Gamers will demand this.

There were some RTX on and off comparisons during the presentation but I was disappointed not with the difference but with the unfortunately lousy scenes they used. A lot of them were very dark to try to show all the bounced lighting being calculated but for me the most brain-pleasing scenes were the brighter ones in indoor and outdoor areas. I was really impressed with the BF5 outdoor scene at the canals looking across the water at the buildings. When they turn the ray tracing on your brain goes "whoa, what just happened the scene only changed a little but it's RADICALLY more "real". If you look at the buildings and walls and windows far across the water they all suddenly bounce light off the siding and the WINDOWS in a real way. The light on each side of each building is correct and more importantly the windows are bouncing the correct type of reflection for where you are standing which makes all the difference in the world to the realism of the scene.

And all that is just the open-world advantages. In any kind of indoor enclosed environment even brightly lit it's a massive difference.
 
This is like the first video cards with tessellation. The performance will be garbage for several generations until the software AND hardware improves. For now part of the die is nearly useless to gamers unless you like playing at 1080p 30-60fps again...on a gaming rig that's thousands of dollars.
 
The first iteration of T&L on Geforce 256 was why the RTX 2800 Ti is called a GPU today. This will evolve, perhaps like you've said 3-5 years from now as the true game changer.
Except you didn't have to go back to 320x200 to enjoy HW T&L on it. T&L was a game changer then and there, not 3-5 years later. At this point this seems like a distraction. From what you may ask? We'll find out soon when independent benchmarks on public drivers start to come.
 
Except you didn't have to go back to 320x200 to enjoy HW T&L on it. T&L was a game changer then and there, not 3-5 years later. At this point this seems like a distraction. From what you may ask? We'll find out soon when independent benchmarks on public drivers start to come.

Where does Raytracing seems like a "distraction"....video/images please?
 
Where does Raytracing seems like a "distraction"....video/images please?
What? You seem to have missed the point entirely.

A distraction from the actual performance of the new cards without it. Why do you think I mentioned independent benchmarks?
 
Over time they'll find ways to make the performance hit come down bigtime, for this generation RT won't have that much of an impact.
 
I haven’t seen this post here yet, kind of surprising. I got way more excited for ray tracing after watching it.



Basically they are at 40-50Hz for 1440p and think at least another 30% is easy (52-65Hz). Also looking into splitting RT resolution off and making it selectable. So you can render at 4k and just rt at 1080p. Then you can have your high Hz and still way better visuals. Also DICE only had the cards for two weeks before Gamescon.
 
Except you didn't have to go back to 320x200 to enjoy HW T&L on it. T&L was a game changer then and there, not 3-5 years later. At this point this seems like a distraction. From what you may ask? We'll find out soon when independent benchmarks on public drivers start to come.

Let's assume that's true and Nvidia invested years of research and millions of dollars and spent a bunch of transistors on adding RT hardware just to distract from poor performance in today's games . Why didn't they save themselves a ton of trouble and just throw some more shader cores in there instead of wasting time on RT? Granted we don't really know how much transistor budget RT actually uses. Could potentially be very cheap since it's custom logic.

Either way I really hope Turing delivers the rasterization goods and AMD quickly follows with RT hardware of their own. Then we can move on from all the mental gymnastics required to see the development of proper real time GI as anything but awesome and start arguing over which company has "real" RT :)
 
Let's assume that's true and Nvidia invested years of research and millions of dollars and spent a bunch of transistors on adding RT hardware just to distract from poor performance in today's games . Why didn't they save themselves a ton of trouble and just throw some more shader cores in there instead of wasting time on RT? Granted we don't really know how much transistor budget RT actually uses. Could potentially be very cheap since it's custom logic.

Either way I really hope Turing delivers the rasterization goods and AMD quickly follows with RT hardware of their own. Then we can move on from all the mental gymnastics required to see the development of proper real time GI as anything but awesome and start arguing over which company has "real" RT :)

They could have put 50%+ more CUDA cores....

The video I posted right before your post sold me a lot more. Reflections off everything (windows, water, walls, vehicles, your gun barrel, ect.) I think will greatly increase immersion for me.
 
Obviously a long term must have.

Some might think it would have made more sense for NVidia to wait for 7nm and more spare transistors to introduce it, but really it makes sense for them to push the first generation now, so when the 7nm cards arrive there will be lots of actual Ray Tracing software to run.

I think it's a great move.

While I am not a huge VR fan, eventually I see the combo of VR and Raytracing as dynamite.

While the effects of realistic shadows and reflections may be subtle when you are playing on 24" monitor, they should be much more profound when in VR space.
 
RT seems to like cores / threads for seamless gameplay.

The mention was targeting your basic pre-Coffee i7 at 4C/8T; while ray tracing is inherently parallel, it's hard to imagine the CPU needs growing much. All heavy work should be accelerated on the GPU.
 
they should be much more profound when in VR space.

I think that, along with HDR support, good ray tracing be much more necessary to complete the VR 'experience'.

Well, I'll clarify- it always has been necessary, just now it's actually attainable!
 
What? You seem to have missed the point entirely.

A distraction from the actual performance of the new cards without it. Why do you think I mentioned independent benchmarks?

I know why NIVIDA is talking Raytracing...just like they talked CUDA at the launch of the G80. (If you look back at the G80, it kinda seems like this was there plan along...CUDA -> RT)
If you think that implies that the rasterization performance is "lacking" I think you will be in for a surprise.

Is it so hard to grasp that raytracing is THE holy grail in graphics and NVIDIA is proud that they are bringing it to the market so they focus on that...really?
 
Is it so hard to grasp that raytracing is THE holy grail in graphics and NVIDIA is proud that they are bringing it to the market so they focus on that...really?

I don't think anyone is really questioning NV's focus on Raytracing. It's the occlusion of the other metrics that is suspect, that's what M76 was getting at.

RTX is all exciting until you realize very few games will use it in the next 12 months (AKA the main timeframe when Nvidia will sell 20 series before moving on to what's next), and only for certain effects at that. Therefore, rasterization performance is still a much more important topic until raytracing becomes more prevalent. Of course RTX is the new shiny stuff, but people are right to doubt and criticize NV for not sharing the values that actually matter for now: raster performance. I'm sure in the next 10 years the raster/raytrace balance will shift, and raytracing performance then could and should take precedence.

In the meantime, it's not very relevant to actual game performance = habitual NV shadiness. I'll be glad to see the 20 series kick butt, but even then, you can't ignore that it'll be a short-lived generation. 7nm is in production and around the corner, where the real benefits lay.
 
Back
Top