Ray Tracing - Game Changer or Overhyped?

Ray Tracing - Game Changer or Overhyped?

  • Yes - Game Changer for sure

    Votes: 118 46.6%
  • No - Overhyped, not what was intended

    Votes: 135 53.4%

  • Total voters
    253
Still need to factor in overclocking, general tweaking, driver updates, game patches, and SLI...as all of these things can and should help RT performance.
 
Gamechanger when it can be achieved at 4k at 60-fps, they have to start out somewhere so i'm not really surprised this first iteration of it is on on the slow side.
 
Gamechanger when it can be achieved at 4k at 60-fps, they have to start out somewhere so i'm not really surprised this first iteration of it is on on the slow side.

How complacent have gamers become when real time raytracing is considered "on the slow side"?

my-generation-the-new-generation-the-entitlement-generation-needs-to-6445535.png
 
So 3 out 4 developers are struggling to get 60fps on 1080p, including one developer who has been working on RTX since January, was involved in the Demo in March and is aiming to have their game running at 60fps @ 1080p when Ray Tracing is enabled for the release next February.

And you have on developer who after 3 days is getting 90-130fps with 4K.

One of these results isn't using Real time Ray Tracing.

Did you just blanket compare different engines?
 
To be sure we just have to wait on benchmark testing not Nvidia claim, looks good on video shot at 1080p But what about 4K UHD where it will push the card to it's MAX? then oh boy that looks beautiful then CTD because of the bugs ?

Benchmarks will be out before anything ever ships. You’d have to ignore all the tech sites to buy it without benchmarks heh.

I think I heard ray tracing won’t even work until a DX update in October. I plan to wait until I see which AIB card is the best for power limit and then maybe buy something.

People are being pessimistic based on prerelease nVidia drivers utiltizing prerelease DX drivers on prerelease games for a whole new tech. The problem is these people chant their opinions as fact.
 
Last edited:
How complacent have gamers become when real time raytracing is considered "on the slow side"?

View attachment 99362

Well considering theyre pushing 4k gaming at over 60fps on the same card, then yes, the current ray tracing implementation would be considered slow. That's just the way it is.
 
Been gaming on 4K for about a year. Even with a 1080 Ti, 4K max settings is not possible in every game (Deus Ex, for example).

Funny because many people were previously saying that 4K wasn't worth it, now all of a sudden people only care about 4K.
 
Been gaming on 4K for about a year. Even with a 1080 Ti, 4K max settings is not possible in every game (Deus Ex, for example).

Funny because many people were previously saying that 4K wasn't worth it, now all of a sudden people only care about 4K.


Because that's what has been getting pushed by the tech industry for a few years now. Same as when eyefinity and surround was being pushed heavily, plenty of people (me included) had one of those setups.
 
The hardest thing for me is grasping the trade offs made to call this ray-tracing. The whole bottom/top acceleration structure makes me feel this is still in the environmental light arena compared to a recursive sampling from the camera's point of view.

Tracing a ray from one source to another is technically ray tracing, this just may be a bit different than camera centric algorithm.
 
The hardest thing for me is grasping the trade offs made to call this ray-tracing. The whole bottom/top acceleration structure makes me feel this is still in the environmental light arena compared to a recursive sampling from the camera's point of view.

Tracing a ray from one source to another is technically ray tracing, this just may be a bit different than camera centric algorithm.

It's a start. Everything has to have a start. That old V180 cell phone progressed into a Galaxy S9 eventually. Evolution of technology.
 
Ray-tracing has many flavors now, and actually earlier. Camera, light, hybrid, etc. So for anyone other than a 1990s purest, it fits.
 
The hardest thing for me is grasping the trade offs made to call this ray-tracing. The whole bottom/top acceleration structure makes me feel this is still in the environmental light arena compared to a recursive sampling from the camera's point of view.

Tracing a ray from one source to another is technically ray tracing, this just may be a bit different than camera centric algorithm.

This is actually a good thing!

The hybrid rendering technique is really going to be necessary to push the benefits of ray tracing down the product stack quickly.

Also, since we're already very good at raster graphics, I suspect that using ray tracing hardware to inform raster shaders will be a shortcut that produce a performant compromise until full global illumination is possible at speed with future ~1050/560 range cards.

I really expect this is the way that Sony and MS will want to go with AMD hardware, as while I fully believe that AMD could produce a console APU at volume by 2020 that could do full RT at 4k60 or better, I don't think Sony or MS will want to spend for the per-unit cost :D.
 
If Nvidia has been working on this for 10 years, I'm a little worried if or when AMD can catch up. Doing that for the next consoles seems overly optimistic.
 
If Nvidia has been working on this for 10 years, I'm a little worried if or when AMD can catch up. Doing that for the next consoles seems overly optimistic.

Everybody has been working on Ray Tracing for years. Remember Intel's big thing with Larrabee was supposed to be Ray Tracing. I believe the ATI 2900 xt cards were used to draw trailers for the first Transformer movie in real time.

They also have been working on Radeon Rays and Radeon Rays 2.0 was released last March.

Lastly, AMD has traditionally been strong in Compute performance which should help with Ray Tracing.
 
Last edited:
Much as I dont like how they are going about this Gen I do think that Ray Tracing will be a game changer in 2 to 3 gens time. The only thing thats holding it back is we are all trying to move up to 4K and to get ok performance with RTX it appears 1080p and below is a requirement for modern games.
BFV looked great with the reflections and who doesnt want Ray Tracing. I just dont know if I can stomach turning res down from 4K just to get the ray tracing.
 
Everybody has been working on Ray Tracing for years. Remember Intel's big thing with Larrabee was supposed to be Ray Tracing. I believe the ATI 2900 xt cards were used to draw scenes from the first Transformer movie in real time.

They also have been working on Radeon Rays and Radeon Rays 2.0 was released last March.

Lastly, AMD has traditionally been strong in Compute performance which should help with Ray Tracing.



Makes me wonder about that Intel 2020 Dedicated GPU.
 
No at much a fail as not having raytracing hardware...the train is starting to go now...and again, I amuse myself with the influx of new posters at every launch...always very opinionated in a almost predictable manner ;)

Sure - us new posters can't have an oppinion if it dosen't mirror the old posters.. makes sense

and while the train might be starting now.. i can board it at every step of the way.. i don't magically miss out of anything.. at all

and not having real-time raytrace hardware is a fail? dammn it seems like i am a proud failure.



now: what i don't get is how the greatest RL-ateist has fallen for the nVidia gospel.. that is a riddle worth solving...
 
Last edited:
Ray tracing holds great promise but we are several generations away from realizing it. Nvidia really needs to work on their upscaling if they want us to drop to 1080p to use ray tracing. Integer scaling on 4K displays has been a requested feature for years for a good reason and checkerboard rendering needs better support on the PC.

Right now ray tracing will be something that you will miss by the time the game is in movement. Huge fps hit for not all that much visual improvement over faked reflections and shadows.

That said, I still want to see what CD Project Red can do with the tech, by the time Cyberpunk 2077 is released they might have implemented it and it might fit that game very well.
 
When you start talking about 'Nvidia gospel', you're already outed your bias.

It ain't because you're new.

He specifically called him out for being new dude. Nothing wrong with having a bias. Something is definitely wrong making new posters feel unwelcome.
 
So, if the 2080ti has 21% more cores over the 1080ti. And the 2080 has 15% more cores over the 1080.

Logic says according to nVidia’s 2080:1080 charts the 2080ti should be ~50% faster than the 1080ti (2080 was 47% faster than the 1080) and more than 2.1x with DLSS.

That’s my story and I am sticking with it since we have no other data. Bahahhaha
 
It makes the games look better? But I don't think it will be a game changer. The graphics will be prettier. If you stop to notice that is with bullets flying everywhere and explosions.
 
So, if the 2080ti has 21% more cores over the 1080ti. And the 2080 has 15% more cores over the 1080.

Logic says according to nVidia’s 2080:1080 charts the 2080ti should be ~50% faster than the 1080ti (2080 was 47% faster than the 1080) and more than 2.1x with DLSS.

That’s my story and I am sticking with it since we have no other data. Bahahhaha

2k cards do have double the L2 and their memory interface is basically 1080ti class.

The segments are so bucketized, it's like a carnival game.
 
Well considering theyre pushing 4k gaming at over 60fps on the same card, then yes, the current ray tracing implementation would be considered slow. That's just the way it is.
You're comparing apples to skyscrapers.
 
Moving to raytracing in video games is a natural evolution, and, in the absence of a third alternative, inevitable. There has been talk of it for a long time now but no hardware for it. Aside from performance (for the time being) raytracing is superior to rasterization in every way - just look at CG in movies compared to video games. And I'm not talking about fully animated movies, I'm talking about the stuff you don't even realize is there because it's so realistic. Eventually the tech will be doing that in real-time.

It doesn't look all that impressive right now, but a lot of first gen hardware technologies don't.
Movies are vastly different than video games. The cgi you don't realize it is there is only the background decoration which is usually made trough photogrammetry. If it were a game and you could actually go close to those backgrounds, it would stand out like a sore thumb. The methods that work in movies would be useless for gaming in that form. So there is no point in doing comparisons with movies.

As for foreground objects like monsters and characters, they're still unable to make cgi look anything close to realistic. I wish movies still used practical effects for everything feasible, but cgi is their go to thing now. And the results usually range from terrible to acceptable with a grain of salt. And sometimes it would be easier to just put a person in a costume and put led lights on it, but they do everything with cgi nowadays, and I think it is terrible.

As for this gimmick? I'd wait and see how much difference does it make in looks in actual released games. And do you accept a slideshow for that privilege.
 
Well considering theyre pushing 4k gaming at over 60fps on the same card, then yes, the current ray tracing implementation would be considered slow. That's just the way it is.

We can say this, we just need to put it in context- ray tracing performance will absolutely be slow relative to raster framerates. But we can actually run it in modern games!

That second point was thought to be beyond reach until this point. Getting it playable- lets say maximum frametimes corresponding to 40FPS at 1080p- is a massive breakthrough and means that those with the means can actually experience this first-hand.
 
Movies are vastly different than video games. The cgi you don't realize it is there is only the background decoration which is usually made trough photogrammetry.
People actually underestimate how much movies (well, action movies) are nearly entirely cg these days. See this for example, including many foreground elements.

 
It makes the games look better? But I don't think it will be a game changer. The graphics will be prettier. If you stop to notice that is with bullets flying everywhere and explosions.

But when you get hit by the shiny ray traced claymore pellets, you’ll know where that extra money went.

Honestly I think it will be pretty cool, but I get what you’re saying. I just think it will have a much better impact on lighting which is one of my favorite features of games when done well. I’d rather use computation for this than something half-assed like bad/slow AO techniques.
 
People actually underestimate how much movies (well, action movies) are nearly entirely cg these days. See this for example, including many foreground elements.
I never said there weren't cgi, I said they look awful, and I wish they used practical effects instead on foreground objects and characters. Like iron man's suit looks exactly like it came out the game mass effect, looks plastic and fake. Or Ultron in the previous avengers movie, plastic fake looking robot, with over exaggerated human like animations. It was just painful to watch. CGI in these movies is like taking my immersion out the back door and shooting it in the face, and then in the neck.
 
I never said there weren't cgi, I said they look awful, and I wish they used practical effects instead on foreground objects and characters. Like iron man's suit looks exactly like it came out the game mass effect, looks plastic and fake. Or Ultron in the previous avengers movie, plastic fake looking robot, with over exaggerated human like animations. It was just painful to watch. CGI in these movies is like taking my immersion out the back door and shooting it in the face, and then in the neck.
I can see this in some movies. An example for me is the original Clash of the Titans. I loved it and when the new ones came out, they tried to substitute graphics and effects for story line.
 
I read a brief article yesterday that mentioned ray tracing needs lots of cpu cores to process efficiently. This could be a bigger boon for high core count platforms than the single thread, high IPC philosophy commonly applied to gaming builds today.

Maybe that 2950X TR will smoke the 9700K with ray tracing enabled? If so, then, finally, game changer.

So, will [H] test higher core count cpu's along with higher speed cpu's on the 2080Ti?
 
I think it will be a game changer in 3-5 generations down the road, when we have big, 1100mm+ dies once the race to sub 5nm stops being viable and we have to go back to massive dies to get huge performance increases. I am a bit concerned about how dev's will try to cheat and lock the FoV to nasty, migraine inducing, console options of like 50~60d instead of the nice wide ones we can get with most games now. Unless they have changed it, I had to skip the Metro series since the insanely narrow FoV would make me super sick in under 20 minutes.
 
Back
Top