Ray Tracing - Game Changer or Overhyped?

Ray Tracing - Game Changer or Overhyped?

  • Yes - Game Changer for sure

    Votes: 118 46.6%
  • No - Overhyped, not what was intended

    Votes: 135 53.4%

  • Total voters
    253
Would be cool if Borderlands 3 adapts raytracing like they did when Phys X was released.
So you can actually see the difference :cat::cat::smuggrin: with these realistic games it's give or take really.
 
Ah, it is exhausting when people don't read posts properly... a) it's not my job to find proof for you, you can google things like anyone else. But since I'm nice, LMGTFY: among many others, Anandtech said:

"Meanwhile AMD has also announced that they’re collaborating with Microsoft and that they’ll be releasing a driver in the near future that supports DXR acceleration. The tone of AMD’s announcement makes me think that they will have very limited hardware acceleration relative to NVIDIA, but we’ll have to wait and see just what AMD unveils once their drivers are available."

And b) I did not comment on the degree of AMD's participation in the DXR endeavor. I responded to the following comment:
View attachment 102277
The comment mentioned AMD's fault if they didn't know about DXR. To which I replied that I had already mentioned that they did know well in advance, as they were involved in its development. I didn't comment at all on how much AMD or NV contributed to DXR, just emphasized that they knew and obviously aren't blindsided in some way - because otherwise we're implying that only NV is on the ball about new developments, which is not true. Are they the main driver, yes, are they the only driver, no.
.

Giant, zoomed images of your own post, does not constitute a source. All you did was repeat the unsubtantiated statement I questioned, larger and in image form.

There is ZERO evidence AMD helped create DXR. Their collaboration is NOT "Just like NV".

All the actual evidence is of a NVidia and Microsoft partnership with AMD off to the side. It looks like AMD contributed just about nothing, and has just about nothing related to Ray Tracing. All they have stated is that the will have driver support eventually.

GDC 2018 where DXR was released, was practically the NVidia Ray Tracing show. That isn't coincidence.

In Contrast, AMD at GDC 2018 seemed almost blindsided about Ray Tracing and said very little than some mention of driver, and practically nothing since.
 
Last edited:
Giant, zoomed images of your own post, does not constitute a source. All you did was repeat the unsubtantiated statement I questioned, larger and in image form. There is ZERO evidence AMD helped create DXR. Their collaboration is NOT "Just like NV".

Holy crap, dude. I took an image (oversized??? looks normal on my screen) of the post I quoted and responded to because I didn't know how to quote both things properly, this is all I could think of. Apologies, your highness. And I did not image my source, I linked to source(s, plural).

Fine if you want to say AMD didn't "create" DXR, but it still proves that MSFT collaborated with them, therefore AMD knew about it. If Anandtech, or MSFT's own blog post mentioning AMD (which I'm not going to link to because you're a big boy/gal and can google things yourself), I don'tknow what you consider official.

It looks like AMD contributed just about nothing, and has just about nothing related to Ray Tracing.

To quote your own logic and style: "There is ZERO evidence AMD" "contributed just about nothing". So there, good luck proving otherwise (you can suggest to your heart's content, though).

In Contrast, AMD at GDC 2018 seem almost blindsided about Ray Tracing and said very little than some mention of driver, and practically nothing since.

Again, assumptions without proof of anything. AMD isn't working on any immediate hardware release, so you explain to me the business sense of talking about something that's not ready. This is starting to stink of NV fanboyism, so I'm going to leave this impasse at a disagreement and move on, got better things to do. Peace!
 
I strongly disagree. You think new consoles in 2020 won't have GTX 2070 equivalent performance?

Yup, they absolutely will.

Part of it is that the hardware and software will be well understood by that point, and part of it is that ray tracing can be used in various amounts alongside rasterization.

This means that ray tracing will both be a bit limited as we'd expect, and that it will vary in level implementation not just from engine to engine or game to game, but hell, scene to scene.

Which is great! It gives developers the flexibility to add visuals using available horsepower and to scale visuals with performance requirements.
 
Holy crap, dude. I took an image (oversized??? looks normal on my screen) of the post I quoted and responded to because I didn't know how to quote both things properly, this is all I could think of. Apologies, your highness. And I did not image my source, I linked to source(s, plural).

How it appears on Windows 7 without scaling. It looks about 3 times normal size. Like you are doing ALL CAPS shouting, but even bigger.

Capture.PNG




Fine if you want to say AMD didn't "create" DXR, but it still proves that MSFT collaborated with them, therefore AMD knew about it. If Anandtech, or MSFT's own blog post mentioning AMD (which I'm not going to link to because you're a big boy/gal and can google things yourself), I don'tknow what you consider official.

No it doesn't prove that:

"Meanwhile AMD has also announced that they’re collaborating with Microsoft and that they’ll be releasing a driver in the near future that supports DXR acceleration."

Could mean: "we found out yesterday, and we are collaborating (vague) with MS going forward."

Given that the MS DXR and NVidia RTX were announced together at the same day, at the same show and that all of MS demos were NVidia demos points out who the real partners are.

AMD may be along for the ride, but they aren't among the drivers.
 
How it appears on Windows 7 without scaling.

I never meant it to look gigantic, apologies for that. I'm on a 4K screen and forget to take DPI into account - it's really easy to not think about it.

AMD may be along for the ride, but they aren't among the drivers.

Yes, I've been agreeing with you on this point the whole day. We're interpreting "collaboration" in different ways (and that's OK btw, we don't have to agree), but we both share the idea that AMD is a follower, has been so for the past 2 years, and will keep being so at least until 2020.
 
Giant, zoomed images of your own post, does not constitute a source. All you did was repeat the unsubtantiated statement I questioned, larger and in image form.

There is ZERO evidence AMD helped create DXR. Their collaboration is NOT "Just like NV".

All the actual evidence is of a NVidia and Microsoft partnership with AMD off to the side. It looks like AMD contributed just about nothing, and has just about nothing related to Ray Tracing. All they have stated is that the will have driver support eventually.

GDC 2018 where DXR was released, was practically the NVidia Ray Tracing show. That isn't coincidence.

In Contrast, AMD at GDC 2018 seemed almost blindsided about Ray Tracing and said very little than some mention of driver, and practically nothing since.


Well, I don't really know about how far along they are with working with Microsoft but AMD have been working on Ray Tracing for years. They did some real time rendering in the first Transformers movie using the 2900XT. The 4xxx cards had hardware Ray Tracing with Dx9. And of course, they have been working on Radeon Rays and upgraded to Radeon Rays 2.0 earlies this year. So I Don't think they were blindsided at all. The drivers won't be a problem, as they have a lot of the work done already with Radeon Rays. It's the hardware that's needed. Maybe a Vega Refresh to repurpose some of Vega's massive compute power to Ray Tracing might be a solution for them. Or maybe they just won't bother at the moment as it's going to be a couple of generations to take off. And probably won't become mainstream until the consoles can Ray TRace too.
 
That's an interesting direction the thread has taken with next gen consoles.

Console makers are kind of stuck between a rock and a hard place. Everyone is going to expect and demand 4K support whether anyone actually NEEDS it or not. So, since the graphics hardware is going to have to be beefy enough to handle that and is going to need quite the redesign anyway why couldn't the hardware be designed to not have rasterized lighting at all. Rebuild the whole damn thing from the ground up for hybrid ray tracing only. The one absolute advantage of consoles IS the closed proprietary system. If they say going forward that all lighting will be done THIS way, then that's how it's going to be done.

It's an interesting possibility depending on how much the video hardware can be simplified if it doesn't have to have the mountain of features the 2000 series does to play old and new games well.
 
Raytraced 1080p is obviously and objectively going to look better than rasterized 4K. Resolution doesn’t add new features to an image it just increases clarity. Raytracing can change the image itself.

So all else equal (like fps) the raytraced image is gonna win every time. There are some things you just can’t do with rasterization.
 
Raytraced 1080p is obviously and objectively going to look better than rasterized 4K. Resolution doesn’t add new features to an image it just increases clarity. Raytracing can change the image itself.

So all else equal (like fps) the raytraced image is gonna win every time. There are some things you just can’t do with rasterization.
Sigh, if you have a native 4k screen then going to 1080p looks like shit compared to native resolution. It would seem beyond stupid to say ray tracing looks better yet be too damn blind to be bothered by running below native resolution.
 
Last edited:
I like the idea of it but in most of the demo's thus far they are highlighting small details that will be superfluous fluff in most cases. Seeing a BF V explosion reflecting in the bullet flying by my head or in the car I'm running past at full speed is awesome and a great progression in tech but ultimately not something that will improve the game play on that scale. Seeing things from a distance with reflections in lakes and more is where it will shine if they can keep the frame rates in check. If not I see it being turned off so people can play the game more smoothly.
 
Sigh, if you have a native 4k screen then going to 1080p looks like shit compared to native resolution. It would seem beyond stupid to say ray tracing looks better yet be too damn blind to be bothered by running below native resolution.

Who said anything about running 1080p on a 4K screen?
 
Who said anything about running 1080p on a 4K screen?
Does this really have to be explained? The type of person that is willing to spend 1200 bucks on a gpu is almost certainly not running a native 1080p screen. They are much more likely to have a 4k screen or at least a high refresh rate 1440p or ultra wide 1440p.
 
Does this really have to be explained? The type of person that is willing to spend 1200 bucks on a gpu is almost certainly not running a native 1080p screen. They are much more likely to have a 4k screen or at least a high refresh rate 1440p or ultra wide 1440p.

That’s purely speculative - you obviously don’t know how other people spend their money and it’s not really relevant to the topic.

My comment and others earlier in the thread were about the obvious benefits of better pixels vs more pixels. It becomes even more obvious as resolution increases. E.g. 4K raytraced vs 8K rasterized. Easy choice.

Lots of existing examples to choose from. Fallout new Vegas at 4K looks nowhere as good as Fallout 4 at 1080p etc etc.
 
That’s purely speculative - you obviously don’t know how other people spend their money and it’s not really relevant to the topic.

My comment and others earlier in the thread were about the obvious benefits of better pixels vs more pixels. It becomes even more obvious as resolution increases. E.g. 4K raytraced vs 8K rasterized. Easy choice.

Lots of existing examples to choose from. Fallout new Vegas at 4K looks nowhere as good as Fallout 4 at 1080p etc etc.
Lol not relevant to the subject? It most certainly is as again hardly anyone dropping 1200 bucks on a gpu is going to be using a goddamn native 1080p monitor. :rolleyes:
 
Lol not relevant to the subject? It most certainly is as again hardly anyone dropping 1200 bucks on a gpu is going to be using a goddamn native 1080p monitor. :rolleyes:

You seem to be really interested in what other people buy. Not sure what it has to do with my comment you replied to though.

Do you agree that 1080p raytraced would look objectively better than 4K rasterized? If you agree then what are you arguing about?

/shrug
 
You seem to be really interested in what other people buy. Not sure what it has to do with my comment you replied to though.

Do you agree that 1080p raytraced would look objectively better than 4K rasterized? If you agree then what are you arguing about?

/shrug
Because you are not comprehending that the real world choice will not be between a rasterized 4k image on a native 4k screen versus a ray traced scene on a native 1080p screen. For the VAST majority who buy a 2080 ti It will be between running a game with some ray tracing effects at well below native resolution or running without those effects but all other settings maxed at their native resolution which will be well above 1080p.
 
Really, 50% of you think ray tracing is hype? LOL. Yeah riiiight.

To put that in perspective for you flat-earth-ray-trace nay-sayers ... that's like saying fire or the wheel wasn't that big of an invention ...

ray tracing has been one of the most dreamth of and sought after tech companies have been chasing. According to nVidia, 10+ years. Lord only knows how many hundreds of millions of dollars have been invested.

You guys need to go and read up on early computer graphics to really appreciate what this card can do. Seriously.

I really can't help but to think that some of this mind-set has been tempered with the high cost of this product that a lot of you didn't / couldn't plan or save for. I know all of you would have one of you could. And don't try and play it off like you wouldn't.

Anyways. The RTX 2080 ti is just the start. Imagine what nVidia will launch in 2021. We are only getting started boys.
 
Because you are not comprehending that the real world choice will not be between a rasterized 4k image on a native 4k screen versus a ray traced scene on a native 1080p screen. For the VAST majority who buy a 2080 ti It will be between running a game with some ray tracing effects at well below native resolution or running without those effects but all other settings maxed at their native resolution which will be well above 1080p.

What you’re not comprehending is you can’t put words in someone’s mouth and then argue against those words. I said nothing about the 2080 Ti or what people will do. That’s a whole different convo.

My comment was about the tech and IQ which you seem to have no interest in discussing so why respond to my post. You’re barking up the wrong tree sir (or ma’am).

I really can't help but to think that some of this mind-set has been tempered with the high cost of this product that a lot of you didn't / couldn't plan or save for. I know all of you would have one of you could. And don't try and play it off like you wouldn't.

It has to be the price making people stir crazy because if anybody doesn’t like more accurate reflections or whatever they can just turn them off and enjoy their 4K monitor. The hyper ventilating is not making any sense.
 
Last edited:
Does this really have to be explained? The type of person that is willing to spend 1200 bucks on a gpu is almost certainly not running a native 1080p screen. They are much more likely to have a 4k screen or at least a high refresh rate 1440p or ultra wide 1440p.

I actually run my 4k TV at 1080p because I can’t tell the difference at 8’ and would rather max setting and above 60Hz for minimums..... on a 1080ti.

Also DICE said they are working on having two different resolutions, one for the entire scene and one for ray tracing. So you could potentially render everything at 4k and just rt at 1080p.
 
@ $1200 to be able to raytrace...not for me just yet. When a GPU is $750 and can rasterize 4K, I still see that as a bit expensive.

Plus, no games coming out are “native” ray traced. What I mean is, they have a mode for ray tracing rather than the game being built raytraced from the ground up.
 
Sigh, if you have a native 4k screen then going to 1080p looks like shit compared to native resolution.

For text and work, absolutely. For gaming, it's not that noticeable.

Upscaling 1080p on a 4K set will look bad no matter how many fancy effects are thrown in.

Disagree. I game at 1080p 3 feet away from my 40" 4K and it looks fine. Granted I do so at ultrawide 1080p so that vertical doesn't stretch all the way up. At 16/9, yeah it doesn't look great, but if I have to do it for decent performance, it's a trade-off I'll happily make.

better pixels vs more pixels

The choice is so clear to me, it's not even a choice. Better pixels > more mediocre ones.

I really can't help but to think that some of this mind-set has been tempered with the high cost of this product that a lot of you didn't / couldn't plan or save for. I know all of you would have one of you could. And don't try and play it off like you wouldn't.

Incorrect. I could but several dozens of 2080 Tis with cold hard cash right now if I wanted to. Many of us thankfully make a decent enough salary that we can buy one whenever we feel like it. I don want Turing because I don't think it's good value. 1st Gen products never are. My GTX 1060 will hum along just fine until 7nm GPUs.
 
Last edited:
^ Yes, 1080p upscales nicely on 4k, otherwise 4k would have been a huge fail. Most of TV viewing on 4k screens world-wide is on 1080p. It was designed to be (4 pixels to 1) since no one would buy 4k TVs if 1080p looked like crap on it.
I also have 40" 4k screen and game on 4k @ 2.5-3' away, but sorry cant accept the pixellation at that distance when on 1080p. 4k gaming (on appropriately large screen) is glorious and no way would I step down to a 1080p small screen to enjoy RT. Most who game on 1080p do so on miserably small 24" screens which kills immersion and no amount of ray tracing is going to save the day for screens that small.
 
Really, 50% of you think ray tracing is hype? LOL. Yeah riiiight.

To put that in perspective for you flat-earth-ray-trace nay-sayers ... that's like saying fire or the wheel wasn't that big of an invention ...

ray tracing has been one of the most dreamth of and sought after tech companies have been chasing. According to nVidia, 10+ years. Lord only knows how many hundreds of millions of dollars have been invested.

You guys need to go and read up on early computer graphics to really appreciate what this card can do. Seriously.

I really can't help but to think that some of this mind-set has been tempered with the high cost of this product that a lot of you didn't / couldn't plan or save for. I know all of you would have one of you could. And don't try and play it off like you wouldn't.

Anyways. The RTX 2080 ti is just the start. Imagine what nVidia will launch in 2021. We are only getting started boys.
It's hype right now. When it can be done at high refresh rates and FPS, then it will be good. Maybe in 2-3 years.
Also by then most games will have it instead of a few. Like you said we are just getting started. So yes, it's hyped now.
 
I strongly disagree. You think new consoles in 2020 won't have GTX 2070 equivalent performance? Fat chance. XB1 and PS4Pro already do 4K checkerboard, that wouldn't sell new consoles, it's not progress. Consoles these days use regular parts already in production, tweaked for their custom designs. Then that design is solidified for a number of years where all devs develop to the same static HW configuration. Since we established AMD already is involved with DXR, and we know all their new stuff is coming at 7nm, and it's been reported that their next GPU is related to XB/PS5, put all those three together and you already know there'll be a degree of raytracing in consoles in 2020. That doesn't mean it'll all be raytraced, but certainly you'll get 2070 performance on XB/PS5. Add to that a 5 year lifetime that always shows great improvement due to the locked-down spec, and you pretty much have guaranteed decent use of raytraced effects in the next console generation.

Xbox One X came out last year and it gets what, maybe 970/980 level performance with a CPU bottleneck? I think it's wishful thinking that a 500-700€ next gen console would be able to match a 2018 GPU that alone costs that much. I fully expect the PS5 to be faster than the Xbox One X but not by much. That's still over double GPU performance of current gen consoles.

Performance upgrades are now an expected thing in console lifecycles and sell devices just fine for the enthusiast gamers on said platform. I'll sell my PS4 Pro the moment I have a PS5. Then do it all over again for a PS5 Pro, which might be actually a device capable of raytracing on a console. At that point I will probably consider if I have a desire to also play on PC when even now the best games I've played this year have all been on consoles.
 
Wow what an unwavering support for ray tracing with exactly 0 games out there supporting this. Nvidia supporting this then every game with ray tracing will prolly be the best game ever. Because however you want to turn it people love flashy tech demo's on first generation hardware. What AMD not supporting this hardware feature well AMD must compete or else they won't sell, guess what :) AMD is not competing however you want to frame it.

That means Nvidia can ask any price (for their hardware) provide games more or less funded by yourselves and the amount of ray tracing in games is exactly the amount Nvidia loves you guys (and ray tracing for that matter).
That doesn't mean it'll all be raytraced, but certainly you'll get 2070 performance on XB/PS5. Add to that a 5 year lifetime that always shows great improvement due to the locked-down spec, and you pretty much have guaranteed decent use of raytraced effects in the next console generation.

I have to say that consoles next to never promote agendas from the Atari 2600 to whatever incarnation the Xbox/Play Station is in it is about optimizing for the hardware. The last piece of hardware that was more or less promoting much of anything was the PS3 and that did not carry over into the next generation.

To state that any kinds of serious ray tracing is in the next generation of consoles is kind of silly because that would mean people would dump their PC al together and just play on consoles.
 
Xbox One X came out last year and it gets what, maybe 970/980 level performance with a CPU bottleneck? I think it's wishful thinking that a 500-700€ next gen console would be able to match a 2018 GPU that alone costs that much. I fully expect the PS5 to be faster than the Xbox One X but not by much. That's still over double GPU performance of current gen consoles.

XB1X is more like an RX580 according to some. If next gen consoles arrive in late 2019 or 2020, I'd say close to 2070 performance is absolutely likely, at least certainly 2060, which is expected to perform ~1080. We're talking about a potential 2 full years here, by then the 3080 or 4080 will be that much more powerful.

Wow what an unwavering support for ray tracing with exactly 0 games out there supporting this... To state that any kinds of serious ray tracing is in the next generation of consoles is kind of silly because that would mean people would dump their PC al together and just play on consoles.

Frankly I don't need raytracing to be in games for me to support it. It's a real advancement that's worth everyone's attention. That said, I'm definitely not buying the 20 series, because it's too early, 1st gen is never good value and yes, there's barely any games to justify the purchase now. That said, I highly doubt anyone would dump their PC to play on next gen consoles, because they won't be released in a vacuum - by 2019/2020, PC GPUs will be 2 years newer, 2 generations more powerful, so consoles will certainly be behind the PC.
 
Frankly I don't need raytracing to be in games for me to support it. It's a real advancement that's worth everyone's attention. That said, I'm definitely not buying the 20 series, because it's too early, 1st gen is never good value and yes, there's barely any games to justify the purchase now. That said, I highly doubt anyone would dump their PC to play on next gen consoles, because they won't be released in a vacuum - by 2019/2020, PC GPUs will be 2 years newer, 2 generations more powerful, so consoles will certainly be behind the PC.

Based on what exactly ? Explain it to me , gaming is not going to change because of this. If there was an argument to be made about it looking better then the Battlefield series would have decimated COD and no one would ever have touched Counterstrike.
I played Quake when it came out didn't stun me very short experience for me rather then the technological marvel some perceive it to be.

I would even suggest that Nvidia might abandon it completely(or scale it down) if this current generation hardware does not sell, because the die size is over 700mm supposedly. Unless they make some really impressive progress on the implementation of the ray tracing cores the room to grow is not really there.
 
Do you think the 2070 and the 2080 will be gimped with Ray Tracing?
While the 2080 TI will get by better because it's a faster card?

I want a regular 2080 but dont' want the Ray Tracing to be gimped.
 
Do you think the 2070 and the 2080 will be gimped with Ray Tracing?
While the 2080 TI will get by better because it's a faster card?

I want a regular 2080 but dont' want the Ray Tracing to be gimped.

It’ll be 80% as fast as a 2080ti which is pretty good imo.

It’s 10, 8, 6 Gigarays/sec for the 2080ti/2080/2070.

I’ve kinda been debating the 2080 vs 2080ti as well. The value curve from the 2080 to 2080ti isn’t bad though so I am leaning 2080ti. 47% more cuda cores for ~50% more cost. I do understand not everyone can swing $1200. (Or possibily $999 for a crappy blower.)

29D7E7C4-064F-42C0-903C-8C6D5BDB5BC1.jpeg
 
Last edited:
< more excited about path tracing (aka unbiased rendering) not this silly fake looking ray tracing shit that's been hyped up since 2000
 
OK thanks didn't know the Ray tracing was downplayed on the slower cards if you asked me if I wanted 400.00+ and a 2080 or a 2080ti without 400.00 in steam credit. I would take the 2080 and the Steam Credit. Most people are not going to notice the difference unless visually unless a seriously demanding game comes out say 2-3 years down the road. When cards are coming out anyway every two years anyway.
 
Last edited:
nah man, it's too much work getting off my fat ass to try and lay down the law...

but that's cute to suggest what we are seeing in games here is physically accurate and it's the end-all-be-all... I remember folks were saying that when mario first learned to jump, and look where we are now... oh shit, applying physics to jump mechanics turns out to be a turd, what irony...
 
Back
Top