Nvidia CEO says buying a Gpu without raytracing is crazy...

Status
Not open for further replies.
I'm pretty sure I know how this sequence of strawman arguments goes:
  1. "This new GPU is stupid because I can't afford it."
    1. "Ok, well, here is an updated model which you can afford,"
  2. "I can't believe I'm being forced to upgrade!"
    1. "Ok, that's fine. Don't upgrade."
  3. "This new GPU is stupid because it can't do [today's normal resolution +1] at [today's normal FPS +150%]"
    1. "You're right. But it will still beat your current setup by 80% and your display can only handle 30fps anyway."
  4. "This new GPU is stupid because the top model is better than the bottom model."
    1. "Ok, that's fine. Don't upgrade."
  5. "This new GPU is stupid because two years from now there will be a better one!"
    1. "Ok.... Honestly, I don't know how to respond to something that stupid."
  6. "This new GPU is stupid because a bottom end model from a different brand that is 2 generations old can still kind of sort of run games at 640x480."
    1. "Ok...So you're just a generally uphappy person? And you wanted to tell us this?"
  7. "This new GPU is stupid because next year the other brand will have something that almost matches the performance of a card from this brand which came out two years ago!"
    1. "Ok... so keep buying two year old cards and quit complaining?"
Did I miss anything?
 
Last edited:
Again, this is a bit better than I remember when I was looking into it, but still nothing I would say is going to keep up in 2-4 years

How do you anticipate the venerable 1080 Ti will perform in 2021? How about the 5GB 1060? Bear in mind today's flagship cell phones have already matched the compute capability of the 1060.
 
On post patch (after performance increase in the game) the 2080 TI is running at 94/64 on one map an 105/74 @ 1080p (DXR on low, much worse at higher settings). It also still has visual issues "again there’s not a large difference between all four DXR modes, and some of the issues with godrays being incorrectly reflected still appear to be present."

Then run on Low, since there really isn't much difference between the modes.

I have looked at all the comparison images. I struggle to find anything significantly different in the lower settings.

Surprisingly (to me anyways) the 2070 isn't horrid @ 1080p low. "The RTX 2070 is a GPU we called useless for ray tracing last time because it couldn’t even achieve 60 FPS at 1080p with the Low DXR mode". After patch it's hitting 73/58, which I would consider smooth/playable. There is still a huge performance difference as noted, "However, yes you guessed it, performance is ~50% higher with DXR disabled compared to on the Low mode." I would consider this playable though and possibly worth it depending on how you feel about the looks. It still has some issues with visual, especially with the aggressive culling they introduced (that's how they got higher speeds, the removed parts of the scene), it seems the reflections sometimes miss objects and such which can lead to some distraction and some lighting is still out of place (sun/god rays showing where they shouldn't be).

The old 2070 not being horrid, makes the new 2060S, not so horrid either. The original 2060 I would have put off the table from the beginning, from the low memory before getting further.

BFV is something I would have no interest in with or without RT, and it's really the wrong type of game to showcase this tech. But I guess they needed to go with whatever they had ready to demo.

I think there is lots left to learn about optimizing. This link mentions noise a lot, IIRC they did their own noise reduction, and did not use Tensor cores. It would be interesting if Tensor cores do a better (or maybe worse given DLSS) job on denoising. Denoising RT is a fact of life for the foreseeable future.

Regardless of how it pans out. I would still rather have this generations RT, than none, when choosing between a Super series and a 5700 series.

Also I'd prefer to see a Metro Exodus analysis if you see one, because I greatly prefer it's Global Illumination effects to the Reflections in BFV.
 
IMO RT-wise the 2060 will last you about 1.5 years tops, 2070/2060S 2 years, 2080 and above probably 3 years. If you're willing to turn off RT then you can stack another year on top of those. This is from release date, though, so the later in the product life-cycle you buy the worse it gets.

I don't see a problem with this trend here. The further you go along, the lower the settings and resolution you have to use. Why is this upsetting people now?
 
I'm pretty sure I know how this sequence of strawman arguments goes:
  1. "This new GPU is stupid because I can't afford it."
    1. "Ok, well, here is an updated model which you can afford,"
  2. "I can't believe I'm being forced to upgrade!"
    1. "Ok, that's fine. Don't upgrade."
  3. "This new GPU is stupid because it can't do [today's normal resolution +1] at [today's normal FPS +150%]"
    1. "You're right. But it will still beat your current setup by 80% and your display can only handle 30fps anyway."
  4. "This new GPU is stupid because the top model is better than the bottom model."
    1. "Ok, that's fine. Don't upgrade."
  5. "This new GPU is stupid because two years from now there will be a better one!"
    1. "Ok.... Honestly, I don't know how to respond to something that stupid."
  6. "This new GPU is stupid because a bottom end model from a different brand that is 2 generations old can still kind of sort of run games at 640x480."
    1. "Ok...So you're just a generally uphappy person? And you wanted to tell us this?"
  7. "This new GPU is stupid because next year the other brand will have something that almost matches the performance of a card from this brand which came out two years ago!"
    1. "Ok... so keep buying two year old cards and quit complaining?"
Did I miss anything?

Yes, you just created 7 strawman arguments to argue against. Nobody is making the arguments you just made up and then supposedly answered. Nobody used the word, "stupid," to describe these GPUs, nobody used 640x480 to describe anything concerning these cards. Makes me wonder if you even know what a strawman argument is.
 
Then run on Low, since there really isn't much difference between the modes.

I have looked at all the comparison images. I struggle to find anything significantly different in the lower settings.



The old 2070 not being horrid, makes the new 2060S, not so horrid either. The original 2060 I would have put off the table from the beginning, from the low memory before getting further.

BFV is something I would have no interest in with or without RT, and it's really the wrong type of game to showcase this tech. But I guess they needed to go with whatever they had ready to demo.

I think there is lots left to learn about optimizing. This link mentions noise a lot, IIRC they did their own noise reduction, and did not use Tensor cores. It would be interesting if Tensor cores do a better (or maybe worse given DLSS) job on denoising. Denoising RT is a fact of life for the foreseeable future.

Regardless of how it pans out. I would still rather have this generations RT, than none, when choosing between a Super series and a 5700 series.

Also I'd prefer to see a Metro Exodus analysis if you see one, because I greatly prefer it's Global Illumination effects to the Reflections in BFV.

Did you check out the images in the link I sent? It clearly shows the sun rays in a portion of the game where they shouldn't be. I agree, BF V isn't the greatest game, it's just the first one I found. I will see if I can find a metro one, from what i've seen it is one of the better done games.

Here's a quick review...
https://www.pcgamer.com/metro-exodus-settings-performance-ray-tracing-dlss/

"Above is one of the better scenes showing the difference between ray tracing and traditional rendering from Metro Exodus. Note however that even the ray traced version has many oddities, like why doesn't the gun or guitar or cup cast a shadow, other than on itself?"
"That's because GI isn't necessarily concerned with casting shadows, but rather the default brightness from a single global light source."


So, it changes some of the lighting but doesn't affect shadows. And that was from "one of the better scenes".

https://www.techspot.com/article/1793-metro-exodus-ray-tracing-benchmark/
"Then there’s the question, could the developers of Metro Exodus, have achieved the same global illumination effect without ray tracing?" Well, seeing as GI has been done before, I don't see why it wouldn't have been possible, but only the developers know this answer.
"The RTX penalty is most likely larger, however it’s also a better implementation, so we're not as concerned with this being a pointless re-engineering of global illumination as we were with reflections."

They also have benchmarks (as you've probably figured). From the first link, 2080ti on High (not ultra). It's getting 89 but with lows of 52 (and they use 97% @ pcgamer, not 99% so it's actually more lenient). The 2070 is 57/33, of course the lows are in more demanding parts of the game, but they are still actually parts in the game that you will run into. Just as a reference, the VII was getting 81/43 (obviously without RT). These are from pcgamer, techspot #'s are a bit lower in general as they used Ultra settings not just high settings with DXR.

The good news is enabled ray traced global illumination doesn’t tank performance to anywhere near the degree of ray traced reflections in Battlefield V. This is good news, and I hope going forward we can find more implementations that do it justice. That being said, I still don't see how a card doing 57/33 @ 1080p today is going to last 2,3, or 4 years into future games.

"Looking at the RTX High results, we only saw a 17% increase in performance from disabling RTX at 1080p for the RTX 2080 Ti. This grows slightly to 21% for both the RTX 2080 and RTX 2070, and 24% for the RTX 2060".

And back to the 2080ti....
"The RTX 2080 Ti is the only GPU we’d consider using for High ray tracing in this game at 4K, and even then I feel the performance impact is probably too large for most gamers. Going from a 47 FPS average, which isn’t amazing but playable for this sort of game, to a mid 30s frame rate is not great"

So, you bought you a nice 1200-1300 shiny card to go with your new 4K monitor... please turn down the resolution so you can use this feature that's going to last you 2-4 years easy. Does that really not sound ridiculous to you? And this is one of the few games where it looks decent without incurring to huge of a hit in general.
 
Oh, and seeing as metro is one of the better example, I find this quote funny as crap:
"So as much as specialised hardware to speed up the calculations of the ray intersections is important, fast compute cores and memory which lets you get at your bounding volume data quickly is also a viable path to doing real-time RT."

Aka, the ray intersection calculation in the tensor core is only one part of the battle. That part could still be done in compute cores as well. Maybe AMD will go back to HBM, lol.
 
Did you check out the images in the link I sent? It clearly shows the sun rays in a portion of the game where they shouldn't be. I agree, BF V isn't the greatest game, it's just the first one I found. I will see if I can find a metro one, from what i've seen it is one of the better done games.

Here's a quick review...

So, it changes some of the lighting but doesn't affect shadows. And that was from "one of the better scenes".

That split screen they are talking about, I can click it and open the images to full size.

So the RT image isn't perfect. But It's still 10X better than the non-RTX image. This is one of those situations where people are letting the perfect be the enemy of the VERY damn good.

Aside: the way you randomly intersperse source quotes all through your text makes it annoying to read and parse.


So, you bought you a nice 1200-1300 shiny card to go with your new 4K monitor... please turn down the resolution so you can use this feature that's going to last you 2-4 years easy. Does that really not sound ridiculous to you? And this is one of the few games where it looks decent without incurring to huge of a hit in general.

A little perspective. This game doesn't even hit 60 FPS with no RT effects at 4K on a 2080Ti using Ultra settings. It's already ridiculous before RT enters the picture.
 
Oh, and seeing as metro is one of the better example, I find this quote funny as crap:
"So as much as specialised hardware to speed up the calculations of the ray intersections is important, fast compute cores and memory which lets you get at your bounding volume data quickly is also a viable path to doing real-time RT."

Aka, the ray intersection calculation in the tensor core is only one part of the battle. That part could still be done in compute cores as well. Maybe AMD will go back to HBM, lol.

Quote from where? Intersection calculations happen in RT cores, not Tensor cores. Tensors are used for denoising (if at all).

Here is a breakdown of the calculations in a Metro Exodus Frame. Sure Intersections are only a part of the battle, but if you don't accelerate them they become a dominant part of the battle, and that as everyone is keen on pointing out, is only in Hybrid games where not everything is handled by RT. AFAICT, Tensors tacked on at the end for DLSS.

geforce-rtx-gtx-dxr-metro-exodus-rtx-rt-core-dlss-frame-expanded.png


Here is the frame with and without RT cores and on 1080ti. Intersection calcs grow to dominate frame time without specialized intersection HW:

https://www.nvidia.com/content/dam/...eforce-rtx-gtx-dxr-one-metro-exodus-frame.png
 
Last edited:
Quote from where? Intersection calculations happen in RT cores, not Tensor cores. Tensors are used for denoising (if at all).

Here is a breakdown of the calculations in a Metro Exodus Frame. Sure Intersections are only a part of the battle, but if you don't accelerate them they become a dominant part of the battle, and that as everyone is keen on pointing out, is only in Hybrid games where not everything is handled by RT. AFAICT, Tensors tacked on at the end for DLSS.

View attachment 182647

Here is the frame with and without RT cores and on 1080ti. Intersection calcs grow to dominate frame time without specialized intersection HW:

https://www.nvidia.com/content/dam/...eforce-rtx-gtx-dxr-one-metro-exodus-frame.png

That’s a lot of teraflops!

Everything I’ve seen points to RT cores being their own hardware. It wouldn’t surprise me if RT cores are a specific piece of silicon, but maybe it’s just a tiny portion that sets up the data to be processed efficiently by the tensor arrays.

If it was just tensor the Volta card would be faster than Turing, but it’s significantly slower.

Lighting has always been a fetish for me with GPUs so I love ray tracing. Really does it for me. I still don’t get why people expect absolutely no impact and seem to want it to run movie quality RT on $300 hardware. Lighting/shadows/ect have always been a huge impact to FPS even today. If you think about what is actually happening it’s quite amazing the impact isn’t more.

Perhaps I am biased since I run 3440x1440 and 60Hz, turning ray tracing on essentially has zero negative impact because I go from, say in BFV, from 100 fps to 80 fps. My dips are still above 60.

I think a lot of it is people read these reviews of canned benchmarks that show artificially low fps and still images, which like evaluating AA do it no justice, and make opinions off of that.
 
Last edited:
That’s a lot of teraflops!

Everything I’ve seen points to RT cores being their own hardware. It wouldn’t surprise me if RT cores are a specific piece of silicon, but maybe it’s just a tiny portion that sets up the data to be processed efficiently by the tensor arrays.

If it was just tensor the Volta card would be faster than Turing, but it’s significantly slower.

RT and Tensor cores are separate and distinct HW implementations, there is no doubt at all.

RT cores (intersection HW) are the necessary piece to get real time Ray Tracing accelerated. Even AMDs RT patents, have their own RT cores(Intersection HW), but no Tensor cores, at least in patents found so far.

OTOH, Tensor cores appear to be a solution in search of a problem. I think they are mainly here because NVidia sells the RTX chips in pro cards, where one use is combining many in a server box, for Neural Network training(super hot area right now), and they have been trying to justify their inclusion on the consumer side.

Consumer uses so far have included DLSS ( seriously questionable) and denoising RT scenes with low ray/pixel counts (which will be all real-time RT scenes). Except even for the denoising, it seems very questionable. AFAICT, both BFV and Metro, use their own Temporal denoising routines and not Tensor cores, which further calls into question the inclusion of Tensor cores in the consumer space.
 

A quote from that article, "It’s also important to note that the game doesn’t feel like something is missing from the lighting system when RTX is disabled." At this point, Metro Exodus is the benchmark for RT. At 1440p (a fairly common enthusiast resolution), the difference is playable vs. unplayable on a 2080Ti with RTX on high or Ultra. RTX off will get average FPS into the 70's vs the 50's with RTX on.

I don't see that situation getting better with newer games on newer game engines (e.g. Cyberpunk). I think that's the differing opinion to Jensen's "crazy" not to buy rant, especially in light of the "doesn't feel like something is missing" / good enough lighting but playable FPS counter argument.
 
Last edited:
Yeah, I was already qoute happy, but I saw that one as well. And it does arguably look a bit better on in general from screen shots I've seen. Obviously not perfect, but neither is what it's replacing, otherwise we wouldn't need a replacement.
 
Full context of that last quote:
It’s also important to note that the game doesn’t feel like something is missing from the lighting system when RTX is disabled. It’s not a situation where the developers have removed an effect from the non-RTX version of the game, only to bring it back through ray tracing. The game still looks phenomenal with RTX off, it just – at least in our opinion – looks even better and more accurate with RTX on.

They have to have something there when RT is off, or the very same people currently pointing out the "nothing missing" would be screaming bloody murder that they are ruining Raster modes to sell RT cards. So of course they still have lighting effects with RT off.

But for how long? Tracking down the original Interview, with Metro devs it is a good read. They bring up the point about how much longer they have to support the non-RT patch as they are clearly interested in moving on to RT only:


This is also a question of how long you support a parallel pipeline for legacy PC hardware. A GeForce GTX 1080 isn't an out of date card as far as someone who bought one last year is concerned. So, these cards take a few years to phase out and for RT to become fully mainstream to the point where you can just assume it. And obviously on current generation consoles we need to have the voxel GI solution in the engine alongside the new RT solution. RT is the future of gaming, so the main focus is now on RT either way.

So when we are arguing about card capabilities being weak 4 years from now, it will probably be everything.
 
Last edited:
I would counter that better looking but unplayable FPS is worse than worse looking but playable FPS, especially when it can be argued it looks great either way.

Realistically is just when RTX on keeps it under 60fps. And RTX off keeps it above. In a game where the difference is 90 versus 120 FPS it's less of an issue.
 
I would counter that better looking but unplayable FPS is worse than worst looking but playable FPS, especially when it can be argued it looks great either way.

When there is about 20% performance difference, you have to cherry pick your conditions to move from playable to unplayable on 20% difference.

The good news is enabled ray traced global illumination doesn’t tank performance to anywhere near the degree of ray traced reflections in Battlefield V. Looking at the RTX High results, we only saw a 17% increase in performance from disabling RTX at 1080p for the RTX 2080 Ti. This grows slightly to 21% for both the RTX 2080 and RTX 2070, and 24% for the RTX 2060. And the margins are smaller again when looking at 1% lows, so it’s clear that in the very most intensive areas of Metro Exodus, it’s not ray tracing that is causing that extra performance drop.


But sure, those are sure some sour grapes. You best avoid them. ;)
 
When there is about 20% performance difference, you have to cherry pick your conditions to move from playable to unplayable on 20% difference.



But sure, those are sure some sour grapes. You best avoid them. ;)

Cherry picking? All I had to do was look at the best available example of RT at a common resolution for enthusiasts. We all get that you love your ray tracing and will defend it to the bitter end. Not everyone shares your view at this point in the development cycle. Nor am I against it. I just don't fault anyone for not buying into it at this stage of the game.
 
Cherry picking? All I had to do was look at the best available example of RT at a common resolution for enthusiasts. We all get that you love your ray tracing and will defend it to the bitter end. Not everyone shares your view at this point in the development cycle. Nor am I against it. I just don't fault anyone for not buying into it at this stage of the game.

What is your playable/unplayble dividing line? "...RTX on keeps it under 60fps. And RTX off keeps it above"

Even a 2080Ti at 1080p can't maintain above 60 FPS with RTX off. Average above 60 FPS is NOT the same as maintaining (AKA keeping above) 60 FPS. Assuming you have Vsync on, every dip below 60 FPS will stutter to 30 fps.

If you have VRR you really aren't going to notice a much difference between 50 FPS and 60 FPS average like on the 2070 (the only actual case where the Average FPS moves from just barel 60 FPS to below 60 FPS).

Your playable/unplayable dividing line is empty arbitrary nonsense.
 
Full context of that last quote:


They have to have something there when RT is off, or the very same people currently pointing out the "nothing missing" would be screaming bloody murder that they are ruining Raster modes to sell RT cards. So of course they still have lighting effects with RT off.

But for how long? Tracking down the original Interview, with Metro devs it is a good read. They bring up the point about how much longer they have to support the non-RT patch as they are clearly interested in moving on to RT only:




So when we are arguing about card capabilities being weak 4 years from now, it will probably be everything.
I completely agree it will be everything, which is what the point was. Whether you get RT today or not, your going to have to upgrade to use it in future titles.
 
What is your playable/unplayble dividing line? "...RTX on keeps it under 60fps. And RTX off keeps it above"

Even a 2080Ti at 1080p can't maintain above 60 FPS with RTX off. Average above 60 FPS is NOT the same as maintaining (AKA keeping above) 60 FPS. Assuming you have Vsync on, every dip below 60 FPS will stutter to 30 fps.

If you have VRR you really aren't going to notice a much difference between 50 FPS and 60 FPS average like on the 2070 (the only actual case where the Average FPS moves from just barel 60 FPS to below 60 FPS).

Your playable/unplayable dividing line is empty arbitrary nonsense.
I think part of the problem too is acceptable is different to everyone and a single number is useless, since different games work ok at different speeds. Example, in rocket league I try to keep my fps at 200... In Minecraft, 100 is fine (and even less doesn't really make much difference to me). Other games 30-40 is fine, some need 60 consistently, others work best 80+. As mentioned, very arbitrary and to each their own.
 
What is your playable/unplayble dividing line? "...RTX on keeps it under 60fps. And RTX off keeps it above"

Even a 2080Ti at 1080p can't maintain above 60 FPS with RTX off. Average above 60 FPS is NOT the same as maintaining (AKA keeping above) 60 FPS. Assuming you have Vsync on, every dip below 60 FPS will stutter to 30 fps.

If you have VRR you really aren't going to notice a much difference between 50 FPS and 60 FPS average like on the 2070 (the only actual case where the Average FPS moves from just barel 60 FPS to below 60 FPS).

Your playable/unplayable dividing line is empty arbitrary nonsense.

60 FPS is a generally accepted playable/unplayable dividing line (I guess to everyone but the great RTX apologist) because of the 60 hz refresh rate of most monitors. I didn't just pull that number out of thin air. Not everyone has VRR monitors. Common sense would tell you that if a card can't push an average of over 60 FPS it is going to be under 60 more often than one that averages over 60.
 
I completely agree it will be everything, which is what the point was. Whether you get RT today or not, your going to have to upgrade to use it in future titles.

At some point, yes. Where that exact point is, is sheer guesswork and fuzzy at best.

What isn't fuzzy guesswork, is that before that point, you undoubtedly get better RT experience with RT capable HW, than no RT HW.

Not sure what is so difficult to grasp, that people have to constantly argue that having no RT HW at all is somehow equal or even better, than RT HW that will someday be obsolete.
 
At some point, yes. Where that exact point is, is sheer guesswork and fuzzy at best.

What isn't fuzzy guesswork, is that before that point, you undoubtedly get better RT experience with RT capable HW, than no RT HW.

Not sure what is so difficult to grasp, that people have to constantly argue that having no RT HW at all is somehow equal or even better, than RT HW that will someday be obsolete.

If the hardware can't run RT now, it doesn't matter if you have RT or not. You are going to have to turn it off to have a playable experience. That's what people are arguing. You are assuming that future optimizations will make current hardware playable. But that is an assumption. like I said I don't have any delusions that my 2060 super is going to do anything remotely Ray tracing oriented ever at playable frame rates.
 
If the hardware can't run RT now, it doesn't matter if you have RT or not. You are going to have to turn it off to have a playable experience. That's what people are arguing. You are assuming that future optimizations will make current hardware playable. But that is an assumption. like I said I don't have any delusions that my 2060 super is going to do anything remotely Ray tracing oriented ever at playable frame rates.

Already shown by tweaking settings, you can get 80+ FPS RTX on, out of 2060 Super. So this constant, "doesn't work now" refrain is rather empty.
 
Already shown by tweaking settings, you can get 80+ FPS RTX on, out of 2060 Super. So this constant, "doesn't work now" refrain is rather empty.

On the one hand you say that a 2080Ti can't run RTX off in Metro on Ultra at 1080p and on the other you say that tweak down the settings on a 2060 Super and you can have RTX On... I also assume you're talking 1080p (which is an uncommon enthusiast resolution). I'm sure you could turn RTX on with a 1080Ti and get some level of playability if you run 720p or less with all settings at low, but what is it really going to look like?

I don't know that the average person would notice RTX "low" vs RTX off in a game unless you look at side by side screenshots and even then the difference would be minimal. I know I didn't notice a whole lot in Shadow of the Tomb Raider (RTX medium I think) other than 20 less FPS which was more distracting than the fake lighting.

I'll rephrase though. It's not so much "doesn't work now" as it doesn't work now with the IQ, framerates, and resolutions some people are looking for. I don't agree with Jensen that people are "crazy" if they buy something without it at this point, especially in the $300-400 range where the competition is.
 
Last edited:
On the one hand you say that a 2080Ti can't run RTX off in Metro on Ultra at 1080p and on the other you say that tweak down the settings on a 2060 Super and you can have RTX On...

On that one hand, I was trying to figure out what settings you were using when it became "uplayable". It must be the settings in the Techspot article being discussed. And with those settings even a 2080Ti struggles with RTX OFF.

If you want to stick to the 1440p/Ultra settings, 60 FPS settings, well guess what, your 2060 Super already fails at RTX off, traditional rasterization.

That's it its over, your 2060 Super is already uplayable at traditional rasterization based games(2070 is a reasonable stand in):
https://static.techspot.com/articles-info/1793/images/2019-02-15-image-2.png

1440P Ultra, No Ray Tracing: Lowly 46.5 FPS (IE below your unplayable line of 60 FPS).

Junk it now, it's useless. IIRC, you or one of your cronies said, Games never use less power going forward, so it's junk today and worse in the future.

That is the criteria you apply to RTX performance, so be even handed and apply the same to Rasterization games...

Or you could be reasonable, and recognize, that different resolution, and performance setting make both RTX performance and Traditional rasterization performance viable on these cards.
 
I'll rephrase though. It's not so much "doesn't work now" as it doesn't work now with the IQ, framerates, and resolutions some people are looking for. I don't agree with Jensen that people are "crazy" if they buy something without it at this point, especially in the $300-400 range where the competition is.

I agree with all of this. We aren't even talking full scence RT at this point. I've said it before we are a good 2 - 3 generations away from this easily. In addition to that RT isn't an effect. It's a rendering technique. There's not too much that can't be done via rasterization because of it's speed and the "hacks" to reproduce what RT is capable of still don't take enough of a performance hit to not come close to what RT is capable of and therefore not use. The sacrifice is obviously IQ but there isn't a single gamer I know of that wouldn't sacrifice IQ for performance. Hell we have games that until this last generation couldn't be maxed out at 1080p and stay above 60FPS. The problem is that hardware is not powerful enough to fully render a scene via RT. So right now it's being used as an effect on a limited number of objects.

Am I glad Nvidia started the conversation with their latest cards? Of course I am. But the topic should be discussed very honestly because the last thing we need is a lack of lighting, reflections, etc merely because it's not rendered and not because it couldn't be achieved through other methods. I got really worried when I saw the side by side video of metro because many of the things not shown with RT Off could be done with RT off.

This is very different than moving from direct x versions. We have to go back to the 5800 FX days to visualize hardware not capable of a rendering path and even then the fix was only 1 generation away. Full Scene RT is many many generations away and no card that you can buy today or tomorrow can do it at frame rates anyone would accept with the rendering technique implemetned fully. Because of this everyone needs to keep it in their head that marketing forces are pushing us to a new rendering technique that cannot be fully implemented yet and therefore many of the "effects" they are seeing is limited to specfic objects that really don't do RT the justice it deserves if implemented fully.

Maybe that's enough for some people I don't know. But something tells me the transition is going to be rocky and we will see things or not see things we should or should not see.
 
At some point, yes. Where that exact point is, is sheer guesswork and fuzzy at best.

What isn't fuzzy guesswork, is that before that point, you undoubtedly get better RT experience with RT capable HW, than no RT HW.

Not sure what is so difficult to grasp, that people have to constantly argue that having no RT HW at all is somehow equal or even better, than RT HW that will someday be obsolete.
I never said it was equal, I said all things equal buy the RT... But, IMO if you can save money and forgo RT at this time you probably will never miss it and could put that $ to something more useful. Obviously this amount will differ person to person.
 
Yes, you just created 7 strawman arguments to argue against. Nobody is making the arguments you just made up and then supposedly answered. Nobody used the word, "stupid," to describe these GPUs, nobody used 640x480 to describe anything concerning these cards. Makes me wonder if you even know what a strawman argument is.
I've seen all of Thunderdolt 's examples used before.
 
Au contraire... 1080p is still by far and away the most common gaming resolution.

That's not what I said though. I said "enthusiast" resolution. People who spend more than $400 on a video card probably are not running a 1080p monitor.
 
Last edited:
Honestly the Ray Tracing is the least exciting part of the RTX lineup. I am far more interested in the other aspects of what those cores are capable of. The AI capability’s alone are fascinating, I have read some papers on game studios experimenting with AI enemies, they learn as you beat them. So the enemies don’t get harder by becoming bullet sponges that hit harder but by learning to counter your play style. And think about MMO’s where the dungeons and raids learn to counter the meta that is used against them.
 
Honestly the Ray Tracing is the least exciting part of the RTX lineup. I am far more interested in the other aspects of what those cores are capable of. The AI capability’s alone are fascinating, I have read some papers on game studios experimenting with AI enemies, they learn as you beat them. So the enemies don’t get harder by becoming bullet sponges that hit harder but by learning to counter your play style. And think about MMO’s where the dungeons and raids learn to counter the meta that is used against them.

The cores will never be used for in game ai, so let's shoot that one down, enemies learning isn't really a problem, it can be done, the problem is always how to balance it so they don't become frustrating hard, which is a thing that can happen.
 
This thread is obviously taken over by 3 nvidia cheerleaders, who are unable to except facts.

Jensen said what he said, because he is delusional and thinks people should buy his piss-poor products because he is hip. Nvidia's RTX cards are a flop and can't do what they, or the cheerleaders keep pronouncing. And don't show us a game with max frames, show us a game with 5% minimums... and how RTX makes your frames jump all over the place and eratic. Nobody is buying a new GPU to go back to 5 year old frames. Just buy it.... doesn't work with smart people.


Nvidia doesn't have a working Ray-Tracing solution and it is just a gimmick brought on by Jensen's greedy marketing. And too many people followed him into that RTX hole. Nvidia is going to have to wait a whole entire year to offer a card with real ray tracing. The problem is.... while you wait Nvidia wants you to think that RTX is legit and worth money. LOL @ all the feeble minds out there. Nobody games with RTX On unless you are a gullible fanboy.

Nvidia's ray Tracing solution for games coming out is called Ampere. That doesn't come out until October 2020... like 13 months from. So stop with the ray tracing bullsh!t it is real old. Tons of people who own RTX cards mocking Jensen's ray tracing... I don't know anybody who plays with RTX On... only one person on the forums has admitted to playing with rtx on.... Yet we have superstar cheerleaders who want a leather jacket... trying to sell here.


Laughable.
Everyone (including the cheerleaders) know that RTX On is a hoax and a big joke in the gaming industry. And yes, there are people who pretend not to live outside of HardForums who pretend not to know what is being said on reddit, or other sites. So they come here to spread their little lies and cheerlead.

Buying a GPU without ray-tracing is crazy, because Jensen has a sickness....
 
This thread is obviously taken over by 3 nvidia cheerleaders, who are unable to except facts.

What, all the AMD apologists and cheerleaders don't get a mention?

Jensen said what he said, because he is delusional and thinks people should buy his piss-poor products because he is hip. Nvidia's RTX cards are a flop and can't do what they, or the cheerleaders keep pronouncing. And don't show us a game with max frames, show us a game with 5% minimums... and how RTX makes your frames jump all over the place and eratic. Nobody is buying a new GPU to go back to 5 year old frames. Just buy it.... doesn't work with smart people.


Nvidia doesn't have a working Ray-Tracing solution and it is just a gimmick brought on by Jensen's greedy marketing. And too many people followed him into that RTX hole. Nvidia is going to have to wait a whole entire year to offer a card with real ray tracing. The problem is.... while you wait Nvidia wants you to think that RTX is legit and worth money. LOL @ all the feeble minds out there. Nobody games with RTX On unless you are a gullible fanboy.

Nvidia's ray Tracing solution for games coming out is called Ampere. That doesn't come out until October 2020... like 13 months from.

Already been refuted, take off the blinders. No one is denying it's too expensive for most, for what you currently get.

So stop with the ray tracing bullsh!t it is real old.

Then why are you bothering to post in the thread? Feels like a tantrum because you don't have the same toys as some of the others.

Tons of people who own RTX cards mocking Jensen's ray tracing... I don't know anybody who plays with RTX On... only one person on the forums has admitted to playing with rtx on.... Yet we have superstar cheerleaders who want a leather jacket... trying to sell here.


Laughable.
Everyone (including the cheerleaders) know that RTX On is a hoax and a big joke in the gaming industry. And yes, there are people who pretend not to live outside of HardForums who pretend not to know what is being said on reddit, or other sites. So they come here to spread their little lies and cheerlead.

Buying a GPU without ray-tracing is crazy, because Jensen has a sickness....

Now you are just being nasty. We all know your opinion.
 
Honestly the Ray Tracing is the least exciting part of the RTX lineup. I am far more interested in the other aspects of what those cores are capable of. The AI capability’s alone are fascinating, I have read some papers on game studios experimenting with AI enemies, they learn as you beat them. So the enemies don’t get harder by becoming bullet sponges that hit harder but by learning to counter your play style. And think about MMO’s where the dungeons and raids learn to counter the meta that is used against them.

When the cards were initially announced, I wasn't that interested in Ray Tracing, I was also most interested in what could be done on the tensor cores including DLSS, especially DLSS 2X(hoping for something that improves the blurry post process AA methods).

But in practice everything done with the tensor cores seems to have been a bust, DLSS looks pointless at best. DLSS 2X never showed up, and we are almost a year out for these cards now, and that is still missing in action. Denoising Ray Tracing was the other feature: AFAICT BFV and Metro use their own temporal denoising.

Don't hold your breath on getting anything useful out of the tensor cores. I think they were mainly put it there for usage in the pro cards, and they are trying to justify their inclusion on the consumer cards.

Ray Tracing on the other hand will only grow importance with new games getting it all the time.

"Control" releases today, and it's getting very good reviews, makes extensive use of Ray Tracing, and it's one of the free games with current Super cards.
Review: Control is Remedy’s best game yet—and a ray tracing masterpiece
Ray tracing on compatible PCs looks so incredible, it's worth the downgrade in resolution.

Basically I expected DLSS and Tensor cores to be the big bonus, in reality that is a bust and Ray Tracing is actually the main feature.
 
When the cards were initially announced, I wasn't that interested in Ray Tracing, I was also most interested in what could be done on the tensor cores including DLSS, especially DLSS 2X(hoping for something that improves the blurry post process AA methods).

But in practice everything done with the tensor cores seems to have been a bust, DLSS looks pointless at best. DLSS 2X never showed up, and we are almost a year out for these cards now, and that is still missing in action. Denoising Ray Tracing was the other feature: AFAICT BFV and Metro use their own temporal denoising.

Don't hold your breath on getting anything useful out of the tensor cores. I think they were mainly put it there for usage in the pro cards, and they are trying to justify their inclusion on the consumer cards.

Ray Tracing on the other hand will only grow importance with new games getting it all the time.

"Control" releases today, and it's getting very good reviews, makes extensive use of Ray Tracing, and it's one of the free games with current Super cards.
Review: Control is Remedy’s best game yet—and a ray tracing masterpiece


Basically I expected DLSS and Tensor cores to be the big bonus, in reality that is a bust and Ray Tracing is actually the main feature.

From the review:

The ugly
  • How much you have to pay in 2019 to get this game's ray traced version to run efficiently.
In other words you aren't running it on $400 hardware as we've already discussed ad nauseam.
 
From the review:

The ugly
  • How much you have to pay in 2019 to get this game's ray traced version to run efficiently.
In other words you aren't running it on $400 hardware as we've already discussed ad nauseam.

Much like I already mentioned in the Metro Exodus post above, you apply an uneven double standard to RT performance. Where you insist you get 1440p at 60 FPS or it's a failure/obsolete.

But $400 cards already fail to achieve that in rasterization alone. By your logic. $400 cards are already dead for rasterization performance, let alone RT performance.

Here is the 1440p performance for Control without RT, on a variety of cards at 1440p. Note that $400 cards (2060 Super, 5700XT) FAIL to achieve even an average of 60FPS without RT:
Control1440.JPG

If you want to maintain your zealots position on requiring 1440P/60FPS, your $400 card is already obsolete before we even get to Raytracing.

Or as I said for Metro post above, you can recognize the necessity of compromise (even on rasterization), then you can check what options are availble for a $400 card:
2060 Super.jpg
 
I find it funny how on [H], where we bang tens, make $100k+, and pride ourselves in the fastest rigs; all of a sudden everyone is financially frugal in RTX threads. ;)

Much like I already mentioned in the Metro Exodus post above, you apply an uneven double standard to RT performance. Where you insist you get 1440p at 60 FPS or it's a failure/obsolete.

But $400 cards already fail to achieve that in rasterization alone. By your logic. $400 cards are already dead for rasterization performance, let alone RT performance.

Here is the 1440p performance for Control without RT, on a variety of cards at 1440p. Note that $400 cards (2060 Super, 5700XT) FAIL to achieve even an average of 60FPS without RT:
View attachment 183197

If you want to maintain your zealots position on requiring 1440P/60FPS, your $400 card is already obsolete before we even get to Raytracing.

Or as I said for Metro post above, you can recognize the necessity of compromise (even on rasterization), then you can check what options are availble for a $400 card:
View attachment 183198

That's awesome you get RT basically for free due to DLSS.
 
Status
Not open for further replies.
Back
Top