Nvidia CEO says buying a Gpu without raytracing is crazy...

Status
Not open for further replies.
Much like I already mentioned in the Metro Exodus post above, you apply an uneven double standard to RT performance. Where you insist you get 1440p at 60 FPS or it's a failure/obsolete.

But $400 cards already fail to achieve that in rasterization alone. By your logic. $400 cards are already dead for rasterization performance, let alone RT performance.

Here is the 1440p performance for Control without RT, on a variety of cards at 1440p. Note that $400 cards (2060 Super, 5700XT) FAIL to achieve even an average of 60FPS without RT:
View attachment 183197

If you want to maintain your zealots position on requiring 1440P/60FPS, your $400 card is already obsolete before we even get to Raytracing.

Or as I said for Metro post above, you can recognize the necessity of compromise (even on rasterization), then you can check what options are availble for a $400 card:
View attachment 183198

I'm not saying that new cards don't struggle with performance on new titles. I'm pointing out that the RTX performance tax is very real and very large (again). From the same article looking at the 2060 super, if you average 55 FPS with RTX off, you average 32 FPS with RT high / 43 RT medium. RT medium cuts out 3/5 of the RT features, so you end up with nothing but a few RT reflections and a whole lot of rasterized rendering (e.g. what's the point?). Or, you end up rendering it at something like 900p (see your review above) and upscaling it to get any semblance of playability with RT.

Additionally, I find it amusing how you held up a new gold game standard of RT game and the review you quoted a few posts up clearly said the "Ugly" was how much you had to pay to make it look pretty using RT.

I find it funny how on [H], where we bang tens, make $100k+, and pride ourselves in the fastest rigs; all of a sudden everyone is financially frugal in RTX threads. ;)

A man in a leather jacket tells you that your previous high end $699 card is now one generation later $1199 and a year after launch, you have barely a handful of titles to use all the new features with? And then when you enable the features you take a severe performance hit. And then the CEO tells people you're "crazy" when you don't listen to him and pay inflated prices.

But you're right...it's all about being cheap and not wanting to pay for performance :rolleyes:.
 
I'm not saying that new cards don't struggle with performance on new titles. I'm pointing out that the RTX performance tax is very real and very large (again). From the same article looking at the 2060 super, if you average 55 FPS with RTX off, you average 32 FPS with RT high / 43 RT medium. RT medium cuts out 3/5 of the RT features, so you end up with nothing but a few RT reflections and a whole lot of rasterized rendering (e.g. what's the point?). Or, you end up rendering it at something like 900p (see your review above) and upscaling it to get any semblance of playability with RT.

Additionally, I find it amusing how you held up a new gold game standard of RT game and the review you quoted a few posts up clearly said the "Ugly" was how much you had to pay to make it look pretty using RT.



A man in a leather jacket tells you that your previous high end $699 card is now one generation later $1199 and a year after launch, you have barely a handful of titles to use all the new features with? And then when you enable the features you take a severe performance hit. And then the CEO tells people you're "crazy" when you don't listen to him and pay inflated prices.

But you're right...it's all about being cheap and not wanting to pay for performance :rolleyes:.

Technically, it's $999 for a 2080ti. 60% more transistors for 43% more cost. (or even cheaper if you compare it to the Titan Xp). I realize this doesn't really matter, but addresses the price gouging, since they are actually doing it less with Turing; which brings me to the next line:

I looked at the rasterized alone when deciding, 45% faster for 43% more cost, not ideal. I do personally have a fetish for lighting/shadows/reflections so I do get off to RT, but was not factored in.

My comment was more at that everyone is focused on the $300-500 section only recently and especially in RTX threads. ;)
 
I'm not saying that new cards don't struggle with performance on new titles. I'm pointing out that the RTX performance tax is very real and very large (again). From the same article looking at the 2060 super, if you average 55 FPS with RTX off, you average 32 FPS with RT high / 43 RT medium. RT medium cuts out 3/5 of the RT features, so you end up with nothing but a few RT reflections and a whole lot of rasterized rendering (e.g. what's the point?). Or, you end up rendering it at something like 900p (see your review above) and upscaling it to get any semblance of playability with RT.

Additionally, I find it amusing how you held up a new gold game standard of RT game and the review you quoted a few posts up clearly said the "Ugly" was how much you had to pay to make it look pretty using RT..

Because they have Good/bad/Ugly on every review, and they only tested on a laptop and 2080Ti and NOTHING in between. From their you leaped back to your zealot position, that you can't run it on a $400 card.

So I included a performance reviews that shows exactly how you can indeed run it on a $400 card. You have multiple options. You can run at 1080p, you can run 1440p with DLSS, you have adjustable RT levels.
 
Technically, it's $999 for a 2080ti. 60% more transistors for 43% more cost. (or even cheaper if you compare it to the Titan Xp). I realize this doesn't really matter, but addresses the price gouging, since they are actually doing it less with Turing; which brings me to the next line:

I looked at the rasterized alone when deciding, 45% faster for 43% more cost, not ideal. I do personally have a fetish for lighting/shadows/reflections so I do get off to RT, but was not factored in.

My comment was more at that everyone is focused on the $300-500 section only recently and especially in RTX threads. ;)

Man you keep trying to stretch it, it's not 45% better then a 1080ti most times 25% to 35%. You got promised DLSS and DLSS x2 and that was a dud and ray tracing is only ok if your willing to lower your resolution from 4K. If I were you I would be more pissed about Nvidia promising me things and not delivering. As for the $300 to $500 that is where most people at the enthusiast level spend and only a tiny % spend more. A few stretched it last generation to get the 1080ti and now Nvidia is convinced you will always spend just a bit more then last time.
 
Man you keep trying to stretch it, it's not 45% better then a 1080ti most times 25% to 35%. You got promised DLSS and DLSS x2 and that was a dud and ray tracing is only ok if your willing to lower your resolution from 4K. If I were you I would be more pissed about Nvidia promising me things and not delivering. As for the $300 to $500 that is where most people at the enthusiast level spend and only a tiny % spend more. A few stretched it last generation to get the 1080ti and now Nvidia is convinced you will always spend just a bit more then last time.
.

It’s 45% at 60-70Hz which is where I play. The lower numbers (25-35%) are ~110+Hz which is not applicable to me. Sorry, should have been more specific.

I don’t listen to nVidia or AMD, I look at reviews.
 
Last edited:
Because they have Good/bad/Ugly on every review, and they only tested on a laptop and 2080Ti and NOTHING in between. From their you leaped back to your zealot position, that you can't run it on a $400 card.

So I included a performance reviews that shows exactly how you can indeed run it on a $400 card. You have multiple options. You can run at 1080p, you can run 1440p with DLSS, you have adjustable RT levels.

I'll rephrase though. It's not so much "doesn't work now" as it doesn't work now with the IQ, framerates, and resolutions some people are looking for. I don't agree with Jensen that people are "crazy" if they buy something without it at this point, especially in the $300-400 range where the competition is.

If you're going to argue with me, at least argue with what I'm saying, not with what you think I'm saying based on your rose colored RTX glasses... DLSS isn't a magic bullet fix either as it reduces image quality to the point where Nvidia had to promise to improve it. I've never said RTX was a bad technology. I just don't think it's as valuable as you do in that $300-400 range.

Once again, circling the wagons back to the actual point. In the $300-400 price range, I don't think you're "crazy" if you opt out of RTX at this point.
 
If you're going to argue with me, at least argue with what I'm saying, not with what you think I'm saying based on your rose colored RTX glasses... DLSS isn't a magic bullet fix either as it reduces image quality to the point where Nvidia had to promise to improve it. I've never said RTX was a bad technology. I just don't think it's as valuable as you do in that $300-400 range.

Once again, circling the wagons back to the actual point. In the $300-400 price range, I don't think you're "crazy" if you opt out of RTX at this point.

You aren't just saying you don't think it's as valuable as I do.

You keep implying it's totally unusable on any of the more affordable cards.

You repeatedly say things like: "In other words you aren't running it on $400 hardware as we've already discussed ad nauseam." or "I would counter that better looking but unplayable FPS is worse than worse looking but playable FPS..."

Your definition of "unplayable" seems to be less than 1440p/60fps, and you seem unwilling to make any reasonable allowance to that.

Despite the point that the very same metric is failed to be reached in these games without RTX (AKA hypocrisy).
 
The cores will never be used for in game ai, so let's shoot that one down, enemies learning isn't really a problem, it can be done, the problem is always how to balance it so they don't become frustrating hard, which is a thing that can happen.
I should clarify that those were for cloud based games. One of said studies is pretty interesting as the system is basically playing an MMO with a few thousand instances of itself where it is both the players and the monsters.
 
If you want to stick to the 1440p/Ultra settings, 60 FPS settings, well guess what, your 2060 Super already fails at RTX off, traditional rasterization.

That's it its over, your 2060 Super is already uplayable at traditional rasterization based games(2070 is a reasonable stand in):

Eh, only reviewers this past year have been promoting or at least talking about it via their tests that 1440p is doable with those cards when it clearly hasn't been. Hence many of the complaints about prices. I've also seen a fair amount of reviews that bring up the 2060 and 1660ti as entry level 1440p. Its clear if you want ultra 1440p you're buying a 2080+.
 
Honestly the Ray Tracing is the least exciting part of the RTX lineup. I am far more interested in the other aspects of what those cores are capable of. The AI capability’s alone are fascinating, I have read some papers on game studios experimenting with AI enemies, they learn as you beat them. So the enemies don’t get harder by becoming bullet sponges that hit harder but by learning to counter your play style. And think about MMO’s where the dungeons and raids learn to counter the meta that is used against them.
We are a year in. I haven't read even a rumor never mind a studio hyping such a feature in an upcoming game.
 
Your head must be in the sand then. Most everyone has announced it now.

Perhaps post a link then to such a article as that would be far more useful then asking if his head is in the sand. Because far as I know I have not heard of any studio talk about using tensor cores for AI in games.
 
Look at OpenAI's Neural MMO. It's some really cool stuff
So cool that it isn't discussed by the wanting RTX crowd? (Kidding I am sure it is cool, but it isn't mainstream and unfortunately doesn't change my points.)
Let me know when it is a reality. So far Nvidia has abandoned everything promised except RT and even there. Yikes.
They lately have taken up competition with AMD to incorporate most of the latest AMD features.
Why the wait Nvidia? DLSS a sham? It appears so now.
Does any of it require AI or tensor cores? It doesn't on AMD.
What is the game here?
 
So cool that it isn't discussed by the wanting RTX crowd? (Kidding I am sure it is cool, but it isn't mainstream and unfortunately doesn't change my points.)
Let me know when it is a reality. So far Nvidia has abandoned everything promised except RT and even there. Yikes.
They lately have taken up competition with AMD to incorporate most of the latest AMD features.
Why the wait Nvidia? DLSS a sham? It appears so now.
Does any of it require AI or tensor cores? It doesn't on AMD.
What is the game here?
That was one example for the MMO development, but how about Unreals use of Tensorflow for the Unteal 4 engine and Unity’s ML Agent, there is huge money being spent by Google, Amazon, Microsoft, and Sony on AI engines. And yes AMD has their own version of the Tensor Cores which they have stated is going in both the new XBox and the Ps5. Announcements have been going out like mad from just about every major publisher there is and pretty much all the canned game engines are all on board and I wouldn’t be surprised if EA and the likes are hard at work cooking it into their in-house ones too.
 
That was one example for the MMO development, but how about Unreals use of Tensorflow for the Unteal 4 engine and Unity’s ML Agent, there is huge money being spent by Google, Amazon, Microsoft, and Sony on AI engines. And yes AMD has their own version of the Tensor Cores which they have stated is going in both the new XBox and the Ps5. Announcements have been going out like mad from just about every major publisher there is and pretty much all the canned game engines are all on board and I wouldn’t be surprised if EA and the likes are hard at work cooking it into their in-house ones too.
Uh this is about Nvidia RT. I am SURE the console will be fine :)
 
Uh this is about Nvidia RT. I am SURE the console will be fine :)
He says Ray Tracing, what he means is Tensor Cores. The actual Ray Tracing portion of its ability is the least interesting but easiest to showcase part of the technology.
 
He says Ray Tracing, what he means is Tensor Cores. The actual Ray Tracing portion of its ability is the least interesting but easiest to showcase part of the technology.
It is all "least interesting" as it is doing fuck-all at this time. You have breaking news for us? :)
 
RT medium cuts out 3/5 of the RT features, so you end up with nothing but a few RT reflections and a whole lot of rasterized rendering (e.g. what's the point?). Or, you end up rendering it at something like 900p (see your review above) and upscaling it to get any semblance of playability with RT.

That's pretty funny. In a few months we've gone from no raytracing at all to expecting games to be fully raytraced otherwise there's no point. People will always be able to find something to hate on but the haters will soon be drowned out. It's just a matter of time. And I expect that time to be the day that AMD has hardware raytracing.

Once again, circling the wagons back to the actual point. In the $300-400 price range, I don't think you're "crazy" if you opt out of RTX at this point.

Of course you're not crazy as RTX is nowhere close to a must have feature right now and people are free to spend their money however they see fit. Looking at those Control numbers you can get very playable RT performance on affordable hardware without DLSS. Why hate on the guy who wants to run RT on his 2060 @ 1080p?
 
Perhaps post a link then to such a article as that would be far more useful then asking if his head is in the sand. Because far as I know I have not heard of any studio talk about using tensor cores for AI in games.
I was referring to raytracing.
 
That's pretty funny. In a few months we've gone from no raytracing at all to expecting games to be fully raytraced otherwise there's no point. People will always be able to find something to hate on but the haters will soon be drowned out. It's just a matter of time. And I expect that time to be the day that AMD has hardware raytracing.

Of course you're not crazy as RTX is nowhere close to a must have feature right now and people are free to spend their money however they see fit. Looking at those Control numbers you can get very playable RT performance on affordable hardware without DLSS. Why hate on the guy who wants to run RT on his 2060 @ 1080p?

Not sure if you missed, it, but even the GTX cards and AMD have simple ray-tracing ability. You don't have to spend $100 more to get ad-hoc ray-tracing, when you can get equally bad (ie: gimped) ray-tracing without paying for it.

Also, Nobody is hating on those who have a 2060 and use RT.... we are mocking them (ie: making fun of them), for being illogical. Spending money on a new card for more frames, and instead they pretend they are happy with getting less frames than their previous card.... and can't admit it is a gimmick.


If you use RTX On, you have to admit to yourself and others it is a dumb choice, but if you have the money and don't minf the performance hit. But it the same dweebs who want to say it is a good value... who are getting mocked. As everyone knows that Nvidia's RTX Cards are a horrible value, just compare them to Nvidia's previous Pascal cards to see the proof. Same Nvidia cheerleaders touting "RTX On"... are very small % of the Gaming populace.... because 95% of Gamers spend for more frames.

And the 4 or 5 People in these forums that keep backing "RTX On" are not going to convince logical thinking people to buy into Nvidia's hoax.
 
Not sure if you missed, it, but even the GTX cards and AMD have simple ray-tracing ability. You don't have to spend $100 more to get ad-hoc ray-tracing, when you can get equally bad (ie: gimped) ray-tracing without paying for it.

Also, Nobody is hating on those who have a 2060 and use RT.... we are mocking them (ie: making fun of them), for being illogical. Spending money on a new card for more frames, and instead they pretend they are happy with getting less frames than their previous card.... and can't admit it is a gimmick.


If you use RTX On, you have to admit to yourself and others it is a dumb choice, but if you have the money and don't minf the performance hit. But it the same dweebs who want to say it is a good value... who are getting mocked. As everyone knows that Nvidia's RTX Cards are a horrible value, just compare them to Nvidia's previous Pascal cards to see the proof. Same Nvidia cheerleaders touting "RTX On"... are very small % of the Gaming populace.... because 95% of Gamers spend for more frames.

And the 4 or 5 People in these forums that keep backing "RTX On" are not going to convince logical thinking people to buy into Nvidia's hoax.
I don't know about you, but there were many times in the past when a new graphical paradigm came out with a massive performance hit.
Besides you do realize you can turn off the RTX effects and get more fps, right? Nobody is forcing you to use them, you get the option to. It's up to each invidual to decide whether he or she would rather have more eyecandy or better performance. In the end it's no different to figuring out which settings you want to play on.
For me personally the raytraced effects are pretty amazing, especially in Metro and Quake. Support is still rather limited, but that's not exactly surprising with new tech. Can't wait to give Control a try over the weekend.
 
On top of that I think this whole argument how the purpose of a new GPU is only to get you more frames is very reductionist.

Spending money on a new card for more frames, and instead they pretend they are happy with getting less frames than their previous card.... and can't admit it is a gimmick.
That isn't what's happening, you're twisting facts to suit your narrative. You get an option to exchange performance for extra effects, effects you can't even get otherwise.
Which is a totally valid choice either way - new rendering technologies are one of the parts that allows people to make innovative, never before seen games. Sure that's not going to happen within a year, but it's an important stepping stone.
 
Not sure if you missed, it, but even the GTX cards and AMD have simple ray-tracing ability. You don't have to spend $100 more to get ad-hoc ray-tracing, when you can get equally bad (ie: gimped) ray-tracing without paying for it.

I guess your plan is to just ignore all of the people advising you to give up your campaign of unsubstantiated and silly claims?

RT is just math. Some cards are faster at that math than others. Since you can get equally bad RT without paying for it then I’m sure you’ll share lots of examples of games shipping with “cheap” RT anytime now. Won’t hold my breath though.

Instead of wasting your time here shouldn’t you be petitioning your beloved Lisa to roll out a DXR driver for Navi and prove you right?
 
Status
Not open for further replies.
Back
Top