Nvidia CEO says buying a Gpu without raytracing is crazy...

Status
Not open for further replies.
How about.making powerful sub $300 GPUs a thing again? I cant justify 2080 pricing if I'm not making money off the thing, and I dont mean mining

Unfortunately, old school silicon economics have been upended.

There are no more easy gains like the days when die shrinks cut transistor costs in half, now transistor costs are flat and even potentially increasing with die shrinks.
 
His language is targeting specifically at low information audience, gamers (the majority of his revenue) and he's using a generic 10th rate corporate word salad of platitudes, that other than 20 something fanboys, it's going to fall on deaf ears because anyone with any sort of critical thinking knows that he's trying to justify an overpriced product with features that will cripple your system, lack little to no library of games, is still way in its infancy,

He stated his opinion. Big surprise that it is positive about his companies' products.

all because of greed and pleasing the shareholders.

It's a CEO's job to portray the company and its' products in a positive light. Doing otherwise would get him fired.

But the numbers don't lie, the stock price since last year has dropped significantly, their other business sectors have slowed, sales are down 31%

Factually correct but missing the context of the crypto-mining boom. Sales were up 41.5% year over year for a couple quarters... pretty nuts and obviously not a sustainable growth. Taken in context, they've had 8% growth each year in profits over the last 2 years, with a 30.5% added boost due to crypto-craze, which lasted several quarters. Look at AMD's profits, probably see the same pattern. It's not news or unexpected that sales are lower now that the mining boom is over.

and the investors are leery because of the looming recession that's gaining more momentum thanks to tariffs, tax cuts, and the elephant, Americans are just plain broke. Let me push back further, Jensen is aware of all these elements I listed above, he has no choice but to gloss over these facts and project this image of confidence and certainty otherwise he's going to be skewered by so called market forces during the 3rd quarter reports.

Consumer spending is up, the looming recession is thanks to Trumps trade war with China. What do you expect a CEO to be able to do about things 100% out of his control, do tell?

Just marinate on this thought for a moment, the default price is now $1200 for a flagship GPU, imagine what that is going to be in two years.

No one likes the fact that prices have gone up, not sure what this has to do with Jensen's opinions on nVidia's current products... Doom and Gloom.

And I don't know if some of you have selective amnesia or what but don't forget Jensen's behavior the last year with the NDA's, the Geforce program on the vendors, and now the price gouging, this is just the beginning.

The 5 year NDA was a bit ridiculous. Not sure I've ever heard that this was Jensen's decision.

The GeForce program with the vendors, where they wanted to ensure that their advertising dollars were not spent advertising an OEM's graphics cards from competing vendors. Spun to be something nefarious, when it was just about marketing dollars.

Price gouging? Possibly, as I don't know what it costs to make those GPU's.. (die manufacturing costs + card manufacturing costs + marketing costs + R&D (over years and years) + GeForce program costs to help game devs + website costs + who knows what else). Just as easily it's not, and is completely fair. Price will be what the market will bear.

Nvidia will continue to monopolize and cannibalize the market till nobody has any money left to buy bread and milk.

Not one person on this Earth is required to buy an nVidia product... if you are buying RTX cards before food, you need to talk to a counselor.

Let me reiterate this one more time since I've had in the past, the future of gaming is ray tracing, period. But none of that matters when you have a narcissistic CEO who only knows one thing, GREED, everything else is secondary.

You are describing every successful CEO out there. Maybe not the "narcissistic" comment...:

Dictionary said:
nar·cis·sis·tic
/ˌnärsəˈsistik/
adjective
  1. having an excessive or erotic interest in oneself and one's physical appearance.
Not sure how wearing a leather jacket translates into narcissism... You're just name-calling at this point.

Jensen's cavalier and duplicitous attitude toward his audience is just a reminder quarterly reports and his stupid ass jacket is all that's important to him.

This all sounds like your own opinion.
 
For me the 2080Ti is just NOT worth it at all. I won't degrade my playing to 1080p just to use RT. Now if the 2080Ti could run 4k @ 60 fps RT on then I wouldn't even hesitate to buy one, but at this time I just see no value in the card. Even if you are buying a new card now, RT isn't a feature that's going to be useful for another 3-4 years till the feature becomes more mainstream and you're probably going to be buying a new card by then that can do RT at your native resolution for less money.

You touch on something here which is almost always ignored. I agree that any mainstream implementation of ray tracing is years away at this point and that's a huge issue with the statement made by the CEO. Ray tracing is very resource intensive which is the reason we have not had a real time implementation of it yet. nVidia's current implementation is used for what amounts to a minor effect with a large performance hit.

Since it's likely it will be at least a few years before mainstream adoption of anything beyond token implementations, what kind of performance do you think you'd get in a few years with current year cards? nVidia's RTX cards already struggle with the current token implementations we're seeing and that's at the high end. The lower end RTX cards can't do anything with ray tracing and still maintain a usable performance. How, then, can it be worth getting a current ray tracing card for use with ray tracing 3+ years down the road?

Ray tracing implementations in games will only become more resource hungry and require stronger hardware to be usable. There's practically no chance that current RTX cards will be able to handle any sort of ray tracing implementations in games 3+ years from now. Unless games 3+ years from now have no better ray tracing implementations meaning that ray tracing has totally stalled out. If this is the case ray tracing will be essentially dead simply because it was unable to move forward.

The fact of the matter is that buying an RTX card now to use for ray tracing in the future is a dead end.
 
The fact of the matter is that buying an RTX card now to use for ray tracing in the future is a dead end.

Exactly. The potential Navi or Super buyers aren't going to have enough ray tracing power to do anything with in Jensen's "crazy" future if they buy RTX, so they might as well buy whatever makes them happy now.

Most people buying RTX aren't doing it for the ray tracing. I know the only reason I have one is because it draws less power and the drivers are less screwed up than Navi at this point. I turned on ray tracing once in Shadow of the Tomb Raider on medium just to see what all the fuss was about. I took about a 25% hit performance wise and didn't really notice a big difference in the few scenes I looked at.
 
Ray tracing implementations in games will only become more resource hungry and require stronger hardware to be usable. There's practically no chance that current RTX cards will be able to handle any sort of ray tracing implementations in games 3+ years from now. Unless games 3+ years from now have no better ray tracing implementations meaning that ray tracing has totally stalled out. If this is the case ray tracing will be essentially dead simply because it was unable to move forward.

An facetious opinion. Better Raytracing will also come from developers getting better at using the new technology.

Just like it takes a while before developers come to terms with new console generations.

Except this is more radical that a new console generation, and we haven't even seen one game designed from the ground up with RT in mind. Everything is just tacked on at the end, with devs who are just getting their feet wet with the technology.

The fact of the matter is that buying an RTX card now to use for ray tracing in the future is a dead end.

More facetious muck.

All graphics cards are eventually a dead end in the future.

The truth of the matter, is that Moores law has stalled, at least on the economic side, there are no more free transistors to make big leaps with.

Where are the transistors coming from to boost Ray tracing suddenly higher? At the expense of reducing Raster Performance? Seems unlikely.

GPUs are not leaping forward anymore, they are grinding slowly forward, and it will be the same for RT performance.

It moves forward slower, but that means old cards obsolete slower as well, and that applies to both Raster and RT performance.
 
An facetious opinion. Better Raytracing will also come from developers getting better at using the new technology.

Just like it takes a while before developers come to terms with new console generations.

Except this is more radical that a new console generation, and we haven't even seen one game designed from the ground up with RT in mind. Everything is just tacked on at the end, with devs who are just getting their feet wet with the technology.



More facetious muck.

All graphics cards are eventually a dead end in the future.

The truth of the matter, is that Moores law has stalled, at least on the economic side, there are no more free transistors to make big leaps with.

Where are the transistors coming from to boost Ray tracing suddenly higher? At the expense of reducing Raster Performance? Seems unlikely.

GPUs are not leaping forward anymore, they are grinding slowly forward, and it will be the same for RT performance.

It moves forward slower, but that means old cards obsolete slower as well, and that applies to both Raster and RT performance.

But it's not facetious muck. At least one the one hand you can look back to the last 4 years of graphics cards to predict a future trend. You're just advocating that the trend ends with current RTX cards because Jensen said so. If I look back 4 years from now and don't see a similar jump between the 2015 flagships: AMD 390X and the Nvidia GTX980 (the $400 cards of the time) and current cards, I'd be surprised.

I don't have any delusions that my 2060 Super is going to be able to do anything remotely ray tracing related 4 years from now anymore than I expect a GTX980 to play games at 4K at playable rates.
 
Factually correct but missing the context of the crypto-mining boom. Sales were up 41.5% year over year for a couple quarters... pretty nuts and obviously not a sustainable growth. Taken in context, they've had 8% growth each year in profits over the last 2 years, with a 30.5% added boost due to crypto-craze, which lasted several quarters. Look at AMD's profits, probably see the same pattern. It's not news or unexpected that sales are lower now that the mining boom is over.



Consumer spending is up, the looming recession is thanks to Trumps trade war with China. What do you expect a CEO to be able to do about things 100% out of his control, do tell?
You say it like AMD wasn't in the same position with the crypto bubble, they were hit even harder, yet check their numbers. AMDs cards were better at mining in general and not as good for games, so resale value dropped significantly once crypto burst. Nvidia still had a great gaming card and retained it's value. I think it has more to do with lack of a reason to upgrade for Nvidia customers. Those who bought a 1080ti haven't had much reason to upgrade in 3 years, regardless of mining. And with the new prices, even less reason. Blaming crypto when customer spending is up is rally not a strategy. You're not selling for other reasons at this point. The 1080ti was a good card, gamers bought it. 2080 cost more and didn't run any faster, and ti was priced way higher.
 
Last edited:
But it's not facetious muck. At least one the one hand you can look back to the last 4 years of graphics cards to predict a future trend. You're just advocating that the trend ends with current RTX cards because Jensen said so. If I look back 4 years from now and don't see a similar jump between the 2015 flagships: AMD 390X and the Nvidia GTX980 (the $400 cards of the time) and current cards, I'd be surprised.

I don't have any delusions that my 2060 Super is going to be able to do anything remotely ray tracing related 4 years from now anymore than I expect a GTX980 to play games at 4K at playable rates.
Someone reasonable, OMG, lol. The 2060 super isn't a bad card at all, but people are delusional when they hear Jenson say you need rtx. 2060 (&super) are not going to run rt games for the next 4 years.
 
For me the 2080Ti is just NOT worth it at all. I won't degrade my playing to 1080p just to use RT. Now if the 2080Ti could run 4k @ 60 fps RT on then I wouldn't even hesitate to buy one, but at this time I just see no value in the card. Even if you are buying a new card now, RT isn't a feature that's going to be useful for another 3-4 years till the feature becomes more mainstream and you're probably going to be buying a new card by then that can do RT at your native resolution for less money.

Given my experience at 3440x1440 (5megapixels) we’re probably around 50 fps at 4k (8 megapixels). That’s assuming no VRAM limitation.

So Ampere should be able to do RT at 4k IMO. We shall see!
 
Given my experience at 3440x1440 (5megapixels) we’re probably around 50 fps at 4k (8 megapixels). That’s assuming no VRAM limitation.

So Ampere should be able to do RT at 4k IMO. We shall see!
Here's to hoping, time will tell.
 
Here's to hoping, time will tell.

I really wish they’d put a minimal amount of RT on every card and have an add on RT card. I read once it’s super easy to multi chip RT but who knows if it’s true.
 
But it's not facetious muck. At least one the one hand you can look back to the last 4 years of graphics cards to predict a future trend. You're just advocating that the trend ends with current RTX cards because Jensen said so. If I look back 4 years from now and don't see a similar jump between the 2015 flagships: AMD 390X and the Nvidia GTX980 (the $400 cards of the time) and current cards, I'd be surprised.

Except the trend line has changed drastically since then. Nothing to do with Jensen. Everything to do with silicon economics.

This wasn't happening 4 years ago:
https://www.anandtech.com/show/1476...ay-1-dr-lisa-su-ceo-of-amd-live-blog-145pm-pt

Dr Lisa Su said:
04:54PM EDT - Cost per transistor increases with the newest processes

IMG_20190819_135435_575px.jpg




I don't have any delusions that my 2060 Super is going to be able to do anything remotely ray tracing related 4 years from now anymore than I expect a GTX980 to play games at 4K at playable rates.

You are kind agreeing with my point where I said: "All graphics cards are eventually a dead end in the future."

Though your card won't be unusable for RT exactly 4 years from now (though who said they are buying for 4 years in the future). Some points:

  • As mentioned above, performance gains are stagnating because silicon economics have shifted. Again Dr Lisa Su stating this, not Jensen. So this means longer cycles for HW, with smaller performance jumps.
  • While the high end will get higher, they will be expanding RT cards to a lower niche, so that performance level will become a larger part of the market that devs won't ignore. Say RTX 3050 TI performs like the RTX 2060, RTX 4050 like 2060 Super. In 4 years 2060 Super may near the lowest end of new RT cards, but it won't be off the scale.
  • As mentioned before, devs will get better at incorporating RT effects, tuning them, having ranges of RT settings (low, med, high, Ultra).
  • On top of that, even if it did, wouldn't your rather have a few years where you could use RT effects, than none at all?? Even if the dire prognostications played out. RT for a couple of years would still be better than ZERO years.

So I really not getting where this: "4 years from now RT on 2060S will be useless" argument makes any sense.
 
Unfortunately, old school silicon economics have been upended.

There are no more easy gains like the days when die shrinks cut transistor costs in half, now transistor costs are flat and even potentially increasing with die shrinks.
Eh, well that's why my 970 will be more than fine for me. Plus bonus, 1080 monitors are cheaper than 4k monitors :D
 
An facetious opinion. Better Raytracing will also come from developers getting better at using the new technology.

Just like it takes a while before developers come to terms with new console generations.

Except this is more radical that a new console generation, and we haven't even seen one game designed from the ground up with RT in mind. Everything is just tacked on at the end, with devs who are just getting their feet wet with the technology.



More facetious muck.

All graphics cards are eventually a dead end in the future.

The truth of the matter, is that Moores law has stalled, at least on the economic side, there are no more free transistors to make big leaps with.

Where are the transistors coming from to boost Ray tracing suddenly higher? At the expense of reducing Raster Performance? Seems unlikely.

GPUs are not leaping forward anymore, they are grinding slowly forward, and it will be the same for RT performance.

It moves forward slower, but that means old cards obsolete slower as well, and that applies to both Raster and RT performance.

This is a load of horseshit and if you've been around for more than a few hardware iterations you'd know this. That means you either don't have a clue what you're talking about or you're lying.

I dare you to go back throughout the history of the GPU and do a comparison of new features and technologies. You're going to find in practically every case that the first generation or two of GPUs with new features and technologies were effectively worthless at using those features and technologies once mainstream support for those features became reality. As time goes on the implementations of those features do not require lighter amounts of processing, they require more as the features are more fully implemented. This is fact borne out by history and the complete opposite of your facetious ramblings here.

First generation RT is dead in the water. Only the top end models can even make any use of RT at somewhat acceptable performance and that's with very limited amounts of RT in the games. Tacked on implementation or not is irrelevant. Ray tracing takes massive amounts of processing power and the current lineup doesn't have enough processing to make it worthwhile at this time and that situation is not going to get any better in the future for this generation of cards.
 
Seems pretty expensive for what it is so far. I'd like to believe I'm in for a treat, but the only thing sounding crazy is someone telling me I need to spend 1k+ to replace my perfectly functioning video card, just so I can lose FPS and run RTX in very few compatible games (not to mention, I think 1440p+144 Hz/FPS is probably more noticeable anyways and NOT worth losing.. Pass for now. Even if it's cool, at this moment, it's for people who burn money like they don't need it.
 
The RTX cards of today are sort of like the 3Dfx Voodoo 1 cards from 1996 in that they are the first generation of a brand new paradigm shift in GPU technology. Maybe not quite to the same degree, however, at least not to me - yet.

RTX technology and hardware ray tracing in general has a similar "this is the future" vibe like the 3Dfx technology had, but it's not quite as obvious or convincing as Voodoo 1 was. One look at the Voodoo 1 games and everyone was like "Wait.. a PC can do this?!" One look at the RTX games and it's like "Yeah, pretty neat, but that fps drop..."
I guess the major difference is that with the Voodoo 1 cards, the speed and smoothness was the selling point whereas with RTX the visuals are the selling point, fps be damned. And that doesn't convince most people that the technology is the future, even though it will be - just perhaps not with this generation of hardware. But you have to start somewhere.

So far I am not helping fund this new RTX technology via the purchase of an RTX card (ignoring my previous Pascal purchases), whereas I got my hands on an Orchid Righteous 3D Voodoo 1 card pretty quickly when I learned about it and never looked back. It's not that I cannot afford an RTX card, but to me buying one this generation seems like kind of a waste because it feels like one step forward and one step back in the same card. It's hard to describe.
 
Buying a 2060 super to use it with only a 1080p monitor is a bit of a waste. And if I have a 1440p or 4K monitor, but have to lower the resolution to 1080p to play RTX titles is even more wasteful.

So now the 2060S is too powerful. Got it.
 
The fact of the matter is that buying an RTX card now to use for ray tracing in the future is a dead end.

Perfectly valid opinion. A bit crystal-ball-y, but probably accurate for low end RT cards, but not necessarily for the high end ones.

Exactly. The potential Navi or Super buyers aren't going to have enough ray tracing power to do anything with in Jensen's "crazy" future if they buy RTX, so they might as well buy whatever makes them happy now..

Another valid opinion.

Jensen's opinion: If you are in the market for a new card, might as well get one that can do raytracing.

Another perfectly valid opinion.

If I was in the market for a card right now, I would definitely be getting one that can do raytracing.

Someone buying a card today, can hope to get 2 to 3 years out of it if they are lucky (Probably less if its low-end). It might be about 3 years before a game designed top-down for raytracing comes out, and we can only speculate how much raytracing power it will need. It might be that hybrid raytracing will dominate RT designs for the next 10 years, for all we know... and if that is the case, todays' raytracing cards will have longer legs than most are giving them credit for.

I think the raytracing performance and game implementation will be the same as when AA first arrived. There was what 2xAA, 4xAA, 8xAA, 16xAA ? So you bought a card with AA capability, game comes out a year later, your card can only do 2xAA in this game, but new cards can do 8xAA. Doesn't mean you aren't getting the benefit of the new technology. Then a year later after your card has aged, you upgrade, and now you can play that same game at 16xAA.

It's the normal cycle. Anyone stating "but, these cards will not be able to raytrace for shit in 4 years?!!?!" isn't doing anything but stating the obvious. I can tell you that I will not be using the same GPU I have in my rig today in 4 years...
I don't care if RTX card Y is going to be obsolete in 4 years, because any card made from any brand is going to be obsolete in 4 years.

My opinion: If you are buying a new card right now, get the 2070 Super as a minimum, 2080 Super if you can afford it. You will get a good 3+ years out of those cards, especially if you were leaning towards the other options on the cheap end (2060 Super or 5700XT), because with those cards you are running at 1440p anyway, or are running 4k and are OK with low frame rates (maybe you play RTS games primarily and fps isn't a factor for you), and the cards I recommended will be able to do those things farther into the future. i,e, you will get your money's worth. And maybe, just maybe, you will play a game with raytracing and even find that you enjoy it!
 
So now the 2060S is too powerful. Got it.
That's not even remotely what that meant.

If I have to play at 1080p to have RTX at a reasonable frame rate, while having a monitor with higher native resolution then it's quite wasteful. I got a monitor with higher resolution than 1080p because I want to play and use it at those higher resolutions. I'm not gonna lower it for a feature, that imo currently is just a gimmick.
 
So I really not getting where this: "4 years from now RT on 2060S will be useless" argument makes any sense.
It was in the original statement from Jensen.... He said 2,3, or 4 years or some other ramblings. Meaning if you buy a card you may hold onto it for 4 years. So, the point from us is, there's no way a 2060 or 2070 is going to come close to running games with RT in 4 years, as they can't even do it with current games. Aka, the 5700 and 5700xt are going to last just as long as a 2060 super as neither will be very useful with RT.
 
It was in the original statement from Jensen.... He said 2,3, or 4 years or some other ramblings. Meaning if you buy a card you may hold onto it for 4 years. So, the point from us is, there's no way a 2060 or 2070 is going to come close to running games with RT in 4 years, as they can't even do it with current games. Aka, the 5700 and 5700xt are going to last just as long as a 2060 super as neither will be very useful with RT.

I mean... there are setting selectors. You might just have to run it on very low, which currently low is way better than nothing.

I actually ran my 65” 4k TV at 1080p when I had my Radeon VII hooked up to it. There’s wasn’t a huge difference between 4k and 1080p at 7’ away. So 1080p RT wouldn’t bother me in that scenario. That might be an edge case though.

Now 1080p on a monitor in my face I definitely notice.

Overall I agree. I’ve been saying ~$700 and up I definitely want RT. And cards at that tier have the power to drive it at reasonable rates.

So there’s a few factors that basically fall into opinions with RT. It’s not as clean as it used to be with giving advice to friends/family that’s for sure.
 
I mean... there are setting selectors. You might just have to run it on very low, which currently low is way better than nothing.

I actually ran my 65” 4k TV at 1080p when I had my Radeon VII hooked up to it. There’s wasn’t a huge difference between 4k and 1080p at 7’ away. So 1080p RT wouldn’t bother me in that scenario. That might be an edge case though.

Now 1080p on a monitor in my face I definitely notice.

Overall I agree. I’ve been saying ~$700 and up I definitely want RT. And cards at that tier have the power to drive it at reasonable rates.

So there’s a few factors that basically fall into opinions with RT. It’s not as clean as it used to be with giving advice to friends/family that’s for sure.
I don't disagree that if you lower settings considerably you could run things, and lots of people do this especially in twitch/competitive games. If the argument is your using RT so you have to lower a bunch of settings and resolution, you kinda lose some of the benefit.
 
Jensen: Buying a GPU without Raytracing is CRAZY!
gtx1660cat.jpg


Yeah no, Jensen. Calling your customers crazy is a strange card to play. Considering none of the games I play have RT features and none of the games I'm looking forward to playing have it either, I'll glady be "crazy" and save money by not buying overpriced gen-1 RTX hardware.
 
Last edited:
It was in the original statement from Jensen.... He said 2,3, or 4 years or some other ramblings. Meaning if you buy a card you may hold onto it for 4 years. So, the point from us is, there's no way a 2060 or 2070 is going to come close to running games with RT in 4 years, as they can't even do it with current games. Aka, the 5700 and 5700xt are going to last just as long as a 2060 super as neither will be very useful with RT.

Yes, but it seems like some people (you included) are failing to realize, that if you keep a card for 4 years:

  • Your card doesn't stop working for all new RT games even 4 years from now. There will be come more challenging, and some less challenging. You might have to turn of RT in the more demanding games, but not all of them, not to mention the older games you still like from that time-frame.
  • You still get all the time in between were RT perf is good enough, till you deem it not good enough. That will likely be a couple of years.
  • Key Element: If you buy the alternative without RT HW, and keep it for the same amount of time, you get absolutely ZERO RT capability for the whole 4 years. In what universe is that better??
 
Key Element: If you buy the alternative without RT HW, and keep it for the same amount of time, you get absolutely ZERO RT capability for the whole 4 years. In what universe is that better??

The cards the person you quoted mentioned can't even handle current games with RT features on. They're never going to handle ones going forward. With or without RT hardware is irrelevant when the GPU itself lacks the power to utilize that hardware.
 
Yes, but it seems like some people (you included) are failing to realize, that if you keep a card for 4 years:

  • Your card doesn't stop working for all new RT games even 4 years from now. There will be come more challenging, and some less challenging. You might have to turn of RT in the more demanding games, but not all of them, not to mention the older games you still like from that time-frame.
  • You still get all the time in between were RT perf is good enough, till you deem it not good enough. That will likely be a couple of years.
  • Key Element: If you buy the alternative without RT HW, and keep it for the same amount of time, you get absolutely ZERO RT capability for the whole 4 years. In what universe is that better??
The part that some people are missing (you included) is that there aren't games you can really turn up with a 2060 or even 2070 really today. Your argument about you can keep playing these older games is moot, because it can't do it now, why would it get better in 4 years?? So I can either get no RT capability, or almost no capability.... w00t, I'd be crazy to pick almost nothing. It's funny b cause I even recommended someone the 2070S over the 5700xt, if they are the same price, no reason to get the 5700xt. If the 5700xt is cheaper, I would not hesitate to recommend it as you're not going to miss out on much besides a few tech demos and the possibility to run games really slow just to see an effect that has so far only look OK in one game.
Your card do want stop working, it works the same as it does today, it's great for rasterizing with decent thermals! Your not getting anything from RT and that's not going to get better as games become more demanding.
Key element:. I would easily give up RT that I can't and never will be able to use to save some $. The only cards that may be ok going forward is the 2080 super (maybe) and 2080ti (still iffy). They have no direct competition from AMD so this is a pretty moot point anyways. If you want 2080ti level performance, you get a 2080ti, regardless of RT support, it's the only option, and this one specific bracket of purchasers have a (decent) chance in 2 years that their card still works well. 4 years.. possibly still ok, maybe lower a couple settings but still run well.
 
The part that some people are missing (you included) is that there aren't games you can really turn up with a 2060 or even 2070 really today. Your argument about you can keep playing these older games is moot, because it can't do it now, why would it get better in 4 years?? So I can either get no RT capability, or almost no capability.... w00t, I'd be crazy to pick almost nothing. It's funny b cause I even recommended someone the 2070S over the 5700xt, if they are the same price, no reason to get the 5700xt. If the 5700xt is cheaper, I would not hesitate to recommend it as you're not going to miss out on much besides a few tech demos and the possibility to run games really slow just to see an effect that has so far only look OK in one game.
Your card do want stop working, it works the same as it does today, it's great for rasterizing with decent thermals! Your not getting anything from RT and that's not going to get better as games become more demanding.
Key element:. I would easily give up RT that I can't and never will be able to use to save some $. The only cards that may be ok going forward is the 2080 super (maybe) and 2080ti (still iffy). They have no direct competition from AMD so this is a pretty moot point anyways. If you want 2080ti level performance, you get a 2080ti, regardless of RT support, it's the only option, and this one specific bracket of purchasers have a (decent) chance in 2 years that their card still works well. 4 years.. possibly still ok, maybe lower a couple settings but still run well.

Queue the Tomb Raider 1080P at medium Ray tracing slide as a response :p.

Honestly, I just fired up Shadow of the Tomb Raider and didn't see enough difference at medium to warrant a 25% performance hit.

I only have the 2060S because I got a good deal on it. I definitely didnt buy it for RT. I wouldn't hesitate to use a 5700XT though if I got a similar good deal.
 
Queue the Tomb Raider 1080P at medium Ray tracing slide as a response :p.

Honestly, I just fired up Shadow of the Tomb Raider and didn't see enough difference at medium to warrant a 25% performance hit.

I only have the 2060S because I got a good deal on it. I definitely didnt buy it for RT. I wouldn't hesitate to use a 5700XT though if I got a similar good deal.
This... I'm not saying they are bad cards at all, I honestly think if the CEO and everyone would shut up about how useful RT is going to be and just focus on what the cards are actually really good. Yes, RT is a neat thing and may eventually be cool, but his cards aren't going to magically get better. Your 2060S does great in most games at reasonable settings, as you noticed, RT isn't reasonable, even if it's cool to check out (and you do get that option). The 5700 would work just as well today and in 3-4 years.
 
The part that some people are missing (you included) is that there aren't games you can really turn up with a 2060 or even 2070 really today. Your argument about you can keep playing these older games is moot, because it can't do it now, why would it get better in 4 years?? So I can either get no RT capability, or almost no capability.... w00t, I'd be crazy to pick almost nothing. It's funny b cause I even recommended someone the 2070S over the 5700xt, if they are the same price, no reason to get the 5700xt. If the 5700xt is cheaper, I would not hesitate to recommend it as you're not going to miss out on much besides a few tech demos and the possibility to run games really slow just to see an effect that has so far only look OK in one game.
Your card do want stop working, it works the same as it does today, it's great for rasterizing with decent thermals! Your not getting anything from RT and that's not going to get better as games become more demanding.
Key element:. I would easily give up RT that I can't and never will be able to use to save some $. The only cards that may be ok going forward is the 2080 super (maybe) and 2080ti (still iffy). They have no direct competition from AMD so this is a pretty moot point anyways. If you want 2080ti level performance, you get a 2080ti, regardless of RT support, it's the only option, and this one specific bracket of purchasers have a (decent) chance in 2 years that their card still works well. 4 years.. possibly still ok, maybe lower a couple settings but still run well.

Given CDPR's love of advanced (demanding) graphical features I wouldn't be surprised if we see the 2080S and 2080 ti struggling with RT in Cyberpunk. So 4 years might even be overestimating how long the high-end cards are viable with RT in new games.
 
"Everyone should have at least 3 cards that can handle 12 monitors each doing 8K with tie in to VR with ray tracing and ability to make at least 3 cups of coffee. Oh... and btw, the gold plated editions of the cards are really the only ones to have. Send us your D&B and we'll work out a lease."
 
Post #95. 80-100 FPS with RT on.
Quake 2, old game, very optimized renderer, uses rt for lighting and actually looks decent.
1080p, 2060S 38fps, with minimum dropping to 30. Battlefield V doesn't even really need much comment.
Not sure where you pulled that chart from, but seeing as the 2060 super only hits 105fps in shadow of the tomb raider, I don't think that 95 was with settings turned up.
https://www.guru3d.com/articles-pages/geforce-rtx-2060-and-2070-super-review,12.html

To it's benefit this is one of those games that works well even @ 40-50 fps.
https://www.legitreviews.com/nvidia-geforce-rtx-2060-super-and-2070-super-video-card-review_212668/9
Oh look, it averages 63 with mins at 42.. so just on the cusp of ok.

Metro? 55/32... Yeah, where's your chart from? Every other benchmark I've found does not match your #'s.
https://www.legitreviews.com/nvidia-geforce-rtx-2060-super-and-2070-super-video-card-review_212668/3

And these are links from 3 different independent sources, and they all agree +/- a few %. None of them look like your chart.

Honestly, this isn't a card that in 2+ years is going to be holding its own with RT turned on. The 2070 (S) are a little better, but.. meh. Keep drinking the coolaid though. We're not even talking about 4k... Not even 1440p... This is @ 1080p. Anything else is really a non started and not even worth discussing.

Edit: just noticed I only put 2 of the links I was looking at, I was looking at like 4-5 benchmark sites at the time. Toms and eurogamer are 2 others off the top of my head.
 
Last edited:
Quake 2, old game, very optimized renderer, uses rt for lighting and actually looks decent.
1080p, 2060S 38fps, with minimum dropping to 30. Battlefield V doesn't even really need much comment.
Not sure where you pulled that chart from, but seeing as the 2060 super only hits 105fps in shadow of the tomb raider, I don't think that 95 was with settings turned up.
https://www.guru3d.com/articles-pages/geforce-rtx-2060-and-2070-super-review,12.html

To it's benefit this is one of those games that works well even @ 40-50 fps.
https://www.legitreviews.com/nvidia-geforce-rtx-2060-super-and-2070-super-video-card-review_212668/9
Oh look, it averages 63 with mins at 42.. so just on the cusp of ok.

Metro? 55/32... Yeah, where's your chart from? Every other benchmark I've found does not match your #'s.
https://www.legitreviews.com/nvidia-geforce-rtx-2060-super-and-2070-super-video-card-review_212668/3

And these are links from 3 different independent sources, and they all agree +/- a few %. None of them look like your chart.

Honestly, this isn't a card that in 2+ years is going to be holding its own with RT turned on. The 2070 (S) are a little better, but.. meh. Keep drinking the coolaid though. We're not even talking about 4k... Not even 1440p... This is @ 1080p. Anything else is really a non started and not even worth discussing.

Edit: just noticed I only put 2 of the links I was looking at, I was looking at like 4-5 benchmark sites at the time. Toms and eurogamer are 2 others off the top of my head.

I am curious where his chart is from.

You have to be careful since the majority of sites use built in benchmarks that are way more harsh than the actual game. Metro is one of these.
 
Metro? 55/32... Yeah, where's your chart from? Every other benchmark I've found does not match your #'s.
https://www.legitreviews.com/nvidia-geforce-rtx-2060-super-and-2070-super-video-card-review_212668/3

Link is an image from the Linus TT review of 2060S. Do note that the settings are indicated at the top of the chart. Settings are DXR-Medium which may be a large part of the reason they are higher. Other sites seem to be using DXR-Ultra.

Edit: Found the original with some google:

 
Last edited:
I am curious where his chart is from.

You have to be careful since the majority of sites use built in benchmarks that are way more harsh than the actual game. Metro is one of these.
I agree, but that's a pretty big discrepancy between his chart and every other chart I've found, plus multiple owners first hand accounts (some even just a few posts up).
Honestly, I still think RT is going to stick around and become much better, I just don't have confidence these are the cards going to bring it. It looks good in instances for GI like quake 2, but that murders the performance. Puddles are cool and all, but it's not what makes a scene immersive. I think it they tone down the mirrors and focus on the lighting and such it will be much better received. Heck, they may even figure out a way to do these things without hammering performance and make rtx something people really do want now. Until that happens though, I will continue with my pessimism.
 
Post #95. 80-100 FPS with RT on.

@1080p with DXR on medium. Is the trade-off in resolution worth whatever medium DXR provides? Personal preference I suppose, but the other posters are right, as RT is implemented in more games, and the quality improves, the power required will go up, and the existing cards will take a larger hit than they do now.
 
Link is an image from the Linus TT review of 2060S. Do note that the settings are indicated at the top of the chart. Settings are DXR-Medium which may be a large part of the reason they are higher. Other sites seem to be using DXR-Ultra.

Edit: Found the original with some google:


Possibly, it might also be due to early benchmarks being stuck in my head...

"The implementation was plagued by technical issues including poor reflection resolution, things being incorrectly reflected, and loads of noise. On top of this, enabling ray tracing on even the lowest settings cut the game's frame rate in half or even into a third depending on the GPU and resolution at hand. This made the RTX 2080 Ti the only card suitable for ray tracing, and that was using the Low setting at 1080p"
https://www.techspot.com/review/1759-ray-tracing-benchmarks-vol-2/

But even after this updated review...
"Time to look at the RTX 2080. In Tirailleur at 1080p the improvements for Ultra and Low were 64% and 30%, respectively, when compared to the game pre-patch. However the game is still more than 50% faster with DXR disabled, compared to DXR Low."

Well, it's still 50% faster with DXR disabled. That's a big performance difference no matter how you slice it.

On post patch (after performance increase in the game) the 2080 TI is running at 94/64 on one map an 105/74 @ 1080p (DXR on low, much worse at higher settings). It also still has visual issues "again there’s not a large difference between all four DXR modes, and some of the issues with godrays being incorrectly reflected still appear to be present."

So... the improved performance on average 50% from the original review for Battlefield V, the 2080TI still dips into mid 60/70's (anything lower than the 2080ti is obviously dipping much lower) and it still has a lot of visual issues, and the performance upgrade came at the expense of some visual quality (big surprise) "However there is a visual quality downgrade that I spotted in this area, this new patch has introduced some artifacting to the way ray traced reflections are handled."

So again, my overall impression of current implementations and cards... anything less than a 2080ti or 2080 super won't be able to handle things unless they really tone it down. Issue is the less rays cast, the less believable it will be and less benefit you'll get on the scene.

Surprisingly (to me anyways) the 2070 isn't horrid @ 1080p low. "The RTX 2070 is a GPU we called useless for ray tracing last time because it couldn’t even achieve 60 FPS at 1080p with the Low DXR mode". After patch it's hitting 73/58, which I would consider smooth/playable. There is still a huge performance difference as noted, "However, yes you guessed it, performance is ~50% higher with DXR disabled compared to on the Low mode." I would consider this playable though and possibly worth it depending on how you feel about the looks. It still has some issues with visual, especially with the aggressive culling they introduced (that's how they got higher speeds, the removed parts of the scene), it seems the reflections sometimes miss objects and such which can lead to some distraction and some lighting is still out of place (sun/god rays showing where they shouldn't be).

Again, this is a bit better than I remember when I was looking into it, but still nothing I would say is going to keep up in 2-4 years. It seems they are already lowering quality from their initial releases due to performance issues, but obviously the side effect is lower quality.
 
Status
Not open for further replies.
Back
Top