Nvidia CEO says buying a Gpu without raytracing is crazy...

Status
Not open for further replies.
Immersion is a funny thing. I think raytracing is neat, but developers seem to overdo the effect.

I remember when World of Warcraft came out it was designed to run on low-end systems. The world was very cartoon-like, but Blizzard spent their graphics power wisely. The thing they really concentrated on was how things moved - how players moved, how players danced, how animals moved, how birds flew, even little things like how chickens, squirrels and rabbits cavorted. It created an immersive effect on a very low budget.

Totally agreed. IMO we're just rolling deeper into the uncanny valley. The more realistic the scene the more your brain subconsciously notices the minute flaws in the scene.

People keep saying give it time... but I suspect GPU power needs to grow exponentially to fool our brains. I doubt that's going to happen in the next few years. The human brain is the most powerful computer in the known universe. Can you actually 'imagine' a photorealistic scene in real time?

That said, I'd like to see a ray-traced Thief game just to see what's possible. Maybe it could be used to augment particular art/gameplay styles in a good way.
 
That said, I'd like to see a ray-traced Thief game just to see what's possible.

I hadn't thought of that, but sneaking games are one of my favorites and you are bang-on, raytracing would certainly add the right tone (and I will happily lug a body 200 yards just for the satisfaction of carrying it - I can spend endless amounts of time in a good sneaker.)


P.S. It'd be good for a few horror titles, too.

P.P.S. "Ladies, if you are looking for that 'young lover' experience, I can promise you that I have the brains of a German Shepherd and the body of a 17-year-old, and if you don't believe me they're both in the trunk of my car." - Bob Saget
 
He’s not really wrong though. If you’re going to buy a product from a company, you should buy their current generation of product.

It is the same with AMD. If you’re going to buy an AMD card, you’re going to be best off if you buy one that just came out rather than one that came out two years ago.

As CEO, he isn’t really allowed to say things like, “users at the lower end of the market would be better served by buying a two year old used card from the FS forum on [H].” That would be a breach of his fiduciary duty and he could get in trouble for that even if it were true.
 
My problem as someone who owns a 2080ti is REALLY noticing the differences. I mean I feel like graphics are getting to the point of being nigh-impossible to distinguish big enough differences to need huge new GPUs every generation. It's the same story with consoles. We will never see a generational leap again in the vain of PS1 to PS2 or PS2 to PS3.

I have played Metro Exodus, and the Quake RTX thing but I just have trouble saying that the expense of a RTX card is justified by what I was seeing. Developers have gotten so good at creating tricks over the years to give us really good lighting and plenty of games have looked incredible without Ray-Tracing so I would say it's not really a GPU-selling feature to me.

I guess my problem is pointing out to myself "Oh, look at that ray-traced thing!" Like it would be cool if there were some ray-tracing feature like in the Halo remake where on the Xbox you could switch instantly from the new to the old graphics with a quick button press. That would be a cool way to see the difference.
 
Last edited:
He’s not really wrong though. If you’re going to buy a product from a company, you should buy their current generation of product.

It is the same with AMD. If you’re going to buy an AMD card, you’re going to be best off if you buy one that just came out rather than one that came out two years ago.

As CEO, he isn’t really allowed to say things like, “users at the lower end of the market would be better served by buying a two year old used card from the FS forum on [H].” That would be a breach of his fiduciary duty and he could get in trouble for that even if it were true.

Except when they release the same product(1080ti) with rtx effects tacked on(2080) and then mark it up considerably higher than the card its replacing on launch. I get that they had a lack of competition for a while but that sort of sleazy behavior tends to bite them in the ass when competition finally arrives, hence why amd had been getting alot of glowing press lately.
 
Except when they release the same product(1080ti) with rtx effects tacked on(2080) and then mark it up considerably higher than the card its replacing on launch. I get that they had a lack of competition for a while but that sort of sleazy behavior tends to bite them in the ass when competition finally arrives, hence why amd had been getting alot of glowing press lately.

He didn’t say that people with 1080tis should upgrade.
 
He’s not really wrong though. If you’re going to buy a product from a company, you should buy their current generation of product.

It is the same with AMD. If you’re going to buy an AMD card, you’re going to be best off if you buy one that just came out rather than one that came out two years ago.

As CEO, he isn’t really allowed to say things like, “users at the lower end of the market would be better served by buying a two year old used card from the FS forum on [H].” That would be a breach of his fiduciary duty and he could get in trouble for that even if it were true.
The 1660ti is their current generation.... So... Yeah. Their current gen doesn't have RT hardware. That was my point.
 
I've had a 2080 ti for months now and never used it, nor really gave a shit.


That almost read like a Trump tweet. Just a long ramble of how great everything is.

For it to be like that, we would have to anxiously listen to half the nVidia employees tell us how nVidia is the worst company that has ever existed. And then other half who enjoy working at nVidia, well we would just ignore them because they are bigots (obviously).
 
I've had a 2080 ti for months now and never used it, nor really gave a shit.


That almost read like a Trump tweet. Just a long ramble of how great everything is.

The joke I was going to make is "Nvidia has offered a press relations job to Donald Trump in case things don't work out in his reelection bid."
 
Full disclosure, I've owned nothing but Nvidia products since 2008 but looking at this shit-show of a comment by Mr. Hsun, I can only offer my perspective on the situation. "IF" you are a professional user in the field of deep learning, AI, quantum chemistry, mathematics, or in the 3D modeling and rendering field myself, the RTX card is a boon for us because in comparison to a Quadro card, it performs just as good.

With that said however...

His language is targeting specifically at low information audience, gamers (the majority of his revenue) and he's using a generic 10th rate corporate word salad of platitudes, that other than 20 something fanboys, it's going to fall on deaf ears because anyone with any sort of critical thinking knows that he's trying to justify an overpriced product with features that will cripple your system, lack little to no library of games, is still way in its infancy, all because of greed and pleasing the shareholders.

But the numbers don't lie, the stock price since last year has dropped significantly, their other business sectors have slowed, sales are down 31% and the investors are leery because of the looming recession that's gaining more momentum thanks to tariffs, tax cuts, and the elephant, Americans are just plain broke. Let me push back further, Jensen is aware of all these elements I listed above, he has no choice but to gloss over these facts and project this image of confidence and certainty otherwise he's going to be skewered by so called market forces during the 3rd quarter reports. Just marinate on this thought for a moment, the default price is now $1200 for a flagship GPU, imagine what that is going to be in two years. And I don't know if some of you have selective amnesia or what but don't forget Jensen's behavior the last year with the NDA's, the Geforce program on the vendors, and now the price gouging, this is just the beginning. Nvidia will continue to monopolize and cannibalize the market till nobody has any money left to buy bread and milk.

Let me reiterate this one more time since I've had in the past, the future of gaming is ray tracing, period. But none of that matters when you have a narcissistic CEO who only knows one thing, GREED, everything else is secondary. Jensen's cavalier and duplicitous attitude toward his audience is just a reminder quarterly reports and his stupid ass jacket is all that's important to him.
 
Full disclosure, I've owned nothing but Nvidia products since 2008 but looking at this shit-show of a comment by Mr. Hsun, I can only offer my perspective on the situation. "IF" you are a professional user in the field of deep learning, AI, quantum chemistry, mathematics, or in the 3D modeling and rendering field myself, the RTX card is a boon for us because in comparison to a Quadro card, it performs just as good.

With that said however...

His language is targeting specifically at low information audience, gamers (the majority of his revenue) and he's using a generic 10th rate corporate word salad of platitudes, that other than 20 something fanboys, it's going to fall on deaf ears because anyone with any sort of critical thinking knows that he's trying to justify an overpriced product with features that will cripple your system, lack little to no library of games, is still way in its infancy, all because of greed and pleasing the shareholders.

But the numbers don't lie, the stock price since last year has dropped significantly, their other business sectors have slowed, sales are down 31% and the investors are leery because of the looming recession that's gaining more momentum thanks to tariffs, tax cuts, and the elephant, Americans are just plain broke. Let me push back further, Jensen is aware of all these elements I listed above, he has no choice but to gloss over these facts and project this image of confidence and certainty otherwise he's going to be skewered by so called market forces during the 3rd quarter reports. Just marinate on this thought for a moment, the default price is now $1200 for a flagship GPU, imagine what that is going to be in two years. And I don't know if some of you have selective amnesia or what but don't forget Jensen's behavior the last year with the NDA's, the Geforce program on the vendors, and now the price gouging, this is just the beginning. Nvidia will continue to monopolize and cannibalize the market till nobody has any money left to buy bread and milk.

Let me reiterate this one more time since I've had in the past, the future of gaming is ray tracing, period. But none of that matters when you have a narcissistic CEO who only knows one thing, GREED, everything else is secondary. Jensen's cavalier and duplicitous attitude toward his audience is just a reminder quarterly reports and his stupid ass jacket is all that's important to him.

LOL
 
38kmp8.jpg
 
Well I can say I am liking DLSS..... if I am stopped and just looking around I see a slight decrease in visual quality, but running/flying around I see only an increase in quality and FPS, so it is a trade I am willing to make.
 
My problem as someone who owns a 2080ti is REALLY noticing the differences. I mean I feel like graphics are getting to the point of being nigh-impossible to distinguish big enough differences to need huge new GPUs every generation. It's the same story with consoles. We will never see a generational leap again in the vain of PS1 to PS2 or PS2 to PS3.

Yeah those days are long gone unless we discover some ancient alien manufacturing tech in a cave somewhere. That’s the case for every technology though. Bigger breakthroughs early on and smaller improvements as it matures.

For the last few years I’ve appreciated technology advancements in reverse. Thanks to a deep Steam backlog I’m still playing games that are almost 10 years old. When you go backwards it’s when you really realize what’s missing from those old games. Going forward it’s much harder to appreciate new stuff.
 

I doubt it, but my guess is it's going to be more like PhysX - the proprietary stuff didn't go anywhere, but the generic brand-agnostic implementation is used all over the place.

Ray tracing as a concept isn't a gimmick, it's literally, objectively more accurate for lighting. But I don't think "RTX" is going to be the popular implementation.
 
Ray tracing as a concept isn't a gimmick, it's literally, objectively more accurate for lighting. But I don't think "RTX" is going to be the popular implementation.

For GPU cards pretty sure it will be.
 
Not if it's exclusive to nVidia. Same thing happened with GPU PhysX.

RTX is mainly just branding for NVidias HW RT implemenation.

AMD will have branding for theirs.

Intel will have branding for theirs.

Given historical GPU sales trends. I expect NVidia will sell more cards with HW RT than either Intel or AMD, thus it will be the most popular.
 
Not if it's exclusive to nVidia. Same thing happened with GPU PhysX.

RTX is an implementation of a Microsoft api. PhysX is an implementation of an Nvidia api.

Why do people keep comparing RTX to PhysX? Nvidia’s marketing is really amazing.

It’s true that because nvidia is first out the gate that games will optimize for nvidia hardware but those games will run just fine on any other DXR hardware in the future. That’s no different to the current situation with DirectX or Vulkan though.
 
Immersion is a funny thing. I think raytracing is neat, but developers seem to overdo the effect. In the games I've seen with raytracing turned on, the rendered scene is either too dark (to allow you to see the raytrace effect) or the effect is neon bright (to allow you to see the raytrace effect.) It's too much, and just for the sake of the effect. It's like women who draw in false eyebrow arches - too much is unnatural.

I remember when World of Warcraft came out it was designed to run on low-end systems. The world was very cartoon-like, but Blizzard spent their graphics power wisely. The thing they really concentrated on was how things moved - how players moved, how players danced, how animals moved, how birds flew, even little things like how chickens, squirrels and rabbits cavorted. It created an immersive effect on a very low budget.

Raytracing is awesome, but I'd rather see better physics.
It's like when 3d came out in movies (again).
Best example I've seen yet was a subtle shadow effect vs normal hard shadows. That's what I want to see.
 
I think the context of his statement is clearly buy Super cards and don't buy Navi. Referencing Pascal cards at this point doesn't make sense.
How about referencing turing cards... Like the 1660ti and others. I mean, from someone who doesn't spend $1k on a GPU, RT makes no sense. A lot of people buy < $300 cards, some of which his company produces. So, he should stop making them if he feels that way, I'm sure AMD would be more than happy to keep selling people cards that don't want or need or can't/won't pay that much for a GPU.
 
Not if it's exclusive to nVidia. Same thing happened with GPU PhysX.
You've got it wrong... Nvidia implemented direct x raytracing, their implementation is called rtx. It's not locked to their hardware like physics, thus there is no comparison. It's just an implementation of a standard.
 
How about referencing turing cards... Like the 1660ti and others. I mean, from someone who doesn't spend $1k on a GPU, RT makes no sense. A lot of people buy < $300 cards, some of which his company produces. So, he should stop making them if he feels that way, I'm sure AMD would be more than happy to keep selling people cards that don't want or need or can't/won't pay that much for a GPU.

But it wasn't referencing Turing, because again, Context: He was asked how Super was doing, and how it was projected to do.

The answer is rambling, but simplified, it amounts to(with context based implications spelled out for you):

"If you are buying a new GPU, in the same competitive space as Super, and you are going to keep it for 2 to 4 years, then you are crazy not to get Super since it has Ray Tracing, which is taking off."

That isn't that much of an unreasonable argument in context, especially from the CEO of the company that makes that product.

If you are choosing between Super series and 5700 series and plan to keep it 2-4 years. That is 2-4 years you are locked out of Ray Tracing, while it takes off. Certainly not the choice I would make.

Now if you are prepared to buy another new GPU, the moment a game comes out with RT that like and would like to experience/experiment with the RT, then you are more free to choose the cards without RT.

But hey I realize, that is a slow news/rumor cycle, so taking inane comments out of context and ranting about them is what happens. ;)
 
You've got it wrong... Nvidia implemented direct x raytracing, their implementation is called rtx. It's not locked to their hardware like physics, thus there is no comparison. It's just an implementation of a standard.

Not sure I really understand what you are saying. Basically, what you just said could be interpreted as AMD can just take what Nvidia developed on their card and use it as they see fit, right down to the exact same hardware. Otherwise, it is exclusive since games have to be made to support it, exclusively, from what I am seeing.
 
Not sure I really understand what you are saying. Basically, what you just said could be interpreted as AMD can just take what Nvidia developed on their card and use it as they see fit, right down to the exact same hardware. Otherwise, it is exclusive since games have to be made to support it, exclusively, from what I am seeing.

RTX is an umbrella for a few technologies.

For RT devs use toolkits that make implementing Microsoft’s DXR easier. All software usually has this. The code would have to be modified for AMD but I imagine it’d be minimal. AMD should have toolkits of their own very similar to nVidia since it uses the same base API.

We’ve already seen hardware agnostic demos. The devs can program it only with DXR and it’ll work on anything.
 
For me the 2080Ti is just NOT worth it at all. I won't degrade my playing to 1080p just to use RT. Now if the 2080Ti could run 4k @ 60 fps RT on then I wouldn't even hesitate to buy one, but at this time I just see no value in the card. Even if you are buying a new card now, RT isn't a feature that's going to be useful for another 3-4 years till the feature becomes more mainstream and you're probably going to be buying a new card by then that can do RT at your native resolution for less money.
 
Not sure I really understand what you are saying. Basically, what you just said could be interpreted as AMD can just take what Nvidia developed on their card and use it as they see fit, right down to the exact same hardware. Otherwise, it is exclusive since games have to be made to support it, exclusively, from what I am seeing.
The game is written to support DXR (direct x raytracing). As long as AMD's drivers have support for dxr, the games will just work on their hardware, regardless of the actual hardware implementations, which I assume will not be a direct copy of nvidias hardware. It can even be done with shaders if they wanted and still work. M$ defined the API, the manufacturer is open to implement as they see fit. Software developers won't have to write any code to support a specific vendor. That said, since Nvidia was out first, games may not be as well optimized for AMD (due to not having the hardware to test), but it will still work. There is no exclusivity here.
 
The game is written to support DXR (direct x raytracing). As long as AMD's drivers have support for dxr, the games will just work on their hardware, regardless of the actual hardware implementations, which I assume will not be a direct copy of nvidias hardware. It can even be done with shaders if they wanted and still work. M$ defined the API, the manufacturer is open to implement as they see fit. Software developers won't have to write any code to support a specific vendor. That said, since Nvidia was out first, games may not be as well optimized for AMD (due to not having the hardware to test), but it will still work. There is no exclusivity here.

More likely that Microsoft and NVidia defined the DXR API together.
 
More likely that Microsoft and NVidia defined the DXR API together.
That doesnt change any of what I said. It's still a m$ API that AMD is free (and planning) to support it. Also, you realize dx12 was mostly AMD and m$ working together with ideas from vulkan to reduce overhead, and vulkan was basically a copy of AMD mantle. So, it's not as if AMD has/had no way in features for these apis.

PS, the point was that this isn't the same as physics which was locked to a vendor, which was my original response.
 
That doesnt change any of what I said. It's still a m$ API that AMD is free (and planning) to support it. Also, you realize dx12 was mostly AMD and m$ working together with ideas from vulkan to reduce overhead, and vulkan was basically a copy of AMD mantle. So, it's not as if AMD has/had no way in features for these apis.

PS, the point was that this isn't the same as physics which was locked to a vendor, which was my original response.

Absolutely, AMD will eventually support DXR, though I expect there will initially be some glitches that need to be patched when none of the game vendors had a chance to test their cards on AMDs implementation before they were shipped.
 
Absolutely, AMD will eventually support DXR, though I expect there will initially be some glitches that need to be patched when none of the game vendors had a chance to test their cards on AMDs implementation before they were shipped.
Yes, but that's a far cry from the argument that it's an Nvidia only thing like physx.
 
Status
Not open for further replies.
Back
Top