Is nvidia marketing to blame for the RTX complaints?

lol I love the “unless it’s a diamond unicorn and shits loot at $600” it has no value argument.

2080ti is 50% larger and IIRC has twice the components than the Titan Xp but cost the same. It has 30-45% faster performance at 4k60 range plus the first real graphics change in a decade. Metro does GI RT at 1440p 100 fps. That’s awesome.

So yes, someone messed up big time at nVidia since the rasterized increase wasn’t even touched on at the reveal.
 
Last edited:
lol I love the “unless it’s a diamond unicorn and shits loot at $600” it has no value argument.

2080ti is 50% larger and IIRC has twice the components than the Titan Xp but cost the same. It has 30-45% faster performance at 4k60 range plus the first real graphics change in a decade. Metro does GI RT at 1440p 100 fps. That’s awesome.

So yes, someone messed up big time at nVidia since the rasterized increase wasn’t even touched on at the reveal.

In a way you hit the problem though, the only 2000 series card worth a damn is the 2080ti.

Why people keep banging on about the size being important I'll never understand. I get it, but I'm a hardware enthusiast and the vast majority of people are not, thus do not, and will never care that the card is bigger and has more parts, ALL they care about is: How fast is it? How much does it cost? The reality of that larger size is that there is very little performance gain over the last generation, at an increased price.

The 2080 highlights this perfectly, you get 1080ti levels of performance that costs $100 more than the 1080ti, because nVidia is banking on technology that has not been proven yet. So far RTX is really at 2080ti feature for decent frame rates and DLSS looks like my toddler smudged up my glasses.

No one is saying it needs to be a diamond unicorn but you, people are saying that a $100 price increase for the same performance is insulting, and it is.
 
In a way you hit the problem though, the only 2000 series card worth a damn is the 2080ti.

Why people keep banging on about the size being important I'll never understand. I get it, but I'm a hardware enthusiast and the vast majority of people are not, thus do not, and will never care that the card is bigger and has more parts, ALL they care about is: How fast is it? How much does it cost? The reality of that larger size is that there is very little performance gain over the last generation, at an increased price.

The 2080 highlights this perfectly, you get 1080ti levels of performance that costs $100 more than the 1080ti, because nVidia is banking on technology that has not been proven yet. So far RTX is really at 2080ti feature for decent frame rates and DLSS looks like my toddler smudged up my glasses.

No one is saying it needs to be a diamond unicorn but you, people are saying that a $100 price increase for the same performance is insulting, and it is.

Yep. Its almost hard to say it but 2080ti is the only card that is worth it other than the price tag. I got mine for less than 1000 for Zotac amp like new. Vs 1400 after tax retail. I guess I call it an expensive good deal lol.
 
In a way you hit the problem though, the only 2000 series card worth a damn is the 2080ti.

Why people keep banging on about the size being important I'll never understand. I get it, but I'm a hardware enthusiast and the vast majority of people are not, thus do not, and will never care that the card is bigger and has more parts, ALL they care about is: How fast is it? How much does it cost? The reality of that larger size is that there is very little performance gain over the last generation, at an increased price.

The 2080 highlights this perfectly, you get 1080ti levels of performance that costs $100 more than the 1080ti, because nVidia is banking on technology that has not been proven yet. So far RTX is really at 2080ti feature for decent frame rates and DLSS looks like my toddler smudged up my glasses.

No one is saying it needs to be a diamond unicorn but you, people are saying that a $100 price increase for the same performance is insulting, and it is.

I agree about the performance/cost. I personally “black box” things and only care about the performance/cost and not how they get there. The only way it’s relevant is when people get emotional and claim nVidia is increasing margin. By any logical thinking they actually reduced margin to push RTX features. - Although it was to have a stranglehold on a new market, not push innovation for us plebs... it was a ballsy move regardless.

I think the guy a few posts up saying the card adds no value unless it can do 4k/144Hz wants that diamond unicorn that shits loot GPU equivalent. ;)
 
the card adds no value unless it can do 4k/144Hz wants that diamond unicorn that shits loot GPU equivalent. ;)

I can't argue with that, but I think that guy doesn't understand where performance is currently sitting given that the 1080ti/Titans where trying for 4k/60hz, 4/144 thats like a 60ish percent increase if you remove the fact that Pascal can't push some titles at 4/60.
 
Maybe regarding 2080 Ti prices we should also discuss the premiums vendors put on their cards. I bought a Palit 2080 Ti Gaming Pro for 1100 euros shipped. Meanwhile Asus or MSI cards are 100-300 euros more expensive and for what? Slightly less noise and slightly better cooling performance, maybe a tiny bit higher stock overclocks. Yet at the same time they don't seem to perform much better than a reference card with a BIOS swapped for higher power limit.
 
Real time ray-tracing @>700fps on mid-range RTX 2070
It is so fast that even screenshot of it moves !!!!!!!!!!!!!111
0mUALXb.jpg


Try doing that on your 1080Ti or Vega VII. I dare you =)
 
That's not happening in a few generations and it's not something everyone wants or needs. Not to mention there's only 2 displays that can do those refresh rates at 4K in the first place and they have limitations on it as well.

It's mistaken to say that everyone chases the fastest framerates their display can handle. As long as games run above 60 fps most of the time I'm good, G-Sync/Freesync handles the dips under pretty nicely. I went from 980 Ti to 2080 Ti because I wanted to make it a bit more future proof when I get a 4K desktop display later this year. I've got a 1440p 144 Hz display yet I've decided to instead downscale from 4K for most games because I appreciate the increased visual fidelity more than running at 100+ fps vs running at 60+ fps. I don't play multiplayer shooters where framerate is king. I'm definitely the type who would enable all the RTX options because they look better even if they incur a heavy hit on the framerate.

Actually, it could have happened this generation if they hadn't done full on ray tracing boner.
 
Because the 2080ti is the only one that will even do at least 4k/60FPS+ in newer titles. You jumped straight to 4k/144hz without even recognizing that the 2080ti is the first to 4k/60hz properly.

I have a X27. You're correct, in the latest titles I can't hit even 100FPS+ with a 2080ti. At the very least i'm holding above 60 though.

Additionally due to link limitations, you can't even hit 144hz/4k unless you downgrade to 4:2:2, or you can set it to 120hz/8-bit RGB, or if you've got the software that can use it 98hz/10-bit RGB.

Ultimately the 2080ti was the only card that would at least keep me above 60Hz in most titles, and that to me adds value. My 1080Ti would dip regular to the 30-40 FPS range in titles like AC Odyssey, and even with g-sync once you start getting into that 30-40 FPS range it becomes noticeable.

Finally, as more titles that properly utilize raytracing like Metro come out, it'll make that feature set worth it, and the 2080ti is the only card that will give acceptable framerates @ 4k running raytracing without sacrificing much quality by having to turn down settings elsewhere. The raytraced global illumination in Metro is absolutely stunning, and it's the first title in quite a while that left wowed by the graphics. I haven't really been wowed by graphics since Crysis first came out. Seeing HDR for the first time in a game wowed me too, but that was more of a display feature, not a GPU compute related feature.

haha Metro looks WORSE with ray tracing on than off, arguably.
 
Actually, it could have happened years ago if they hadn't done full on shaders boner.

At least shaders have some value in the here and now. The ray tracing shit is just snake oil.

What has history taught us? General purpose always wins. Even shills admit that you could do ray tracing with generic compute shaders. You don't want hardware specialized to one thing that most people don't even care about right now.
 
In a way you hit the problem though, the only 2000 series card worth a damn is the 2080ti.

Why people keep banging on about the size being important I'll never understand. I get it, but I'm a hardware enthusiast and the vast majority of people are not, thus do not, and will never care that the card is bigger and has more parts, ALL they care about is: How fast is it? How much does it cost? The reality of that larger size is that there is very little performance gain over the last generation, at an increased price.

The 2080 highlights this perfectly, you get 1080ti levels of performance that costs $100 more than the 1080ti, because nVidia is banking on technology that has not been proven yet. So far RTX is really at 2080ti feature for decent frame rates and DLSS looks like my toddler smudged up my glasses.

No one is saying it needs to be a diamond unicorn but you, people are saying that a $100 price increase for the same performance is insulting, and it is.
Again, it depends on how you black box it. If we black box by price, you can get a 2080 for about the price of a 1080Ti, at least in theory atm. Both products had insane prices when they were new and stock was low. Now for a $100 increase, or 1/7th (~14%) increase you get similar performance in todays games, plus the RT/DLSS features. Now, is that worth it? Does it add value to you? That's highly subjective and obviously some say yes, some say no. If it's not, get a 1080Ti. If it is, get the 2080. Again, it depends on what goals you have with purchasing the card. Games only? Games plus GPGPU programming? Games plus just an interest in new tech? Value is subjective. However, that's not really the point of this thread...

Unless you're an AMD or Nvidia fan boy, I don't know how you could have come to any other conclusion than that based on the presentation, especially as an enthusiast who claims to know how this market works. Literally the whole thing was last get CUDA cores + RT/DLSS, at an approx. given price. The only reason to get upset and emotional about the pricing is if it's a status buy. Otherwise, if you're upgrading, you get same performance for about the same cost plus RT/DLSS as the previous generation. Again, crazy early adopter markets excluded.

Now if you black box it by model number, the original premise and question in the thread (did Nvidia's marketing mess up?), it does appear that Nvidia messed up. Enough people buy expensive GPUs for status. Like fancy cars, they are upset emotionally that to have the "best," it's going to cost a lot more. A 2080 is more than a 1080. A 2080Ti is more than a 1080Ti. However, they are ~30% faster than the same "model," plus have RT/DLSS. Is that of value? Again, subjective.

Now, beyond simple black boxing, as enthusiasts, we like to understand why. The why is simple. These chips are MASSIVE. The Titan-V is closely related to these, not that much bigger, but was the largest IC possible for TSMC to manufacture on the process. The only reason they cost as little as they do, or exist at all, is because the GPGPU market is driving the cart here; these are broken GPGPU ICs. Again, obvious from the presentation. The GPGPU market benefits massively from the Tensor cores. Obviously, not for all problems, but enough. If AMD had a competitive GPU, Nvidia would be forced to do a custom spin of these ICs with likely all CUDA cores instead of CUDA/Tensor. I.e. they couldn't get away with selling broken GPGPU cards.

As for unproven tech...have you paid any attention to every tech company? Over the last two decades especially? That's literally their business model.
 
Last edited:
Again, it depends on how you black box it. If we black box by price, you can get a 2080 for about the price of a 1080Ti, at least in theory atm. Both products had insane prices when they were new and stock was low. Now for a $100 increase, or 1/7th (~14%) increase you get similar performance in todays games, plus the RT/DLSS features. Now, is that worth it? Does it add value to you? That's highly subjective and obviously some say yes, some say no. If it's not, get a 1080Ti. If it is, get the 2080. Again, it depends on what goals you have with purchasing the card. Games only? Games plus GPGPU programming? Games plus just an interest in new tech? Value is subjective. However, that's not really the point of this thread...

Unless you're an AMD or Nvidia fan boy, I don't know how you could have come to any other conclusion than that based on the presentation, especially as an enthusiast who claims to know how this market works. Literally the whole thing was last get CUDA cores + RT/DLSS, at an approx. given price. The only reason to get upset and emotional about the pricing is if it's a status buy. Otherwise, if you're upgrading you get same performance for about the same cost plus RT/DLSS as the previous generation. Again, crazy early adopter markets excluded.

Now if you black box it by model number, the original premise and question in the thread (did Nvidia's marketing mess up?), it does appear that Nvidia messed up. Enough people buy expensive GPUs for status. Like fancy cars, they are upset emotionally that to have the "best" it's going to cost a lot more. A 2080 is more than a 1080. A 2080Ti is more than a 1080Ti. However, they are ~30% faster than the same "model," plus have RT/DLSS. Is that of value? Again, subjective.

Now, beyond simple black boxing, as enthusiasts, we like to understand why. The why is simple. These chips are MASSIVE. The Titan-V is closely related to these, not that much bigger, but was the largest IC possible for TSMC to manufacture on the process. The only reason they cost as little as they do, or exist at all, is because the GPGPU market is driving the cart here; these are broken GPGPU ICs. Again, obvious from the presentation. The GPGPU market benefits massively from the Tensor cores. Obviously not for all problems, but enough. If AMD had a competitive GPU, Nvidia would be forced to do a custom spin of these ICs with likely all CUDA cores instead of CUDA/Tensor. I.e. they couldn't get away with selling broken GPGPU cards.

As for unproven tech...have you paid any attention to every tech company? Over the last two decades especially? That's literally their business model.

I am going to ignore the irrelevant items.

Value is not subjective in the broader market, only the individual, it's actually elastic in nature. Value is a function of various factors, but in its most basic form is supply and demand.

Its elastic because we are not dealing with a need, but a want. NVidia can manipulate that elasticity through price, product segments and marketing.

What nVidia has done in a nut shell, to repeat, is charge more for an untested, general unusable, feature set. They then marketed that feature set as being so easy to implement that it just works. Now 5 months later, it is still largely absent. That 5 months is at best a quarter of the product life cycle, at worst its little less than half.

And yes, I've been around at least as long as you have, I am familiar with early adopter tax. The problem is there is not much other choice for those that don't want to pay a tax for a product that likely won't be worthwhile for a generation or two. Back in the 90s to 00s there was.

Given nvidias gpp and marketing campaign, the backlash is normal. The fact that the 2080 is 100 more than the 1080ti for the same performance is a spit in the face from nVidia.
 
haha Metro looks WORSE with ray tracing on than off, arguably.

If that's your experience then that's fine. But I can say with absolute confidence that is not the way it is on my PC. Metro with Ray Tracing enabled is absolutely better. Yes there is a good performance hit, but then there generally is when new hardware features come out.
 
Is nvidia marketing to blame for the RTX complaints?

More likely the problem is the quality issues and poor performance given the cost.
 
At least shaders have some value in the here and now. The ray tracing shit is just snake oil.
What has history taught us? General purpose always wins. Even shills admit that you could do ray tracing with generic compute shaders. You don't want hardware specialized to one thing that most people don't even care about right now.
RTX 20x0 series is very similar to GF3 and GTX 10x0 to GF2
Back when shaders came out in GeForce3 they took die space and without them and the same transistor cound this chip could have more TMUs increasing performance of all games a lot. With more pure rasterization resources gamers could utilize much larger resolutions their CRT were supporting or much smootcher gameplay in resolution that were typically used at the time. There were a lot of nice non-shader games with even more to come out. Shaders usually just added shinier water or some light effects you argue were totally not worth it if presented with option to have much faster texture fill rates. Did they have the same value at the begining as they did eg. year later? No
Adoption of any new tech is rather slow process but in the end after transition phase and especially better more mainstream products come out we cannot imagine world without these new features. If you removed all shaders and made such a massive chips with pure rasterization capabilities it could put out insane amount of texture filtered polygons at insane resolutions like 16K with ease.

Tensor cores are a killer feature for AI research market which is growing incredibly fast in recent years. RT cores at this point do not take that much space and are also generic purpose resource which you can used for any number of things and will prove especially useful for accelerating path tracing in programs like Blender and other offline renderers.

Is nvidia marketing to blame for the RTX complaints?

More likely the problem is the quality issues and poor performance given the cost.
Quality issues aside NV marketing failed to show added features are supposed to accelerate stuff and are important for the future of computer graphics by not giving anything that ran the same kind of visuals on shaders to compare and see for themselves the performance improvements in these kind of calculations is massive.

And people will be people. Spoiled consumers who do not even stop and think what they are complaining about <_<
 
RTX 20x0 series is very similar to GF3 and GTX 10x0 to GF2
Back when shaders came out in GeForce3 they took die space and without them and the same transistor cound this chip could have more TMUs increasing performance of all games a lot. With more pure rasterization resources gamers could utilize much larger resolutions their CRT were supporting or much smootcher gameplay in resolution that were typically used at the time. There were a lot of nice non-shader games with even more to come out. Shaders usually just added shinier water or some light effects you argue were totally not worth it if presented with option to have much faster texture fill rates. Did they have the same value at the begining as they did eg. year later? No
Adoption of any new tech is rather slow process but in the end after transition phase and especially better more mainstream products come out we cannot imagine world without these new features. If you removed all shaders and made such a massive chips with pure rasterization capabilities it could put out insane amount of texture filtered polygons at insane resolutions like 16K with ease.

Tensor cores are a killer feature for AI research market which is growing incredibly fast in recent years. RT cores at this point do not take that much space and are also generic purpose resource which you can used for any number of things and will prove especially useful for accelerating path tracing in programs like Blender and other offline renderers.


Quality issues aside NV marketing failed to show added features are supposed to accelerate stuff and are important for the future of computer graphics by not giving anything that ran the same kind of visuals on shaders to compare and see for themselves the performance improvements in these kind of calculations is massive.

And people will be people. Spoiled consumers who do not even stop and think what they are complaining about <_<

Also, Nvidia impulsed RTX and DLSS prominently like they were a game changer, but so far has failed to deliver.

There are several features of Pascal that AFAIK have yet to appear and have already been forgoten, no one made a big deal about them. Technologies like single pass stereo and simultaneous multiplojection. The last one was supposed to be used to speed up not only VR but also 4K rendering, IIRC it promised like 30% performance improvement at 4K, then there was this adaptive MSAA thing which gave 4xMSAA IQ at virtually no cost, haven't heard about it ever since.
 
Also, Nvidia impulsed RTX and DLSS prominently like they were a game changer, but so far has failed to deliver.

There are several features of Pascal that AFAIK have yet to appear and have already been forgoten, no one made a big deal about them. Technologies like single pass stereo and simultaneous multiplojection. The last one was supposed to be used to speed up not only VR but also 4K rendering, IIRC it promised like 30% performance improvement at 4K, then there was this adaptive MSAA thing which gave 4xMSAA IQ at virtually no cost, haven't heard about it ever since.
Single pass stereo is not unique to NVIDIA. Sony uses it for their VR implementation on the PS4, which is why it performs relatively well on weak hardware. Single pass has issues with some post processing effects, though, which is why I think we don't see it utilized that often. I know it is built into the Unity engine as an option.

iRacing uses simultaneous multi-projection. It offers up to a 50% boost in FPS in multi-monitor setups. Sadly, it is still the only game I know of that uses it. I don't think we see it being used since the segment where it provides benefit is extremely niche, and I think ultra widescreen is superseding the need for multiple monitors.

The "adaptive MSAA" is called MFAA, and it works in nearly every game. In my experience with it it really does deliver 4x quality at 2x performance. Unlike SMP, MFAA is a driver-level setting that doesn't need explicit support from game developers to work.
 
Back
Top