DooKey
[H]F Junkie
- Joined
- Apr 25, 2001
- Messages
- 12,707
Value is more than $ cost of a product."value is a king"... that's a pretty hard sentiment to take to heart for todays pricing. Especially if you advocating for nVidia.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature currently requires accessing the site using the built-in Safari browser.
Value is more than $ cost of a product."value is a king"... that's a pretty hard sentiment to take to heart for todays pricing. Especially if you advocating for nVidia.
... nVidia is currently offering one of the worst price/performance ratios in decades and at the highest prices we have ever had. So that's a bitter pill for what amounts to the only new feature being Frame Generation.Value is more than $ cost of a product.
As long as you're willing to admit that NVIDIA is only losing to themselves (of the past).... nVidia is currently offering one of the worst price/performance ratios in decades and at the highest prices we have ever had. So that's a bitter pill for what amounts to the only new feature being Frame Generation.
Sorry, but "Value is more than $ cost of a product."... sounds like an ad for diamonds. Get real.
Value IS more than the $ cost of a product. The RTX 4070 Ti and 4080 are not great values. I have no idea why Nvidia chose to price them the way they did. If it's between an RTX 4080 and a 7900 XTX, the XTX wins for sure.... nVidia is currently offering one of the worst price/performance ratios in decades and at the highest prices we have ever had. So that's a bitter pill for what amounts to the only new feature being Frame Generation.
Sorry, but "Value is more than $ cost of a product."... sounds like an ad for diamonds. Get real.
and a GPU that is 1.5-2x faster in Rasterization and Ray Tracing <---- You forgot that part.
I was talking about 3090 -> 4090.But even the 3050 sells more than the 6650xt. Surely that can't be due to RT/DLSS as the 6650 would smoke the 3050 !?
https://www.notebookcheck.net/AMD-i...spend-savings-on-other-PC-parts.700365.0.html
"AMD could have made an RTX 4090 competitor but chose not to in favor of a trade-off between price and competitive performance. AMD's rationale is that a 600 W card with a US$1,600 price tag is not a mainstream option for PC gamers and that the company always strived to price its flagship GPUs in the US$999 bracket. The company execs also explained why an MCM approach to the GPU die may not be feasible yet unlike what we've seen with CPUs."
They could have - but they didn't wanna.
But even the 3050 sells more than the 6650xt. Surely that can't be due to RT/DLSS as the 6650 would smoke the 3050 !?
At some point it just comes down to consumers deserving the market that they made for themselves.At no point was the 1050/ti EVER as fast as products costing the same or even less, but because of the Ferrari effect, it sold far far more.
All those “glory days” video cards are still available at even lower prices in the secondary market!At some point it just comes down to consumers deserving the market that they made for themselves.
Can't wait for the next segment of "Why do GPUs cost so much???"
People unironically buy less for more because of name.
If it's between the RTX 4080 and the RX 7900 XTX, the more valuable card isn't that clear-cut, speaking as someone who owned both and traded the latter for the former.Value IS more than the $ cost of a product. The RTX 4070 Ti and 4080 are not great values. I have no idea why Nvidia chose to price them the way they did. If it's between an RTX 4080 and a 7900 XTX, the XTX wins for sure.
BUT, as I explained earlier in this thread, the RTX 4090 is the king of this segment at the moment, and as such, it allows Nvidia to get away with not pricing their products as value, but instead, premium options. That's how it works. AMD is forced to price their products as the value alternative to Nvidia (not sure what they were thinking on the 7900 XT) because AMD does not claim the halo card product and does not own the most mindshare. Value, in this case, is a direct result of mindshare among consumers. Obviously you do not believe the RTX 4000 offers value because you've done your research. Good. Unfortunately, many consumers see "Nvidia" and automatically think "best GPU maker", so the value is in the mindshare that Nvidia currently holds.
Also, "only new feature being Frame Generation"... and a GPU that is 1.5-2x faster in Rasterization and Ray Tracing <---- You forgot that part.
The sub X space is why pricing is a mess.If it's between the RTX 4080 and the RX 7900 XTX, the more valuable card isn't that clear-cut, speaking as someone who owned both and traded the latter for the former.
Why? AMD's terrible VR performance for a flagship, four-figure GPU. I don't understand how they can have a GPU that strong for pancake gaming with pure rasterization and just drop the ball so hard on VR frametimes that some people find it performing worse than the 3080 in certain titles, never mind the 4080. Trading was definitely the right move after I saw how much my DCS performance went up - reprojection was almost gone, whereas the motion smoothing smearing all over the cockpit was constant with the 7900 XTX no matter how much I adjusted the graphics settings. (Any throttling from the defective vapor chamber certainly wasn't helping!)
This isn't even getting into professional GPGPU uses and CUDA, though people shopping for that are only going for the 4090 because it's overall much better value than the RTX A6000 Ada if the dataset fits into 24 GB (and anything less can't handle their workloads and thus has no value). The pro card has twice the VRAM, but it's also a bit slower than the "gaming" card on top of being about four times the price - small wonder why they're buying up 4090s instead!
For what it's worth, I agree with everyone that the RTX 4080 is horrendously overpriced, and generally more GPU than anyone needs at sub-4K resolutions. Yet it's also the minimum performance tier that cuts it for VR in the more unoptimized titles, so anything less simply isn't valuable to VR gamers - not anything AMD offers (until they fix their drivers, because I'd love to see the 7900 XTX whoop the 4080 like it should have), and not NVIDIA's own last-gen Ampere stuff.
It's a frustrating spot to be in, but at least that level of performance exists now. It just needs to drop to more affordable price levels, while developers ought to consider optimizing for GPUs that aren't $1,000+ in the meantime. (Not just simulators with VR slapped on after the fact - what's with all these AAA releases the past few months with ludicrous system requirements?)
People always say this, but I want to know exactly what good they have done for the community. All of their open source technology runs most efficient on AMD hardware and always has. AMD's architecture has unique challenges associated with it when it comes to putting all its general purpose compute cores to work, which is what their tech works around. The fact that they let it run on other hardware is just to sell the illusion of goodwill to the community, but it's just that: an illusion.Yes, AMD has done a lot of good for the gaming community, but they can't just rest on their past accomplishments--they have nothing outside of them. RSR, AMD's built-in to drivers upscaling tech, great tech, but it's a one-trick pony, but it's inferior to FSR 2 which is usable by Nvidia and Intel which means you don't need an AMD card to capitalize on it. DLSS on the other hand is driven by AI, which requires Nvidia's Tensor cores, produces better image quality at almost the same performance levels, and comes with additional features such as DLAA, DLDSR, Reflex and Frame Generation (DLSS 3.0/40-Series), RTX VSR, etc.
Lol highest prices…. Hahahahahahaha…..... nVidia is currently offering one of the worst price/performance ratios in decades and at the highest prices we have ever had. So that's a bitter pill for what amounts to the only new feature being Frame Generation.
Sorry, but "Value is more than $ cost of a product."... sounds like an ad for diamonds. Get real.
Well, I think the “goodwill” comes from giving end-users, and developers options that aren’t hardware locked.People always say this, but I want to know exactly what good they have done for the community. All of their open source technology runs most efficient on AMD hardware and always has. AMD's architecture has unique challenges associated with it when it comes to putting all its general purpose compute cores to work, which is what their tech works around. The fact that they let it run on other hardware is just to sell the illusion of goodwill to the community, but it's just that: an illusion.
"But NVIDIA is worse."
When it comes to their business dealings, sure, but when it comes to graphics tech they're only looking out for #1, just like AMD. They're both in the business of selling graphics cards.
The reason AMD came up with Mantle and donated it to Khronos Group in the first place was because of their generic compute cores sitting idle a lot of the time. There was already movement in the rendering space to develop a bare metal API already, but AMD was just the first to demonstrate it. Coincidentally, AMD cards benefitted most from this approach at the time, especially with asynchronous compute. NVIDIA's architecture never had this issue, and so saw minimal performance gains from this approach. Conversely, NVIDIA was the first to enable multithreading in their drivers for DirectX and saw a huge uplift in performance since their cores were saturated with work, and as a result they often were caught waiting on the CPU to catch up.Well, I think the “goodwill” comes from giving end-users, and developers options that aren’t hardware locked.
As for these things running better on their hardware, that isn’t true necessarily. Vulkan runs good on all hardware, even though it may have been designed around the GCN hardware, it remains open source so it can be changed and modified to work with all manners of hardware.
FSR is just as good on Nvidia as it is on AMD or Intel, although why you’d use it on Nvidia if DLSS is an option is beyond me.
Freesync can work with Nvidia GPU’s, just that if it’s not a certified G-Sync compatible monitor then there’s no guarantee, which that’s on Nvidia to do, not AMD.
Nvidia isn’t worse, they’re just not wanting to give their stuff away for free. They’re a business, as you said, they’re in the business of selling graphics cards.
The short version is that AMD is the company that is producing API's or modifying API's and releasing them as open source. nVidia for the most part only uses and creates API's that are proprietary.People always say this, but I want to know exactly what good they have done for the community. All of their open source technology runs most efficient on AMD hardware and always has. AMD's architecture has unique challenges associated with it when it comes to putting all its general purpose compute cores to work, which is what their tech works around. The fact that they let it run on other hardware is just to sell the illusion of goodwill to the community, but it's just that: an illusion.
"But NVIDIA is worse."
When it comes to their business dealings, sure, but when it comes to graphics tech they're only looking out for #1, just like AMD. They're both in the business of selling graphics cards.
They've been passing off xx60 as xx80 with respective pricing since Kepler and the 600-series. Once upon a time a "Titan" or a 4090, or a 2080 Ti was just called a GTX 580 with a full unlocked die, none of this cutdown die (well ok, GTX 480, but you get my point), was the top die, and was $500.In a world where Nvidia tried to pass off a xx70 as a xx80, but regardless I don't consider a fake RRP of $799 to be a mainstream option either.
The pricing sux.Lol highest prices…. Hahahahahahaha…..
$500 gtx 680 was faster, more power efficient, cooler, had better frame times by far in Sli vs crossfire, and had more features than the $550 Radeon HD 7970. Are you saying the Radeon wasn't high end silicon either? That's not a good look if amd couldn't beat Nvidia midrange with their high end eh?Oh how much the market got duped on Kepler and every gen since. People still thought 80-class cards were high-end silicon and payed for it with the GTX 980, the GTX 1080, the RTX 2080, and the RTX 4080.
Not compared to sli setups of yesteryear...The pricing sux.
Yes it does, but the year isn’t over yet.The pricing sux.
I don't think the prices can hold either. There just aren't enough people that want to spend that level of money on these cards. The value isn't there.Yes it does, but the year isn’t over yet.
But both manufactures have released 1K+ cards more in the past. While I generally buy on the high end every few years, it sucks for those who don’t want to go down that route. And not everyone is lucky enough to come across good deals on the previous generation. Hell my local area has two computer sites, one in Catania with a 600 euro 1800x, and one in the mall with 700 euro 3060 ti. Anything else is pushing 1K+.
This isn't unusual six months post launch that stock is readily available,in non crypto mining times.It's only been a short while after launch. There is clearly not a stock problem, it's only a "demand problem
Sales of cards is also generally higher in "non crypto mining times".This isn't unusual six months post launch that stock is readily available,in non crypto mining times.
Lowest Unit Sales of Desktop Graphics Cards in History
(Image credit: Tom's Hardware/Jon Peddie Research)
For the whole year 2022, AMD, Intel, and Nvidia sold around 37.86 million graphics processors for desktop AIBs, down sharply from approximately 49.15 million units in 2021, according to Jon Peddie Research. In fact, 37.86 million units is an all-time low for discrete graphics desktop graphics boards. To add some context, sales of standalone graphics cards for desktop PCs peaked at 116 million units in 1998, based on JPR data.
Sharp Decline of Revenues
Unit sales of standalone graphics boards for desktop PCs clearly nosedived in 2022, so the whole market dropped $24.14 billion in the last four quarters, according to JPR. The number suggests that an average selling price (ASP) of a graphics card was $637. By contrast, the desktop AIB market was worth $51.8 billion in 2021 and a graphics board ASP was at $1,056. Still, JPR expects the AIB market to grow by 7% over the next three years.
Sure it’s pricing and sure it’s the economy. People aren’t spending as much on hobbies and non-essentials.Sales of cards is also generally higher in "non crypto mining times".
Though the 4090 has initially sold well, the entire market of video card sales is down.
https://www.tomshardware.com/news/n...f-graphics-cards-hit-all-time-low-in-2022-jpr
and
And you now agree, that this is definitely not an issue with availability. Coming off of "crypto mining times" when there was poor card availability, one would also assume there would be pent up demand. Clearly what pent up demand there was, was very small.
To restate: I would say this is a direct result of absurdly poor pricing that a majority of the market is not willing to receive. Would you care to add other commentary?
This.Sure it’s pricing and sure it’s the economy. People aren’t spending as much on hobbies and non-essentials.
https://finance.yahoo.com/news/consumer-spending-challenged-high-inflation-131524035.html
It’s affecting everything. Not just GPUs. So it’s barely a decent argument.
Sure it’s pricing and sure it’s the economy. People aren’t spending as much on hobbies and non-essentials.
https://finance.yahoo.com/news/consumer-spending-challenged-high-inflation-131524035.html
It’s affecting everything. Not just GPUs. So it’s barely a decent argument.
The slow down is less than 2% across the board according to that article. This is the lowest sales of GPU's ever.This.
Well met. I won’t argue any of those points - but you’re forgetting one thing - the full product stack isn’t released, yet. Only super high end is out. A lot of people are on the sidelines - waiting.The slow down is less than 2% across the board according to that article. This is the lowest sales of GPU's ever.
For context, the lowest quarter of GPU sales in 2008 during the economic crash still sold 2x as many cards as today. The lowest quarters in 2009 and 2010, let's say the two recovery years the same (actually higher).
To put that into context, in 2008-2009, people lost their jobs, houses, and their savings en mass. Right now it's nothing like that, and they still managed to sell 2x as many cards.
Additionally that article basically has all industry leaders stating optimism. Could be all bluster, but they were doing quotes from CEO's on their earnings calls.
Demonstrably false.The short version is that AMD is the company that is producing API's or modifying API's and releasing them as open source. nVidia for the most part only uses and creates API's that are proprietary.
It ran slower on AMD because those GPU's had much less shader power than the nVidia GPU of the same era. Queue AMD butthurt fanboys crying in their cereal yelling 'He cheated!' in between sobs. The very next AMD gpu corrected the shader performance imbalance, and Gameworks games played fine. Gameworks games meaning games programmed to use nVidia supplied dll's to shorten dev time to implement those graphics features. It was good for the gaming industry and AMD only was hurting for 1 generation. It's not nVidia's job to tell their competitors "beef up your shader performance or your shit will suck...".Just as one off the cuff example, nVidia created "Gameworks". There was a scandal at the time that nVidia intentionally added increased tessellation to the rendering pipe that would be culled on nVidia hardware (and otherwise run faster with a better tessellation engine) that obviously wouldn't be on AMD hardware. They also made it impossible for any other vendor to examine the code in the libraries, making it impossible for ATi/AMD to optimize their hardware/drivers to run Gameworks properly. In short, Gameworks was intentionally designed to run slower on competitor hardware by obfuscation and it being closed source. It was/is seen as being malicious. Then AMD essentially had to create an open source version of Gameworks, GPUOpen with technologies like TressFX essentially to prevent nVidia's muddying of the playing field.
Sounds like a good business strategy, but it's unproven, and always has been. What HAS been proven is both nVidia and AMD cheating on benchmarks of old, more than once. You conveniently left that out in your diatribe.Repeatedly it has been shown that nVidia has been more than willing to use underhanded methods with their proprietary technology to stifle innovation in other companies and prevent other companies from being equally performative due to optimizations. Whereas AMD has repeatedly then had to make open source variations of the same things to not only even the playing field, but to give a transparent and open standard across the board.
That's misrepresenting it. nVidia has for many years assisted game dev's with implementing graphical features, and new features that they haven't yet had any experience with. This is good for the game dev, and it's good for gamers too. AMD has their own version of that program, but nVidia does a far better job at it.This is generally why AMD is seen as the company that is "good" for the industry, whereas nVidia is looked at as "bad". And there are countless examples of this: "The Way it's meant to be played" (basically paying game devs to optimize for nVidia hardware and ignore competition),
Which was blown all out of proportion. All nVidia wanted with that was to separate the branding that their GPU's were sold under, to not also contain competitors products. (for some reason this flipped some peoples' shit to the extreme). The ROG brand for example. If people have a good experience with a ROG branded nVidia card, then they are likely to tell their friends, and buy one again the next time. If the box says ROG but the chip is AMD's, then nVidia's past products are helping sell their competitors products. nVidia didn't want that, it's a pretty obvious business decision.the GPP program that Kyle revealed,
How? By all accounts that was profitable for everyone. I'll admit nVidia wants a larger piece of the pie... they also spend 5 billion a years on R&D. Board partners spend a 10th of that if not even less. Who was the legitimately 'greedy' partner in that business proposition?? They all want profit, and as far as anyone can tell, 3xxx GPU's made everyone a lot of money.and how they've treated their board partners (EVGA).
Being the best in the business for a solid 15 years running, leading innovation over that same period, supporting game devs with good tools isn't really anti-consumer. You will have to provide some examples. If the examples involve mentions of AMD's less-than as good performance, that's a bs argument.It basically boils down to anti-competitive and anti-consumer behavior
Predatory?? lol what. Explain it to us.and being dicks to everyone that isn't them (including board partners and consumers with their terrible and predatory 4000 series pricing and prioritizing miners).
What no...I think most have the expectation that nVidia should win simply by making a better product in a clean competition. RTX is probably one of the few times in which nVidia hasn't acted in this way as at least everything they were doing in terms of adding RT was adopted into the DX12 spec and it's open on Vulkan.
Ah.Most of the time that isn't the case though.
Plenty of analysis in many ways such as $ per frame, which AMD usually wins but not always and not always by enough to matter.Does nVidia make better hardware? For the most part, most of the time: yes. But by how much is actually much less clear due
For at least the last 15 years all graphics innovations of any note were developed by nVidia, then AMD copied.to the software stack that isn't "told" by a simple benchmark.
Now the other half of this is how much of a dick AMD would/could be if they're on top. I have no illusions that corporations won't act like corporations. Up to this point though AMD has definitely relied on less/fewer underhanded tactics in both the processor and GPU space.
EDIT: spelling/grammar
List the nVidia API's that are open source, royalty free for any game dev and opposing graphics manufacturers to use. I'll wait.Demonstrably false.
Guessing you didn't read the article and weren't there.It ran slower on AMD because those GPU's had much less shader power than the nVidia GPU of the same era. Queue AMD butthurt fanboys crying in their cereal yelling 'He cheated!' in between sobs. The very next AMD gpu corrected the shader performance imbalance, and Gameworks games played fine. Gameworks games meaning games programmed to use nVidia supplied dll's to shorten dev time to implement those graphics features. It was good for the gaming industry and AMD only was hurting for 1 generation. It's not nVidia's job to tell their competitors "beef up your shader performance or your shit will suck...".
You act like this is something that needs contesting. It doesn't. But also nVidia could make all their API's open source, and don't. I can tell you which is helping more people.AMD's 'open source' technologies are made that way because if they were not, they would die on the vine.
You're more than welcome to talk about whatever you want to talk about. It's not my job to post about everything all companies have done. You call my statement a diatribe, then say it's not long enough because certain things aren't included? Try at least being consistent.Sounds like a good business strategy, but it's unproven, and always has been. What HAS been proven is both nVidia and AMD cheating on benchmarks of old, more than once. You conveniently left that out in your diatribe.
Which they could release as tools for free, and don't. Meaning other people (AMD) have to.That's misrepresenting it. nVidia has for many years assisted game dev's with implementing graphical features, and new features that they haven't yet had any experience with. This is good for the game dev, and it's good for gamers too. AMD has their own version of that program, but nVidia does a far better job at it.
They wanted to take ROG branding and make it their own. Which it wasn't. Did nVidia develop ROG? Do they have copyright over ROG? Do they have ownership over any ASUS property? Last time I checked the answer is no. So why should nVidia get any right to dictate what ASUS does with their intellectual and/or branding property?Which was blown all out of proportion. All nVidia wanted with that was to separate the branding that their GPU's were sold under, to not also contain competitors products. (for some reason this flipped some peoples' shit to the extreme). The ROG brand for example. If people have a good experience with a ROG branded nVidia card, then they are likely to tell their friends, and buy one again the next time. If the box says ROG but the chip is AMD's, then nVidia's past products are helping sell their competitors products. nVidia didn't want that, it's a pretty obvious business decision.
Wanting to separate themselves is good business, and ultimately would have been better for consumers too.
Indeed. Glad we agree.but OMG nvidia evil!
If you want to have a conversation about what is "demonstrably false" then that is one of them. Especially considering that EVGA released all of their margin information. And a bunch of additional information as to why they left the market. As well as Igor's piece. But I guess believe your own opinions about this terrible company rather than board partners who worked with them for years behind the scenes?How? By all accounts that was profitable for everyone. I'll admit nVidia wants a larger piece of the pie... they also spend 5 billion a years on R&D. Board partners spend a 10th of that if not even less. Who was the legitimately 'greedy' partner in that business proposition?? They all want profit, and as far as anyone can tell, 3xxx GPU's made everyone a lot of money.
I already have. Creating closed APIs is bad for consumers and literally bad for the rest of the market. Bad enough that your examples below like GSync forced VESA to make open standards. It's those open standards that even allow adaptive sync on TVs. GSync is all but dead and adaptive sync won out.Being the best in the business for a solid 15 years running, leading innovation over that same period, supporting game devs with good tools isn't really anti-consumer. You will have to provide some examples. If the examples involve mentions of AMD's less-than as good performance, that's a bs argument.
There isn't anything that needs to be explained. The vast majority of users that aren't buying any of these cards already understands. Only staunch nVidia defenders like yourself don't.Predatory?? lol what. Explain it to us.
We get it, you prefer nVidia being closed source and jerks. As well as using underhanded tactics, abusing their AiB's, and threatening reviewers over samples when they say anything they don't like.What no...
Yeap.
Perspective.Plenty of analysis in many ways such as $ per frame, which AMD usually wins but not always and not always by enough to matter.
Again, all of which could've been open source and helped out the entire market place.For at least the last 15 years all graphics innovations of any note were developed by nVidia, then AMD copied.
Adaptive sync
cuda
Raytracing
DLSS
Frame Generation
The one item you might go but what about! would be Mantle, which was developed by AMD/Dice, but that only worked on AMD gpu's and didn't gain much traction in the PC market. It was quickly superseded by Vulkan, developed by the Khronos group, which has many contributing members including both AMD and nVidia.
Some of these AMD has yet to copy, but rest assured, they are working on it.
The $500 GTX 680 was also a lousy uplift over the GTX 580 if you consider what was considered normal gen on gen uplifts back then. The real successor to the GTX 580 would be the GTX 780 Ti as its big Fermi and big Kepler. GTX 680 was literally nothing more than mid-range silicon branded and priced high end.$500 gtx 680 was faster, more power efficient, cooler, had better frame times by far in Sli vs crossfire, and had more features than the $550 Radeon HD 7970. Are you saying the Radeon wasn't high end silicon either? That's not a good look if amd couldn't beat Nvidia midrange with their high end eh?Who cares about codenames thus much, seriously...?
Classic revisionist history.The $500 GTX 680 was also a lousy uplift over the GTX 580 if you consider what was considered normal gen on gen uplifts back then. The real successor to the GTX 580 would be the GTX 780 Ti. GTX 680 was literally nothing more than mid-range silicon branded and prices high end.
Radeon 7970 was high end silicon that was beaten by mid-range Nvidia silicon.
So what's your point?
Gameworks is not an API. It is a middleware that uses standard DirectX calls to generate graphical effects that take advantage of NVIDIA hardware. If you have a citation that NVIDIA deliberately made Gameworks effects perform worse on competitor hardware rather than just be optimized to work on their own hardware, I'd love to see it.The short version is that AMD is the company that is producing API's or modifying API's and releasing them as open source. nVidia for the most part only uses and creates API's that are proprietary.
Just as one off the cuff example, nVidia created "Gameworks". There was a scandal at the time that nVidia intentionally added increased tessellation to the rendering pipe that would be culled on nVidia hardware (and otherwise run faster with a better tessellation engine) that obviously wouldn't be on AMD hardware. They also made it impossible for any other vendor to examine the code in the libraries, making it impossible for ATi/AMD to optimize their hardware/drivers to run Gameworks properly. In short, Gameworks was intentionally designed to run slower on competitor hardware by obfuscation and it being closed source. It was/is seen as being malicious. Then AMD essentially had to create an open source version of Gameworks, GPUOpen with technologies like TressFX essentially to prevent nVidia's muddying of the playing field.
Repeatedly it has been shown that nVidia has been more than willing to use underhanded methods with their proprietary technology to stifle innovation in other companies and prevent other companies from being equally performative due to optimizations. Whereas AMD has repeatedly then had to make open source variations of the same things to not only even the playing field, but to give a transparent and open standard across the board.
This is generally why AMD is seen as the company that is "good" for the industry, whereas nVidia is looked at as "bad". And there are countless examples of this: "The Way it's meant to be played" (basically paying game devs to optimize for nVidia hardware and ignore competition), the GPP program that Kyle revealed, and how they've treated their board partners (EVGA). It basically boils down to anti-competitive and anti-consumer behavior and being dicks to everyone that isn't them (including board partners and consumers with their terrible and predatory 4000 series pricing and prioritizing miners).
I think most have the expectation that nVidia should win simply by making a better product in a clean competition. RTX is probably one of the few times in which nVidia hasn't acted in this way as at least everything they were doing in terms of adding RT was adopted into the DX12 spec and it's open on Vulkan. Most of the time that isn't the case though.
Does nVidia make better hardware? For the most part, most of the time: yes. But by how much is actually much less clear due to the software stack that isn't "told" by a simple benchmark.
Now the other half of this is how much of a dick AMD would/could be if they're on top. I have no illusions that corporations won't act like corporations. Up to this point though AMD has definitely relied on less/fewer underhanded tactics in both the processor and GPU space.
EDIT: spelling/grammar
I'd say that another factor is that most people aren't really hurting with their current GPUs at the moment. Most of the market is still stuck on 1080p where you can use just about any old ass card. My son's PC is currently pushing 1440p with a Radeon RX470 4GB (that I bought for like $160 7 years ago) and he says everything he's played is absolutely fine (even RDR2). I'm on a 2080ti at 3440x1440 and while some games push it a bit it's still perfectly serviceable and overkill for the vast majority of games. I don't think we have the amount of software that's needed to push most of these new cards unless you're an enthusiast that demands 4k all max settings in every game.Well met. I won’t argue any of those points - but you’re forgetting one thing - the full product stack isn’t released, yet. Only super high end is out. A lot of people are on the sidelines - waiting.
How is that "revisionist"? Of course reviewers praised it back then coming off of "Thermi", but you do realize in that time period that 30% uplift over last gen with a new arch back then was terrible right? Yet no one called that out. AMD is not immune to that criticism either. The original 7970 is not what should have launched.Classic revisionist history.
Sums it up nicely:
https://www.anandtech.com/show/5699/nvidia-geforce-gtx-680-review/20
“In any case, this has ended up being a launch not quite like any other. With GTX 280, GTX 480, and GTX 580 we discussed how thanks to NVIDIA’s big die strategy they had superior performance, but also higher power consumption and a higher cost. To that extent this is a very different launch – the GTX 680 is faster, less power hungry, and quieter than the Radeon HD 7970. NVIDIA has landed the technical trifecta, and to top it off they’ve priced it comfortably below the competition.”
The GTX 680 did mark the first time people started crying about a x80 that was really a xxx.
So I thank it for that quality drama.