AMD Intentionally Held Back from Competing with RTX 4090

Status
Not open for further replies.
Value is more than $ cost of a product.
... nVidia is currently offering one of the worst price/performance ratios in decades and at the highest prices we have ever had. So that's a bitter pill for what amounts to the only new feature being Frame Generation.

Sorry, but "Value is more than $ cost of a product."... sounds like an ad for diamonds. Get real.
 
... nVidia is currently offering one of the worst price/performance ratios in decades and at the highest prices we have ever had. So that's a bitter pill for what amounts to the only new feature being Frame Generation.

Sorry, but "Value is more than $ cost of a product."... sounds like an ad for diamonds. Get real.
As long as you're willing to admit that NVIDIA is only losing to themselves (of the past).

AMD, Intel are not offering value, either. Maybe Intel - but at the mid-lower segment.
 
... nVidia is currently offering one of the worst price/performance ratios in decades and at the highest prices we have ever had. So that's a bitter pill for what amounts to the only new feature being Frame Generation.

Sorry, but "Value is more than $ cost of a product."... sounds like an ad for diamonds. Get real.
Value IS more than the $ cost of a product. The RTX 4070 Ti and 4080 are not great values. I have no idea why Nvidia chose to price them the way they did. If it's between an RTX 4080 and a 7900 XTX, the XTX wins for sure.

BUT, as I explained earlier in this thread, the RTX 4090 is the king of this segment at the moment, and as such, it allows Nvidia to get away with not pricing their products as value, but instead, premium options. That's how it works. AMD is forced to price their products as the value alternative to Nvidia (not sure what they were thinking on the 7900 XT) because AMD does not claim the halo card product and does not own the most mindshare. Value, in this case, is a direct result of mindshare among consumers. Obviously you do not believe the RTX 4000 offers value because you've done your research. Good. Unfortunately, many consumers see "Nvidia" and automatically think "best GPU maker", so the value is in the mindshare that Nvidia currently holds.

Also, "only new feature being Frame Generation"... and a GPU that is 1.5-2x faster in Rasterization and Ray Tracing <---- You forgot that part.
 
and a GPU that is 1.5-2x faster in Rasterization and Ray Tracing <---- You forgot that part.

But even the 3050 sells more than the 6650xt. Surely that can't be due to RT/DLSS as the 6650 would smoke the 3050 !?
 
https://www.notebookcheck.net/AMD-i...spend-savings-on-other-PC-parts.700365.0.html

"AMD could have made an RTX 4090 competitor but chose not to in favor of a trade-off between price and competitive performance. AMD's rationale is that a 600 W card with a US$1,600 price tag is not a mainstream option for PC gamers and that the company always strived to price its flagship GPUs in the US$999 bracket. The company execs also explained why an MCM approach to the GPU die may not be feasible yet unlike what we've seen with CPUs."

They could have - but they didn't wanna.

Good.
 
But even the 3050 sells more than the 6650xt. Surely that can't be due to RT/DLSS as the 6650 would smoke the 3050 !?

That's the Ferrari effect. People associate Ferrari with fast. So Ferrari could sell a vastly inferior model and still sell it for a mint because of the name.

3050 is like that. Same with the 1050/1650

At no point was the 1050/ti EVER as fast as products costing the same or even less, but because of the Ferrari effect, it sold far far more.

AMD Radeon is associated with budget GPUs that allow you to sacrifice stability, features and quality for getting the same performance.

Nvidia Geforce is associated with GPUs.

Spot the difference.
 
At no point was the 1050/ti EVER as fast as products costing the same or even less, but because of the Ferrari effect, it sold far far more.
At some point it just comes down to consumers deserving the market that they made for themselves.

Can't wait for the next segment of "Why do GPUs cost so much???"

People unironically buy less for more because of name.
 
At some point it just comes down to consumers deserving the market that they made for themselves.

Can't wait for the next segment of "Why do GPUs cost so much???"

People unironically buy less for more because of name.
All those “glory days” video cards are still available at even lower prices in the secondary market!

No need to cry about current pricing and how we are owed a 4080 at $699, but the 4080 is a 4070, and the 4090 uses too much power…blah blah.
 
Value IS more than the $ cost of a product. The RTX 4070 Ti and 4080 are not great values. I have no idea why Nvidia chose to price them the way they did. If it's between an RTX 4080 and a 7900 XTX, the XTX wins for sure.

BUT, as I explained earlier in this thread, the RTX 4090 is the king of this segment at the moment, and as such, it allows Nvidia to get away with not pricing their products as value, but instead, premium options. That's how it works. AMD is forced to price their products as the value alternative to Nvidia (not sure what they were thinking on the 7900 XT) because AMD does not claim the halo card product and does not own the most mindshare. Value, in this case, is a direct result of mindshare among consumers. Obviously you do not believe the RTX 4000 offers value because you've done your research. Good. Unfortunately, many consumers see "Nvidia" and automatically think "best GPU maker", so the value is in the mindshare that Nvidia currently holds.

Also, "only new feature being Frame Generation"... and a GPU that is 1.5-2x faster in Rasterization and Ray Tracing <---- You forgot that part.
If it's between the RTX 4080 and the RX 7900 XTX, the more valuable card isn't that clear-cut, speaking as someone who owned both and traded the latter for the former.

Why? AMD's terrible VR performance for a flagship, four-figure GPU. I don't understand how they can have a GPU that strong for pancake gaming with pure rasterization and just drop the ball so hard on VR frametimes that some people find it performing worse than the 3080 in certain titles, never mind the 4080. Trading was definitely the right move after I saw how much my DCS performance went up - reprojection was almost gone, whereas the motion smoothing smearing all over the cockpit was constant with the 7900 XTX no matter how much I adjusted the graphics settings. (Any throttling from the defective vapor chamber certainly wasn't helping!)

This isn't even getting into professional GPGPU uses and CUDA, though people shopping for that are only going for the 4090 because it's overall much better value than the RTX A6000 Ada if the dataset fits into 24 GB (and anything less can't handle their workloads and thus has no value). The pro card has twice the VRAM, but it's also a bit slower than the "gaming" card on top of being about four times the price - small wonder why they're buying up 4090s instead!

For what it's worth, I agree with everyone that the RTX 4080 is horrendously overpriced, and generally more GPU than anyone needs at sub-4K resolutions. Yet it's also the minimum performance tier that cuts it for VR in the more unoptimized titles, so anything less simply isn't valuable to VR gamers - not anything AMD offers (until they fix their drivers, because I'd love to see the 7900 XTX whoop the 4080 like it should have), and not NVIDIA's own last-gen Ampere stuff.

It's a frustrating spot to be in, but at least that level of performance exists now. It just needs to drop to more affordable price levels, while developers ought to consider optimizing for GPUs that aren't $1,000+ in the meantime. (Not just simulators with VR slapped on after the fact - what's with all these AAA releases the past few months with ludicrous system requirements?)
 
If it's between the RTX 4080 and the RX 7900 XTX, the more valuable card isn't that clear-cut, speaking as someone who owned both and traded the latter for the former.

Why? AMD's terrible VR performance for a flagship, four-figure GPU. I don't understand how they can have a GPU that strong for pancake gaming with pure rasterization and just drop the ball so hard on VR frametimes that some people find it performing worse than the 3080 in certain titles, never mind the 4080. Trading was definitely the right move after I saw how much my DCS performance went up - reprojection was almost gone, whereas the motion smoothing smearing all over the cockpit was constant with the 7900 XTX no matter how much I adjusted the graphics settings. (Any throttling from the defective vapor chamber certainly wasn't helping!)

This isn't even getting into professional GPGPU uses and CUDA, though people shopping for that are only going for the 4090 because it's overall much better value than the RTX A6000 Ada if the dataset fits into 24 GB (and anything less can't handle their workloads and thus has no value). The pro card has twice the VRAM, but it's also a bit slower than the "gaming" card on top of being about four times the price - small wonder why they're buying up 4090s instead!

For what it's worth, I agree with everyone that the RTX 4080 is horrendously overpriced, and generally more GPU than anyone needs at sub-4K resolutions. Yet it's also the minimum performance tier that cuts it for VR in the more unoptimized titles, so anything less simply isn't valuable to VR gamers - not anything AMD offers (until they fix their drivers, because I'd love to see the 7900 XTX whoop the 4080 like it should have), and not NVIDIA's own last-gen Ampere stuff.

It's a frustrating spot to be in, but at least that level of performance exists now. It just needs to drop to more affordable price levels, while developers ought to consider optimizing for GPUs that aren't $1,000+ in the meantime. (Not just simulators with VR slapped on after the fact - what's with all these AAA releases the past few months with ludicrous system requirements?)
The sub X space is why pricing is a mess.
Everything is either too much power for a target resolution or too little. 1080P is easy you can max that for peanuts, everything else though is either too little or too much until you get to the 4090 which is just… yeah.
 
Yes, AMD has done a lot of good for the gaming community, but they can't just rest on their past accomplishments--they have nothing outside of them. RSR, AMD's built-in to drivers upscaling tech, great tech, but it's a one-trick pony, but it's inferior to FSR 2 which is usable by Nvidia and Intel which means you don't need an AMD card to capitalize on it. DLSS on the other hand is driven by AI, which requires Nvidia's Tensor cores, produces better image quality at almost the same performance levels, and comes with additional features such as DLAA, DLDSR, Reflex and Frame Generation (DLSS 3.0/40-Series), RTX VSR, etc.
People always say this, but I want to know exactly what good they have done for the community. All of their open source technology runs most efficient on AMD hardware and always has. AMD's architecture has unique challenges associated with it when it comes to putting all its general purpose compute cores to work, which is what their tech works around. The fact that they let it run on other hardware is just to sell the illusion of goodwill to the community, but it's just that: an illusion.

"But NVIDIA is worse."
When it comes to their business dealings, sure, but when it comes to graphics tech they're only looking out for #1, just like AMD. They're both in the business of selling graphics cards.
 
... nVidia is currently offering one of the worst price/performance ratios in decades and at the highest prices we have ever had. So that's a bitter pill for what amounts to the only new feature being Frame Generation.

Sorry, but "Value is more than $ cost of a product."... sounds like an ad for diamonds. Get real.
Lol highest prices…. Hahahahahahaha…..
 
People always say this, but I want to know exactly what good they have done for the community. All of their open source technology runs most efficient on AMD hardware and always has. AMD's architecture has unique challenges associated with it when it comes to putting all its general purpose compute cores to work, which is what their tech works around. The fact that they let it run on other hardware is just to sell the illusion of goodwill to the community, but it's just that: an illusion.

"But NVIDIA is worse."
When it comes to their business dealings, sure, but when it comes to graphics tech they're only looking out for #1, just like AMD. They're both in the business of selling graphics cards.
Well, I think the “goodwill” comes from giving end-users, and developers options that aren’t hardware locked.

As for these things running better on their hardware, that isn’t true necessarily. Vulkan runs good on all hardware, even though it may have been designed around the GCN hardware, it remains open source so it can be changed and modified to work with all manners of hardware.

FSR is just as good on Nvidia as it is on AMD or Intel, although why you’d use it on Nvidia if DLSS is an option is beyond me.

Freesync can work with Nvidia GPU’s, just that if it’s not a certified G-Sync compatible monitor then there’s no guarantee, which that’s on Nvidia to do, not AMD.

Nvidia isn’t worse, they’re just not wanting to give their stuff away for free. They’re a business, as you said, they’re in the business of selling graphics cards.
 
Well, I think the “goodwill” comes from giving end-users, and developers options that aren’t hardware locked.

As for these things running better on their hardware, that isn’t true necessarily. Vulkan runs good on all hardware, even though it may have been designed around the GCN hardware, it remains open source so it can be changed and modified to work with all manners of hardware.

FSR is just as good on Nvidia as it is on AMD or Intel, although why you’d use it on Nvidia if DLSS is an option is beyond me.

Freesync can work with Nvidia GPU’s, just that if it’s not a certified G-Sync compatible monitor then there’s no guarantee, which that’s on Nvidia to do, not AMD.

Nvidia isn’t worse, they’re just not wanting to give their stuff away for free. They’re a business, as you said, they’re in the business of selling graphics cards.
The reason AMD came up with Mantle and donated it to Khronos Group in the first place was because of their generic compute cores sitting idle a lot of the time. There was already movement in the rendering space to develop a bare metal API already, but AMD was just the first to demonstrate it. Coincidentally, AMD cards benefitted most from this approach at the time, especially with asynchronous compute. NVIDIA's architecture never had this issue, and so saw minimal performance gains from this approach. Conversely, NVIDIA was the first to enable multithreading in their drivers for DirectX and saw a huge uplift in performance since their cores were saturated with work, and as a result they often were caught waiting on the CPU to catch up.
 
People always say this, but I want to know exactly what good they have done for the community. All of their open source technology runs most efficient on AMD hardware and always has. AMD's architecture has unique challenges associated with it when it comes to putting all its general purpose compute cores to work, which is what their tech works around. The fact that they let it run on other hardware is just to sell the illusion of goodwill to the community, but it's just that: an illusion.

"But NVIDIA is worse."
When it comes to their business dealings, sure, but when it comes to graphics tech they're only looking out for #1, just like AMD. They're both in the business of selling graphics cards.
The short version is that AMD is the company that is producing API's or modifying API's and releasing them as open source. nVidia for the most part only uses and creates API's that are proprietary.

Just as one off the cuff example, nVidia created "Gameworks". There was a scandal at the time that nVidia intentionally added increased tessellation to the rendering pipe that would be culled on nVidia hardware (and otherwise run faster with a better tessellation engine) that obviously wouldn't be on AMD hardware. They also made it impossible for any other vendor to examine the code in the libraries, making it impossible for ATi/AMD to optimize their hardware/drivers to run Gameworks properly. In short, Gameworks was intentionally designed to run slower on competitor hardware by obfuscation and it being closed source. It was/is seen as being malicious. Then AMD essentially had to create an open source version of Gameworks, GPUOpen with technologies like TressFX essentially to prevent nVidia's muddying of the playing field.

Repeatedly it has been shown that nVidia has been more than willing to use underhanded methods with their proprietary technology to stifle innovation in other companies and prevent other companies from being equally performative due to optimizations. Whereas AMD has repeatedly then had to make open source variations of the same things to not only even the playing field, but to give a transparent and open standard across the board.

This is generally why AMD is seen as the company that is "good" for the industry, whereas nVidia is looked at as "bad". And there are countless examples of this: "The Way it's meant to be played" (basically paying game devs to optimize for nVidia hardware and ignore competition), the GPP program that Kyle revealed, and how they've treated their board partners (EVGA). It basically boils down to anti-competitive and anti-consumer behavior and being dicks to everyone that isn't them (including board partners and consumers with their terrible and predatory 4000 series pricing and prioritizing miners).

I think most have the expectation that nVidia should win simply by making a better product in a clean competition. RTX is probably one of the few times in which nVidia hasn't acted in this way as at least everything they were doing in terms of adding RT was adopted into the DX12 spec and it's open on Vulkan. Most of the time that isn't the case though.

Does nVidia make better hardware? For the most part, most of the time: yes. But by how much is actually much less clear due to the software stack that isn't "told" by a simple benchmark.

Now the other half of this is how much of a dick AMD would/could be if they're on top. I have no illusions that corporations won't act like corporations. Up to this point though AMD has definitely relied on less/fewer underhanded tactics in both the processor and GPU space.

EDIT: spelling/grammar
 
Last edited:
In a world where Nvidia tried to pass off a xx70 as a xx80, but regardless I don't consider a fake RRP of $799 to be a mainstream option either.
They've been passing off xx60 as xx80 with respective pricing since Kepler and the 600-series. Once upon a time a "Titan" or a 4090, or a 2080 Ti was just called a GTX 580 with a full unlocked die, none of this cutdown die (well ok, GTX 480, but you get my point), was the top die, and was $500.

Oh how much the market got duped on Kepler and every gen since. People still thought 80-class cards were high-end silicon and payed for it with the GTX 980, the GTX 1080, the RTX 2080, and the RTX 4080.
 
Oh how much the market got duped on Kepler and every gen since. People still thought 80-class cards were high-end silicon and payed for it with the GTX 980, the GTX 1080, the RTX 2080, and the RTX 4080.
$500 gtx 680 was faster, more power efficient, cooler, had better frame times by far in Sli vs crossfire, and had more features than the $550 Radeon HD 7970. Are you saying the Radeon wasn't high end silicon either? That's not a good look if amd couldn't beat Nvidia midrange with their high end eh? :LOL: Who cares about codenames thus much, seriously...?
 
The pricing sux.
Yes it does, but the year isn’t over yet.

But both manufactures have released 1K+ cards more in the past. While I generally buy on the high end every few years, it sucks for those who don’t want to go down that route. And not everyone is lucky enough to come across good deals on the previous generation. Hell my local area has two computer sites, one in Catania with a 600 euro 1800x, and one in the mall with 700 euro 3060 ti. Anything else is pushing 1K+.
 
Last edited:
Yes it does, but the year isn’t over yet.

But both manufactures have released 1K+ cards more in the past. While I generally buy on the high end every few years, it sucks for those who don’t want to go down that route. And not everyone is lucky enough to come across good deals on the previous generation. Hell my local area has two computer sites, one in Catania with a 600 euro 1800x, and one in the mall with 700 euro 3060 ti. Anything else is pushing 1K+.
I don't think the prices can hold either. There just aren't enough people that want to spend that level of money on these cards. The value isn't there.

A quick cursory glance at NewEgg, when selecting shipped/sold by NewEgg and "In Stock", shows basically every 4090 and 4080 flavor is available to purchase right now. It's only been a short while after launch. There is clearly not a stock problem, it's only a "demand problem".

In fairness to nVidia, the 7900XTX is also fully in stock. AMD did recently lower the 7900XT price to a slightly more sane $799 as it wasn't moving. Though I think a deeper discount on all of these cards will be necessary to get the market to actually want to buy these dumb priced cards.
 
Last edited:
This isn't unusual six months post launch that stock is readily available,in non crypto mining times.
Sales of cards is also generally higher in "non crypto mining times".

Though the 4090 has initially sold well, the entire market of video card sales is down.

https://www.tomshardware.com/news/n...f-graphics-cards-hit-all-time-low-in-2022-jpr

Lowest Unit Sales of Desktop Graphics Cards in History​

(Image credit: Tom's Hardware/Jon Peddie Research)
For the whole year 2022, AMD, Intel, and Nvidia sold around 37.86 million graphics processors for desktop AIBs, down sharply from approximately 49.15 million units in 2021, according to Jon Peddie Research. In fact, 37.86 million units is an all-time low for discrete graphics desktop graphics boards. To add some context, sales of standalone graphics cards for desktop PCs peaked at 116 million units in 1998, based on JPR data.

and

Sharp Decline of Revenues​

Unit sales of standalone graphics boards for desktop PCs clearly nosedived in 2022, so the whole market dropped $24.14 billion in the last four quarters, according to JPR. The number suggests that an average selling price (ASP) of a graphics card was $637. By contrast, the desktop AIB market was worth $51.8 billion in 2021 and a graphics board ASP was at $1,056. Still, JPR expects the AIB market to grow by 7% over the next three years.

And you now agree, that this is definitely not an issue with availability. Coming off of "crypto mining times" when there was poor card availability, one would also assume there would be pent up demand. Clearly what pent up demand there was, was very small.

To restate: I would say this is a direct result of absurdly poor pricing that a majority of the market is not willing to receive. Would you care to add other commentary?
 
Last edited:
Sales of cards is also generally higher in "non crypto mining times".

Though the 4090 has initially sold well, the entire market of video card sales is down.

https://www.tomshardware.com/news/n...f-graphics-cards-hit-all-time-low-in-2022-jpr



and



And you now agree, that this is definitely not an issue with availability. Coming off of "crypto mining times" when there was poor card availability, one would also assume there would be pent up demand. Clearly what pent up demand there was, was very small.

To restate: I would say this is a direct result of absurdly poor pricing that a majority of the market is not willing to receive. Would you care to add other commentary?
Sure it’s pricing and sure it’s the economy. People aren’t spending as much on hobbies and non-essentials.

https://finance.yahoo.com/news/consumer-spending-challenged-high-inflation-131524035.html

It’s affecting everything. Not just GPUs. So it’s barely a decent argument.
 
Sure it’s pricing and sure it’s the economy. People aren’t spending as much on hobbies and non-essentials.

https://finance.yahoo.com/news/consumer-spending-challenged-high-inflation-131524035.html

It’s affecting everything. Not just GPUs. So it’s barely a decent argument.
The slow down is less than 2% across the board according to that article. This is the lowest sales of GPU's ever.
For context, the lowest quarter of GPU sales in 2007-2008 during the economic crash still sold 2x as many cards as today. The lowest quarters in 2009 and 2010, let's say the two recovery years the same (actually higher).

To put that into context, in 2007-2009, people lost their jobs, houses, and their savings en mass. The actual drop in GDP was greater than 8%. Right now it's nothing close to that, and they still managed to sell 2x as many cards in 2008.

Additionally that article basically has all industry leaders stating optimism. Could be all bluster, but they were doing quotes from CEO's on their earnings calls.
 
Last edited:
The slow down is less than 2% across the board according to that article. This is the lowest sales of GPU's ever.
For context, the lowest quarter of GPU sales in 2008 during the economic crash still sold 2x as many cards as today. The lowest quarters in 2009 and 2010, let's say the two recovery years the same (actually higher).

To put that into context, in 2008-2009, people lost their jobs, houses, and their savings en mass. Right now it's nothing like that, and they still managed to sell 2x as many cards.

Additionally that article basically has all industry leaders stating optimism. Could be all bluster, but they were doing quotes from CEO's on their earnings calls.
Well met. I won’t argue any of those points - but you’re forgetting one thing - the full product stack isn’t released, yet. Only super high end is out. A lot of people are on the sidelines - waiting.
 
The short version is that AMD is the company that is producing API's or modifying API's and releasing them as open source. nVidia for the most part only uses and creates API's that are proprietary.
Demonstrably false.
Just as one off the cuff example, nVidia created "Gameworks". There was a scandal at the time that nVidia intentionally added increased tessellation to the rendering pipe that would be culled on nVidia hardware (and otherwise run faster with a better tessellation engine) that obviously wouldn't be on AMD hardware. They also made it impossible for any other vendor to examine the code in the libraries, making it impossible for ATi/AMD to optimize their hardware/drivers to run Gameworks properly. In short, Gameworks was intentionally designed to run slower on competitor hardware by obfuscation and it being closed source. It was/is seen as being malicious. Then AMD essentially had to create an open source version of Gameworks, GPUOpen with technologies like TressFX essentially to prevent nVidia's muddying of the playing field.
It ran slower on AMD because those GPU's had much less shader power than the nVidia GPU of the same era. Queue AMD butthurt fanboys crying in their cereal yelling 'He cheated!' in between sobs. The very next AMD gpu corrected the shader performance imbalance, and Gameworks games played fine. Gameworks games meaning games programmed to use nVidia supplied dll's to shorten dev time to implement those graphics features. It was good for the gaming industry and AMD only was hurting for 1 generation. It's not nVidia's job to tell their competitors "beef up your shader performance or your shit will suck...".

AMD's 'open source' technologies are made that way because if they were not, they would die on the vine.
Repeatedly it has been shown that nVidia has been more than willing to use underhanded methods with their proprietary technology to stifle innovation in other companies and prevent other companies from being equally performative due to optimizations. Whereas AMD has repeatedly then had to make open source variations of the same things to not only even the playing field, but to give a transparent and open standard across the board.
Sounds like a good business strategy, but it's unproven, and always has been. What HAS been proven is both nVidia and AMD cheating on benchmarks of old, more than once. You conveniently left that out in your diatribe.
This is generally why AMD is seen as the company that is "good" for the industry, whereas nVidia is looked at as "bad". And there are countless examples of this: "The Way it's meant to be played" (basically paying game devs to optimize for nVidia hardware and ignore competition),
That's misrepresenting it. nVidia has for many years assisted game dev's with implementing graphical features, and new features that they haven't yet had any experience with. This is good for the game dev, and it's good for gamers too. AMD has their own version of that program, but nVidia does a far better job at it.
the GPP program that Kyle revealed,
Which was blown all out of proportion. All nVidia wanted with that was to separate the branding that their GPU's were sold under, to not also contain competitors products. (for some reason this flipped some peoples' shit to the extreme). The ROG brand for example. If people have a good experience with a ROG branded nVidia card, then they are likely to tell their friends, and buy one again the next time. If the box says ROG but the chip is AMD's, then nVidia's past products are helping sell their competitors products. nVidia didn't want that, it's a pretty obvious business decision.
Wanting to separate themselves is good business, and ultimately would have been better for consumers too.

but OMG nvidia evil!
and how they've treated their board partners (EVGA).
How? By all accounts that was profitable for everyone. I'll admit nVidia wants a larger piece of the pie... they also spend 5 billion a years on R&D. Board partners spend a 10th of that if not even less. Who was the legitimately 'greedy' partner in that business proposition?? They all want profit, and as far as anyone can tell, 3xxx GPU's made everyone a lot of money.
It basically boils down to anti-competitive and anti-consumer behavior
Being the best in the business for a solid 15 years running, leading innovation over that same period, supporting game devs with good tools isn't really anti-consumer. You will have to provide some examples. If the examples involve mentions of AMD's less-than as good performance, that's a bs argument.
and being dicks to everyone that isn't them (including board partners and consumers with their terrible and predatory 4000 series pricing and prioritizing miners).
Predatory?? lol what. Explain it to us.
I think most have the expectation that nVidia should win simply by making a better product in a clean competition. RTX is probably one of the few times in which nVidia hasn't acted in this way as at least everything they were doing in terms of adding RT was adopted into the DX12 spec and it's open on Vulkan.
What no...

Most of the time that isn't the case though.
Ah.
Does nVidia make better hardware? For the most part, most of the time: yes. But by how much is actually much less clear due
Plenty of analysis in many ways such as $ per frame, which AMD usually wins but not always and not always by enough to matter.
to the software stack that isn't "told" by a simple benchmark.

Now the other half of this is how much of a dick AMD would/could be if they're on top. I have no illusions that corporations won't act like corporations. Up to this point though AMD has definitely relied on less/fewer underhanded tactics in both the processor and GPU space.

EDIT: spelling/grammar
For at least the last 15 years all graphics innovations of any note were developed by nVidia, then AMD copied.
Adaptive sync
cuda
Raytracing
DLSS
Frame Generation

The one item you might go but what about! would be Mantle, which was developed by AMD/Dice, but that only worked on AMD gpu's and didn't gain much traction in the PC market. It was quickly superseded by Vulkan, developed by the Khronos group, which has many contributing members including both AMD and nVidia.

Some of these AMD has yet to copy, but rest assured, they are working on it.
 
Demonstrably false.
List the nVidia API's that are open source, royalty free for any game dev and opposing graphics manufacturers to use. I'll wait.

Would you like to compare that to AMD's list? We can start with looking at GPUOpen, a massive SDK and middleware suite.

Even if you're a non-dev, you can go through all of AMD's open SDKs: https://gpuopen.com/
It ran slower on AMD because those GPU's had much less shader power than the nVidia GPU of the same era. Queue AMD butthurt fanboys crying in their cereal yelling 'He cheated!' in between sobs. The very next AMD gpu corrected the shader performance imbalance, and Gameworks games played fine. Gameworks games meaning games programmed to use nVidia supplied dll's to shorten dev time to implement those graphics features. It was good for the gaming industry and AMD only was hurting for 1 generation. It's not nVidia's job to tell their competitors "beef up your shader performance or your shit will suck...".
Guessing you didn't read the article and weren't there.

According to Nvidia, developers can, under certain licensing circumstances, gain access to (and optimize) the GameWorks code, but cannot share that code with AMD for optimization purposes.

Not remotely false at all. nVidia intentionally put a wall around AMD's or any third parties ability to optimize for Gameworks code - and that's from nVidia's own mouth. Guess you disagree with nVidia about nVidia? Feel free to actually show a counter.
AMD's 'open source' technologies are made that way because if they were not, they would die on the vine.
You act like this is something that needs contesting. It doesn't. But also nVidia could make all their API's open source, and don't. I can tell you which is helping more people.
Sounds like a good business strategy, but it's unproven, and always has been. What HAS been proven is both nVidia and AMD cheating on benchmarks of old, more than once. You conveniently left that out in your diatribe.
You're more than welcome to talk about whatever you want to talk about. It's not my job to post about everything all companies have done. You call my statement a diatribe, then say it's not long enough because certain things aren't included? Try at least being consistent.

Not remotely unproven. I just did, and it has been shown repeatedly. Feel free to actually have a counter with examples.
That's misrepresenting it. nVidia has for many years assisted game dev's with implementing graphical features, and new features that they haven't yet had any experience with. This is good for the game dev, and it's good for gamers too. AMD has their own version of that program, but nVidia does a far better job at it.
Which they could release as tools for free, and don't. Meaning other people (AMD) have to.

It's not good for game devs and not good for gamers. All it does is divide the market at best and at worst force people to buy nVidia hardware. Which again is the real point of these API's: to push nVidia hardware. A point that is apparently lost on you.

If nVidia was acting benevolent to game devs then they would be happy to have AMD also have access to the technology so that the game devs can sell titles that are equally performative on any vendor hardware as to increase the size of their game sales. Funny that it's not no? Thought they were trying to help game devs sell games?
Which was blown all out of proportion. All nVidia wanted with that was to separate the branding that their GPU's were sold under, to not also contain competitors products. (for some reason this flipped some peoples' shit to the extreme). The ROG brand for example. If people have a good experience with a ROG branded nVidia card, then they are likely to tell their friends, and buy one again the next time. If the box says ROG but the chip is AMD's, then nVidia's past products are helping sell their competitors products. nVidia didn't want that, it's a pretty obvious business decision.
Wanting to separate themselves is good business, and ultimately would have been better for consumers too.
They wanted to take ROG branding and make it their own. Which it wasn't. Did nVidia develop ROG? Do they have copyright over ROG? Do they have ownership over any ASUS property? Last time I checked the answer is no. So why should nVidia get any right to dictate what ASUS does with their intellectual and/or branding property?

You have to have a very twisted version of right and wrong to say that nVidia should be allowed to enter someone's house and say what they can do with their property. And a lot of gall on top of that to then say that's beneficial to the consumer.

I would love to see you have a conversation with Kyle about it.

As Kyle stated in his findings:
While investigating the GPP, Kyle Bennett from HardOCP spoke with seven companies, none of which wanted to go on the record. However they did speak anonymously about the GPP, and, according to Bennett, all the people he spoke to had similar opinions about the program.
Bennett summarizes the opinions as follows:
  • The terms of the GPP agreement are potentially illegal
  • The GPP will hurt consumer choices
  • The GPP will hurt a partner's ability to do business with other companies like AMD and Intel
These opinions stem from a key component of the GPP agreement document, which Bennett read but decided not to publish. This component states that GPP partners must have their "gaming brand aligned exclusively with GeForce". In other words, if a company like Asus wanted to join the GPP, they would not be allowed to sell AMD graphics cards as Republic of Gamers products. No more ROG-branded Radeon video cards, no more ROG laptops with AMD graphics inside.
and​
The GPP requires participants to align their gaming brands exclusively with GeForce, and if they don't sign up to the program, their direct competitors that are part of the GPP will get special treatment from Nvidia. So there is a pretty strong incentive for OEMs and AIBs to sign up otherwise they'll be left in the dust by the dominant player in the graphics market.
And it allegedly goes beyond the terms outlined in the GPP document. Some AIBs expressed concerns that it they do not sign up to the GPP, Nvidia would restrict GPU allocations and preference GPP members instead. This isn't in the GPP agreement itself, but is allegedly happening through under-the-table agreements.
The biggest issue that stems from this kind of arrangement is that OEMs and AIBs are essentially forced into signing up to remain competitive among Nvidia partners. If they don't join the GPP, they won't get benefits like marketing development funds or launch partner status. And if a competitor does join, they will receive a genuine advantage, which puts anyone that decides not to join the GPP in a disadvantageous position.
But by all means, I would love for you to explicitly state that Kyle is a liar. The GPP was by definition anti-competitive and likely illegal. The whole point of it was to force AIB's to go exclusively nVidia or be shadow-banned.
but OMG nvidia evil!
Indeed. Glad we agree.
How? By all accounts that was profitable for everyone. I'll admit nVidia wants a larger piece of the pie... they also spend 5 billion a years on R&D. Board partners spend a 10th of that if not even less. Who was the legitimately 'greedy' partner in that business proposition?? They all want profit, and as far as anyone can tell, 3xxx GPU's made everyone a lot of money.
If you want to have a conversation about what is "demonstrably false" then that is one of them. Especially considering that EVGA released all of their margin information. And a bunch of additional information as to why they left the market. As well as Igor's piece. But I guess believe your own opinions about this terrible company rather than board partners who worked with them for years behind the scenes?

The only way to be profitable in the GPU card vendor game is to sell incredibly high volume.
Being the best in the business for a solid 15 years running, leading innovation over that same period, supporting game devs with good tools isn't really anti-consumer. You will have to provide some examples. If the examples involve mentions of AMD's less-than as good performance, that's a bs argument.
I already have. Creating closed APIs is bad for consumers and literally bad for the rest of the market. Bad enough that your examples below like GSync forced VESA to make open standards. It's those open standards that even allow adaptive sync on TVs. GSync is all but dead and adaptive sync won out.

I'll tell you that nVidia would have been more than happy to force every display manufacturer to have to buy a proprietary chip and muscle AMD out. nVidia would've happily forced every TV to cost $100 more to get a GSync chip forcing every consumer to have to pay more money just to get adaptive sync.

Gsync by it's very nature is designed to be anti-consumer so that nVidia can pocket more money. The great irony is that now nVidia is riding on the back of open adaptive sync standards. They're still jerks that won't let other people use GSync though without paying an absurd licensing cost.
Predatory?? lol what. Explain it to us.
There isn't anything that needs to be explained. The vast majority of users that aren't buying any of these cards already understands. Only staunch nVidia defenders like yourself don't.
What no...
We get it, you prefer nVidia being closed source and jerks. As well as using underhanded tactics, abusing their AiB's, and threatening reviewers over samples when they say anything they don't like.
Yeap.
Plenty of analysis in many ways such as $ per frame, which AMD usually wins but not always and not always by enough to matter.
Perspective.
For at least the last 15 years all graphics innovations of any note were developed by nVidia, then AMD copied.
Adaptive sync
cuda
Raytracing
DLSS
Frame Generation

The one item you might go but what about! would be Mantle, which was developed by AMD/Dice, but that only worked on AMD gpu's and didn't gain much traction in the PC market. It was quickly superseded by Vulkan, developed by the Khronos group, which has many contributing members including both AMD and nVidia.

Some of these AMD has yet to copy, but rest assured, they are working on it.
Again, all of which could've been open source and helped out the entire market place.

Honestly much of this has just hurt nVidia in terms of R&D dollars. Generally what has happened is nVidia makes something closed source, tries to ram it down everyone's throats. AMD counters by making an open source variation, and then everyone uses the open source version because: 1) it's open, 2) requires no licensing, 3) is vendor agnostic.

Game devs have no dog in the fight. They want everyone to play their games regardless of AMD or nVidia hardware. Therefore it's actually worse for them to use nVidia technologies because then they are giving a worse experience to a significant portion of the market. So they either use the open source option or have to program twice. This also goes double because of consoles. It's worse to use nVidia's proprietary API's when most things are getting ported to AMD hardware.

If you're massive like Epic and you can use every rendering technology in your engine that's one thing. Epic's primary product is Unreal engine so they have the dev resources to do that. But elsewhere even in AAA gaming it's just not worth it.
 
Last edited:
$500 gtx 680 was faster, more power efficient, cooler, had better frame times by far in Sli vs crossfire, and had more features than the $550 Radeon HD 7970. Are you saying the Radeon wasn't high end silicon either? That's not a good look if amd couldn't beat Nvidia midrange with their high end eh? :LOL: Who cares about codenames thus much, seriously...?
The $500 GTX 680 was also a lousy uplift over the GTX 580 if you consider what was considered normal gen on gen uplifts back then. The real successor to the GTX 580 would be the GTX 780 Ti as its big Fermi and big Kepler. GTX 680 was literally nothing more than mid-range silicon branded and priced high end.

Radeon 7970 was high end silicon that was beaten by mid-range Nvidia silicon.

So what's your point?
 
Last edited:
The $500 GTX 680 was also a lousy uplift over the GTX 580 if you consider what was considered normal gen on gen uplifts back then. The real successor to the GTX 580 would be the GTX 780 Ti. GTX 680 was literally nothing more than mid-range silicon branded and prices high end.

Radeon 7970 was high end silicon that was beaten by mid-range Nvidia silicon.

So what's your point?
Classic revisionist history.

Sums it up nicely:

https://www.anandtech.com/show/5699/nvidia-geforce-gtx-680-review/20

“In any case, this has ended up being a launch not quite like any other. With GTX 280, GTX 480, and GTX 580 we discussed how thanks to NVIDIA’s big die strategy they had superior performance, but also higher power consumption and a higher cost. To that extent this is a very different launch – the GTX 680 is faster, less power hungry, and quieter than the Radeon HD 7970. NVIDIA has landed the technical trifecta, and to top it off they’ve priced it comfortably below the competition.”

The GTX 680 did mark the first time people started crying about a x80 that was really a xxx.
So I thank it for that quality drama.
 
Sour grapes. AMD makes 1 good product and 5 trash on average so didn't expect anything more from them.
On graphics they are so far behind it isn't even funny anymore.
 
The short version is that AMD is the company that is producing API's or modifying API's and releasing them as open source. nVidia for the most part only uses and creates API's that are proprietary.

Just as one off the cuff example, nVidia created "Gameworks". There was a scandal at the time that nVidia intentionally added increased tessellation to the rendering pipe that would be culled on nVidia hardware (and otherwise run faster with a better tessellation engine) that obviously wouldn't be on AMD hardware. They also made it impossible for any other vendor to examine the code in the libraries, making it impossible for ATi/AMD to optimize their hardware/drivers to run Gameworks properly. In short, Gameworks was intentionally designed to run slower on competitor hardware by obfuscation and it being closed source. It was/is seen as being malicious. Then AMD essentially had to create an open source version of Gameworks, GPUOpen with technologies like TressFX essentially to prevent nVidia's muddying of the playing field.

Repeatedly it has been shown that nVidia has been more than willing to use underhanded methods with their proprietary technology to stifle innovation in other companies and prevent other companies from being equally performative due to optimizations. Whereas AMD has repeatedly then had to make open source variations of the same things to not only even the playing field, but to give a transparent and open standard across the board.

This is generally why AMD is seen as the company that is "good" for the industry, whereas nVidia is looked at as "bad". And there are countless examples of this: "The Way it's meant to be played" (basically paying game devs to optimize for nVidia hardware and ignore competition), the GPP program that Kyle revealed, and how they've treated their board partners (EVGA). It basically boils down to anti-competitive and anti-consumer behavior and being dicks to everyone that isn't them (including board partners and consumers with their terrible and predatory 4000 series pricing and prioritizing miners).

I think most have the expectation that nVidia should win simply by making a better product in a clean competition. RTX is probably one of the few times in which nVidia hasn't acted in this way as at least everything they were doing in terms of adding RT was adopted into the DX12 spec and it's open on Vulkan. Most of the time that isn't the case though.

Does nVidia make better hardware? For the most part, most of the time: yes. But by how much is actually much less clear due to the software stack that isn't "told" by a simple benchmark.

Now the other half of this is how much of a dick AMD would/could be if they're on top. I have no illusions that corporations won't act like corporations. Up to this point though AMD has definitely relied on less/fewer underhanded tactics in both the processor and GPU space.

EDIT: spelling/grammar
Gameworks is not an API. It is a middleware that uses standard DirectX calls to generate graphical effects that take advantage of NVIDIA hardware. If you have a citation that NVIDIA deliberately made Gameworks effects perform worse on competitor hardware rather than just be optimized to work on their own hardware, I'd love to see it.

"The Way It's Meant to be Played" is no different from ATi "Get in the Game"/AMD Gaming Evolved/AMD Gaming. You can't ignore how AMD Gaming titles like The Callisto Protocol and Far Cry 6 run worse on NVIDIA hardware while calling NVIDIA "dicks" at the same time.

NVIDIA is one of the biggest promoter members of the Khronos Group.
 
Well met. I won’t argue any of those points - but you’re forgetting one thing - the full product stack isn’t released, yet. Only super high end is out. A lot of people are on the sidelines - waiting.
I'd say that another factor is that most people aren't really hurting with their current GPUs at the moment. Most of the market is still stuck on 1080p where you can use just about any old ass card. My son's PC is currently pushing 1440p with a Radeon RX470 4GB (that I bought for like $160 7 years ago) and he says everything he's played is absolutely fine (even RDR2). I'm on a 2080ti at 3440x1440 and while some games push it a bit it's still perfectly serviceable and overkill for the vast majority of games. I don't think we have the amount of software that's needed to push most of these new cards unless you're an enthusiast that demands 4k all max settings in every game.
 
Last edited:
Classic revisionist history.

Sums it up nicely:

https://www.anandtech.com/show/5699/nvidia-geforce-gtx-680-review/20

“In any case, this has ended up being a launch not quite like any other. With GTX 280, GTX 480, and GTX 580 we discussed how thanks to NVIDIA’s big die strategy they had superior performance, but also higher power consumption and a higher cost. To that extent this is a very different launch – the GTX 680 is faster, less power hungry, and quieter than the Radeon HD 7970. NVIDIA has landed the technical trifecta, and to top it off they’ve priced it comfortably below the competition.”

The GTX 680 did mark the first time people started crying about a x80 that was really a xxx.
So I thank it for that quality drama.
How is that "revisionist"? Of course reviewers praised it back then coming off of "Thermi", but you do realize in that time period that 30% uplift over last gen with a new arch back then was terrible right? Yet no one called that out. AMD is not immune to that criticism either. The original 7970 is not what should have launched.

Anyway, the fact remains that The GTX 680 used silicon that was typically used in 60-class cards...again, at the time. There's nothing revisionist about that, that's what happened. It's a fact.

Before the GTX 680, the 80 card was always the flagship gaming card (sometimes they'd refresh it with die-shrink silicon, or more unlocked cores like was the case going from the 480 to the 580), but it was always the top end silicon. Not so the GTX 680.

Nothing revisionist about that. Suddenly 80-class cards which retained their same branding and pricing was using x04 silicon. Exceptions to this over the past decade is the 780 and the 3080, though I'd point out the 780 was extremely cut down, and the 3080 was also cut down silicon).
 
Last edited:
Status
Not open for further replies.
Back
Top