AMD Intentionally Held Back from Competing with RTX 4090

Status
Not open for further replies.

sk3tch

2[H]4U
Joined
Sep 5, 2008
Messages
3,588
https://www.notebookcheck.net/AMD-i...spend-savings-on-other-PC-parts.700365.0.html

"AMD could have made an RTX 4090 competitor but chose not to in favor of a trade-off between price and competitive performance. AMD's rationale is that a 600 W card with a US$1,600 price tag is not a mainstream option for PC gamers and that the company always strived to price its flagship GPUs in the US$999 bracket. The company execs also explained why an MCM approach to the GPU die may not be feasible yet unlike what we've seen with CPUs."

They could have - but they didn't wanna.
 
Fair. Considering nVidia's pricing structure and the fact that their GPU sales are down, it seems like a high range, but not insane range card was the right move to make. Well, I'm sure some people that spend $1600 on graphics cards would disagree, but whatever.
 
Would they have needed 600 watt to reach 4090 (that does it with 360 or less most of the time while gaming and max a bit above 450w) one would understand and needed for the model to be $1600 to make sense for their profit margin....

But that feel like 2 big if, maybe that with regular TSMC 5 instead of fancy nvidia tsmc 5 that his true, but I am not sure how much you know in advance
 
4C71DAB1-6005-4A9F-9613-EB08E62E2165.jpeg


I dunno, the 4090 smokes the AMD cards at 300w too.
 
I dunno, the 4090 smokes the AMD cards at 300w too.
Which would be why they thought they needed 600watt only to match maybe ?

If they are being fair and not playing with words here in a pr move....
 
Which would be why they thought they needed 600watt only to match maybe ?

If they are being fair and not playing with words here in a pr move....

What I should say is “it could happen, it won’t but it could”.

How many more $999 cards are sold over $1600 cards? I’m betting that market is tiny.
 
I'm ok with not making it. The price nVidia is charging for the 4090 is absolutely bonkers and I'd hate for it to be normalized by all parties going after a price point that is bad for gamers and bad for PC gaming. As getting that expensive is not good for any of us.

It's bad enough paying a $1000 for a top-end cards.
 
https://www.notebookcheck.net/AMD-i...spend-savings-on-other-PC-parts.700365.0.html

Source: https://old.reddit.com/r/Amd/comments/11pclsw/amd_intentionally_held_back_from_developing_a_600/

"AMD could have made an RTX 4090 competitor but chose not to in favor of a trade-off between price and competitive performance. AMD's rationale is that a 600 W card with a US$1,600 price tag is not a mainstream option for PC gamers and that the company always strived to price its flagship GPUs in the US$999 bracket. The company execs also explained why an MCM approach to the GPU die may not be feasible yet unlike what we've seen with CPUs."

They could have - but they didn't wanna.


are-you-serious-spiderman.gif
 
So much for them trying to go for that halo product mindshare, I guess...

With that said, I was pissed enough that I spent $1,000 on a GPU and got a defective vapor chamber out of it. I'd be even more pissed if that happened on a $1,600 card. RDNA 3 was clearly not ready for prime time, regardless of which card they were gunning for.

This isn't even getting into the crippled VR performance that isn't doing AMD any favors at the high end with their current drivers. When I traded that 7900 XTX for a 4080 and got significantly better performance in DCS to the point where I wasn't constantly enduring motion smoothing artifacts, that's clearly a warning sign that AMD isn't competitive in an area where a $1,000+ GPU is actually justified.

Beyond that, four-figure GPUs just for gaming are indeed ludicrous. NVIDIA gets away with it because their bet on AI/ML is paying off big time, which means professional users are buying up RTX 4090s because they have just enough VRAM for that and also support ECC (while also being significantly faster than the RTX A6000 Ada for datasets that can fit in just 24 GB rather than 48 GB), and they've got CUDA as an effective means of vendor lock-in that keeps AMD out of the lucrative HPC market, where I'm sure they'd love to dominate just as handily as they are on the CPU side with EPYC Genoa.

AMD's got an uphill battle ahead of them, and it just became even more difficult after RDNA 3's launch caused them to lose a lot of mindshare and actually kind of justify NVIDIA's insane pricing on the 4080 and 4090.
 
As did Nvidia, but only in the current clown world climate is $1K considered mainstream.
Unfortunately, their experiment may pay off, as when you make 10x the profit, you can sell 1/10th the inventory and still break even.
 
I paid 899 for my 3080, the rip off is so great, when it dies, I won't be buying another discrete GPU, whatever an APU can push is what I'll play.
 
Last edited:
Paying over 1000 for a card is just insane to me. I remember when top shelf cards were 500 but now it's just nuts, The card I just bought was based on the best fps for the price. I am fine with playing at 1080p for now.

But more power to you guys that spend 1500 for a card.
 
No. Not buying it. The 7900 XTX was meant to be a flagship product (384-bit memory bus, 24GB, high power targets), designed to go up against the best Nvidia had to offer. AMD saw what Nvidia did with the RTX 3090 and said "yea, we can beat that easily", so they designed a product that hit around a 30-50% performance uplift. This is inline with normal generation to generation performance improvements as of the past 8-10 years.

The issue, though is that Nvidia used an inferior process node on the RTX 3000 series and still got very good results out of it; the architecture was not the issue. It was the node upon which it was fabricated. Naturally, as soon as Nvidia releases their next generation GPU on a cutting edge process node, it seems like a massive leap forward technologically, and AMD is once again firmly in second place. This was a calculated error on AMD's part. They do have the technology to make a product that competes with the 4090, and in fact, they tried to compete. They did not expect Ada Lovelace to be such a powerful and efficient architecture. I'm glad AMD is offering a value proposition against they crazy price of the RTX 4090, but until they decide to throw down and release a product that obliterates Nvidia's GPUs, they will never command the same respect, and no amount of "hindsight is 20/20" marketing BS from them is going to change that.
 
I'm not sure why this statement surprises anyone, it sounds to me like they're saying they had the technical ability to make a more powerful card to position against the 4090 but that it didn't make sense(read: it wouldn't have compared well). My understanding is that the current MCM design can accommodate a slightly larger graphics module but probably not enough to catch the 4090 even in raster and it would decrease efficiency even more. Nvidia took back the efficiency crown this generation and yet the 4090 is still a bit of a pig, I'd hate to see what the numbers on a less efficient RDNA3 card trying to compete with it would have been.

It also seems to me like the two main gaming related reasons to get a 4090 is for high res VR and being able to turn on RT effects in some of the more demanding current games without completely destroying framerates. These are two of AMD's biggest weaknesses right now so even if they had managed to be competitive in raster performance they wouldn't have been competitive in the main justifications for buying that level of card right now. If you don't care much about those two things then the two tiers of cards below the 4090 are plenty for 4k(4080/7900xtx) and 1440p(7900xt/4070ti) currently and for the foreseeable future.

All cards are overpriced right now IMO but that's a separate discussion.
 
https://www.notebookcheck.net/AMD-i...spend-savings-on-other-PC-parts.700365.0.html

"AMD could have made an RTX 4090 competitor but chose not to in favor of a trade-off between price and competitive performance. AMD's rationale is that a 600 W card with a US$1,600 price tag is not a mainstream option for PC gamers and that the company always strived to price its flagship GPUs in the US$999 bracket. The company execs also explained why an MCM approach to the GPU die may not be feasible yet unlike what we've seen with CPUs."

They could have - but they didn't wanna.
  1. The price of their GPU's are arbitrary. They could have made whatever they wanted at the price of $1k. They could have also priced it at whatever they wanted too.
  2. If AMD is holding back then I'd be more worried about price fixing with Nvidia, since according to AMD they just gave Nvidia the win and sales.
  3. As Nvidia has demonstrated, the best GPU sells the most with the current overpriced market. Anything less won't sell as much. So why hold back?
  4. Nobody cares about their $900 GPU's as evident by their 7900 XT price at $800. The XTX still remains at $1k.
 
Lol, no they couldn’t have.

Also:

AMD's rationale is that a 600 W card with a US$1,600 price tag is not a mainstream option for PC gamers and that the company always strived to price its flagship GPUs in the US$999 bracket.

Ah, yes, “always”, that long-standing ultra-pro-mainstream $1000 flagship tradition that AMD has followed since all the way back to...the 6900XT
 
In real speak we couldn't compete with Nvidia unless we push our silicon to the edge and ship a small nuclear reactor with every card.
Yeah, idk. It's not really a good look to go excuse making. At the same time it's a shame they feel compelled to because autistic consumers will base buying $300 gpus on who has the best $1500 gpu.
 
Part of me wants to think SURRRE you could have.... but considering the price and power consumption difference it could actually be true

could they have? possibly(doubt it though) if they said damn efficiency and cost, also nvidia pulling a rabbit out of their ass and figuring out how to get the power usage below 450w stock wasn't probably wasn't expected by AMD. but in all honesty they don't need to compete with the 4090, there's very little money to make with cards like that.
 
I agree with this statement, see ya in 2027 when the 10900 XTX releases...
 
As did Nvidia, but only in the current clown world climate is $1K considered mainstream.
Mainstream is still 1080p running a monitor of less than or equal to 120hz, you can do that very well on sub $300, with all 3 ( weird using that number) having a number of readily available cards.

The reality is the requirements for the “average” gamer hasn’t changed much in nearly 5 years. You can still play anything available today well at 1080p on a gosh darned 1080, this is why the 1650 still sells so well.
 
Last edited:
The RDNA 3-based GPU "Radeon RX 7900XTX" released this time is targeted at $999 (about ¥136,000), which is considered to be the "upper price" assumed by high-end users among general PC gaming fans. The "Radeon RX 7900XT" below it is said to be $699 (about ¥95,000).

The price strategy is the same as the previous RDNA 2 (Radeon RX 6000 series), with the top-end "Radeon RX 6900XT" and "Radeon RX 6800XT" targeting $999 and $699, respectively. However, the target price changes for each GPU generation.

That’s how you know this is BS, because 7900XT didn’t launch for $699, it’s dropping from it’s $799 launch price because no one wants to buy it, so AMD has apparently decided to retcon reality by claiming this was the strategy all along. Give me a break.

AMD is a publicly traded company. They’re watching Nvidia sell a bunch of $1600 cards to gamers who apparently have way too much money to burn. I’d AMD could play in that market, they would. They could have even launched a competitor at $1300 and still made insane margins siphoning that buyer away from Nvidia.
 
Status
Not open for further replies.
Back
Top