Join us on November 3rd as we unveil AMD RDNA™ 3 to the world!

"Fastest GPU under $1000"
... can be interpreted 2 different ways.

"Fastest GPU AND under $1000."
-OR-
"Fastest GPU at a price point of less than $1000."

Methinks it's the latter rather than the former.
It's definitely the latter.
 
Feels ridiculous to me that we are applauding $1000 price tags on GPUs, but I guess that's where we're at now.
Was going to say the same thing. Guess it really just comes down to the badge on the card for some.
 
Fastest gaming card under $1000, that's a fair description. Unless Nvidia releases something else which will counter by Dec 13th.
If leaked benches are to believed... and AMDs performance claims are close to right they will not be competitive unless they plan to drop the 4080 16gb to $899.
But who knows... AMD didn't even try and give us benchmark numbers.
 
May the new 7900xt performance and pricing make the 4080 price look silly enough to force better price on the rest of the lineup:
Presentation was weak on numbers a bit, but I was a bit pessimistic, maybe it will even be the 7900xtx that make the $1200 look silly.
 
I mean, lets be honest I dont think it will be as fast as the 4090. But AMD didnt make this a 450w card. I mean its 100w less and $500 cheaper....

Makes me wonder if they will let AIB's do whatever they want and make power hungry cards with faster memory etc.
Can say one thing: The 7900 XTX will most likely destroy the 4080 16GB at this price point. I wonder how NVIDIA will respond?
 
I'm disappointed that they aren't available immediately and also disappointed that they didn't give any pricing/availability on the RX7600,RX7700, and RX7800XT cards or did I miss it?
 
But who knows... AMD didn't even try and give us benchmark numbers.

The lack of any benchmarks or comparisons to any RTX GPU is the benchmark.

It'll obviously fall somewhere between 3090TI and 4090, but from a PR strategy standpoint I guess if you can't top the chart then don't bother with the chart, especially when your audience is too myopically 'winner takes all' about FPS charts.

Still the 7900XTX does look like it'll be the performance-per-watt-per-price leader and there might've been better ways of conveying that.
 
Last edited:
Were people doing that? I thought everyone was calling out the price.

Didn't stop them from buying it anyway, but...
We have been at this price, around a grand since the 1080Ti. I think I paid 700 bucks for the Founders Edition on that one The aftermarket cards were easily coming close to a grand for that release and that was 2016. I picked up a 2080TI for 999 when the Black Edition from EVGA came out. It's been a grand for a while now for top tier performance.
 
I have to believe that the 7900 XTX was designed to compete with the best Nvidia had to offer, but when it was being designed, AMD did not anticipate that Nvidia would push its halo card to 450+ watts of power usage. Therefore, AMD had to regroup and re-price their product to compete. That could explain why the 7900 XTX is the same MSRP as the 6900 XT was, but using a much wider bus and bigger core.

Pretty sure Nvidia just broke out the champagne glasses at HQ; they won.
 
Can say one thing: The 7900 XTX will most likely destroy the 4080 16GB at this price point. I wonder how NVIDIA will respond?
a price drop likely, Nvidia made no attempt to hide they were inflating the price on the 4000 series cards to steer users toward the 3000 series overstock, Nvidia is going to take a bath on their unsold 3000s, it's gonna get ugly there.
 
I have to believe that the 7900 XTX was designed to compete with the best Nvidia had to offer, but when it was being designed, AMD did not anticipate that Nvidia would push its halo card to 450+ watts of power usage. Therefore, AMD had to regroup and re-price their product to compete. That could explain why the 7900 XTX is the same MSRP as the 6900 XT was, but using a much wider bus and bigger core.

Pretty sure Nvidia just broke out the champagne glasses at HQ; they won.
They were never going to beat the top card, so why go all out instead of going 80-90% of the way while being cheaper and more efficient? It's what served them the best so far.
 
I have to believe that the 7900 XTX was designed to compete with the best Nvidia had to offer, but when it was being designed, AMD did not anticipate that Nvidia would push its halo card to 450+ watts of power usage. Therefore, AMD had to regroup and re-price their product to compete. That could explain why the 7900 XTX is the same MSRP as the 6900 XT was, but using a much wider bus and bigger core.

Pretty sure Nvidia just broke out the champagne glasses at HQ; they won.
I am really unsure about that part, I imagine everyone knew that they would if they needed, at least by the time the 3090TI was announced.

What was not expected was how efficiancy Ada Lovelace would be:

d7lq97nttkv91.png


Nvidia would have launched 350-307 watt version, how much does it move the 4090 ?

Would Nvidia knew about bot Ada efficacy (and quite the diminishing return past 300 w) and AMD, I could see them launch a 350-370 regular sized-powered card instead.

AMD removing $100 on the x900 card price tag after a 15% inflation + upcoming China tariff is quite aggressive here.
 
The lack of any benchmarks or comparisons to any RTX GPU is the benchmark.

It's obviously going to fall somewhere on the scale between a 3090TI and 4090, but from a PR strategy standpoint, if you can't top the chart then don't bother with the chart.

I would say this is also probably a big part of why Nvidia didn't actually release the 4080s for sale. AMD would have had something to show on a chart if they had... and I have a feeling 7900xtx at least will probably make the 4080 look bad considering how cut down it is vs the 4090.
 
They were never going to beat the top card, so why go all out instead of going 80-90% of the way while being cheaper and more efficient? It's what served them the best so far.
Pretty much. I never got the obsession with who has top card. Going to be an impractical buy for most people anyways. Let's see who gives me the best card at $500 or under.
 
I'm disappointed that they aren't available immediately and also disappointed that they didn't give any pricing/availability on the RX7600,RX7700, and RX7800XT cards or did I miss it?

They'll probably show up at CES in January. AMD like Nvidia still wants people to buy out the last gen stuff.
 
  • Like
Reactions: Axman
like this
AMD's die is half the size of Nvidia's.

Even without a node shrink, with some conservative clocks and a bigger die, AMD can launch a card on the same process and same power envelope as Nvidia is (on a smaller node) that could destroy their flagship.

I don't see how Nvidia is "winning." Not when there's all this excess stock in the pipeline. Both companies need to introduce new products because they're publicly-traded, but this round is about who loses the least.

The implications of RDNA3 must have Nvidia and Intel trembling.
 
I mean they could easily make a higher clocked version with faster GDDR6 memory then add in more v-cache push it to 450w. But, it might be a $1599 card that couldn't beat the 4090? Who knows.
 
AMD's die is half the size of Nvidia's.

Even without a node shrink, with some conservative clocks and a bigger die, AMD can launch a card on the same process and same power envelope as Nvidia is (on a smaller node) that could destroy their flagship.

I don't see how Nvidia is "winning." Not when there's all this excess stock in the pipeline. Both companies need to introduce new products because they're publicly-traded, but this round is about who loses the least.

The implications of RDNA3 must have Nvidia and Intel trembling.
It's half the size and half the transistor count, transistors still count for something.
 
Pretty much. I never got the obsession with who has top card. Going to be an impractical buy for most people anyways. Let's see who gives me the best card at $500 or under.
Yeah, I think that card is already out there. This is a game of "big" now. 1440 and lower res gamers, your cheap (very) card is already out there.... just pick.

But still, given what was presented, what if there's an RDNA 3 card that's really really really low power and can deliver 1440p reasonably well. What if it's low profile, or even single slot, or even bus powered. These are things that might (emphasis) be in reach for AMD where laughable on the Nvidia side. Imagine a gaming laptop that doesn't always have to be near an outlet, etc. or reasonable NUC-like device with high end discrete graphics? Anyway, time will tell. Hopefully they didn't take a page from Intel with regards to drivers.
 
Feels ridiculous to me that we are applauding $1000 price tags on GPUs, but I guess that's where we're at now.
Nevermind the $999 is fantasy pricing - meaning a reference model from AMD.com will be somewhere between hard to get and untouchable/botted, while Radeon AIB's will probably run wild again to $1399 and beyond, and if they don't then scalpers will pick up any remaining slack.
 
der8auer has shown the 4090 at 350w has 95% performance compared to 450w. That would make it still faster than the new AMD cards at same power requirements. NV really knocked it out of the park with the 4090.
NV really knocked it out of the park with the 4090, they knocked it out of the park on most PC cases and Bank Accounts.
 
Bravo on the announcement AMD. Now let's see if that Bravo will hold come Dec 13 when the test results come out.
 
Nevermind the $999 is fantasy pricing - meaning a reference model from AMD.com will be somewhere between hard to get and untouchable/botted, while Radeon AIB's will probably run wild again to $1399 and beyond, and if they don't then scalpers will pick up any remaining slack.
I don't expect such a large difference, but it seems that, unlike with Nvidia, AIBs have some room for custom clocks again, so price premiums could actually have performance benefits attached.
 
Jesus Steve is autistically anal about the power adapter and size dig. He's right to roast them on the "8K" wide and w/FSR issue.

I thought the power adapter dig + smug grin + pausing for 30 seconds was something the AMD guy could've avoided. Low effort, too obvious, everyone already knows, and the power adapter thing isn't going to actually change anyone's minds about anything any more than the persistent "Tesla catches FIRE!?" FUD brought down Tesla.
 
How many years have to pass before we can put that non issue to rest?
At least tree fiddy. People love to harp on outdated info.

As a current AMD (CPU, GPU) and NVIDIA user I've had 1 major issue with that has since been fixed (or may not have been directly AMD related at all) with AMD cards/drivers. Admittedly I've had zero issues with NVIDIA but it's ridiculous how much people worry about this. I'm also meticulous about fresh installing / DDU / some variant of not installing shit over other shit and I seemingly rebuild systems every 6 months anyway... so i don't get a lot of the "rot" that happens nor do I just jack around with shit for the hell of it. Other than the aforementioned AMD that was fixed, no issues with AMD.
 
I thought the power adapter dig + smug grin + pausing for 30 seconds was something the AMD guy could've avoided. Low effort, too obvious, everyone already knows and the power adapter thing isn't going to actually change anyone's minds about anything.
Low effort and cringe are a standard feature for these presentations.
 
It's half the size and half the transistor count, transistors still count for something.

Yeah, so they put this out now, at this price point, force Nvidia to lower their prices, and still have the option to go big with a refresh, when there's no surplus stock left eating into their market share.

TSMC's max die size on 4nm is around 850. So Nvidia can't go bigger than that. AMD is at 300 (and 5nm).

If these were car engines, Nvidia's got a six-cylinder engine making 450/450 and gets a combined MPH of 19. AMD just rolled out a 3-cylinder that's making 350/350 that gets 28. What happens when AMD decides to throw some displacement into it?

And right now, the only real market for new cards is mobile. Focusing on efficiency, whether it was planned around the crypto collapse, or just a conservative approach to this gen, puts AMD in a real solid position, more than I think a lot of people think.
 
Back
Top