• Some users have recently had their accounts hijacked. It seems that the now defunct EVGA forums might have compromised your password there and seems many are using the same PW here. We would suggest you UPDATE YOUR PASSWORD and TURN ON 2FA for your account here to further secure it. None of the compromised accounts had 2FA turned on.
    Once you have enabled 2FA, your account will be updated soon to show a badge, letting other members know that you use 2FA to protect your account. This should be beneficial for everyone that uses FSFT.

AMD Radeon RX 9070 and 9070 XT Official Performance Metrics Leaked, +42% 4K Performance Over Radeon RX 7900 GRE

erek

[H]F Junkie
2FA
Joined
Dec 19, 2005
Messages
12,844
“AMD's internal benchmarks of its upcoming RDNA 4-based RX 9070 series graphics cards have been leaked, thanks to VideoCardz. The flagship RX 9070 XT delivers up to 42% better performance than the Radeon RX 7900 GRE at 4K resolution across a test suite of over 30 games, with the standard RX 9070 showing a 21% improvement in the same scenario. The performance data, encompassing raster and ray-traced titles at ultra settings, positions the RX 9070 series as a direct competitor to NVIDIA's RTX 4080 and RTX 5070 Ti. Notably, AMD's testing methodology focused on native rendering and ray tracing capabilities rather than upscaling technologies like FSR. The RX 9070 XT demonstrated large gains at 4K resolution, achieving a 51% performance uplift compared to the two-generations older RX 6900 XT. Meanwhile, the base RX 9070 model showed a 38% improvement over the RX 6800 XT at 4K with maximum settings enabled.”

Source: https://www.techpowerup.com/333025/...ked-42-4k-performance-over-radeon-rx-7900-gre
 
Digging into those results vs the 5070ti, that puts 9070xt a hair above on raster, 15% slower on RT. Otherwise put, a 7900xtx with a 25% boost to RT.

AMDs going to price it in line with market, so around $850-900. Otherwise put, a $150 discount from 5070ti for 15% slower RT performance.
 
Could certainly be the case. If they are priced well I assume the scalpers will just bot them all up anyway and jack the price up to 8-900 bucks. It feels like a no win situation for regular consumers and it sucks. I’ll restate here: hold the line, delay or don’t purchase until things even out hopefully some time later this year. I also don’t play anything that won’t run well enough on $2-600 cards so there is that. - Gave up chasing the best of the best and been enjoying games casually without an fps obsession for years now. That helps a lot!
 
AMDs going to price it in line with market, so around $850-900. Otherwise put, a $150 discount from 5070ti for 15% slower RT performance.
They did that last time and it got them fucking nowhere. They need to price it aggressively to win mindshare, otherwise many are simply going to wait for availability to improve to buy the superior product for a slightly higher premium.
 
Daniel Owen analysis:

Performance slides have been leaked for the 9070 and 9070XT.

The comparisons are made to the 7900 GRE.

9070 is compared to be 20% faster than the GRE and the 9070XT 42% faster than the GRE.

These numbers are probably based off raytracing gains, not pure raster because the 9070XT won't be faster than the 7900 XTX.

GRE MSRP was $550. The quesiton will be the pricing on the 9070/XT cards. I think a $499 for the non-XT and a $599 for the 9070XT would help sell more of these cards.

But who knows how AMD will price these cards.

Video by Daniel Owen discussing the leaks:
View: https://www.youtube.com/watch?v=h0xAqkzQ53k
 
Could certainly be the case. If they are priced well I assume the scalpers will just bot them all up anyway and jack the price up to 8-900 bucks. It feels like a no win situation for regular consumers and it sucks. I’ll restate here: hold the line, delay or don’t purchase until things even out hopefully some time later this year. I also don’t play anything that won’t run well enough on $2-600 cards so there is that. - Gave up chasing the best of the best and been enjoying games casually without an fps obsession for years now. That helps a lot!
Great strategy. I don't intend to buy anything for my main PC this half of the year. Maybe will upgrade 6800 XT to 7xxx series if used ones get cheap enough.
 
Not that I need an "upgrade"....but if the 9070 is priced reasonable..this would put me back on all Team Red, and I can pass my 4070 down to my son
:ROFLMAO:
 
They did that last time and it got them fucking nowhere. They need to price it aggressively to win mindshare, otherwise many are simply going to wait for availability to improve to buy the superior product for a slightly higher premium.
Agreed. But with AMD, we can hope in one hand, shit in the other... you know which one will fill up first :)
 
They did that last time and it got them fucking nowhere. They need to price it aggressively to win mindshare, otherwise many are simply going to wait for availability to improve to buy the superior product for a slightly higher premium.

100%
 
The implication is if they are comparing it to the 7900 GRE that it will be priced comparable to it. At least we can hope.
A lot of people have floated that idea, and I hope you’re right, but I kinda doubt it. I think it’s more likely that they chose the GRE simply because they could brag about bigger gains over the GRE than the XT, but the GRE is still considered a pretty strong card. If they had chosen the 7800 XT, the % gains would be even bigger, but people might have shrugged because the 7800 XT isn’t really considered a fast card these days.
 
The question is - at which point do they start losing money with this product?

at $449, it should be the best seller in a decade for AMD
$499 - it would be a great seller
$550 - running into Nvidia mindshare (GRE launch price)
$599 - good, but not great
$649 - sinking
$699 - We have a repeat of last gen or worse as it is firmly in 5070TI territory

Any more than that, they should not have even bothered. Sitting on shelves waiting for Nvidia to be sold out and unavailable, as very few will buy an inferior product from weaker brand for the same price.
 
Aside from fears about a coming 3-6 month price spike due to scarcity of GPUs, the thing that stops me from being fully excited about the 9070 XT is just that so many more games support DLSS, and it just took a big leap forward with the Transformer model. Even if FSR 4 is just as good as DLSS 4, which is unlikely, none of the most demanding games I play support FSR 3.1, so I can’t just DLL swap for a performance boost. I always assumed I would either get a 9070 XT or wait until late 2026 for UDNA / Rubin, but now I’m thinking more along the lines of picking up a 4070 Ti Super later this year if and when pricing returns to sanity. If FSR support gets added to the games I currently play, that would change my mind, but it might change it in the direction of just enjoying the free uplift on my 6800 XT and keeping what I’ve got.
 
The question is - at which point do they start losing money with this product?

at $449, it should be the best seller in a decade for AMD
$499 - it would be a great seller
$550 - running into Nvidia mindshare (GRE launch price)
$599 - good, but not great
$649 - sinking
$699 - We have a repeat of last gen or worse as it is firmly in 5070TI territory

Any more than that, they should not have even bothered. Sitting on shelves waiting for Nvidia to be sold out and unavailable, as very few will buy an inferior product from weaker brand for the same price.
We all know in our hearts and minds where on the list things are going to fall. And remember, MSRP has little meaning in this market.
 
This "leak" or more like controlled release of info. Interestingly has marketing slides, yet no MSRP. To me it feels like AMD is still trying to figure out how much they can get away with.
The upside for us is they have chose as comparison the GRE. I mean why not the 7800XT? OR if you want to show massive generational gains a 7700xt... to have a massive number on your chart? If they can say 42% vs a GRE why not say 80% vs a 7700.
The only real answer is because its going to slot in the the same MSRP. Still that, that bit didn't leak with performance slides is concerning.
I feel like they are going to go for $550 MSRP... I just hope all the AIBs aren't adding on +$150 cause they included a bios dip switch and some LED trim.
 
The question is - at which point do they start losing money with this product?

at $449, it should be the best seller in a decade for AMD
$499 - it would be a great seller
$550 - running into Nvidia mindshare (GRE launch price)
$599 - good, but not great
$649 - sinking
$699 - We have a repeat of last gen or worse as it is firmly in 5070TI territory

Any more than that, they should not have even bothered. Sitting on shelves waiting for Nvidia to be sold out and unavailable, as very few will buy an inferior product from weaker brand for the same price.

If I can get a nice 9070 XT for $600, I will likely downgrade from my 4090 and spend that net $1,000+ on a vacation. Itching to go back to Iceland.
 
This "leak" or more like controlled release of info. Interestingly has marketing slides, yet no MSRP. To me it feels like AMD is still trying to figure out how much they can get away with.
The upside for us is they have chose as comparison the GRE. I mean why not the 7800XT? OR if you want to show massive generational gains a 7700xt... to have a massive number on your chart? If they can say 42% vs a GRE why not say 80% vs a 7700.
The only real answer is because its going to slot in the the same MSRP. Still that, that bit didn't leak with performance slides is concerning.
I feel like they are going to go for $550 MSRP... I just hope all the AIBs aren't adding on +$150 cause they included a bios dip switch and some LED trim.
And since there will be no MBA, is MSRP fake like the Nvidia cards?
 
Coming from a 1080 ti, newer games don't look newer, but play slower, suggesting it's more code than hardware. I will probably wait one more generation. RT is not a big deal to me.
 
I have a good feeling.
I don't. AMD has nearly a decade of bad GPU pricing. The last time AMD had a product that was well priced was the RX 480.
358d86d3d356cf101e143f983c5e59ed.gif
 
The question is - at which point do they start losing money with this product?

at $449, it should be the best seller in a decade for AMD
$499 - it would be a great seller
$550 - running into Nvidia mindshare (GRE launch price)
$599 - good, but not great
$649 - sinking
$699 - We have a repeat of last gen or worse as it is firmly in 5070TI territory

Any more than that, they should not have even bothered. Sitting on shelves waiting for Nvidia to be sold out and unavailable, as very few will buy an inferior product from weaker brand for the same price.

I am hoping it's 599 and down, I feel that is a good price point to start with.
 
I don't understand why people keep comparing them to the 7900GRE, the gimped version to meet export restrictions to the Chinese market...

The comparisons should be to the 7900xtx. If they can't beat their own previous gen, then this is a bad product.

Yes, I know, they are skipping the high end this generation, but even the mid-high should be able to beat the previous gens high end by between 5%-15%.

That said, with ~40+% over the 7900GRE, that does suggest there is a small boost over the 7900xtx.

Honestly, it is a shame they decided to not go after the high end this gen. If they live up to the improved RT performance, with the supply issues, melting connector issues, missing ROP issues, black screen issues, driver instability issues, etc. etc. Nvidia has been having with the 5090, this could have been AMD's first truly competitive high end product in a long time.

Especially since rumors are that AMD already has the channels packed with a good amount of supply for the launch.

If the averages in this story hold true, comparing these to the many game averages of the 7900 GRE and 7900xtx, the 9070xt is beating the 7900xt by ~3-4% with a 50W smaller power envelope of 305W (power fropm other leaks I've seen)

Imagine what a 355w (same as 7900xtx) or 450W (same as 4090) large die version could do? Heck, what if they went balls-to-the-wall like Nvidia did and launched a near 600W version like the 5090? That would definitely beat the 4090, and probably even beat the 5090.

Let me run some numbers.,...

According my my educated guesses and linear interpolations while looking at perf per watt:

- A 355w version of RDNA4 (7900xtx equivalent power envelope) could just barely beat the 4090.
- A 450w version of RDNA4 (4090 equivalent power envelope) could just barely beat the 5090.
- A 575w monster version of RDNA4 (like the 5090) could beat the 5090 by 25-30%.

Now, there are several caveats here. Even though RT is reportedly much improved in RDNA4 it may not be quite on par with Nvidia yet, and there are no quarantees the perf/watt scales nicely up to these higher power envelopes without some seriously large die monster chips (they could bin smaller chips, and crank up voltage and clocks, but this usually hurrts perf/watt). That, and these are linear performance interpolations based solely on one data point, namely TimeSpy benchmark averages for the respective models..

So no guarantee it would actually look like my above performance estimates. But it does show that they really could make a performance splash if they really wanted to this generation, The fact that Nvidias overprised trash with problems upon problems upon problems, and it looking more and more like Nvidia tried to defraud their customers (no way you get exactly 8 ROP's reduction across multiple product lines due to a "faulty batch", especially since they all just work with the drivers that expect the correct number of ROP's) might just indicate that they have an opening to do just that.

If the 5090 issues persist, maybe there is still time for them to change their minds, and do a rapid follow up with a few runs of enlarged die versions? AMD has done stuff like this before, like when they rapidly turned around and came out with the Threadripper line of CPU's once they realized it was possible.

It could be interesting indeed.
 
I don't understand why people keep comparing them to the 7900GRE, the gimped version to meet export restrictions to the Chinese market...

The comparisons should be to the 7900xtx. If they can't beat their own previous gen, then this is a bad product.
So by that logic. That the 5070TI isn't actually faster then a 4090 really is a fail. :)
If a 9070 beats a 7900 of any flavor that is 2 steps up.

They could have compared it to a 7700XT... So if its 42% fast then the GRE and the GRE is 25% faster then a 7700. I guess logically they could have said 67% faster then a 7700.
I would assume they choose the GRE, as it has a MSRP they are planning to hit but we'll see. They are probably going to say look we are giving you a 42% bump for the same money. Nvidia is selling you the same perf for the same price they sold it to you last time. 15-25% faster. for 15-40% more price.
 
Yes, I know, they are skipping the high end this generation, but even the mid-high should be able to beat the previous gens high end by between 5%-15%.
At some point we gradually stopped to talk like that about harddrive capacity, CPU frequency (well that one all of a sudden around 3ghz, barely doubling the last 23-25 years), yearly upgrade in that way. GPU are now getting really mature like those tech became in the 2000s.

1) High end gpu product over time got quite expensive to do, the idea that Nvidia/AMD mid-range of 2026 (or even 2027) should beat the 5090 per 5-15% at what we want midhigh-range price to be is not necessarily realistic.

2) If price per transistor stopped a while ago to get better, how much better at matrix and vector operation you can arrange your transistor after 30 years of really intense optimization at it ? If Intel 18 and TSMC 2N work really well, maybe we will see something like that happen again.

7900xtx had 530mm of dies pushing 360 watt, 384 bits bus board, what changed in the silicon world to make something mid range at mid range price to be able to beat this ? Upgrade in newer than raster render technic like RT, AI inference, those we can expect what you describe for a little while, a 5070ti FP4 is quite something, RT tflops not bad.

I do not thing there is ever some rules or fhe sorts ( If they can't beat their own previous gen, then this is a bad product.), price is always part of a product quality.

That said, with ~40+% over the 7900GRE, that does suggest there is a small boost over the 7900xtx.
Close but not necessarily, in RT yes, in raster probably a bit below.

Raster,
9070..: +18% ~4070ti super / bit under a 7900xt
9070xt: +37%, ~5070ti/4080super/7900xtx

RT
9070..: +26% ~4070 super
9070xt: +53% ~4070ti super
 
Last edited:
The question is - at which point do they start losing money with this product?

at $449, it should be the best seller in a decade for AMD
$499 - it would be a great seller
$550 - running into Nvidia mindshare (GRE launch price)
$599 - good, but not great
$649 - sinking
$699 - We have a repeat of last gen or worse as it is firmly in 5070TI territory





You have to balance this against what it actually costs them to manufacture them too. At $499 they'd probably be selling them at real loss. TSMC per wafer pricing is not cheap these days, and yields on TSMC 4N are nowhere near as good as fab yields in the good old days due to the complexity of these processes so these chips have some serious price tags attached to them.

Latest yield estimates I can find for TSMC 4N (which I presume but have no confirmation these are built on) is about 70%, and people were heralding that as high and amazing. Back in the day, yields were much better.

AMD pays an already expensive price by the wafer, has to toss quite a few dies off of that wafer due to yields, and the resulting cost per die is no joke.

You also have to llook at the price/performance.

At a 42% average over the 7900GRE, we are talking ~15% faster than a 5070ti. But people like RT these days, and while RDNA4's RT performance seems much improved over RDNA3, if we are honest it is likelt still not competitive with Nvidia.

So, to err on the side of caution, lets say it ties a 5070ti.

Now, Nvidias MSRP for the 5070ti was $749, but that was pre-tarrif, and unrealistic. THere will be a small number of models at launch at that price, so they can say they werent lying, but after that, 5070ti's are going to cost in the $900-$1050 range according to Microcenter.

There is no reason AMD - if they have a product that ties the 5070ti - shouldn't price it in line with a 5070ti.

But they do want to build some market share. The question is, how much of a discount over an equivalent 5070ti would it take to build significant market share? $50 is probably insufficient. $100-$150 is probably going to ahve a real effect.

So, if 5070ti's are selling for $900 to $1050, then I'd argue the 9070xt should be in the $750 to $950 range. It only makes sense.

If these numbers hold up, I think seeing them for much less than $750 is a pipe dream, and at that price they will be a good deal compared to Nvidia's offerings, especially since rumers are they have packed the pipoeline, and people will actually be able to get their hands on them, unlike Nvidia's vaporware.


Any more than that, they should not have even bothered. Sitting on shelves waiting for Nvidia to be sold out and unavailable, as very few will buy an inferior product from weaker brand for the same price.

Totally disagree.

1.) Nvidias MSRP's are not real prices. AIB's were already complaining they couldn't make any money at those prices, so they were going to be more expensive than that once the limited founders edition inventory sold out anyway. And that was before tariffs.

2.) I'm not convinced Nvidias reputation will remain this generation after all the issues with the 50-series. Melting power connectors, black screen issues, unstable drivers, and not to mention outright trying to cheat their customers by selling them fewer ROP's than advertised (there is no way this was a defect). I'd argue Nvidia's reputation is pretty much in the toilet after this launch. At least it should be.

So yeah, TLDR, anything under $749 for the 9070xt is probably a pipe dream, and at that price it will be a WAY better deal than the 5070ti's selling at $900-$1050. And it will probably perform (slightly) better too.
 
Last edited:
You have to think that they picked the 7900GRE because that's the price target. It's really a "9800xt" class card but the want more than $449 for it.
 
So by that logic. That the 5070TI isn't actually faster then a 4090 really is a fail. :)
If a 9070 beats a 7900 of any flavor that is 2 steps up.

Not the 5070 Ti. I was thinking more along the lines of the 5080. And yes, since it does not beat the 4090, it is a fail. If you ahd asked people before the Nvidia launch, absolutely no one expected the 4090 to still be the second fastest GPU once the 50-series launched.
 
Not the 5070 Ti. I was thinking more along the lines of the 5080. And yes, since it does not beat the 4090, it is a fail. If you ahd asked people before the Nvidia launch, absolutely no one expected the 4090 to still be the second fastest GPU once the 50-series launched.
That is fair... but AMD isn't calling these 9080s. I think they set the expectation that they are gunning for the 5070. Not the 5080 or the XTX.
I'm ok with them not going after the XTX as long as they aren't priced were the XTX was. If they perform aprox XTX with improved RT for the SAME price that is a huge fail. If they outperform the XTX by a couple % and its priced like a GRE. Then that makes AMD the conquering hero.

IMO if they are naming this thing 9070, then it should be aimed price wise as a x70. I think the GRE which sure started as a Chinese only product (only because they used cast off silicon from the 7900... they just didn't have enough cast offs for a world wide launch) has really over the last year been AMDs answer to the 4070 Super. I think it is a good card to compare against. The 7700XT just isn't a good card and no one is going to actually be impressed with marketing slides that showed a 9070 beating it up... especially as they are no doubt going to be asking more then the 7700XTs $450 MSRP. They would have gotten killed by reviewers and gamers if they did a silly comparison vs the 7700XT with a 40-50% higher MSRP. Marketing also looks like bunk if you compare it to a XTX and say, 2% faster.
 
Let me run some numbers.,...

According my my educated guesses and linear interpolations while looking at perf per watt:

- A 355w version of RDNA4 (7900xtx equivalent power envelope) could just barely beat the 4090.
- A 450w version of RDNA4 (4090 equivalent power envelope) could just barely beat the 5090.
- A 575w monster version of RDNA4 (like the 5090) could beat the 5090 by 25-30%.

Now, there are several caveats here. Even though RT is reportedly much improved in RDNA4 it may not be quite on par with Nvidia yet, and there are no quarantees the perf/watt scales nicely up to these higher power envelopes without some seriously large die monster chips (they could bin smaller chips, and crank up voltage and clocks, but this usually hurrts perf/watt). That, and these are linear performance interpolations based solely on one data point, namely TimeSpy benchmark averages for the respective models..

So no guarantee it would actually look like my above performance estimates. But it does show that they really could make a performance splash if they really wanted to this generation, The fact that Nvidias overprised trash with problems upon problems upon problems, and it looking more and more like Nvidia tried to defraud their customers (no way you get exactly 8 ROP's reduction across multiple product lines due to a "faulty batch", especially since they all just work with the drivers that expect the correct number of ROP's) might just indicate that they have an opening to do just that.

If the 5090 issues persist, maybe there is still time for them to change their minds, and do a rapid follow up with a few runs of enlarged die versions? AMD has done stuff like this before, like when they rapidly turned around and came out with the Threadripper line of CPU's once they realized it was possible.

It could be interesting indeed.
I already posted a video with the leaked performance specs. Here:
View: https://www.youtube.com/watch?v=h0xAqkzQ53k

When AMD boasts 20% faster than GRE for 9070 and 42% faster than the GRE for 9070XT they are talking about the RayTracing uplift. RDNA4 looks to have significantly better RT performance thanks to a redesign that they also applied to to PS5 PRO gpu.

In terms of pure raster, you can expect the uplift be roughly half of that. This means the 9070XT slots in somewhere between the 7900 XT and the 7900XTX in terms of raster. But you can expect the RT performance of the 9070XT to win over the 7900XTX.

In terms of pure raster, overall performance the 9070XT won't even beat out last gen 7900 XTX. It falls far short of the 4090 and of course the 5090.
 
You have to think that they picked the 7900GRE because that's the price target. It's really a "9800xt" class card but the want more than $449 for it.
9800XT was $499 9700 was $450. GRE $550.
I'm expecting $550 but I am fully prepared to be disappointed. lol
 
When AMD boasts 20% faster than GRE for 9070 and 42% faster than the GRE for 9070XT they are talking about the RayTracing uplift.
Combined average uplift, RT will be higher than those according to that leak, they probably kind of doubled they're RT tflops.
Raster,
9070..: +18% ~4070ti super / bit under a 7900xt
9070xt: +37%, ~5070ti/4080super/7900xtx

RT
9070..: +26% ~4070 super
9070xt: +53% ~4070ti super
 
I already posted a video with the leaked performance specs. Here:
View: https://www.youtube.com/watch?v=h0xAqkzQ53k

When AMD boasts 20% faster than GRE for 9070 and 42% faster than the GRE for 9070XT they are talking about the RayTracing uplift. RDNA4 looks to have significantly better RT performance thanks to a redesign that they also applied to to PS5 PRO gpu.

In terms of pure raster, you can expect the uplift be roughly half of that. This means the 9070XT slots in somewhere between the 7900 XT and the 7900XTX in terms of raster. But you can expect the RT performance of the 9070XT to win over the 7900XTX.

In terms of pure raster, overall performance the 9070XT won't even beat out last gen 7900 XTX. It falls far short of the 4090 and of course the 5090.


Like it or not, (and personally I usually don't like it) pure raster performance is more or less completely irrelevant in 2025.

Almost every AAA title out there has at least some element of RT in it these days, some even make it mandatory now.

This is why Nvidia has dominated the market share among gamers in the last few years. There is FOMO about missing out on RT, so when choosing between an AMD card that has great raster performance, but drops a lot of performance when RT is enabled, and an Nvidia card that can do both well, most have decided it is better to be safe than sorry, and gone with Nvidia.

We need to evaluate these cards on RT performance at this point. No, not the ridiculous unplayable tech demo that is Path Tracing in Cyberpunk 2077, but more reasonable RT levels (like RT high in Cyberpunk for a high end card, and RT medium for this level of card)

As for the performance leaks, I don't think it is straight up RT we are seeing in that 42% average number, but rather a bit of a mix. So your point still stands, for my estimates, maybe I should have used a mix of not just Time Spy but also Port Royal. Too late now though. I'm not redoing it. I have work to do.
 
Honestly, it is a shame they decided to not go after the high end this gen. If they live up to the improved RT performance, with the supply issues, melting connector issues, missing ROP issues, black screen issues, driver instability issues, etc. etc. Nvidia has been having with the 5090, this could have been AMD's first truly competitive high end product in a long time.
That’s the thing—until the 9070XT is in the wild, we won’t know what to really expect. It could face the same thing these 50-series cards are facing. Also, what percentage of 50-series owners are affected by these issues, and what’s Nvidia’s response time, usually very
If the averages in this story hold true, comparing these to the many game averages of the 7900 GRE and 7900xtx, the 9070xt is beating the 7900xt by ~3-4% with a 50W smaller power envelope of 305W (power fropm other leaks I've seen)

Imagine what a 355w (same as 7900xtx) or 450W (same as 4090) large die version could do? Heck, what if they went balls-to-the-wall like Nvidia did and launched a near 600W version like the 5090? That would definitely beat the 4090, and probably even beat the 5090.

Let me run some numbers.,...

According my my educated guesses and linear interpolations while looking at perf per watt:

- A 355w version of RDNA4 (7900xtx equivalent power envelope) could just barely beat the 4090.
- A 450w version of RDNA4 (4090 equivalent power envelope) could just barely beat the 5090.
- A 575w monster version of RDNA4 (like the 5090) could beat the 5090 by 25-30%.
Are there official power consumption numbers for the 9070/9070XT? Last I saw, from a rumor, 9070XT was already at 320W, essentially matching a 4080S in max consumption, which it seldomly ever hits, at least in my case, while falling a bit behind overall. So, going by that number—a 35W bump in power consumption wouldn’t be able to put it near a 4090.

Considering both AMD and Nvidia are using the same node size, it’s safe to assume that the performance/watt would line up roughly the same. But, since AMD’s opted to stick with 8-pin, if they were to push a 450W GPU you’d need at minimum 3 leads to the GPU, but that would be pushing it, more likely we’d need 4 leads to the GPU to be on the safe side—that not only would look god awful, cable management would definitely become an issue.
Now, there are several caveats here. Even though RT is reportedly much improved in RDNA4 it may not be quite on par with Nvidia yet, and there are no quarantees the perf/watt scales nicely up to these higher power envelopes without some seriously large die monster chips (they could bin smaller chips, and crank up voltage and clocks, but this usually hurrts perf/watt). That, and these are linear performance interpolations based solely on one data point, namely TimeSpy benchmark averages for the respective models..

So no guarantee it would actually look like my above performance estimates. But it does show that they really could make a performance splash if they really wanted to this generation, The fact that Nvidias overprised trash with problems upon problems upon problems, and it looking more and more like Nvidia tried to defraud their customers (no way you get exactly 8 ROP's reduction across multiple product lines due to a "faulty batch", especially since they all just work with the drivers that expect the correct number of ROP's) might just indicate that they have an opening to do just that.
If AMD’s RT performance can match a 40-series RT performance, then it’d make a splash with the mid-range market. That’s a big “IF,” they’d still have to be able to compete with DLSS4, neural compression, and the slew of other features Nvidia has on tap.

But, the 50-series is far from trash—for a 40-series owner it’s a wasted upgrade, but for 10-30 series owners it’s a solid upgrade. As for the price, once stock issues get fixed, the prices will drop sharply, everyone’s just capitalizing on the short supply of them. As for the issues, they’re limited, and no conclusive testing has been done to rule out the 5090 melting cable issue, and the missing ROP’s issue is affecting so few customers and they’re already addressing it, and since it’s not coming from the FE cards, it shows the board partners aren’t doing their due diligence.
If the 5090 issues persist, maybe there is still time for them to change their minds, and do a rapid follow up with a few runs of enlarged die versions? AMD has done stuff like this before, like when they rapidly turned around and came out with the Threadripper line of CPU's once they realized it was possible.

It could be interesting indeed.
Again, a big “IF.” They caught Intel shitting the bed, Nvidia doesn’t sleep.

Honestly, I think AMD can hit a home run with this lineup, but what’ll make or break it is pricing. AMD’s doing well for themselves financially, and they’ve shown that they have no inclination to seriously compete with Nvidia, so I can see the 9070 will probably launch at $500-$600, with the 9070XT launching at $700+. I just hope AMD surprises me.
 
Back
Top