AMD's Radeon RX 7900-series Highlights

So basically the 7900 XTX is slower than a 4090 but also costs $600 less. What AMD doesn't get is that anyone spending over $800 on a GPU isn't going to care about pricing. They want the best performance and Nvidia is that. To these people $600 is nothing. So AMD once again falls into the situation where their GPU's are slightly slower and slightly cheaper than Nvidia's, but Nvidia is also at the point where their graphic cards are starting to cost as much as a used car. So anyone who shops bellow $800 won't pick these up, and anyone who shops above $800 will also not pick these up. Should also be noted that a $1000 GPU is still $1000 at a time where an economic recession is looming. AMD should stop pricing their cards like Nvidia and try to actually go for higher volume sales.
 
So basically the 7900 XTX is slower than a 4090 but also costs $600 less. What AMD doesn't get is that anyone spending over $800 on a GPU isn't going to care about pricing. They want the best performance and Nvidia is that. To these people $600 is nothing. So AMD once again falls into the situation where their GPU's are slightly slower and slightly cheaper than Nvidia's, but Nvidia is also at the point where their graphic cards are starting to cost as much as a used car. So anyone who shops bellow $800 won't pick these up, and anyone who shops above $800 will also not pick these up. Should also be noted that a $1000 GPU is still $1000 at a time where an economic recession is looming. AMD should stop pricing their cards like Nvidia and try to actually go for higher volume sales.

Thats a huge difference. There are people up to 1k and there are people in 1600+ bracket. Not everyone spends 1600 on a video card. Don't speak of it as a fact.
 
So basically the 7900 XTX is slower than a 4090 but also costs $600 less. What AMD doesn't get is that anyone spending over $800 on a GPU isn't going to care about pricing. They want the best performance and Nvidia is that. To these people $600 is nothing. So AMD once again falls into the situation where their GPU's are slightly slower and slightly cheaper than Nvidia's, but Nvidia is also at the point where their graphic cards are starting to cost as much as a used car. So anyone who shops bellow $800 won't pick these up, and anyone who shops above $800 will also not pick these up. Should also be noted that a $1000 GPU is still $1000 at a time where an economic recession is looming. AMD should stop pricing their cards like Nvidia and try to actually go for higher volume sales.
I don't know if $600 is nothing. I mean sure some people will go all out (they already have)... still the difference between 1600 and 1000 is pretty freaking big. AMD is going to sell out. There are plenty of previous $700-800 range customers that will come up to 1k... but there is no way they will ever go to 1500+. I'm not seeing much down side for AMD here... even the RT performance is respectable and bests nvidias previous gen.
 
It might be now, but in a few years it will be standard, and eventually it will be considered essential.
I won't disagree that it could be the future but the only game i play that uses ray tracing, mechwarrior 5 -- i see people with geforce 4090's getting 40fps. I am not interested in ray tracing at all if my FPS is going to drop below 144.
 
It might be now, but in a few years it will be standard, and eventually it will be considered essential.
It will be for sure... but do any of the cards on sale today drive it at a non gimmick frame rate ? The answer is no... even the 4090 can't push 100+fps in any heavy RT title.
The 7900 XTX =s or gets close to = the RT performance of the 3090 last gens winner. That is more then enough to use it in the lighter usage titles if you wish. I mean anyone that paid $1500 6 months ago to get the best RT performance is getting what AMD is now selling for 1k.
With RT it just not worth worrying about until the mid range and even low end cards can flip it on... that is 2-3 generations from now min. imo anyway
 
It will be for sure... but do any of the cards on sale today drive it at a non gimmick frame rate ? The answer is no... even the 4090 can't push 100+fps in any heavy RT title.
The 7900 XTX =s or gets close to = the RT performance of the 3090 last gens winner. That is more then enough to use it in the lighter usage titles if you wish. I mean anyone that paid $1500 6 months ago to get the best RT performance is getting what AMD is now selling for 1k.
With RT it just not worth worrying about until the mid range and even low end cards can flip it on... that is 2-3 generations from now min. imo anyway
exactly.
 
  • Like
Reactions: noko
like this
Review embargo has lifted.

Not too shabby, but I think we were all expecting it to be better at raster than the 4080.

https://www.techpowerup.com/review/amd-radeon-rx-7900-xtx/

1670857184809.png


1670857230464.png


In ray tracing, not so much. Still a generation behind NVIDIA.

1670857279990.png


Brent Justice's review (former [H] GPU editor).
https://www.thefpsreview.com/2022/12/12/amd-radeon-rx-7900-xtx-video-card-review/
 
Review embargo has lifted.

Not too shabby, but I think we were all expecting it to be better at raster than the 4080.

https://www.techpowerup.com/review/amd-radeon-rx-7900-xtx/

View attachment 533707

View attachment 533708

In ray tracing, not so much. Still a generation behind NVIDIA.

View attachment 533709

Brent Justice's review (former [H] GPU editor).
https://www.thefpsreview.com/2022/12/12/amd-radeon-rx-7900-xtx-video-card-review/
Tbh even being 16% behind the 4080 in RT really doesn't seem that bad.
 
If you spent $1500 on a NV card 6 months ago... 7900 is identical RT performance. I know Nvidia has a newer model... but they are all still too slow for it to become a defacto turn on setting.
Right. That's how I see it. If I were to go all in on 4k ray tracing then they both get demolished by the 4090. Unless you start turning down settings or enabling the various flavors of compensation.
 
Review embargo has lifted.

Not too shabby, but I think we were all expecting it to be better at raster than the 4080.

https://www.techpowerup.com/review/amd-radeon-rx-7900-xtx/

View attachment 533707

View attachment 533708

In ray tracing, not so much. Still a generation behind NVIDIA.

View attachment 533709

Brent Justice's review (former [H] GPU editor).
https://www.thefpsreview.com/2022/12/12/amd-radeon-rx-7900-xtx-video-card-review/
Perhaps drivers improve some of the lows.... but for sure it comes down to what games you prefer to play. I mean modern warfare II the XTX is 28% faster then the 4090 at 1440 and 5% faster then the 4090 at 4k. I mean if your a FPS lover that is planning to be on that game everyday for a year or more, AMD is currently punching $600 up, 300fps+ at 1440 and almost 200FPS at 4k.
 
Perhaps drivers improve some of the lows.... but for sure it comes down to what games you prefer to play. I mean modern warfare II the XTX is 28% faster then the 4090 at 1440 and 5% faster then the 4090 at 4k. I mean if your a FPS lover that is planning to be on that game everyday for a year or more, AMD is currently punching $600 up, 300fps+ at 1440 and almost 200FPS at 4k.
Something very interesting going on, that's for sure.
 
Something very interesting going on, that's for sure.
The call of duty games all seem to love AMD. Not sure if its just a quirk of their engine (could be I assume its heavily optimized for AMD consoles). I am old and don't play a ton of first person games anymore... but I imagine for people that are into the call of duty titles. AMD was already at the top of their list. (ok I know that isn't true... NV has had mind share advantage even if AMD was slapping them around in those titles. The hype on these might make a few FPS gamers notice though)
 
  • Like
Reactions: noko
like this
Transient load spikes are pretty bad on these, not sure where the 6950xt are in regards to transient load spikes but I think my 850w power supply may trip on the XTX as its pretty close already on the 6950xt where my 12v rail is dipping at 11.52v under heavy load. I think I am going to keep what I have for now.
 
Lmao this guys every tweet is about the same thing up to launch again today.
I get they might not have clocks where they wanted 3ghz+. But I am truly not sure what the means. I mean performance is right there between 4080-4090. Worst case 4080. 16% slower in RT which I think is decent given how far they were before. All they can do is improve gen over gen.

Fine for the price, if it was crashing left and right then I guess I would call that a hardware bug. If it’s stable all good.
 

Attachments

  • 6BFAB7A6-832D-426A-B474-21EE5C578A4D.png
    6BFAB7A6-832D-426A-B474-21EE5C578A4D.png
    357 KB · Views: 0
RT is between 3090 and 3090 Ti. $700 is the max I’m willing to pay to beta test this for a multi-billion dollar company XD.
 
Neck-and-neck with 4080 for Raster, but much slower at RT. Also, the 7900 XTX is power hungry.

Lovelace was supposed to be the power hog... yikes.
 
I don't know if $600 is nothing. I mean sure some people will go all out (they already have)... still the difference between 1600 and 1000 is pretty freaking big. AMD is going to sell out. There are plenty of previous $700-800 range customers that will come up to 1k... but there is no way they will ever go to 1500+. I'm not seeing much down side for AMD here... even the RT performance is respectable and bests nvidias previous gen.

That is dumb. Why would he do the 7900xtx over a 4090. He cares allot about VR performance too which AMD is not as good with.
I was running VR on a mobile RX580 8gb -- I doubt the 7900xtx will have an issue with current VR headsets.
 
I was running VR on a mobile RX580 8gb -- I doubt the 7900xtx will have an issue with current VR headsets.
I’ve got a Rift 2 and it does fine on a 2060 mobile. Any of these cards will be fine, I’m more saddened by the lack of game support than anything else.
I mean Darktide + VR… Diapers on standby.
 
I’ve got a Rift 2 and it does fine on a 2060 mobile. Any of these cards will be fine, I’m more saddened by the lack of game support than anything else.
I mean Darktide + VR… Diapers on standby.
I want VR support in mechwarrior really bad. However i need a headset that is sharp enough to read the text of instrument panels and such. If probably buy an HP reverb g2 right now if i trusted its sensors / controllers.
 
For the performance the 7900 XT/X offers, it uses more power than the 4080. That means Lovelace is the more efficient architecture.
Makes me wonder if the idle power draw is a driver issue. I can’t imagine they thought 100w at idle is acceptable…
 
Buy a 4080. It’s faster in raster in more games then not. Rt is 16% slower and $200 cheaper. Power hog? What are you talking about?
Just pointing out that people always criticize NVIDIA for their power consumption. Turns out the 7900 XTX uses 50W more power for the same level of performance as the 4080 in gaming.
Makes me wonder if the idle power draw is a driver issue. I can’t imagine they thought 100w at idle is acceptable…
TPU shows it's only an issue with multimonitor.
1670865113923.png
 
I want VR support in mechwarrior really bad. However i need a headset that is sharp enough to read the text of instrument panels and such. If probably buy an HP reverb g2 right now if i trusted its sensors / controllers.
I’ve found that over rendering and scaling down fixes that issue more than not. But yeah.
 
Just pointing out that people always criticize NVIDIA for their power consumption. Turns out the 7900 XTX uses 50W more power for the same level of performance as the 4080 in gaming.

TPU shows it's only an issue with multimonitor.
View attachment 533732
Ah thanks. Yeah, a bit disappointed on the power draw in gaming. I’m one who cares about the consumption as it drives PSU sizing, both cpu and gpu sides.
 
TPU shows it's only an issue with multimonitor.
1670865113923.png
They show idle and multi but no single?

Board partners though are listing their recommended PSUs for the 7900xtx as 850w to 1200w.

All the reviews I’ve seen so far have it drawing just above the 4080 for gaming loads, nothing significant though like 10w on the outside.
 
Last edited:
It feel to me a mix of "leaker" and AMD presentation, halo ridiculously powerful 4090, timing reduce how nice this is in reality:
No it is not 50-70% higher than a 6950xt, it is more 35% higher

https://www.techspot.com/review/2588-amd-radeon-7900-xtx/
https://www.techpowerup.com/review/amd-radeon-rx-7900-xtx/32.html

If a 7900xtx is 1.0 in performance

On techspot
at 1440p
6950xt/3090ti: .80
4080 : .99
7900xtx : 1
4090 : 1.16


4K
6950xt :.74
3090ti : .80
4080 : .96
7900xtx: 1
4090 : 1.26


Cyberpunk lock at 90fps power consumption, total system
4080 : 358w
4090 : 384w
7900xtx: 443w


TechpowerUp
1440p
3090ti.: .83
7900xt.: .89
4080...: .98
7900xtx: 1
4090...: 1.17


4K
7900xt.: .84
3090TI.: .84
4080...: .96
7900xtx: 1
4090...: 1.22


power usage while gaming
4080...: 304w
7900xt.: 320w
7900xtx: 356w
4090...: 411w


Gaming at 60hz
4080...: 68w
4090...: 79w
7900xt.: 112w
7900xtx: 127w



- Lot of things that we "knew", the xt is about exactly 5/6 the strength of the xtx

- Some I was semi certain, but online talk momentum were repeating without thinking much, Lovelace is by a giant amount more energy efficient than RDNA 3, close to half the energy gaming at 60fps on a 4080 versus a xtx, why would it not be all of the chip is on a little bit of a superior node without an fabric has well to connect 6nm with 5nm, do people thought that Ampere vs RDNA 2 was not all a story about Samsung and first gen GDDR6x vs tsmc and regular ram, that AMD had some inherent advantage ? I imagine we will see a shift on the who care about energy talk and who talk a lot about it.

- Being around a 3090TI with RT on, the XT around a 3090

The surprise:
The 7900xt being a 3090TI at 4K, which would it have been a 7800xt sold at $700 MSRP, $800 in real life would have been quite a good upgrade for people that do not care about CUDA and RT, getting a 3090TI that run on regular 320watt with a new warranty nice, if it is near $1000.... a bit meh.

The 7900xtx being virtually a 4080 at the moment performance wise is the big let-down I expected a nice double-digit lead, which was maybe ridiculous, the people that knew the best about the 7900xtx were AMD, the single actual piece of information they gave us about the 7900xtx performance was the announced MSRP.

Why would they had gave their 6900xt upgrade a massive price cut if it was a 4080 beater ?

7900xtx Resident evil village up to 138 FPS with RT on at 4K max setting, was quite literary Up to I imagine.
 
Last edited:
Back
Top