AMD's Radeon RX 7900-series Highlights

They show idle and multi but no single?

Board partners though are listing their recommended PSUs for the 7900xtx as 850w to 1200w.

All the reviews I’ve seen so far have it drawing just above the 4080 for gaming loads, nothing significant though like 10w on the outside.
The chart on the left is idle power usage for single monitor; the one on the right is idle power usage for multimonitor.
 
All the reviews I’ve seen so far have it drawing just above the 4080 for gaming loads, nothing significant though like 10w on the outside.

power-gaming.png
watt-per-frame.png


It goes in 50-60 watt territory (80watt with RT on) for some title according to both techpowerup and techspot, the 4080-4090 will often use 3070 type of energy if you play on a 120fps monitor.
 
Last edited:
Calling a fancy graphics upgrade as "essential" is a bit much isn't it? That's like saying "Ultra everything is essential" ... gotta respectfully disagree with that.
It will be used more and more, until the point where games are all RT, and don't even offer rasterization as a fallback. I think you'd have to call it "essential" at that point, as you won't even be able to play such a game without it. I don't know how long it will take to reach that point, but I have no doubt that it will happen.
 
View attachment 533747View attachment 533748https://tpucdn.com/review/amd-radeon-rx-7900-xtx/images/watt-per-frame.png

It goes in 50-60 watt territory (80watt with RT on) for some title according to both techpowerup and techspot, the 4080-4090 will often use 3070 type of energy if you play on a 120fps monitor.
Yeah I need to get my shit together because I keep looking at them and being like but its right there within like 10-20w, then I realize I am looking at the 3080 and.... F-me, I should just call it a day and go home.
 
Looks like some games are suffering some bad frame time spikes. Chiplet design to blame? Perhaps driver maturation could improve that behavior.
This was my first thought when I read 7-series would be chiplet arch: would some type of microstutter be present like an internal SLI though at a ringbus level, but when it turned out the GPU core would still be mono, it stopped seeming a potential issue. So hopefully it can be resolved in driver, a latency tightened somewhere.

Particularly for latency sensitive applications like PC VR, low and stable frametimes are critical because the further it deviates from real-time, the more the brain and CNS are taxed and the sooner VR fatigue can set in.
 
Last edited:
This was my first thought when I first heard about 7-series being chiplet arch - would some type of microstutter be present similar to SLI, but when I learned the GPU core would still be mono it stopped seeming a potential issue. So hopefully it can be resolved in driver, a latency tightened somewhere.
This is one of the main problems Nvidia is fighting with their Blackwell stuff, based on info so far their consumer releases will only use one GPU core and their Enterprise stuff will have the two cores. Balancing cores like that for CUDA or accelerated loads is easy enough as they can be tailored for the hardware, but gaming is too general, like early multi-core CPUs if the engine isn't tweaked for multi-core then one of them mostly sits idle. GPUs are the same way and developers are stuck, they can tweak it for multi-core GPUs but single-core GPU suffers. So AMD and Nvidia are both currently stuck trying to develop a front-end IO interface that can dispatch the commands to both cores while presenting them as a single unified architecture to the OS. That interface needs some serious memory throughput and everybody there seems to be struggling on making that happen without that single IO costing more than the rest of the card as a whole.
 
Haven't seen any good overclocking results. Makes me wonder what AIB's will be clocking the new cards at.
Without changing the power target that is going to be tricky, it looks like it rides the line very closely already it's going to need an additional 6 pins at the very least to clock up much from where it is now.
 
Without changing the power target that is going to be tricky, it looks like it rides the line very closely already it's going to need an additional 6 pins at the very least to clock up much from where it is now.
Guess it depends how much leeway AMD gives AIB's to customize the cards. I assume with 3x8pin cards clocks would hopefully be over 3k?
 
Guess it depends how much leeway AMD gives AIB's to customize the cards. I assume with 3x8pin cards clocks would hopefully be over 3k?
well find out, stuff like this from Asus on their page for the 7900xtx OC edition make me uneasy, but I guess we will find out tomorrow.

OC mode : up to TBD MHz (Boost Clock)/up to TBD MHz (Game Clock)
Default mode : up to TBD MHz (Boost Clock)/up to TBD MHz (Game Clock)

It's going to come down to how the card handles transient spikes, I haven't seen that topic covered too much in the reviews so I don't know.
 
I don't know if $600 is nothing. I mean sure some people will go all out (they already have)... still the difference between 1600 and 1000 is pretty freaking big. AMD is going to sell out. There are plenty of previous $700-800 range customers that will come up to 1k... but there is no way they will ever go to 1500+. I'm not seeing much down side for AMD here... even the RT performance is respectable and bests nvidias previous gen.
The way it works is that $100-$300 is the mainstream market. These people will never leave this price range no matter what AMD and Nvidia does. Around $300 to $600 are enthusiast price range. These are the people that buy 3070's and will pay more depending on performance. $600-$800 are the epeens that brag to their friends that spending $100-$200 gets them 10% more performance and yet somehow less VRAM. The $800+ crowd has nearly infinite money as far as they're concerned and don't care for price.
 
The way it works is that $100-$300 is the mainstream market. These people will never leave this price range no matter what AMD and Nvidia does. Around $300 to $600 are enthusiast price range. These are the people that buy 3070's and will pay more depending on performance. $600-$800 are the epeens that brag to their friends that spending $100-$200 gets them 10% more performance and yet somehow less VRAM. The $800+ crowd has nearly infinite money as far as they're concerned and don't care for price.
I spent $1500 on a 6900xt and i very much care about price. Of course back then a 6700xt was $1050 and a 3080 was $2000. I feel comfortable spending $600-800 for 85-90% of top tier performance. I am also plenty happy to just game on a 6600xt. If i remember correctly I bought a launch 7970 3gb for $549 and a 290X for $599 -- adjusted for inflation and thats easily $750-800.
 
The way it works is that $100-$300 is the mainstream market. These people will never leave this price range no matter what AMD and Nvidia does. Around $300 to $600 are enthusiast price range. These are the people that buy 3070's and will pay more depending on performance. $600-$800 are the epeens that brag to their friends that spending $100-$200 gets them 10% more performance and yet somehow less VRAM. The $800+ crowd has nearly infinite money as far as they're concerned and don't care for price.

And I'm sure you have real, hard. data on everything you're saying and aren't just pulling things out of your ass, right?

Going from $800 to $1600 is literally doubling the cost. That's a massive gulf. Even at $1000 its a 60% increase in price. Given Nvidia's general mind-share of being "the best" and the consumer belief of "more money=more better" you could make the argument that going from the 7900 XTX to a 4080 isn't much and people at the $1000 price-point might be willing to do that, but given how abysmal sales of the 4080 have been that might even be a stretch. It'll be interesting to see how these cards sell at slightly cheaper prices or if Nvidia will counter with a discount on the 4080.
 
The pricing is rather wonky for these, just like the 4090/4080. 7900xt should be at least $200 cheaper, if not $300.

That would be ideal, but we're never going to go back to the days of sub-$1000 top tier cards and, relatively, cheap one tier down cards. That said, the XT's value is really going to depend on where Nvidia ends up pricing the 4070 ti. If it's still the originally planned $900, the XT might end up still being well priced (comparatively), but if Nvidia brings the price down to $800 it's going to put AMD in a rough spot. Of course, that's all assuming that the XT matches or exceeds the 4070 it's raster performance.
 
The way it works is that $100-$300 is the mainstream market. These people will never leave this price range no matter what AMD and Nvidia does. Around $300 to $600 are enthusiast price range. These are the people that buy 3070's and will pay more depending on performance. $600-$800 are the epeens that brag to their friends that spending $100-$200 gets them 10% more performance and yet somehow less VRAM. The $800+ crowd has nearly infinite money as far as they're concerned and don't care for price.

If only AMD and Nvidia had this 'wisdom'. They could have charged $10k for the 3090 and 7900xtx as those people have near infinite money.
 
The way it works is that $100-$300 is the mainstream market. These people will never leave this price range no matter what AMD and Nvidia does. Around $300 to $600 are enthusiast price range. These are the people that buy 3070's and will pay more depending on performance. $600-$800 are the epeens that brag to their friends that spending $100-$200 gets them 10% more performance and yet somehow less VRAM. The $800+ crowd has nearly infinite money as far as they're concerned and don't care for price.
The problem with GPUs in the $100-$300, any system you build around that with the with the way things are now will get trounced by a console for gaming, so if gaming is your only concern then a PS5 or Xbox is probably the better bet. 1080p max settings on 144hz or better is early to accomplish in the $600 range. Once you move up to 1440p and up costs multiply a fair bit.
 
The pricing is rather wonky for these, just like the 4090/4080. 7900xt should be at least $200 cheaper, if not $300.
Maybe it will, at least where the price can move for the non direct from AMD cards, in China the Xtx seem to go above $1300 USD for some model and all of them seem above $1100.

If some xt achieve to stay near $900 that could end up being the case.
 
The pricing is rather wonky for these, just like the 4090/4080. 7900xt should be at least $200 cheaper, if not $300.
They did it because of the price point for the 4080 which would be $899 on the high end, but since it's not if AMD had released it lower then scalpers would just swoop in and make up the difference. So AMD's priced it according to the 4080, should the price change there AMD has more than enough wiggle room to bring the 7900 series in line, they have been bragging at all the cost savings their new chiplet design brings them so we all know there is an insane markup on those 7900 cards right now.
 
So basically the 7900 XTX is slower than a 4090 but also costs $600 less. What AMD doesn't get is that anyone spending over $800 on a GPU isn't going to care about pricing. They want the best performance and Nvidia is that. To these people $600 is nothing. So AMD once again falls into the situation where their GPU's are slightly slower and slightly cheaper than Nvidia's, but Nvidia is also at the point where their graphic cards are starting to cost as much as a used car. So anyone who shops bellow $800 won't pick these up, and anyone who shops above $800 will also not pick these up. Should also be noted that a $1000 GPU is still $1000 at a time where an economic recession is looming. AMD should stop pricing their cards like Nvidia and try to actually go for higher volume sales.

7900XTX is $200 less than the 4080 and manages to outperform even the 4090 on certain titles at certain resolutions (COD, BFV @ 1440p raster, for example). You might not capture the guys willing to spend $600 for 10-20 extra FPS and a whole lot of e-peen, but no question this card embarrasses the 4080 most of the time, so you may capture guys who want to get most of the way there while pocketing $200-$600. I presume that’s the strategy. I also wouldn’t consider that “slightly” lower. I have a GSync monitor, so I’m feeling locked to Nvidia, but honestly, I could buy a 7900XTX and get a decent Freesync monitor that works with GSync with the money I have left over. That’s kind of insane.

I don’t blame AMD for this. Nvidia is the market leader and they set the prices. If they want to anchor at $1600, I wouldn’t expect AMD to launch a card this good for $800. That would be insane. We as consumers can say it’s TOTALLY a good idea because it means cheaper cards for us, but none of us would make this decision at business owners. I would suggest that most gamers calling for AMD to lower prices don’t buy their cards regardless, so why would they?
 
They did it because of the price point for the 4080 which would be $899 on the high end, but since it's not if AMD had released it lower then scalpers would just swoop in and make up the difference. So AMD's priced it according to the 4080, should the price change there AMD has more than enough wiggle room to bring the 7900 series in line, they have been bragging at all the cost savings their new chiplet design brings them so we all know there is an insane markup on those 7900 cards right now.

I’d be curious to see the total difference. It was substantial for Ryzen as I recall. I believe AMD said chiplets basically made a 16 core CPU cost the same as an 8 core monolithic CPU (feel free to correct me if I’m wrong, but it was something like that), and the more cores you have, the better it scales.
 
If you spent $1500 on a NV card 6 months ago... 7900 is identical RT performance. I know Nvidia has a newer model... but they are all still too slow for it to become a defacto turn on setting.

It looks, on average, to be about the same as the 3090 from reviews I read, meaning you’re also saving about 200W of power under load IIRC. Smaller PSU, less electricity consumed, less heat, same performance.
 
I’d be curious to see the total difference. It was substantial for Ryzen as I recall. I believe AMD said chiplets basically made a 16 core CPU cost the same as an 8 core monolithic CPU (feel free to correct me if I’m wrong, but it was something like that), and the more cores you have, the better it scales.
I remember seeing some slide leaks saying the 7900xtx was supposed to be a $799 part and AMD was claiming a 55-60% margin on that, as they want to approach Nvidia's margins to keep the shareholders happy.
 
That is dumb. Why would he do the 7900xtx over a 4090. He cares allot about VR performance too which AMD is not as good with.

Maybe he didn’t think it was worth an extra $600, which he can use to build most of the rest of the PC, or it’s better in some of the titles he likes to play, or both.
 
I remember seeing some slide leaks saying the 7900xtx was supposed to be a $799 part and AMD was claiming a 55-60% margin on that, as they want to approach Nvidia's margins to keep the shareholders happy.

I would have made the same decision. I had it as a consumer, but I don’t blame AMD/Nvidia as much as I blame the kids lining up overnight to buy a $1600 video card on opening day.

This price makes sense considering the price for previous gen, which used to roughly dictate the pricing tiers, up until Nvidia discovered gamers would spend $1000 for a Titan to gain 4FPS over its $600 cousin.
 
The way it works is that $100-$300 is the mainstream market. These people will never leave this price range no matter what AMD and Nvidia does. Around $300 to $600 are enthusiast price range. These are the people that buy 3070's and will pay more depending on performance. $600-$800 are the epeens that brag to their friends that spending $100-$200 gets them 10% more performance and yet somehow less VRAM. The $800+ crowd has nearly infinite money as far as they're concerned and don't care for price.
I don't know, neither of those things describe myself. I mean I could go and spend 2k on a GPU if I really wanted... but I won't. 1k ? I'm probably going to skip one more gen, I have been tempted to buy a 6800 or 6900... but then nothing I'm playing is really annoying me so might as well wait another gen, or at least for a refresh. Considering everything else I buy right now is 20-30% more then it was a couple years ago... I could see myself spending 1k if I believed I could stretch the card 4-5 years.

I understand the thinking... that at 1k or anything in that range people aren't worried about price and will just drop 1600. I just don't think the market jumps that quick from well off to rich. lol
I know you have probably been around PCs as long if not longer then I have... I remember 25 years ago selling PCs, and people have no issue dropping 3-4k (Canadian) on a system. $3000 in 1997 would be over 5k CND today ($3600 US). The only reason people are less likely to do that these days is we just don't need much of a PC for anything BUT gaming I guess.

Anyway if reports of AMD being flush with stock are correct... I guess we'll know in a week or two. If AMD sells through we'll know there is a market for them. Something tells me the 7900s are going to be out of stock most places after a few days and 4080s will still be on the shelf. (assuming Nvidia doesn't decide to push pricing down anyway)
 
I don't know, neither of those things describe myself. I mean I could go and spend 2k on a GPU if I really wanted... but I won't. 1k ? I'm probably going to skip one more gen, I have been tempted to buy a 6800 or 6900... but then nothing I'm playing is really annoying me so might as well wait another gen, or at least for a refresh. Considering everything else I buy right now is 20-30% more then it was a couple years ago... I could see myself spending 1k if I believed I could stretch the card 4-5 years.

I understand the thinking... that at 1k or anything in that range people aren't worried about price and will just drop 1600. I just don't think the market jumps that quick from well off to rich. lol
I know you have probably been around PCs as long if not longer then I have... I remember 25 years ago selling PCs, and people have no issue dropping 3-4k (Canadian) on a system. $3000 in 1997 would be over 5k CND today ($3600 US). The only reason people are less likely to do that these days is we just don't need much of a PC for anything BUT gaming I guess.

Anyway if reports of AMD being flush with stock are correct... I guess we'll know in a week or two. If AMD sells through we'll know there is a market for them. Something tells me the 7900s are going to be out of stock most places after a few days and 4080s will still be on the shelf. (assuming Nvidia doesn't decide to push pricing down anyway)
In Canada it's even worse because the border and brokerage on the devices are messed up, almost all our electronic supply is currently coming out of the US, so not only do we have to pay all their Tariffs but then ours on top of it.
 
In Canada it's even worse because the border and brokerage on the devices are messed up, almost all our electronic supply is currently coming out of the US, so not only do we have to pay all their Tariffs but then ours on top of it.
I built my last system a month or two before all the covid crap... and at any point since (until recently) I could have have turned a profit selling my parts. Stock has been screwed to hell for a few years now. Anytime I have picked anything up, or bought parts for others it has hurt. I have noticed mem express only has one 6950 listed at $1790 ($1300 USD) and a 6900 at $1600. We are getting the shaft right now for sure. Another reason to skip a gen I guess. lol
 
I built my last system a month or two before all the covid crap... and at any point since (until recently) I could have have turned a profit selling my parts. Stock has been screwed to hell for a few years now. Anytime I have picked anything up, or bought parts for others it has hurt. I have noticed mem express only has one 6950 listed at $1790 ($1300 USD) and a 6900 at $1600. We are getting the shaft right now for sure. Another reason to skip a gen I guess. lol
I'm dealing with some of my suppliers trying to get some new Supermicro servers delivered for a network monitoring solution I need to update and brokerage fees are bending me over the desk.
 
Sweet, finally released with numbers... now lets hope last gen stuff drops in price to match.
 
Last edited:
Does anybody else get the feeling that AMD's "doubled" cores that require very special programing to even come close to possibly working right are just Bulldozer's mistakes all over again?
 
Does anybody else get the feeling that AMD's "doubled" cores that require very special programing to even come close to possibly working right are just Bulldozer's mistakes all over again?
No, they look like the 7900 parts are working as intended, I am sure there are some hiccups with the process that has limited clock speeds in some way, probably relating to the various processes used and making everything sync nicely, but this seems to be working.
Now if you are talking about the GPUs with the multi GCDs found in the CDNA2 stuff that AMD makes then very much yes, you need to be working in a method specifically coded for multi GPU, which is very common in the enterprise space, but at this stage excessively uncommon for consumer use.
AMD and Nvidia both have extensive papers written on the subject of IO processes that take multiple GPU cores and present them as a single GPU to the operating system and software, but the problems there are numerous and well-documented.
It is an exceptional amount of data that needs to be moved, synchronized, and coordinated for delivery, and the costs associated with the process go up each year as memory capacities and speeds increase with each generation making it either technically or financially unfeasible.
I remember Nvidia presenting on the topic a good while back (maybe 10 years) and at that time the costs associated with the IO dye cost more than the combined costs of the GPU dyes, since then I am sure it has only grown more expensive.

Here's one of Nvidia's research papers on the subject from back in 2017, they have been working on it for a long time.
https://research.nvidia.com/sites/default/files/publications/ISCA_2017_MCMGPU.pdf
Here's one from 2021:
https://dl.acm.org/doi/10.1145/3484505
Here's a presentation from Nvidia on the subject also from 2021, apparently they and TSMC have been working together on the subject since 2008
https://www.nextplatform.com/2022/0...s-a-course-to-multiple-multichip-gpu-engines/

While AMD may have gotten theirs to market first, Nvidia's no slouch on the subject.
 
Last edited:
No, they look like the 7900 parts are working as intended, I am sure there are some hiccups with the process that has limited clock speeds in some way, probably relating to the various processes used and making everything sync nicely, but this seems to be working.
Now if you are talking about the GPUs with the multi GCDs found in the CDNA2 stuff that AMD makes then very much yes, you need to be working in a method specifically coded for multi GPU, which is very common in the enterprise space, but at this stage excessively uncommon for consumer use.
AMD and Nvidia both have extensive papers written on the subject of IO processes that take multiple GPU cores and present them as a single GPU to the operating system and software, but the problems there are numerous and well-documented.
It is an exceptional amount of data that needs to be moved, synchronized, and coordinated for delivery, and the costs associated with the process go up each year as memory capacities and speeds increase with each generation making it either technically or financially unfeasible.
I remember Nvidia presenting on the topic a good while back (maybe 10 years) and at that time the costs associated with the IO dye cost more than the combined costs of the GPU dyes, since then I am sure it has only grown more expensive.

Here's one of Nvidia's research papers on the subject from back in 2017, they have been working on it for a long time.
https://research.nvidia.com/sites/default/files/publications/ISCA_2017_MCMGPU.pdf
Here's one from 2021:
https://dl.acm.org/doi/10.1145/3484505
Here's a presentation from Nvidia on the subject also from 2021, apparently they and TSMC have been working together on the subject since 2008
https://www.nextplatform.com/2022/0...s-a-course-to-multiple-multichip-gpu-engines/

While AMD may have gotten theirs to market first, Nvidia's no slouch on the subject.
The monolithic die portion of the GPU has partially doubled FP32 cores (which is why the core count is listed as 12,288 / 6,144) and the description of how they function in very similar to how Bulldozer was billed as working in which if the programing is absolutely custom and functions just right you can get essentially double performance per core-couplet, but otherwise functions as one core and some parasitically nearly dead silicon.
 
Back
Top