AMD 7900XT and 7900 XTX Reviews

overclocking a 4090 might get you 4-5% more performance while hitting 500w+

So far the good AIB 7900xtx's are showing around 21% performance right now overclocking. While using around 420w+.
Yeah, I will be curious to see what those cards actually draw. Hell, the reference model is showing spikes larger than supported power draws intermittently up to 455 Watts or something like that.

It probably all levels off when you have ample power though.

I will let you guys know if I hate reference card, but I suspect I am going to be loving just being over 60 FPS at 4K, max settings in Cyberpunk and probably gonna make my MW5 modded to the gills playthroughs amazing too.
 
overclocking a 4090 might get you 4-5% more performance while hitting 500w+

So far the good AIB 7900xtx's are showing around 21% performance right now overclocking. While using around 420w+.
I can't remember a card/chip that overclocked this far. Even the references with their limited power limit and cooling are getting 2900mhz core out the door. 9800 GT maybe came close, but not near 20%?

Also, what was the rational for AMD to not capitalize on the chip's potential? Trying to keep the thermals and power down for comparisons against Lovelace. Leaving headroom for the XTX+ versions surely waiting in the wings? Both?
 
I'm just hoping that the 7000's launch will finally bring down prices on a 6800XT, which is the card I actually almost afford
 
I can't remember a card/chip that overclocked this far. Even the references with their limited power limit and cooling are getting 2900mhz core out the door. 9800 GT maybe came close, but not near 20%?

Also, what was the rational for AMD to not capitalize on the chip's potential? Trying to keep the thermals and power down for comparisons against Lovelace. Leaving headroom for the XTX+ versions surely waiting in the wings? Both?
Not sure on the AMD side of things. They said they left a lot of headroom in the architecture. I would suspect it comes down to one of two things. 1. They thought they had a winner from their initial engineering samples and didn't really want to push the power envelope. 2. Lisa underestimated her uncle, again. They didn't think they needed to open up the power spigots.

You see the AIB cards have the extra PCI-E power connectors and units like the TUF are getting +9 FPS in stuff. That's a pretty decent gain. I think they were unwilling to push things right up to the maximum power thresholds. AMD is almost always a step behind when it comes to dumping power into things for performance. Which is odd, if you look at their switch from efficiency in their previous Ryzen stuff to their current lineup that draws juice until the chips light on fire...
 
I'm just hoping that the 7000's launch will finally bring down prices on a 6800XT, which is the card I actually almost afford
If you can even find them, those cards are at the lowest prices you're gonna see them at most likely. They're not a focus on the market. They have been almost completely replaced by the 6750XT
 
Where are the +20% overclock gains shown on AIB cards? All the early reviews showed less gains.
 
I can't remember a card/chip that overclocked this far. Even the references with their limited power limit and cooling are getting 2900mhz core out the door. 9800 GT maybe came close, but not near 20%?

Also, what was the rational for AMD to not capitalize on the chip's potential? Trying to keep the thermals and power down for comparisons against Lovelace. Leaving headroom for the XTX+ versions surely waiting in the wings? Both?
its a damn good question. It really does look like AMD could of got close to 4090 performance. Maybe AMD wanted to let the AIB's do all the RMA crap of people overclocking etc. Who knows.

The last good card I remember having a blast overclocking was my 8800GT. I remember Kyle doing a review on it, and it sold me on how good it was laugh.
 
Also, what was the rational for AMD to not capitalize on the chip's potential? Trying to keep the thermals and power down for comparisons against Lovelace. Leaving headroom for the XTX+ versions surely waiting in the wings? Both?

My guess is that the idle / base power is too high for AMD's liking. They probably wanted to stick to the initial power limit budgets

Maybe a RDNA 3+ refresh might fix the high power issue
 
This shows a 6.7% OC gain in Heaven and 8.5% in Cyberpunk (12% compared to reference).
1671124313913.png


Says 21.5% right there after overclocking.
 
  • Like
Reactions: kac77
like this
That's not what it says. It says that it has a 21.5% gap over a stock 4080 FE after OC+UV. That's not a 21.5% boost from its own stock settings.
correct. And most reviews are saying the 7900xtx is on par with a 4080 overall when it comes to performance.

Now this is a VERY small sample size. We need more data.
 
I have Question. why would amd limit its own performance? that makes no sense business to me.

I feel like amd is taking short cuts make there profit. in business world when you take short cuts it will backfire on you, either its now or later. people want product worth money, and that goes for all companies my included.
 
I have Question. why would amd limit its own performance? that makes no sense business to me.

I feel like amd is taking short cuts make there profit. in business world when you take short cuts it will backfire on you, either its now or later. people want product worth money, and that goes for all companies my included.
The 7900xtx Nitro+ of the OC example with the highest OC bios peak power consumption look to be the same has a 3090TI, around 475 watt with a typical average consumption while gaming at 431 (6950xt/3090TI level)

Could be a lot for the reference model (that peak at 410w), that one has 3 power connector instead of 2 on a much larger case:
https://www.guru3d.com/articles_pages/sapphire_radeon_rx_7900_xtx_nitro_review,6.html

I am not sure if show particularly good overclocking potential that was a mistake for AMD to not release a 450 watt 3 power connector card instead, the performance per watt drop quite a bit:
https://www.guru3d.com/articles_pages/sapphire_radeon_rx_7900_xtx_nitro_review,29.html

Tomb raider goes from 130 fps to 133 fps, Witcher goes from 147 to 154, Cyberpunk goes from 55 to 58 fps at 1440p with RT on in more playable framerate they are the same:
https://www.guru3d.com/articles_pages/sapphire_radeon_rx_7900_xtx_nitro_review,18.html

3-8% gain for 25% more energy, needing one more connector, etc... even if heat and noise are manageable by those monsters giant card, seem a logical decision to let the AIB special edition do it, it would have removed all the talking point from AMD presentation about power and card fitting in case and so on.
 
Last edited:
It's still very early, but the plot that Luke posted seems to show the RTX hardware running UE5 appreciably faster than AMD.
I would not put too much weight to what happen in the under 60 fps world, the 3090TI >> 4080 in that regard and could be an example were the low memory bandwith catch up 4K no DLSS, but do you want to play Fornite at 52 fps (average not minimum) anyway ?. It could be interesting and maybe it will translate in a relevant scenario, but maybe it will never matter, it would end up only be something that happen in unplayable affair.

In the playable band of comfortably enough above 60, 7900xtx and 4080 are quite close, maybe above the margin of error but not by much to be even certain
 
correct. And most reviews are saying the 7900xtx is on par with a 4080 overall when it comes to performance.

Now this is a VERY small sample size. We need more data.
TPU is going by their own testing. They have the reference model 8% ahead so they the AIB cards can you 12-13% more performance when you tweak them.
 
Would like to see the margins to comments, how many $400 PS5 APU you need to sales to make the money of a single $10,000 Epyc or $1000 7900xtx.

A PS5 APU is 260mm2 of 6nm (it was 300mm of 7nm at launch), the 7950x are 2x70mm + 124mm for the IO die.

Say they achieve to charge $150 for it (which would leave 250-300 for 16gig of expensive ram, 1tb of nvme drive and so on), it is probably a lot of console apu you need to sales to make up for the sales of a single CPU/GPU.

Points noted, but the volume of sales of those parts is still vast.....tens of millions with no letup year after year in most console cases, there usually isn't a ton of drop-off until a new model is announced....and while they may have 200k cards to sell at launch, with at least a couple thousand buyers ready to jump into the pool at MSRP (snicker) we can't exactly call them the company they used to be, all smoke and mirrors. I'd be surprised if they wound up packing it in, even on the Discrete GPU front. If anything, I believe Nvidia is going to do to videocards what companies like Dodge and Ford are doing to trucks, what used to be $40,000 is now $90,000 and there are a lot, a lot of people who see that (myself included) as paying $90,000 for a $40,000 item....adding more padding and plastic and features that will be obsolete and expensive to repair when a warranty runs out, people will get wise and go "Screw that, give me the thing that drives about as well but costs half and replacement parts don't cost $10 grand a visit".

AMD is caught in an interesting pickle right now though.....they are priced too high compared to Intel offerings if you're gaming at 1080p or 1440p with consumer-grade aspirations, and they are priced too high when compared to the NVidia offerings when you turn on Ray Tracing which, arguably, is a technology we can avoid turning on, for now, but won't be able to in a few years time.......AMD sells this stuff at $600 and $700 and I doubt they'd be able to keep them in stock. But everyone wants the Bentley or Rolls Royce, so they justify spending $1500 give or take on a GPU.....and considering how well a 4090 performs you can add to the justification of cost by saying "This will save me 2-3 generational leaps".....it's a 1080 Ti situation just at a more brutal price point.
 
The 7900xtx Nitro+ of the OC example with the highest OC bios peak power consumption look to be the same has a 3090TI, around 475 watt with a typical average consumption while gaming at 431 (6950xt/3090TI level)

Could be a lot for the reference model (that peak at 410w), that one has 3 power connector instead of 2 on a much larger case:
https://www.guru3d.com/articles_pages/sapphire_radeon_rx_7900_xtx_nitro_review,6.html

I am not sure if show particularly good overclocking potential that was a mistake for AMD to not release a 450 watt 3 power connector card instead, the performance per watt drop quite a bit:
https://www.guru3d.com/articles_pages/sapphire_radeon_rx_7900_xtx_nitro_review,29.html

Tomb raider goes from 130 fps to 133 fps, Witcher goes from 147 to 154, Cyberpunk goes from 55 to 58 fps at 1440p with RT on in more playable framerate they are the same:
https://www.guru3d.com/articles_pages/sapphire_radeon_rx_7900_xtx_nitro_review,18.html

3-8% gain for 25% more energy, needing one more connector, etc... even if heat and noise are manageable by those monsters giant card, seem a logical decision to let the AIB special edition do it, it would have removed all the talking point from AMD presentation about power and card fitting in case and so on.
Thank you for that! im rusty at pc stuff, been few years since i messed with it and get back update.

me personally i would of added 3rd 8 pin connecter. make sure HSF can handle temps and power. this is where business fall short and be cheap. so adding connecter would add cost, as business owner i would of told public. we are adding 3rd-8pin connector and cost would be extra $2. and i would make sure my business would make sure temps are in check and even it cost extra $1-$2. i am customer in pc world. but i am business in lawn care and business is not hard good earner.

I have learn company would rather something go unchecked then say oo sorry we didnt know that would happen. but in reality if i did this as business i would be out of business.

Customers give businesses power. if you dont buy there product you take away there power its simple logic and it amazes me people dont think about thing in details and dept. maybe it has do fact i have adhd and i see things others dont
 
Points noted, but the volume of sales of those parts is still vast.....tens of millions with no letup year after year in most console cases, there usually isn't a ton of drop-off until a new model is announced....and while they may have 200k cards to sell at launch, with at least a couple thousand buyers ready to jump into the pool at MSRP (snicker) we can't exactly call them the company they used to be, all smoke and mirrors. I'd be surprised if they wound up packing it in, even on the Discrete GPU front. If anything, I believe Nvidia is going to do to videocards what companies like Dodge and Ford are doing to trucks, what used to be $40,000 is now $90,000 and there are a lot, a lot of people who see that (myself included) as paying $90,000 for a $40,000 item....adding more padding and plastic and features that will be obsolete and expensive to repair when a warranty runs out, people will get wise and go "Screw that, give me the thing that drives about as well but costs half and replacement parts don't cost $10 grand a visit".

AMD is caught in an interesting pickle right now though.....they are priced too high compared to Intel offerings if you're gaming at 1080p or 1440p with consumer-grade aspirations, and they are priced too high when compared to the NVidia offerings when you turn on Ray Tracing which, arguably, is a technology we can avoid turning on, for now, but won't be able to in a few years time.......AMD sells this stuff at $600 and $700 and I doubt they'd be able to keep them in stock. But everyone wants the Bentley or Rolls Royce, so they justify spending $1500 give or take on a GPU.....and considering how well a 4090 performs you can add to the justification of cost by saying "This will save me 2-3 generational leaps".....it's a 1080 Ti situation just at a more brutal price point.
I agree, but fact is if people didnt buy product and tell business we are not paying this. the majority of people would have do this. customers can make business lose millions in ONE DAY. but people keep paying price. so is at fault. the customer for paying price? or business saying it cost that?
 
Go ahead, turn ray tracing on on a 2060 and see what happens. Let's see how well it can "do" ray tracing. I doubt you're going to find very many people who enjoy the slideshow.
I've played several RT games with the rig in my sig, including Metro: Exodus Enhanced Edition, a fully raytraced game, with DLSS quality @ 1080p and avg 70-90fps.

Shadow of the Tomb Raider was around 70fps avg, with DLSS quality @ 1080p

Granted some of these games might not be that hard to run, but they played stellar for being on a 2060.
The problem with the 2060 isn't performance in RT, but the amount of VRAM when using RT.

Except for in Cyberpunk, the 2060 totally shocked me with how well it does RT @ 1080p. DLSS is a life saver.
I was expecting it to be a stuttering mess, man was I wrong. Either way it's far from a "slideshow" as you claim.
 
Some pretty nice positives here. and a general feeling that improvements will come, in the areas which weren't so positive.
 
That's not what it says. It says that it has a 21.5% gap over a stock 4080 FE after OC+UV. That's not a 21.5% boost from its own stock settings.
Correct. It's still 14% over the stock XTX, and going by launches in the last few years that's a lot (ASUS was 15%). These seem to be great overclockers provided they are fed by 3x8 pins. The reference can only muster around 8% and gets power limited rather quickly. So i'm torn between getting the reference and EK SE reference waterblock or get something like the Nitro and a standard alphacool waterblock. That SE WB looks so nice though..

On water and 3x8 pins, >15% gains should be achievable.
 
That is really, really, really painful to see, speaking as someone who got the 6800 version of that card right before scalper prices even from stores stopped being a thing.
Yeah and 6650xt have been $300 or less. Gigabyte just had a deal which was like $280.
 
I have not been screwed on a mail in rebate, ever. Just follow the instructions exactly and take pics.
Yup. Did a $50 MSI one on a returned 6800xt at microcenter. Didn't even have to mail anything. Just scan and submit online what they ask for and 6 or so weeks later got the prepaid card.

Never been easier.
 
I have not been screwed on a mail in rebate, ever. Just follow the instructions exactly and take pics.
I've never not received a rebate but one took over a year to receive and I've had to fight the rebate company on a couple of others, it is also a good idea to keep a copy of the rebate with all the fine print.

The funniest denial I got was one that claimed they couldn't process it because I had stapled the upc code to the form but they approved it after I pointed out that the terms specifically stated to staple it. Another time I had a rebate denied claiming I had sent the wrong code in until I told them that I had a picture of the correct code next to the completed form and still had the box with the correct code cut out and the code they claimed I sent in still on it.
 
I had AMD Radeon RX 6800 did not have good experience, random lock ups with black screen of death, and non stop driver crashes. turned me off from AMD until they fix there drivers with black screen death and random lock ups. games lagging and it was just nightmare.

I was seriously going buy rx 7900xtx but drivers turned me off. I had Radeon VII and never had problem.

which with Nvidia I haven't had single problem. I picked up RTX 3080 10gb for $525. got way better performance for same price as brand new rx 6800 lol
 
I had AMD Radeon RX 6800 did not have good experience, random lock ups with black screen of death, and non stop driver crashes. turned me off from AMD until they fix there drivers with black screen death and random lock ups. games lagging and it was just nightmare.

I was seriously going buy rx 7900xtx but drivers turned me off. I had Radeon VII and never had problem.

which with Nvidia I haven't had single problem. I picked up RTX 3080 10gb for $525. got way better performance for same price as brand new rx 6800 lol
That's a shame, my 6800 xt is rock solid, got it for 519 after MIR. Was the 3080 used? I looked all over for one around that price.
 
  • Like
Reactions: kac77
like this
Back
Top