Anybody plan on grabbing one of the new RX7xxx Navi cards on launch?

The 3080 Ti is a great card. If you don't mind me asking, what'd you pay? And what resolution do you game at?

I bought from a friend who did mine with it, but the history of the card is that even though he mined, he kept temperatures down (by undervolting) and dust out, it was babied. He even changed the thermal pads on the memory to take care of it (and I trust him).

Per my sig, it is the gigabyte eagle variant.

I paid $800 AUD (~$517 in USD as of today) - was an offer too good to refuse

I game at 4k60.
 
Last edited:
Remember that the 3090 & 4090 are Titan class cards. Even if Nvidia shipped enough 4090s for all of us that would take it, it's a very small slice of the GPU market.

The 3080/4080 and 6900 XT/7900 XTX are what have traditionally been "flagship" cards. RDNA2 was very competitive with Ampere throughout the stack, even with the 3090 on raster performance. RT was about a generation behind Nvidia.

RDNA3 is probably about a generation behind Nvidia again on RT. And again will most likely challenge or beat the Nvidia counterparts on raster.

The 4090 is really the first GPU that makes native 4K RT achievable. It will be another 1-2 generations for that to reach the mainstream product. Probably 2-3 generations for AMD.

I'd take a 4090 FE for MSRP if there was one to be found. It's the only one that would fit in the system where I'd use it.

Barring that, I still won't prioritize RT for another couple of generations. I'm playing Control (one of the showcases for RT) on a 6800 XT. I can play 1080P with full RT or 4K with no RT. 4K with no RT looks better. RT makes environment reflections more shiny. Wall reflections look grainy and nearly pixelated. It's not impressive. I'll see what CP2077 looks like when they get the expansion DLC out.

The way it looks now, performance ranking will be 4090 -> 7900 XTX -> 7900 XT -> 4080. Since Nvidia is more focused on getting rid of last gen than selling 4090s, 7900 XTX is very appealing.
 
Remember that the 3090 & 4090 are Titan class cards. Even if Nvidia shipped enough 4090s for all of us that would take it, it's a very small slice of the GPU market.

The 3080/4080 and 6900 XT/7900 XTX are what have traditionally been "flagship" cards. RDNA2 was very competitive with Ampere throughout the stack, even with the 3090 on raster performance. RT was about a generation behind Nvidia.

RDNA3 is probably about a generation behind Nvidia again on RT. And again will most likely challenge or beat the Nvidia counterparts on raster.

The 4090 is really the first GPU that makes native 4K RT achievable. It will be another 1-2 generations for that to reach the mainstream product. Probably 2-3 generations for AMD.

I'd take a 4090 FE for MSRP if there was one to be found. It's the only one that would fit in the system where I'd use it.

Barring that, I still won't prioritize RT for another couple of generations. I'm playing Control (one of the showcases for RT) on a 6800 XT. I can play 1080P with full RT or 4K with no RT. 4K with no RT looks better. RT makes environment reflections more shiny. Wall reflections look grainy and nearly pixelated. It's not impressive. I'll see what CP2077 looks like when they get the expansion DLC out.

The way it looks now, performance ranking will be 4090 -> 7900 XTX -> 7900 XT -> 4080. Since Nvidia is more focused on getting rid of last gen than selling 4090s, 7900 XTX is very appealing.


I wouldn’t be surprised if RDNA 4 takes a larger leap in RT and gets closer to nvidia. Not expecting amd to match it for a bit but given nvidia might run in to limit on how big they can make next gen. RDNA might be in better spot with Mcm.
 
https://videocardz.com/newz/nvidia-...ed-in-geekbench-30-to-37-faster-than-rtx-3080


LOL just 37% faster than the RTX 3080, but it cost went up from $699 to $1200.

F you nVidia.

Where as the 4090 is a beast and like 75% faster and better than the 3090 but the cost difference is only a hundred bucks.

What is nVidia smoking with these bizarre prices compared to performance. So the fastest card is only $100 more expensive this year, but the slower card is $500 more expensive. WTF?
 
https://videocardz.com/newz/nvidia-...ed-in-geekbench-30-to-37-faster-than-rtx-3080


LOL just 37% faster than the RTX 3080, but it cost went up from $699 to $1200.

F you nVidia.

Where as the 4090 is a beast and like 75% faster and better than the 3090 but the cost difference is only a hundred bucks.

What is nVidia smoking with these bizarre prices compared to performance. So the fastest card is only $100 more expensive this year, but the slower card is $500 more expensive. WTF?

Funny some are already okay with the price. I truly believe you can gas light people easy. Just make them passionate about your brand and sell shit at double price or sell them anything by that means. Nvidia is basically selling you a 4080 for 4080ti price.
 
The fact that I don't have to upgrade my case, PSU, and worry about cable melting, I am probably going to pick up at 7900XTX at launch. I am on a 2070Super right now and it has been over a decade since I had my last AMD GPU which was a TriFire setup. I do have the Alienware 34" OLED that has Gsync Ultimate, I assume it should still work using FreeSync?
 
The fact that I don't have to upgrade my case, PSU, and worry about cable melting, I am probably going to pick up at 7900XTX at launch. I am on a 2070Super right now and it has been over a decade since I had my last AMD GPU which was a TriFire setup. I do have the Alienware 34" OLED that has Gsync Ultimate, I assume it should still work using FreeSync?
I was curious too:

1667964146528.png
 
Yeah I figured even if the 7900 XTX is twice as powerful I'm paying half the price, and I really only game at 1440p anyway. I'll be back once there's a 4k monitor I actually want.
Right on. I have the Gaming X model and it's crazy on 1440p.
 
I’m waiting to see what the 7900 xtx does. I just picked up a 3080 on Facebook for 400 bucks to hold me over for a while. I game at 4k on a LG CX. I definitely want something that will push 120fps@4k. Nvidia has just gone crazy with prices. I honestly don’t mind spending 1k-1200 for a card that’s close to 2x a 3080 like the 4090. If the 7900 xtx is 85-90% of a 4090 I’ll pick one up. I’ll have it on a custom loop. I’m really guessing that the 7900 xtx with a higher power limit from the custom cards will clock higher. I really want a liquid devil 7900 xtx.
 
I’m waiting to see what the 7900 xtx does. I just picked up a 3080 on Facebook for 400 bucks to hold me over for a while. I game at 4k on a LG CX. I definitely want something that will push 120fps@4k. Nvidia has just gone crazy with prices. I honestly don’t mind spending 1k-1200 for a card that’s close to 2x a 3080 like the 4090. If the 7900 xtx is 85-90% of a 4090 I’ll pick one up. I’ll have it on a custom loop. I’m really guessing that the 7900 xtx with a higher power limit from the custom cards will clock higher. I really want a liquid devil 7900 xtx.
Im definitely finding myself more and more receptive to the 7900XTX for sure. I think it will ultimately come down to whats available when I pull the trigger on a new cpu (waiiting for Zen4X3D). If the 4090's are readily available I will go that route, but I will not be waiting on availability. If 4090's are not around due to nvidia playing channel games, I will happily purchase a 7900XTX.
 
What sort of power cable connector grease does AMD cards use? Same as Nvidia? I have been doing it dry all these years, I feel foolish now.
 
I know, but, lube! We gotta lube it. If you want to try it dry go ahead. If 4090s burn then all the cards gonna burn.
 
I know, but, lube! We gotta lube it. If you want to try it dry go ahead. If 4090s burn then all the cards gonna burn.
Oh, lol. I completely didn't see the "grease" part. I thought you were only asking about the power connector itself.
 
If those leaks are true the 7900 XT makes absolutely no sense at all, that's a massive difference for 100 dollars. They're going to get raked over the coals in reviews for it. One good, one bad.
 
If those leaks are true the 7900 XT makes absolutely no sense at all, that's a massive difference for 100 dollars. They're going to get raked over the coals in reviews for it. One good, one bad.
Hopefully, people wait for trusted third party reviews before making their mind up on these. I'm still of the mind that AMD's first foray into a chiplet GPU is going to be janky just like Zen 1 & 2 were. My gut is telling me they are keeping numbers vague for a reason, which may not be a good.
 
Hopefully, people wait for trusted third party reviews before making their mind up on these. I'm still of the mind that AMD's first foray into a chiplet GPU is going to be janky just like Zen 1 & 2 were. My gut is telling me they are keeping numbers vague for a reason, which may not be a good.
I think in terms of relative performance between the xt and xtx it is probably quite on the nose, everything on the xt is about 83% of the xtx (5/6), CU count, ram, memory bus size-bandwidth.

It being a terrible choice if both card exist at MSRP, could be by design, in the sense maybe there not that many NAVI31 that do not achieve to be a xtx so there would be a low volume of xt anyway to sell, may has well do not create competition for the giant volume of high end RDNA3 still around and make the xtx pricing look really good.
 
I just noticed that the difference between the 7900XTX and the 7900XT is basically the same as the 3080 16GB and 3080 12GB. A full tier down while keeping the moniker of the higher tier card. Yet Nvidia was torn apart for something that AMD seems to be getting away with so far in the very same generation.

1668501876311.png
 
Yet Nvidia was torn apart for something that AMD seems to be getting away with so far in the very same generation.
Wrong. The 4080 was criticized because the 12gb was literally a renamed 4070. When called out on that, they then tried to insist that they're the same card except for different vram.

They said it was the exact same scenario as the 3080 10gb/12gb. That was a blatant lie.
 
I agree the 7900XT for $899 is too high. The performance gap between it and the 7900XTX is too large for the 799XTX to be only $100 more. They are doing exactly what NVIDIA is doing and using the cheaper card to upsale the more expensive one. As I say that, 7900XTX for $999 is on point, it will be around 15-18% faster in raster performance than the 4080 but $200 cheaper. Even in ray tracing, there are some games that the 7900XTX pulls ahead. Nvidia is shooting themselves in the foot with the 4080.
 
Wrong. The 4080 was criticized because the 12gb was literally a renamed 4070. When called out on that, they then tried to insist that they're the same card except for different vram.
How that different from the 7800xt renamed in 7900xt ? At least if we used the previous generation naming model has some reference ? Is the big difference letter xt vs xtx being bigger than number 16 vs 12 has a signaling ?

I feel the massive difference is simply the 7900xtx price that make the absence of an well priced 7800xt at launch goes way better, combined with the current 6800xt-6900xt performance by dollar pricing, if you do not want to go for a $1000 card. Not that the 7900xt is that much of a less ridicolous versus the 7900xtx than the announced 4080 12gb was over the 16 and 4090.
 
Uh, was it? Or is the same thing as Ti vs non Ti which nobody is complaining about?
There obviously no strict rules or science involved here, but going by RDNA 2 (I am not sure what else to look at):

2020 MSRP
6950xt: 80 CU, 16 GB or ram, 256 bits
6900xt: $1,000 ($1147 today), 80CU, 16GB , 256 bits
6800xt: $650 ($745.43 today), 72CU, 16GB, 90% of the CU 100% of the ram , 256 bits 100% of the ram bus

2022 MSRP
7900xtx: $1000, 96 CU, 24 GB, 384 bits
7900xt: $900, 84 CU, 20 GB, 320 bits, 87.5% of the CU, 83.3% of the ram, 83.3% of the ram bus

the 7900xt is a significantly bigger cut down of the NAVI31 than the 6800xt was of the 6900xt in everyway and the difference gap is much larger?

Would the 4080 16gb would have been call the 4080Ti and the 4080 12gig simply the 4080, I do not see how it would have been better at all, the big issue was the 4080 12gb getting the xx80 class moniker and pricing, not the possible confusion with the 16 gig version (no one knowledgeable enough to complain was at risk at making that mistake in the store).
 
Last edited:
This is exactly why I picked up a 6800xt for $350 less than the 6900xt. Looks the 7900xtx is the only one to get because what’s a $100 bucks once you get up near a grand.. I’m interested in actual reviews to see the performance spread between the xtx and xt.

With all that being said I’m trying to score the xtx on launch for a buddies AM5 build. Not too optimistic though.
 
There obviously no strict rules or science involved here, but going by RDNA 2 (I am not sure what else to look at):

2020 MSRP
6950xt: 80 CU, 16 GB or ram, 256 bits
6900xt: $1,000 ($1147 today), 80CU, 16GB , 256 bits
6800xt: $650 ($745.43 today), 72CU, 16GB, 90% of the CU 100% of the ram , 256 bits 100% of the ram bus

2022 MSRP
7900xtx: $1000, 96 CU, 24 GB, 384 bits
7900xt: $900, 84 CU, 20 GB, 320 bits, 87.5% of the CU, 83.3% of the ram, 83.3% of the ram bus

the 7900xt is a significantly bigger cut down of the NAVI31 than the 6800xt was of the 6900xt in everyway and the difference gap is much larger?

Would the 4080 16gb would have been call the 4080Ti and the 4080 12gig simply the 4080, I do not see how it would have been better at all, the issue was the big issue was the 4080 12gb getting the xx80 class moniker and pricing, not the possible confusion with the 16 gig version (no one knowledge enough to complain was at risk at making that mistake in the store).

2.5% is now "significantly bigger"? That's a stretch for sure.

In addition, the reduction in bus width/ram qty is directly related to the use for 5 vs 6 MCD. They're each 64bit with 4gb attached vs navi21 which was the same exact die with only some CUs disabled.

I agree this should have had all 6 MCD and full ram with some CUs disabled for this price, or been cheaper, let's not try and correlate the 7900xt and Nvidia using ad103 in an xx80 class card.
 
2.5% is now "significantly bigger"? That's a stretch for sure.
True enough I had 5/6 for everything in mind, but performance wise it seem to be around 83% of an 7900xtx overall, that really close to the 6800 not Xt relative to the 6900xt for RDNA2.

In addition, the reduction in bus width/ram qty is directly related to the use for 5 vs 6 MCD
yes, which is the point

let's not try and correlate the 7900xt and Nvidia using ad103 in an xx80 class card.
I really do not mind much naming, which could be true, but in term of pushing bad deals using it, the AD104 being used for the 4080 12 was indeed more pushing it, but they are both doing very much the same thing here.

The xt seem to be 83% of an xtx in performance for 90% of the price according to AMD released marketing slide
4080 12 seemed to be around 78% of an 4080 16 for 75% of the price
 
If those leaks are true the 7900 XT makes absolutely no sense at all, that's a massive difference for 100 dollars. They're going to get raked over the coals in reviews for it. One good, one bad.

I've been giving this some thought, and I came up with a scenario where it makes sense.

AMD wants to have the reference design list for $999 -- the "sub-$1,000" card. They won't make many. Instead, they'll turn the parts over to AIBs who will make OC models with expensive cooling for $1,200-$1,500.

When that happens, the XT will be appealing, the AIB XTXes will punch way above its class, and AMD can still say "It's less than a thousand dollars!" People will look at the XT and go "eh, I got it for MSRP (or less)" and everyone's happy. Ish.
 
The latest and greatest is always nice if you can afford it but the go to budget ..... RDNA2 cards are still the price/performance king cards for the next year or so until stocks are depleted. I'll be holding onto my 6900 until some time in the next year or two when RDNA3 has become the budget king. Rinse and repeat as always.
 
I've definitely got my eyes on the RX 7900 XTX, but it all hinges on something that hasn't been AMD's strong suit if Babel Tech Reviews is to be believed with the older RDNA 2 lineup - VR performance without synth/dropped frames. (There were a few games where what would be synth/reprojection frames on NVIDIA were straight-up dropped frames on AMD.)

Not interested in flat 4K monitor gaming yet (still on a 1080p120 monitor), not that interested in ray tracing, I just want something that actually makes DCS and other notoriously unoptimized flight sims playable in VR.

The problem there is that AMD apparently falls short with their drivers at launch (on Windows, anyway, Linux users hate NVIDIA drivers), but quickly picks up steam after a few months and the updates roll in. Nobody thinks to actually redo the reviews after said driver updates, though, just in case there's significant improvements in frame-pacing, microstutters, etc. that may have soured the experience at launch.
 
True enough I had 5/6 for everything in mind, but performance wise it seem to be around 83% of an 7900xtx overall, that really close to the 6800 not Xt relative to the 6900xt for RDNA2.


yes, which is the point


I really do not mind much naming, which could be true, but in term of pushing bad deals using it, the AD104 being used for the 4080 12 was indeed more pushing it, but they are both doing very much the same thing here.

The xt seem to be 83% of an xtx in performance for 90% of the price according to AMD released marketing slide
4080 12 seemed to be around 78% of an 4080 16 for 75% of the price

I think the pricing is off on the 7900xt, at least relative to the xtx.

To keep the same bus width and ram count, your bom cost has to be the same as the xtx, you would have all 6 MCDs with the same core die. This negates one of the primary benefit of chiplets, the ability combine cheaper cores with flexible io costs and still have the standard binning that's currently used.

The xt still uses the same big core chip as the xtx, the actual cost difference betweenan xt vs xtx might reasonably be $100 (one 6nm MCD and one 4gb ddr6 chip).

That's vs ad103 which is almost half the size of ad102, the difference in 12 vs 16 4080 is much bigger than xt vs xtx, at least in actual cost to produce.
 
Back
Top