Anybody plan on grabbing one of the new RX7xxx Navi cards on launch?

I've been giving this some thought, and I came up with a scenario where it makes sense.

AMD wants to have the reference design list for $999 -- the "sub-$1,000" card. They won't make many. Instead, they'll turn the parts over to AIBs who will make OC models with expensive cooling for $1,200-$1,500.

When that happens, the XT will be appealing, the AIB XTXes will punch way above its class, and AMD can still say "It's less than a thousand dollars!" People will look at the XT and go "eh, I got it for MSRP (or less)" and everyone's happy. Ish.
This scenario doesn't exist. AMD's reference cards have been leaked for a month or two.
 
I've definitely got my eyes on the RX 7900 XTX, but it all hinges on something that hasn't been AMD's strong suit if Babel Tech Reviews is to be believed with the older RDNA 2 lineup - VR performance without synth/dropped frames. (There were a few games where what would be synth/reprojection frames on NVIDIA were straight-up dropped frames on AMD.)

Not interested in flat 4K monitor gaming yet (still on a 1080p120 monitor), not that interested in ray tracing, I just want something that actually makes DCS and other notoriously unoptimized flight sims playable in VR.

The problem there is that AMD apparently falls short with their drivers at launch (on Windows, anyway, Linux users hate NVIDIA drivers), but quickly picks up steam after a few months and the updates roll in. Nobody thinks to actually redo the reviews after said driver updates, though, just in case there's significant improvements in frame-pacing, microstutters, etc. that may have soured the experience at launch.
I run a Pimax 5K super 2560X1440 2 panels at 60 Hz in MSFS, 90 Hz in IL2 Sturmovik haven't tried DCS yet and may not. My 6900XT handles VR quite well with no stutter or pacing issues now. Total pixel count per frame is just under 4K pixel count. That's why I'm not really considering a new card until the scalper run is over and demand dies down sometime late next year or early 2024.There were some early adopter pains with driver optimization but that's always the case with AMD cards and one reason they are cheaper. Their drivers eventually get there and it's all fun and games from there. Then the next Gen bug bites again.
 
I'm going to try and grab a reference 7900 XTX to replace my 6900 XT. Which is currently in my ITX system. If I can't nab a reference one at MSRP. I'll probably wait for the 7950 XT(X) refresh in 6 ish months. If it is still coming? has anyone heard anything about it?
 
I'm going to try and grab a reference 7900 XTX to replace my 6900 XT. Which is currently in my ITX system. If I can't nab a reference one at MSRP. I'll probably wait for the 7950 XT(X) refresh in 6 ish months. If it is still coming? has anyone heard anything about it?
I know they are working on it but I haven't heard of any ETA. I would be surprised if we see anything next year.
 
I run a Pimax 5K super 2560X1440 2 panels at 60 Hz in MSFS, 90 Hz in IL2 Sturmovik haven't tried DCS yet and may not. My 6900XT handles VR quite well with no stutter or pacing issues now. Total pixel count per frame is just under 4K pixel count. That's why I'm not really considering a new card until the scalper run is over and demand dies down sometime late next year or early 2024.There were some early adopter pains with driver optimization but that's always the case with AMD cards and one reason they are cheaper. Their drivers eventually get there and it's all fun and games from there. Then the next Gen bug bites again.
That's reassuring, though I've seen some YouTube videos hinting at MSFS2020 just not hitting 80-90 FPS in VR in dense cities, even with high-end cards like the RX 6900 XT and RTX 3090. I'm not sure even DCS has it that bad.


I think I'll be holding out for the RX 7900 cards; better performance aside, I think DisplayPort 2.1 support (something NVIDIA left off of Ada Lovelace!) will pay off in the long run, even though I'm currently lacking anything that has that or HDMI 2.1 - but I think it'll open up my options to better displays whenever I tire of the ol' Eizo FG2421 or Valve Index (would not be surprised if later HMDs actually do require DP 2.1 levels of bandwidth).
 
Do we know when the reviews embargo lift ?, seem common to be zero minute before launch or almost nowadays
Does anyone have an answer to this?

And once these cards drop, what would be a fair price for a 6900xt?
 
Does anyone have an answer to this?

And once these cards drop, what would be a fair price for a 6900xt?
Not sure. Rumor is a couple of days before release but that is just people talking. As far as pair price for a 6900XT, I am thinking around $600.
 
Not sure. Rumor is a couple of days before release but that is just people talking. As far as pair price for a 6900XT, I am thinking around $600.
I'm kind of surprised that the 6900XT hasn't dropped a lot more. Lowest new price I see is $654 on Newegg. 6950XT's are at $830.
 
I plan to upgrade. I play some older games and use 4k where brute force raster performance is key.

My current monitors only do 60hz and some games don't even get to that. I'll want a 120hz 4k at some point and need all that number crunching power.
 
Are there any 4k 120hz monitors that can display games at 1440 just as good as native?

Can't be a TV, has to be ~27 in
 
Are there any 4k 120hz monitors that can display games at 1440 just as good as native?

Can't be a TV, has to be ~27 in
It'll always be a worse picture using a non-native resolution - especially text.

I noticed it from 6 to 8 feet away when I tried 1440P on my 4K LG CX, and I found it unacceptable. I'd wager that it would be even more noticeable with a traditional monitor where you sit even closer to it.
 
It'll always be a worse picture using a non-native resolution - especially text.

I noticed it from 6 to 8 feet away when I tried 1440P on my 4K LG CX, and I found it unacceptable. I'd wager that it would be even more noticeable with a traditional monitor where you sit even closer to it.
Desktop is definitely better set to the panel's native res. It's pretty easy to scale in Windows to make everything usable at 4K. Linux is a different story.

Gaming at a lower resolution than native is where anti-aliasing comes in. Traditional AA has usually had some performance hit to improve the image quality. DLSS & FSR are unique in that they gain performance by rendering at a lower resolution, but both still produce a level of artifacts that may or may not be noticeable during gameplay.

Playing Control on a 4K TV with a 6800 XT, you can do 4K without RT or 1080P with RT. 1080P is full of jaggies until you turn on MSAA, which makes edges and fine objects (like hair) look nearly as good as 4K native.

TLDR: It's not the panel that compensates for non-native resolution, it's AA.
 
Are there any 4k 120hz monitors that can display games at 1440 just as good as native?

Can't be a TV, has to be ~27 in
A projector is your best bet for that. Because there are no 4k CRTs. That was really the last type of display that you could change the resolution without the picture quality looking horrible other than resolution change. The next type other than the projector would probably be OLED. But as for 1440P just get the QD-Oled. OR stick to Native 4k, and just use FSR, XES, or DLSS. Whatever your GPU uses.
 
Desktop is definitely better set to the panel's native res. It's pretty easy to scale in Windows to make everything usable at 4K. Linux is a different story.
Yep. Been using 4k since 2014 and had little issue with scaling in win10. All have been resolved for years now.
 
Gotcha. So even if i had inf. money, the 4090 gets 120fps+ in 13 games at 4K, but it doesnt get 120fps in 11 games. Point being, since I value framerate over resolution, I would still need to go down to 1440 in half the games I play. There is no monitor that can do this well now? 4k 120 and 1440 120? Desktop res is still 4k.

https://www.techpowerup.com/review/nvidia-geforce-rtx-4090-founders-edition/31.html
It’s not an even division. 1080P works because it’s 1:2, or exactly 1/4th the pixels of 4K. Thus you just use two pixels in each dimension to display it, (both the x and y plane). 1440P is a half way step. Can’t do half pixels so it fiddles through - to marginal results at times. Only way to tell is to try and see how good the upscale system in the screen is (if one exists). Some are good. Some are less good. Some don’t really have one.
 
Gotcha. So even if i had inf. money, the 4090 gets 120fps+ in 13 games at 4K, but it doesnt get 120fps in 11 games. Point being, since I value framerate over resolution, I would still need to go down to 1440 in half the games I play. There is no monitor that can do this well now? 4k 120 and 1440 120? Desktop res is still 4k.

https://www.techpowerup.com/review/nvidia-geforce-rtx-4090-founders-edition/31.html
If you got the money, then just get two monitors. One 1440P and one 4k. Otherwise, you will always compromise.
 
Since I have money x money, you know those chalk boards in colleges that flip over to a new side when you want to write more? Do they make monitors like that? 4k on one side, 1440 on the other? again, i have money
 
Are there any 4k 120hz monitors that can display games at 1440 just as good as native?

Can't be a TV, has to be ~27 in
Why does it have to be the display when you can just use RSR/NIS? This is why upscaling technology on the GPU side has been advancing in the past few years. FSR and DLSS (quality modes) would be 1440p internal resolution with 4K output and look close to 4K native. Even RSR/NIS get fairly close with a little sharpening.
 
Do you think it will be worth holding out for the AIB 7900xtx cards or just grab the reference cards?
 
Do you think it will be worth holding out for the AIB 7900xtx cards or just grab the reference cards?
I am getting the reference card due to size and only using 2 8pin. The AIB will be faster but it will use more power, be bigger, and will be more expensive. I have heard 200+.
 
my main question is whether or not 3x8 pins and beefed up components and higher power limit would be worth the extra cost? Sometimes the reference cards overclock just as good as the custom AIB cards.
 
I am getting the reference card due to size and only using 2 8pin. The AIB will be faster but it will use more power, be bigger, and will be more expensive. I have heard 200+.
my main question is whether or not 3x8 pins and beefed up components and higher power limit would be worth the extra cost? Sometimes the reference cards overclock just as good as the custom AIB cards.
 
my main question is whether or not 3x8 pins and beefed up components and higher power limit would be worth the extra cost? Sometimes the reference cards overclock just as good as the custom AIB cards.
Depends on the price but it wouldn’t surprise me if they can get 3Ghz and be faster than a 4090. In that case it might be worth it but for my use case I need the reference.
 
my main question is whether or not 3x8 pins and beefed up components and higher power limit would be worth the extra cost? Sometimes the reference cards overclock just as good as the custom AIB cards.
Depends on how much extra cost it is; if it's anything like the infamous $1,550 Asus ROG Strix variant of the RTX 4080, then that's an instant GTFO because you'd only have to spend $50 for a much, much faster card in the form of the RTX 4090.

$100 is tolerable. $200 is really stretching it; there'd better be a factory full-cover waterblock on there, and on that note, waterblock compatibility becomes a very real concern with non-reference cards.
 
Being that's there's a very minor performance difference between all of these AIB cards, I'm only interested in ones that have beefier coolers to keep the temps and noise down
 
Being that's there's a very minor performance difference between all of these AIB cards, I'm only interested in ones that have beefier coolers to keep the temps and noise down
Well if AIBs use those ridicules 4090 coolers which also migrated to the 4080 for no good reason comes to the RNDA3 31 line up, you will have some very quiet coolers to pick from. I am hoping AIBs put on coolers that are appropriate, good engineering design for the AMD GPU and not a 600w Lovelace design that eventually the GPU and not the cooler was cut down to 450w.
 
Well if AIBs use those ridicules 4090 coolers which also migrated to the 4080 for no good reason comes to the RNDA3 31 line up, you will have some very quiet coolers to pick from. I am hoping AIBs put on coolers that are appropriate, good engineering design for the AMD GPU and not a 600w Lovelace design that eventually the GPU and not the cooler was cut down to 450w.
Benefit to the massive cooler on the 4080 is that the thing runs very cool, and even on the reference cards means you're hitting very good frequency without even touching anything.

Will be curious to see what the story is here. Although, once you enter 4080 pricing territory at $1200 it's going to be stiff competition.
 
I plan on water cooling so I'm not sure if the AIB cards will even overclock any higher. I remember a fairly recent generation of gpus where custom AIB cards and even water-cooling didn't make much of a difference for overclocking when compared to the reference cards.
 
It was the same situation with the 6800 and 6800 XT.

XT was a lot faster and not that much more expensive.
They seem to do this all the time. I think it's just repurposing chips that failed to make the cut for the higher end product. There's always a price premium it seems because they're never produced in very high quantities anyway (5800x was similar on the cpu side). Last gen they probably made a mistake pricing the 6800xt so much cheaper than the 6900xt when the performance was close so now they've raised the price on its replacement. I'm guessing the 6800 equivalent will come it around $600 again this time.
 
They seem to do this all the time. I think it's just repurposing chips that failed to make the cut for the higher end product. There's always a price premium it seems because they're never produced in very high quantities anyway (5800x was similar on the cpu side). Last gen they probably made a mistake pricing the 6800xt so much cheaper than the 6900xt when the performance was close so now they've raised the price on its replacement. I'm guessing the 6800 equivalent will come it around $600 again this time.
I am thinking in a similar fashion, having a 7900 XT at $900 does indicate the 6800 XT/XTX? version will be much cheaper. To me it looks like AMD is going to have a kickass full lineup once complete. Some guesses, most likely wrong:
  • 6800XTX - $800
  • 6800XT - $700
  • 6700XTX - $600
  • 6700XT-$500
There will be a ceiling on the price when the 6900 series is at the MSRP price. The 6800x depending what die it uses could be cheaper which will drive everything below it lower. Have no clue what Nvidia has plan and what pricing they have to charge.
 
Last edited:
I am thinking in a similar fashion, having a 7900 XT at $900 does indicate the 6800 XT/XTX? version will be much cheaper. To me it looks like AMD is going to have a kickass full lineup once complete. Some guesses, most likely wrong:
  • 6800XTX - $800
  • 6800XT - $700
  • 6700XTX - $600
  • 6700XT-$500
There will be a ceiling on the price when the 6900 series is at the MSRP price. The 6800x depending what die it uses could be cheaper which will drive everything below it lower. Have no clue what Nvidia has plan and what pricing they have to charge.
Did you mean the "78xx/77xx" variants? ;)
 
Holding pattern here. If I can grab a 7900 XTX at launch for MSRP, I will be happy to do so. If not, a 6900 XT for under $700 or... hanging onto my 6700 XT are not bad plans b/c.
 
Back
Top