7900xtx or 4080 super?

Unfortunately, you're here in a thread to suggest a card for someone else. Your personal preference, suggesting a card that's WORSE, is counterproductive to the thread.

It's advice good or bad, so relevant to the thread. Agreeing or disagreeing is a personal opinion on said advice.
 
Looks like guchengxianqinglingshangmaoyouxiangongsi is the new seller of choice for these cards :D

Check out that solid 14% feedback rating

Ah yes, good old scammer storefront. I saw those today too.
 
I am sorta confused here.

If both cards, outside of RT are the same, why wouldn't you buy the similar priced card that also offers an extra feature?

It seems you are suggesting AMD just "to buy something different".

Seems counterproductive to buy a GPU, similarly priced, with less features just "to be different".

I mean, you can do you and so whatever you want, but it's shitty advice.
Similarly priced? 200$ difference if a significant difference when it comes to budget. I looked at 4080S before I bought my 7900xtx and realized I didn't want to spend the extra 200 dollars. I got mine for $929. I couldn't justify the extra money for minor performance increase.
Fanboy BS is getting pissed because someone suggested something other than what you think it should be. The personal choice is way more nuanced than what everyone here is yelling about.
 
Similarly priced? 200$ difference if a significant difference when it comes to budget. I looked at 4080S before I bought my 7900xtx and realized I didn't want to spend the extra 200 dollars. I got mine for $929. I couldn't justify the extra money for minor performance increase.
Fanboy BS is getting pissed because someone suggested something other than what you think it should be. The personal choice is way more nuanced than what everyone here is yelling about.

I think the 7900XTX is slightly faster overall in raster, much slower in RT. Kind of a toss up. I have a 4080 Super FE here that was purchased for a build for someone I know, it was easily snagged at $999 from the Nvidia site.
 
Unfortunately, you're here in a thread to suggest a card for someone else. Your personal preference, suggesting a card that's WORSE, is counterproductive to the thread.
It's not worse. And you are the one that is counter productive because only YOUR opinion matters. Only youre superior opinion is productive. You completely dissmissed all the other people stating the xtx is a good option in the thread. It's not worse at all. It's at least 200 bucks less and puts out a little better raster performance. Who cares anymore. I don't care. Peace out. Thread has more than addressed the OPs query.
 
Last edited:
Similarly priced? 200$ difference if a significant difference when it comes to budget. I looked at 4080S before I bought my 7900xtx and realized I didn't want to spend the extra 200 dollars. I got mine for $929. I couldn't justify the extra money for minor performance increase.
Fanboy BS is getting pissed because someone suggested something other than what you think it should be. The personal choice is way more nuanced than what everyone here is yelling about.

That's the thing, if it was $200 difference then the 7900 XTX would be a solid value.

Over the last two weeks, I've been keeping tabs on 7900 and 4080S availability and so far the lowest 7900 XTX I've seen was $909 new. 4080S have been routinely popping back in stock at $999-$1029

AMD really needs to drop the MSRP of the 7900 XTX to $799-$849 to make it compelling IMO.
 
That's the thing, if it was $200 difference then the 7900 XTX would be a solid value.

Over the last two weeks, I've been keeping tabs on 7900 and 4080S availability and so far the lowest 7900 XTX I've seen was $909 new. 4080S have been routinely popping back in stock at $999-$1029

AMD really needs to drop the MSRP of the 7900 XTX to $799-$849 to make it compelling IMO.
That's exactly it, I just upgraded my gpu to a 4080 super. One of the factors is that the 7900 xtx was within $100 cad of the 4080 super.

For basically the same price I'm going for the card with far better rt performance and better software for frame generation from what I can tell. I only upgrade every 5+ years (last card was a gtx 1080 lol) so things like better dlss support is a selling feature, get every ounce I can from the hardware.

If the 7900xtx had been a solid $200-$300 cad cheaper ($100-$200 usd) that would have massively swayed my choice.
 
Last edited:
If you had to pick between a Powercolor Red Devil, MSI Gaming, Sapphire Nitro+ or Asus Tuf 7900 xtx - which would you pick (and why)? All used - Powercolor still seems to have complaints from ppl - temps mostly? Sometimes coil whine. The others seem to rarely have complaints - although, sometimes the Sapphire does. The other two are not AMD AIB - but, the warranty terms are better in my country - if the receipt or invoice is lost/missing etc., the company goes by the serial # - the AIB ones refuse the warranty if you don't have anything. I think I can get the receipt from the sellers of either cards.

I'm posting this because a 4080 Super is not an option because of price - and I'm not seeing used 4080s for decent prices anymore - plus, I'm wondering - if there's some brand preference now... - lots of guys on here have all these 7900 xtx (brands). :)
Right?????
 
If you had to pick between a Powercolor Red Devil, MSI Gaming, Sapphire Nitro+ or Asus Tuf 7900 xtx - which would you pick (and why)? All used - Powercolor still seems to have complaints from ppl - temps mostly? Sometimes coil whine. The others seem to rarely have complaints - although, sometimes the Sapphire does. The other two are not AMD AIB - but, the warranty terms are better in my country - if the receipt or invoice is lost/missing etc., the company goes by the serial # - the AIB ones refuse the warranty if you don't have anything. I think I can get the receipt from the sellers of either cards.

I'm posting this because a 4080 Super is not an option because of price - and I'm not seeing used 4080s for decent prices anymore - plus, I'm wondering - if there's some brand preference now... - lots of guys on here have all these 7900 xtx (brands). :)
Right?????
If we talk about Sapphire NITRO+ RX 7900 XTX Vapor-X it has the highest TDP than other models, so it is supposed to clock higher and to be a little faster.
 
Higher than stock? Why would someone want to do that with a 7900 xtx? :unsure:
Undervolt and finding the max boost with the max power limit will make your card have a bit more % performance at the same TDP/Temps - so why not? :)
 
Looks like the 7900xtx is down to $850 in some cases. If it goes down another $100 then it might make sense to buy it
 
As someone who's had a 7900XT for a little over a month now I wish I would have returned it and gone with a 4070TiS or splurged for a 4080S. The AMD GPU is fine until you decide to max a game's settings only to find it has RT and you're reduced to unplayable framerates unless you enable FSR which generally just looks bad in most games when compared to DLSS. If you're not running RT then it's fine but in that case it hardly matters what card you have as most of these 'high-end' GPUs are close enough that it doesn't really matter. For instance my 2080ti was already fine in CP2077 w/o RT; the 7900XT feels slightly better but it's really not that much different and when I enable RT they're basically the same and I'm required to use AFMF anyway which just doesn't feel very good when enabled. Do I NEED RT? Not at all but it sucks to know that your 'expensive' new card was barely an upgrade over your 2 gen old card that you bought for the same amount of cash 4 or 5 years ago.

The 7900XT is generally 50% faster than my old 2080ti in pure raster titles which you'd think would be great. The issue is that where the 2080ti was fine the 7900XT's extra performance isn't noticeable; and where the 2080ti struggled at 60fps the 7900XT is doing 90fps which while it is technically substantially faster its not really a big deal since its still not able to hit my monitor's max refresh in those games (3440x1440 @ 120hz). At least with a 4070TiS I'd be about the same on raster and much better when I do want to use RT.
 
Last edited:
As someone who's had a 7900XT for a little over a month now I wish I would have returned it and gone with a 4070TiS or splurged for a 4080S. The AMD GPU is fine until you decide to max a game's settings only to find it has RT and you're reduced to unplayable framerates unless you enable FSR which generally just looks bad in most games when compared to DLSS. If you're not running RT then it's fine but in that case it hardly matters what card you have as most of these 'high-end' GPUs are close enough that it doesn't really matter. For instance my 2080ti was already fine in CP2077 w/o RT; the 7900XT feels slightly better but it's really not that much different and when I enable RT they're basically the same and I'm required to use AFMF anyway which just doesn't feel very good when enabled. Do I NEED RT? Not at all but it sucks to know that your 'expensive' new card was barely an upgrade over your 2 gen old card that you bought for the same amount of cash.

The 7900XT is generally 50% faster than my old 2080ti in pure raster titles which you'd think would be great. The issue is that where the 2080ti was fine the 7900XT's extra performance isn't noticeable; and where the 2080ti struggled at 60fps the 7900XT is doing 90fps which while it is technically substantially faster its not really a big deal since its still not able to hit my monitor's max refresh in those games (3440x1440 @ 120hz). At least with a 4070TiS I'd be about the same on raster and much better when I do want to use RT.

Want Ray Tracing then you have to pay the Nvidia tax, especially with the amount of pixels you are pushing. Honestly I can tell if your using DLSS or FSR, they both have obvious visual artifacts and why I never want either on for any reason. Honestly there are very few titles where turning on Ray Tracing made a exciting difference for me. Path tracing looks far more interesting but neither brand is giving you a good experience in that unless the game is super old.
 
Want Ray Tracing then you have to pay the Nvidia tax, especially with the amount of pixels you are pushing. Honestly I can tell if your using DLSS or FSR, they both have obvious visual artifacts and why I never want either on for any reason. Honestly there are very few titles where turning on Ray Tracing made a exciting difference for me. Path tracing looks far more interesting but neither brand is giving you a good experience in that unless the game is super old.
Not caring about RT is a legitimate stance for anyone to have. BUT I enjoy being able to use it when it's there and RDNA3 not really allowing me to legitimately use it any more than my 2 gen old Turing card did is just a bit disappointing. It was more of a sidegrade than a straight upgrade since the places where I really needed the performance were already good enough with the 2080Ti and the games that had performance issues were those with RT and I didn't really gain much there. So if I were to do it over again I'd drop the extra $100 on a 4070TiS and still have an unnoticeable difference in raster but usable RT for when I wanted it.

With FSR I just run into quite a few games where it looks straight up bad (smearing textures, flashing, etc) vs when I was able to run DLSS it looked acceptable the majority of the time and when it didn't it was usually a DLL swap away from looking almost perfect.

Adrenaline has the best performance metrics overlay I've ever used but the rest of it feels overdone, it loves to conflict with other programs on occassion, some things just don't work (Video Upscale for one), and it's a bit slow in use. If I'm popping it up during a game it's because something doesn't feel right and I'm toggling settings to see what option is causing it whereas with my Nvidia GPU I didn't have that option but I also never felt like I had to go in and make any driver level changes anyway.


Anyway I am definitely noticing the things I've lost in the switch and feel like I didn't gain much elsewhere. I wouldn't steer anyone away from buying RDNA3 but I would say that the Nvidia tax is worth it.
 
Last edited:
Not caring about RT is a legitimate stance for anyone to have. BUT I enjoy being able to use it when it's there and RDNA3 not really allowing me to legitimately use it any more than my 2 gen old Turing card did is just a bit disappointing. It was more of a sidegrade than a straight upgrade since the places where I really needed the performance were already good enough with the 2080Ti and the games that had performance issues were those with RT and I didn't really gain much there. So if I were to do it over again I'd drop the extra $100 on a 4070TiS and still have an unnoticeable difference in raster but usable RT for when I wanted it.

With FSR I just run into quite a few games where it looks straight up bad (smearing textures, flashing, etc) vs when I was able to run DLSS it looked acceptable the majority of the time and when it didn't it was usually a DLL swap away from looking almost perfect.

Adrenaline has the best performance metrics overlay I've ever used but the rest of it feels overdone, it loves to conflict with other programs on occassion, some things just don't work (Video Upscale for one), and it's a bit slow in use. If I'm popping it up during a game it's because something doesn't feel right and I'm toggling settings to see what option is causing it whereas with my Nvidia GPU I didn't have that option but I also never felt like I had to go in and make any driver level changes anyway.


Anyway I am definitely noticing the things I've lost in the switch and feel like I didn't gain much elsewhere. I wouldn't steer anyone away from buying RDNA3 but I would say that the Nvidia tax is worth it.
Thanks for your perspective. I've never seen RT in action so I have no idea how impressive it is. My situation is that I'm running at 2070 on a 2018 Alienware laptop that I bought used from a member here back in 2019. The laptop has done me well enough over the years and can play inide games and older games well enough but now that I want to get back into gaming w more recent games (especially Cyberpunk) it's in no way powerful enough to run games at 4k on the LG 48CX I bought 2 years ago, even without RT.

A 7900XTX would likely deliver good raster performance at 4k (at 120hz with maxed in game settings but RT off?), far better than the 2070 laptop card. But I don't want to pay close to $1k for that card since I want my hardware to deliver good performance at high settings for at least 3 years before I upgrade. Anything at or above $1k would have to be NVIDIA since AMD can't do RT nearly as well.
 
Thanks for your perspective. I've never seen RT in action so I have no idea how impressive it is. My situation is that I'm running at 2070 on a 2018 Alienware laptop that I bought used from a member here back in 2019. The laptop has done me well enough over the years and can play inide games and older games well enough but now that I want to get back into gaming w more recent games (especially Cyberpunk) it's in no way powerful enough to run games at 4k on the LG 48CX I bought 2 years ago, even without RT.

A 7900XTX would likely deliver good raster performance at 4k (at 120hz with maxed in game settings but RT off?), far better than the 2070 laptop card. But I don't want to pay close to $1k for that card since I want my hardware to deliver good performance at high settings for at least 3 years before I upgrade. Anything at or above $1k would have to be NVIDIA since AMD can't do RT nearly as well.
Think of rt as the new "ultra". It is the benchmark for high visual quality now :). Not that games can't look great without it, but it's a whole extra tier of visuals for games that implement it well.
 
Think of rt as the new "ultra". It is the benchmark for high visual quality now :). Not that games can't look great without it, but it's a whole extra tier of visuals for games that implement it well.
Yeah I think of RT the same way, as a new "ultra" the small problem with that is RT can be managed very well and kill even a 6090 with no problem.
 
Thanks for your perspective. I've never seen RT in action so I have no idea how impressive it is. My situation is that I'm running at 2070 on a 2018 Alienware laptop that I bought used from a member here back in 2019. The laptop has done me well enough over the years and can play inide games and older games well enough but now that I want to get back into gaming w more recent games (especially Cyberpunk) it's in no way powerful enough to run games at 4k on the LG 48CX I bought 2 years ago, even without RT.

A 7900XTX would likely deliver good raster performance at 4k (at 120hz with maxed in game settings but RT off?), far better than the 2070 laptop card. But I don't want to pay close to $1k for that card since I want my hardware to deliver good performance at high settings for at least 3 years before I upgrade. Anything at or above $1k would have to be NVIDIA since AMD can't do RT nearly as well.
You should be able to find a 7900XTX closer to $800 if you look around and wait for a sale (and by "wait" I mean a week or two, not months). Make sure you have a very high end PSU regardless of your choice, AMD or Nvidia, as the current high end cards have some wild brief power spikes that play havoc with lesser PSUs.

Also, people keep saying the 7900XTX "can't do RT well" and that's kind of bullshit. As well as a current generation, more expensive Nvidia card? No. As well or better than a 3090, the previous generation Nvidia halo card? Yes.

If you do get an XTX (or honestly, given your use case and current gear, an XT or GRE would still be a huge step up) my strongest recommendation is to join the AMD Vanguard beta program immediately. Getting in the beta channel has been the difference between absolutely hating this thing and being only occasionally annoyed with it. I can't say more than that due to NDA, but just... trust me.
 
Make sure you have a very high end PSU regardless of your choice, AMD or Nvidia, as the current high end cards have some wild brief power spikes that play havoc with lesser PSUs.
6800 XT do higher spikes than 7900 XT, same to 6900 XT vs 7900 XTX.
 
You should be able to find a 7900XTX closer to $800 if you look around and wait for a sale (and by "wait" I mean a week or two, not months). Make sure you have a very high end PSU regardless of your choice, AMD or Nvidia, as the current high end cards have some wild brief power spikes that play havoc with lesser PSUs.

Also, people keep saying the 7900XTX "can't do RT well" and that's kind of bullshit. As well as a current generation, more expensive Nvidia card? No. As well or better than a 3090, the previous generation Nvidia halo card? Yes.

If you do get an XTX (or honestly, given your use case and current gear, an XT or GRE would still be a huge step up) my strongest recommendation is to join the AMD Vanguard beta program immediately. Getting in the beta channel has been the difference between absolutely hating this thing and being only occasionally annoyed with it. I can't say more than that due to NDA, but just... trust me.
Thanks

I'd go for a 7900xtx if/when it hits the $700 price
 
Will we see changes this year? My 3090 is doing fine but I would like to upgrade at some point.
 
If you care about raytracing get the 4080, or ideally save up for a 4090. If you don't really care about raytracing and plan to keep the card more than a couple years, the 7900XTX. I have had a lot more problems with AMD drivers than I have with Nvidia's in the past, but experiences vary.

this is what is comes down to with the entire AMD vs Nvidia debate...if you care about ray-tracing go with Nvidia...if not go with AMD...DLSS used to be a big win as well for Nvidia but AMD seems to be getting better with FSR
 
I did start having the issue with the 5800X3D after I upgraded to the 4090 FE (it was never an issue with the 3090 FTW3), but it wasn't at this frequency with the 7800X3D where it happens multiple times a day, and is easily repeatable by alt tabbing / selecting different windows - they don't even need to be ones with audio playing. EDIT - same Onkyo TX-RZ810 receiver used across both working and not really working Dolby Atmos setups, same firmware even (it hasn't gotten updates in a long time).

I can say that it happened in Windows 10 and 11. It has happened with at least the last 6 WHQL drivers, including clean installs and updating over the old. It still happens after clean installing the X670E AMD chipset drivers. Uninstalling and re-installing Dolby Access didn't fix it, nor did installing DTS:X. Using Soundkeeper helps to slightly reduce the frequency of the events (I did start using this with my 5800X3D and 4090 and it helped tremendously, but with the 7800X3D it doesn't seem to help nearly as much). Disabling Atmos upmixing in the Dolby Access app definitely reduced the frequency (I went a full day without it happening on my 7800X3D, then it happened again this morning). I've also tried deactivating HPET and the dynamic synthetic timer (some random solution that I found on M$ answers). I even went so far as purchasing a new set of VESA certified HDMI 2.1 cables. This issue does not occur when setting audio to Stereo (I never use this), 5.1 or DTS:X. Only Dolby Atmos.
You might read up on this: https://forums.guru3d.com/threads/w...ge-signaled-based-interrupts-msi-tool.378044/
And check your 4090's interrupt config. Although I suspect it would be setup in the best way since it is a recent hardware installation.

Mine is in Message Signal Interrupt mode
1713556405956.png

The only devices in my system using the old Line-Based Interrupt mode are the audio controllers..
1713556933576.png

So not sure if that means audio is preferable to be in Line Based or not. I don't have any issues with my audio, but I'm not using Dolby Atmos either.

The forum post links a utility that I have no idea if it is safe to use. But checking what interrupt mode your devices are in might reveal something.

I mean it could just be a driver bug but seems unlikely since you said you had no issues with the 3090 in the same use case.
Might be something to do with thread placement on the CPU, if that is an MCM design cpu. Workload across chiplets has some latencies.

There is an old utility that could also point out issues... LatencyMon. This is pretty useful https://www.resplendence.com/latencymon
Check with this and let us know what it shows.
 
You might read up on this: https://forums.guru3d.com/threads/w...ge-signaled-based-interrupts-msi-tool.378044/
And check your 4090's interrupt config. Although I suspect it would be setup in the best way since it is a recent hardware installation.

Mine is in Message Signal Interrupt mode
View attachment 648719
The only devices in my system using the old Line-Based Interrupt mode are the audio controllers..
View attachment 648723
So not sure if that means audio is preferable to be in Line Based or not. I don't have any issues with my audio, but I'm not using Dolby Atmos either.

The forum post links a utility that I have no idea if it is safe to use. But checking what interrupt mode your devices are in might reveal something.

I mean it could just be a driver bug but seems unlikely since you said you had no issues with the 3090 in the same use case.
Might be something to do with thread placement on the CPU, if that is an MCM design cpu. Workload across chiplets has some latencies.

There is an old utility that could also point out issues... LatencyMon. This is pretty useful https://www.resplendence.com/latencymon
Check with this and let us know what it shows.
Sadly, that's 90% Greek to me - I just don't mess with software at that level. It works, or it doesn't - and it doesn't. I keep reporting the issue on the Nvidia troubleshooting reddit thread, and submitting direct reports to Nvidia.
 
Right click your start button, you get this menu:

1713631563992.png


Select Device Manager.
Then click View - Resources by Type

1713631643074.png


Expand the Interrupt Request (IRQ) line
Then scroll way down to the (PCI) section, and find your 4090. Then just look to see if it has a positive or negative number in the parenthesis.

But really, running LatnecyMon (All you do is double click the latencymon.exe) is probably going to show more helpful information.
 
Right click your start button, you get this menu:

View attachment 648925

Select Device Manager.
Then click View - Resources by Type

View attachment 648926

Expand the Interrupt Request (IRQ) line
Then scroll way down to the (PCI) section, and find your 4090. Then just look to see if it has a positive or negative number in the parenthesis.

But really, running LatnecyMon (All you do is double click the latencymon.exe) is probably going to show more helpful information.
Thank you, but latencymon has had high spikes ever since I tried looking at it with a 3090 (which didn't have the issue). Same spikes with a 4090. Points to the nvidia kernel nvlddmkm.sys, and sometimes to wdf01000.sys and HDAudBus.sys.

IRQ is a -144
 
Last edited:
Can you post a screensnip of the latencymon?

In the interrupt list, find the PCI's with positive numbers in the parenthesis, and screensnip those too, and paste here. I believe the items that use the old line based interrupts could be whats hogging system resources just enough to cause you problems.
 
Back
Top