7900xtx or 4080 super?

peppergomez

2[H]4U
Joined
Sep 15, 2011
Messages
2,155
If you were buying a card for gaming at 4K resolution. Which one of these would you choose

I think their prices are similar and to my limited knowledge. It seems like the advantage the 4080 has is 4K gaming and the advantage 7900 has is more V Ram and possibly faster rasterized performance.

As far as the most demanding games, i'd be playing with them. Cyberpunk would probably be the most extreme

If a m d releases the seventy nine fifty this year, I wonder if the prices on the seventy nine hundred xtx will fall
 
If you care about raytracing get the 4080, or ideally save up for a 4090. If you don't really care about raytracing and plan to keep the card more than a couple years, the 7900XTX. I have had a lot more problems with AMD drivers than I have with Nvidia's in the past, but experiences vary.
 
Right, I don't know enough about raytracing because I've never played with it directly to know if I care or not

from what I understand it's very much a niche feature that really only shows up in a handful of games in a meaningful way so I wouldn't say it's a key feature unless it's something that's going to become the norm. in general I don't obsess over image quality, But I do want to be able to game at my monitor's native res of four k
 
Right, I don't know enough about raytracing because I've never played with it directly to know if I care or not

from what I understand it's very much a niche feature that really only shows up in a handful of games in a meaningful way so I wouldn't say it's a key feature unless it's something that's going to become the norm. in general I don't obsess over image quality, But I do want to be able to game at my monitor's native res of four k
If you don't really care about RT (like me) then it the 7900xtx will certainly be in consideration for you. It has really good raster performance.
 
If they're the same price then 4080S all the way. I've been wanting a 7900XTX but it's not a great deal now that the 4080S is the same MSRP. I expect that AMD will drop the prices of most of their GPUs now that the Supers are here.
 
Not worried about Ray tracing, but AMD driver issues are something I still don't want to have to deal with.
 
Not worried about Ray tracing, but AMD driver issues are something I still don't want to have to deal with.
Both Nvidia and AMD are far from perfect. Had quite a few issues with Nvidia in the last 4-5 months of 2023 in the newer titles with sometimes having to wait a few days to play a game while waiting on fixed drivers. Usually both are fairly decent, but there have been rough patches with both AMD and Nvidia over the years.
 
Right, I don't know enough about raytracing because I've never played with it directly to know if I care or not

from what I understand it's very much a niche feature that really only shows up in a handful of games in a meaningful way so I wouldn't say it's a key feature unless it's something that's going to become the norm. in general I don't obsess over image quality, But I do want to be able to game at my monitor's native res of four k
RT is really nice when done right, but only a few games do it right, most just throw it in to tick a box. Alan wake 2, Control, Cyberpunk 2077, Metro exodus enhanced edition and a few more have done it right and look gorgeous with it at a reasonable level. Drawback is that RT turns even the 4090 into a 1440p card unless you want to use DLSS or can live with low framerates. I would wait for reviews and potential price reductions from AMD before deciding though if RT is not a killer feature for you.
 
4080 Super unless you can get a 7900 XTX for 850 or lower. I own a 7900 XTX and it's a great card but I paid 825 for it used. I'd rather just pony up the cash at this point and buy the super if you can get one at the 999 MSRP. Buying used is always an option though, some people might be selling off their over priced 4080s.
 
Last edited:
  • Like
Reactions: pavel
like this
Is that still a serious issue I've read that this is a stereotype based on issues with drivers in years past not so much now
I went from an RTX 2080 to a 7900XTX and it's been a complete shit show. Tons of driver crashes that proved to be caused by Win 11 insider changes (but Nvidia is fine) and a PSU that should have been more than enough.

I paid $800-ish after getting a partial refund because I never got the promised Starfield code. At that price it's difficult to beat. I wouldn't pay significantly more, though.

You're going to want a new ATX 3.0 PSU regardless - 1000w minimum - with the 4080 or 7900XTX, either because you want a native 12vHPWR or because the AMD card's transient spikes will require it.
 
4080 Super unless you can get a 7900 XTX for 850 or lower. I own a 7900 XTX and it's a great card but I paid 825 for it used. I'd rather just pony up the cash at this point and buy the super if you can get one at the 999 MSRP. Buying used is always an option though, some people might be selling off their over priced 4080s.
I hope so. I am looking at used - I don't think these nvidia prices will be as low as ppl say they are - well, maybe in the USA - but, elsewhere, they won't be. If you have the $$ and the cards are priced competitively - meaning the 4080S is not much more or around the same - I'd pick the 4080S - kind of a no-brainer - more features - whether you use them or not is redundant, imho. With all that, it holds more value if you sell it later on.
I am only looking at the 7900 XT or XTX myself - because, I am hoping it will drop in price - but, I am not betting any money on it - especially, if owners of the card want to sell it used, to get one of the new 40 series - that would work for me, too. I want it for Linux use in particular but the productivity use I want - I dunno if it will be good for that - I'm willing to check it out. Otherwise, I'd pick a 4070 Ti Super or *used* 4080 - but, I think those will be too expensive.
 
I went from an RTX 2080 to a 7900XTX and it's been a complete shit show. Tons of driver crashes that proved to be caused by Win 11 insider changes (but Nvidia is fine) and a PSU that should have been more than enough.

I paid $800-ish after getting a partial refund because I never got the promised Starfield code. At that price it's difficult to beat. I wouldn't pay significantly more, though.

You're going to want a new ATX 3.0 PSU regardless - 1000w minimum - with the 4080 or 7900XTX, either because you want a native 12vHPWR or because the AMD card's transient spikes will require it.
Ppl are asking about the drivers - but, not everyone is experiencing the same problems - honestly (and I don't want to dismiss what you say - believe me.... I get it) - but, some 7900 xtx owners say they haven't had drivers problems while others have it so bad - they are loudly complaining online - some even decide to sell their card and switch to an nvidia card.
If it was so bad and prominent - meaning widely prominent - there'd be several youtube videos on it from content ppl. I don't know what the problem is but it sounds pretty bad - I'm considering a 7900 xtx and want it at the cheapest I can find - so willing to go used. The transient spikes are a major concern, though - my psu is a Corsair RM850W so a good quality psu but at 850, still LOWER than a 1000W one. I don't know what to do - it's a real pita if I have to invest another $200 into a psu so add that to my gpu cost?!? Nvidia gpus tend to be lower power - at least, the Ada / 40 series are - although, yeah, the 4080 needs 1000w+ too? Maybe the 4070 series doesn't?
 
Not worried about Ray tracing, but AMD driver issues are something I still don't want to have to deal with.
Agreed, I had a lot of issues with AMD drivers in the not so distant past. They did end up fixing most of them but it took about 9 months.
 
The transient spikes are a major concern, though - my psu is a Corsair RM850W so a good quality psu but at 850, still LOWER than a 1000W one. I don't know what to do - it's a real pita if I have to invest another $200 into a psu so add that to my gpu cost?!? Nvidia gpus tend to be lower power - at least, the Ada / 40 series are - although, yeah, the 4080 needs 1000w+ too? Maybe the 4070 series doesn't?
That Corsair is plenty of PSU for either card. Could even run a 4090 with no issues.
 
That Corsair is plenty of PSU for either card. Could even run a 4090 with no issues.
I hope so - 1000w psus are usually $150 and up (most likely $200 CAD here). :) I'd rather use that towards the gpu.
 
Neither are good enough for 4K, in my opinion. Only card that is actually 'good' for me at 4k would be 4090. This is why if I had to choose the $1000 mark i'd choose the 4080 - Because DLSS is going to be mandatory for 4K, and the 7900XTX doesn't have DLSS nor the raytracing performance.

If you said 1440 i'd say either is fine, because at that point it just depends if you use raytracing a lot or not.
 
Going to need to wait until everything drops and also wait for reviews.

But I basically see it playing out like this:

MSRP for 4080S is $1000. Vendors will likely not have models priced at $1000 (4070S models are already $50 above MSRP).
AMD will respond with both official and unofficial price cuts on the 7900XTX.

I'm guessing all told there will still be at least a $200 differential between the lowest cost 4080S and lowest cost 7900XTX. And it will just come down to whether or not you feel like it's worth the extra money to get the extra RT performance as well as DLSS or whatever.
The 5% additional CUDA cores will not shuffle the performance of the 4080S much, the real headliner is in fact the price drop which changes its price to performance.

I'll also say that either way you'll have a good experience in virtually all games.
Tradeoffs generally being:
7900XTX @ 4k with limited or no RT vs 4080S with maxed out RT but forced to use up-scaling in 4k to stay under the vRAM limit.
In games with lower vRAM requirements in 4k + RT, then the 4080s will generally speaking just win all the way through. (And also wins by wide margin in certain RT outlier games such as CP2077 and Ratchet and Clank in terms of RT performance).
In games without RT, no up-scaling especially in 4k, the 7900XTX will generally win all the way through.

EDIT:

View: https://youtu.be/YbKhxjw8EUE?feature=shared

7900XTX generally winning at UE5, though there are very few UE5 titles currently using hardware RT.
However it's notable that it may stay this way, due to console optimization. Games using UE5 + hardware RT may end up being outliers.

View: https://youtu.be/-jR-pVZEcKM?feature=shared
 
Last edited:
Going to need to wait until everything drops and also wait for reviews.

But I basically see it playing out like this:

MSRP for 4080S is $1000. Vendors will likely not have models priced at $1000 (4070S models are already $50 above MSRP).
AMD will respond with both official and unofficial price cuts on the 7900XTX.

I'm guessing all told there will still be at least a $200 differential between the lowest cost 4080S and lowest cost 7900XTX. And it will just come down to whether or not you feel like it's worth the extra performance to get the extra RT performance as well as DLSS or whatever.
The 5% additional CUDA cores will not shuffle the performance of the 4080S much, the real headliner is in fact the price drop which changes its price to performance.

I'll also say that either way you'll have a good experience in virtually all games.
Tradeoffs generally being:
7900XTX @ 4k with limited or no RT vs 4080S with maxed out RT but forced to use up-scaling in 4k to stay under the vRAM limit.
In games with lower vRAM requirements in 4k + RT, then the 4080s will generally speaking just win all the way through. (And also wins by wide margin in certain RT outlier games such as CP2077 and Ratchet and Clank in terms of RT performance).
In games without RT, no up-scaling especially in 4k, the 7900XTX will generally win all the way through.

EDIT:

View: https://youtu.be/YbKhxjw8EUE?feature=shared

7900XTX generally winning at UE5, though there are very few UE5 titles currently using hardware RT.
However it's notable that it may stay this way, due to console optimization. Games using UE5 + hardware RT may end up being outliers.

View: https://youtu.be/-jR-pVZEcKM?feature=shared

Thanks for that pretty helpful stuff

Unfortunately, I have no way of knowing whether or not retracing is important to me. Having never used it
 
  • Like
Reactions: pavel
like this
Thanks for that pretty helpful stuff

Unfortunately, I have no way of knowing whether or not retracing is important to me. Having never used it
I have. If used correctly it makes lighting basically photo-realistic or a fuckton closer to it depending on how many rays you cast over conventional raster tricks and techniques. Most games that use it do so for GI (global illumination) or shadows/reflections. Alan Wake II is an example that does both, and is the direction more and more games are going to go. While a 7900 XTX is roughly equal to a 3080 Ti/3090 in RT, the 4080 Super IMO is the more future-proofed option if you're looking to get the longest life out of your card. There aren't a ton of games that use it today, but that number will only grow with current consoles being equipped with an RDNA2 based APU, and RTX/RT GPUs being more and more common as older hardware gets replaced.

My two pennies; do with them what you will.
 
I have. If used correctly it makes lighting basically photo-realistic or a fuckton closer to it depending on how many rays you cast over conventional raster tricks and techniques. Most games that use it do so for GI (global illumination) or shadows/reflections. Alan Wake II is an example that does both, and is the direction more and more games are going to go. While a 7900 XTX is roughly equal to a 3080 Ti/3090 in RT, the 4080 Super IMO is the more future-proofed option if you're looking to get the longest life out of your card. There aren't a ton of games that use it today, but that number will only grow with current consoles being equipped with an RDNA2 based APU, and RTX/RT GPUs being more and more common as older hardware gets replaced.

My two pennies; do with them what you will.
I kind of covered some of this in my post. Not trying to "counter" you, but I think the major question is what is the future of RT going to be like for this next generation?

I'm leaning towards CP2077 (REDengine), AW2 (Northlight), and Avatar (Snowdrop) being outliers. Fewer game devs are putting resources into developing their own engines and building out full blown path tracing. For better or for worse.

UE5 is going to become the defacto game engine for a lot of future titles. Enough that REDengine (CP2077) is no longer going to get used for the next Witcher game and I assume the next CP2077 game.

Then there is how consoles also affect game development. Especially in regards to UE5.

It's a long way of saying that a lot of stuff is still getting optimized for AMD's hardware AND that the RT that is being used may end up being software solutions, with only some devs going above and beyond and offering a hardware RT solution for "PC only". I would expect that CDPR would be one of those companies that would bother, but I suspect a lot of other devs will not. And even if they do, I also more than expect that there will be a software toggle that will allow for either software or hardware based RT solution. Much like Fortnite.

The first crop of UE5 games all are not (which is 6 or so games). And I suspect that even with the further optimizations of UE5.2, that games based on that also will be primarily using software RT solutions.

Anyway, say all that to say, I don't think just looking at "RT" is as clear cut as you're making it, other than that RT for sure will be getting used more. Right now in terms of the majority of UE5 titles, AMD is ahead. But as with all of the above, and I'm being candid when I say this, I think it's really murky at knowing what the next 3 years will look like in terms of RT implementations. More than ever before, it's hard to know whether the 4080 or the 7900XTX will age better 3-5 years from now, and it will be completely dependent on what all the major studios end up using in terms of software implementation. For sure if it's path tracing the 4080 will be a decisive winner by a wide margin. But if it's all software RT then the 7900XTX will continue its dominance.

The other note there too is how important upscalers will become as everything in UE5 is all per pixel. Meaning the biggest changes to performance are all tied to resolution. A short way of saying that upscaling is becoming the defacto to gain back performance.
DLSS generally speaking wins there, but the curveball is that UE5 has it's own specialized upscaler TSR, which more or less has closed the image quality and performance gap with DLSS (ignoring if FSR ever gets any image quality improvements). So that is the other curve ball there.
 
Last edited:
I kind of covered some of this in my post. Not trying to "counter" you, but I think the major question is what is the future of RT going to be like for this next generation?

I'm leaning towards CP2077 (REDengine), AW2 (Northlight), and Avatar (Snowdrop) being outliers. Fewer game devs are putting resources into developing their own engines and building out full blown path tracing. For better or for worse.

UE5 is going to become the defacto game engine for a lot of future titles. Enough that REDengine (CP2077) is no longer going to get used for the next Witcher game and I assume the next CP2077 game.

Then there is how consoles also affect game development. Especially in regards to UE5.

It's a long way of saying that a lot of stuff is still getting optimized for AMD's hardware AND that the RT that is being used may end up being software solutions, with only some devs going above and beyond and offering a hardware RT solution for "PC only". I would expect that CDPR would be one of those companies that would bother, but I suspect a lot of other devs will not. And even if they do, I also more than expect that there will be a software toggle that will allow for either software or hardware based RT solution. Much like Fortnite.

The first crop of UE5 games all are not (which is 6 or so games). And I suspect that even with the further optimizations of UE5.2, that games based on that also will be primarily using software RT solutions.

Anyway, say all that to say, I don't think just looking at "RT" is as clear cut as you're making it, other than that RT for sure will be getting used more. Right now in terms of the majority of UE5 titles, AMD is ahead. But as with all of the above, and I'm being candid when I say this, I think it's really murky at knowing what the next 3 years will look like in terms of RT implementations. More than ever before, it's hard to know whether the 4080 or the 7900XTX will age better 3-5 years from now, and it will be completely dependent on what all the major studios end up using in terms of software implementation. For sure if it's path tracing the 4080 will be a decisive winner by a wide margin. But if it's all software RT then the 7900XTX will continue its dominance.

The other note there too is how important upscalers will become as everything in UE5 is all per pixel. Meaning the biggest changes to performance are all tied to resolution. A short way of saying that upscaling is becoming the defacto to gain back performance.
DLSS generally speaking wins there, but the curveball is that UE5 has it's own specialized upscaler TSR, which more or less has closed the image quality and performance gap with DLSS (ignoring if FSR ever gets any image quality improvements). So that is the other curve ball there.
I can't really argue any of this, to be perfectly frank. You made very good points all around and you're right, I don't have a functioning crystal ball to be able to say for certain which way the ray-traced wind will blow. If past trends hold true, RT will be ubiquitous at some point. I would have pointed to the fact that it took about 4-5 years for games that used vertex and pixel shaders to become common, but RTX has been around just as long and this isn't the days of the $500 high end GPU anymore. With much higher prices (leaving out the scalpers during the mining craze) playing a factor, I don't quite see the same adoption rate of RT happening as did with programmable shaders due to this alone.

With all that said, if I had to choose, I'd be fine losing 10-15% at the top end in raster for having a generational performance advantage in RT. Anyways, at the end of the day it's up to the OP to decide what's more important. I'm just some random dude on the internet with an opinion. And opinions are like assholes. Everybody's got one.
 
Last edited:
If you haven't played with RT. Might be high time to do so and see the world of difference in some of the high profile games.
 
If you were buying a card for gaming at 4K resolution. Which one of these would you choose

I think their prices are similar and to my limited knowledge. It seems like the advantage the 4080 has is 4K gaming and the advantage 7900 has is more V Ram and possibly faster rasterized performance.

As far as the most demanding games, i'd be playing with them. Cyberpunk would probably be the most extreme

If a m d releases the seventy nine fifty this year, I wonder if the prices on the seventy nine hundred xtx will fall
Someone do an up to date VR comparison and I'll give my answer! Seems that doesn't exist with the 4080/7900XTX or any other cards :mad:
 
I have. If used correctly it makes lighting basically photo-realistic or a fuckton closer to it depending on how many rays you cast over conventional raster tricks and techniques. Most games that use it do so for GI (global illumination) or shadows/reflections. Alan Wake II is an example that does both, and is the direction more and more games are going to go. While a 7900 XTX is roughly equal to a 3080 Ti/3090 in RT, the 4080 Super IMO is the more future-proofed option if you're looking to get the longest life out of your card. There aren't a ton of games that use it today, but that number will only grow with current consoles being equipped with an RDNA2 based APU, and RTX/RT GPUs being more and more common as older hardware gets replaced.

My two pennies; do with them what you will.
None of the current gen cards are future proof when it comes to ray tracing though. E.g. even the 4090 is a 1440p card with ray tracing like the 3080 was borderline 1080p/1440p card with RT on when it launched and titles become more demanding over the years. RT is still not mature tech and requires a lot of GPU to run. Consoles and lack of powerful hardware on the majority of PCs will also limit the amount of meaningful ray tracing in titles as they are being developed for more than just those with very powerful machines. A few titles go all in, while most only do a little or nothing in the RT department. RT won't be that mainstream until the lower mid-range GPUs and consoles surpass a 4090 in performance.
 
Buying a 7900XTX is like going to a restaurant and asking the chef to cut the quantity of food in half but still pay for the full plate lol.
 
Buying a 7900XTX is like going to a restaurant and asking the chef to cut the quantity of food in half but still pay for the full plate lol.
You could do simple price to performance calculations to know that’s not true. Not being a fan is fine, but at least give a real criticism.

Personally I think buying either a 7900XTX or a 4080S is reasonable. But that is given at what price you’re able to buy either.

If partners can’t get the price of the 4080s below $1100 and the 7900XTX sits at $850 then it’s not remotely hard at all to justify the 7900XTX. What with it costing >1/3 less in that scenario. Price is everything.

It’s not hard to justify the 4080S either if you want DLSS and the better RT performance. Like all things “it depends”.
 
7900XTX will have to price drop. A drop to something like $799 would probably be super attractive.
 
I have. If used correctly it makes lighting basically photo-realistic or a fuckton closer to it depending on how many rays you cast over conventional raster tricks and techniques. Most games that use it do so for GI (global illumination) or shadows/reflections. Alan Wake II is an example that does both, and is the direction more and more games are going to go. While a 7900 XTX is roughly equal to a 3080 Ti/3090 in RT, the 4080 Super IMO is the more future-proofed option if you're looking to get the longest life out of your card. There aren't a ton of games that use it today, but that number will only grow with current consoles being equipped with an RDNA2 based APU, and RTX/RT GPUs being more and more common as older hardware gets replaced.

My two pennies; do with them what you will.
Sorry, but can you explain how that makes sense - "but that number will only grow with current consoles being equipped with an RDNA2 based APU" - um, that's AMD - so, why is ray tracing progressing or advancing but AMD gpus - especially RDNA 3 and 4 - cards aren't being recommended to the same extent as the 4080 in this argument? :)

What I have read - I concluded RT is a bit overrated - not too many games use it - and some games like Cyberpunk 2077 is so 'taxing' on the gpu - even if you have RT on, the 4090 doesn't even have playable frames - I think it's like 30fps or something? At 4K, I mean.

Anyway, I'm not a 'fanboy' of either Nvidia or AMD - I am not partial to either - my conclusions are 'Nvidia gpus are usually exorbitantly expensive - meaning overpriced and crippled for what they offer - and AMD gpus lack features that Nvidia generally have and of the features they supposedly offer - the support and execution of those are very poorly done and AMD knows they are 'choice #2' and part of the duopoly at best - so, their cards are also overpriced for what they offer.'

I'd probably pick a 4080 Super if I had the $$ and could stomach paying the crazy price it will be although I also use Linux - and AMD gpus are generally 'better' in Linux - although, again, AMD's feature set and support is really atrocious in productivity and other areas/fields - making choosing an AMD gpu - even a flagship series a very tough sell for me.

I prefer to buy used if I can trust the seller - and then some stranger gets my money and I save on tax and hopefully several more dollars (compared to retail price).
 
Sorry, but can you explain how that makes sense - "but that number will only grow with current consoles being equipped with an RDNA2 based APU" - um, that's AMD - so, why is ray tracing progressing or advancing but AMD gpus - especially RDNA 3 and 4 - cards aren't being recommended to the same extent as the 4080 in this argument? :)

What I have read - I concluded RT is a bit overrated - not too many games use it - and some games like Cyberpunk 2077 is so 'taxing' on the gpu - even if you have RT on, the 4090 doesn't even have playable frames - I think it's like 30fps or something? At 4K, I mean.
I thought consoles were relevant to add to the discussion since most AAA games that are ported to PC are built with consoles in mind as the lead platform more often than not. Since RDNA2 that's used in the XBox Series S/X and PlayStation 5 has RT capability (albeit very weak), it will help push adoption of the tech. When the PS5 Pro hits it's rumored to have a 60 CU RDNA3 based APU in it - which is a big leap over the 28 CUs present in the base PS5. It took the XBox 360 and the PlayStation 3 for Shader Model 3 titles to start appearing in any real quantity. Later DX iterations were barely supported. 95% of games ported to PC during that era only supported DX9.0c. It wasn't until PlayStation 4 and XBox One hit (utilizing GCN, a DX11/12 architecture) that DX11 became widely adopted. You seeing the trend here?

You are right in that not a lot of games use it today. In time that will change because every console shipped today has that capability, and as used last gen cards hit the market in quantity people like myself who can't afford a $1,000+ GPU will start to snap them up, adding to the install base. Eventually we're gonna hit critical mass on this. Whether it's 5 or 10 years I don't know. As I said before I don't have a working crystal ball; all I have to go on is historical trends of how adoption of new feature sets has played out. And you're not wrong about full path tracing on Cyberpunk (and to a lesser extent Alan Wake II due to ray reconstruction through DLSS 3.5) being incredibly taxing even on a 4090 at 3840x2160. I bet it'd be a bit more playable on a RTX 5090 when Blackwell eventually hits.
 
I generally agree with the sentiment that with the price being the same - The 7900XTX doesn't seem to show any value, unless you just want to be different and not go Nvidia. With price being equal the 4080 is just the better card from a hardware feature perspective, and manages thermals/power better.
 
For even money the Nvidia feature set is compelling. Get the card you like now for what you play/do now and near term. Future proofing is neat but upgrading a gpu is easy.

Personally I have a 7900xt main ,a 3060ti side box, a 3060 and pair of 6700xt in the kids PCs. We all play games and the usual PC home use and none of the machines have had any glaring driver issues. Like ever. Ymmv.


The gpu price hikes over the last 5 years have been brutal for a lot of us. Not because we don’t mind splashing on a hobby but because for the price of one mid to mid high tier gpu we built entire rigs not too long ago.

I’ve somehow walked myself up to $700 on GPUs and honestly my advice in todays clown world is get a sub 1k gpu every two years. Let’s be honest any enthusiast is going to get the upgrade bug by then anyway. For me once you top 1k after tax just yolo on a 4090 or whatever the top end is and go all in.
 
I generally agree with the sentiment that with the price being the same - The 7900XTX doesn't seem to show any value, unless you just want to be different and not go Nvidia. With price being equal the 4080 is just the better card from a hardware feature perspective, and manages thermals/power better.
Yeah we will see if AMD cuts prices when the 4080 super releases. Might make the xtx more attractive.
 
  • Like
Reactions: pavel
like this
Back
Top