AMD 7900XT and 7900 XTX Reviews

While it's not garbage, it's about what AMD tends to deliver on the GPU front............"almost enough". RT performance however continues to be a problem, yet I can't help but think this is more of a software/hardware thing right now...the tech is still early, the wizards of trickery haven't had enough time to digest ways to cheat the system and improve performance without the huge calculation hit........where's Carmack when we need him most! None of us are going to Mars, man, unless you're telling us that Doom was a training simulation for an eventual mission to Mars to eradicate cyberdemons........then it makes total sense...but.............perhaps I've...said too much....<COUGH>.......
 
While it's not garbage, it's about what AMD tends to deliver on the GPU front............"almost enough". RT performance however continues to be a problem, yet I can't help but think this is more of a software/hardware thing right now...the tech is still early, the wizards of trickery haven't had enough time to digest ways to cheat the system and improve performance without the huge calculation hit........where's Carmack when we need him most! None of us are going to Mars, man, unless you're telling us that Doom was a training simulation for an eventual mission to Mars to eradicate cyberdemons........then it makes total sense...but.............perhaps I've...said too much....<COUGH>.......
Personally, I find the ray tracing on low or at most medium to be enough, more than that and it feels fake like it just looks wrong something with the realism exceeding the art style and it just breaks it for me. For those settings the 7900 series more than covers it.
 
I am disappointed with the raster performance. I interpreted AMD's 50% over RDNA2 to mean that 7900XTX will be 50% better than 6950XT. but it in reality it is only 33% better

Ray tracing on the other hand has improved by 50-70% over previous generation. That's not bad !!
 
I am disappointed with the raster performance. I interpreted AMD's 50% over RDNA2 to mean that 7900XTX will be 50% better than 6950XT. but it in reality it is only 33% better

Ray tracing on the other hand has improved by 50-70% over previous generation. That's not bad !!
In the techpowerup review they only had the 6900xt and in raster the 7900xtx was 47% faster and in ray tracing it was 56% faster so it only barely outpaced the raster improvement. Plus the 6900xt was already a poor starting point so AMD did not improve ray tracing anywhere near what they needed to for that level of card. In a ray traced game where you actually need the performance such as CP 2077 the 4080 is 50% faster and 4090 is TWICE as fast.
 
In the techpowerup review they only had the 6900xt and in raster the 7900xtx was 47% faster and in ray tracing it was 56% faster so it only barely outpaced the raster improvement. Plus the 6900xt was already a poor starting point so AMD did not improve ray tracing anywhere near what they needed to for that level of card. In a ray traced game where you actually need the performance such as CP 2077 the 4080 is 50% faster and 4090 is TWICE as fast.
You are right. It is the 6900 XT that was used as base.

I was looking at FPS review which put the improvements between 45% to 70%

https://www.thefpsreview.com/2022/12/12/amd-radeon-rx-7900-xtx-video-card-review/8/
 
Unless the next generation of consoles launch for $1500 this bullshit is going to be great for console gaming.

Mind you, I'm sure "Home Computing as a Service" is on the way. The serfs don't need to own things, that's for lords and ladies.
 
Unless the next generation of consoles launch for $1500 this bullshit is going to be great for console gaming.

Mind you, I'm sure "Home Computing as a Service" is on the way. The serfs don't need to own things, that's for lords and ladies.
And yet there are still some people who insist that we be chained to PCs for gaming! I know the 7900 series is high-end (as is the current RTX 40 line), but it still means that even some devoted gamers will see much more sense in buying a $399 PS5 or $499 Xbox Series X than a computer where the GPU alone is $899. I don't think the theoretical performance improvements are necessarily worth paying the premium, especially if you plan to mainly play cross-platform games that won't show a night-and-day difference.

The value proposition will change as AMD and NVIDIA unveil more affordable GPUs in the current generation, but for now I wouldn't fault someone in the slightest for deciding that they'd rather get a console and a basic PC than pour far more money into a gaming computer in the hopes they'll see a noticeable advantage.
 
I am disappointed with the raster performance. I interpreted AMD's 50% over RDNA2 to mean that 7900XTX will be 50% better than 6950XT. but it in reality it is only 33% better

Ray tracing on the other hand has improved by 50-70% over previous generation. That's not bad !!

No AMD was quoting performance per watt.
 
  • Like
Reactions: DF-1
like this
And yet there are still some people who insist that we be chained to PCs for gaming! I know the 7900 series is high-end (as is the current RTX 40 line), but it still means that even some devoted gamers will see much more sense in buying a $399 PS5 or $499 Xbox Series X than a computer where the GPU alone is $899. I don't think the theoretical performance improvements are necessarily worth paying the premium, especially if you plan to mainly play cross-platform games that won't show a night-and-day difference.

The value proposition will change as AMD and NVIDIA unveil more affordable GPUs in the current generation, but for now I wouldn't fault someone in the slightest for deciding that they'd rather get a console and a basic PC than pour far more money into a gaming computer in the hopes they'll see a noticeable advantage.

you don't even need these high end cards to play on a PC anyway. You can do it just fine, although at lower resolutions with older series cards. Long as you don't need 4k and maxed out graphics to be happy, you could be using a something older and do the same at 1080p.
 
The old console vs pc debate rages on. I can see younger gamers not caring so much eventually and be willing to put $500 into a game box so that they can subscribe to game pass type services and shell out $60+ for every game but not this guy. I’m going to have a PC ,or 12, around to tinker with anyway so besides gaming it’s used daily for everything else , work, media steaming, creative pursuits, paying bills and so on. Also playing 30 years worth of games if you want is nice bonus. Consoles have a place and a type of buy but I stopped after the PS1 myself.

Unless the next generation of consoles launch for $1500 this bullshit is going to be great for console gaming.

Mind you, I'm sure "Home Computing as a Service" is on the way. The serfs don't need to own things, that's for lords and ladies.
And yes we will own nothing and be happy is sadly where things are headed. Arguably here now.
 
No AMD was quoting performance per watt.
Even so, reviews show there is only a 1.3x to 1.5x improvement in performance per watt. Using AMD's best-case scenario of Cyberpunk 2077 at 1.7x at 4K, in reality it's only 1.49x better going by TPU's numbers.
 
Unless the next generation of consoles launch for $1500 this bullshit is going to be great for console gaming.

Mind you, I'm sure "Home Computing as a Service" is on the way. The serfs don't need to own things, that's for lords and ladies.
they will continue to be quite the pain to find, it will be the third Christmas, just took at look at amazon, bestbuy, costco (there was a small windows when you could if you were ready to pay scalping price via a bundle), still cannot buy a PS5.

Maybe not a problem for the Nintendo type, but who will take those mini thin margin contract to make $500 consoles when they can sell out $500 for the GPU alone, what volume they will agree to guarantee.
 
you don't even need these high end cards to play on a PC anyway. You can do it just fine, although at lower resolutions with older series cards. Long as you don't need 4k and maxed out graphics to be happy, you could be using a something older and do the same at 1080p.
That's true; I'm mainly talking about the horsepower needed to provide a clear edge over current-gen consoles. You don't need an RX 7900 or RTX 40 series for that, of course, but even an RTX 3060 costs almost as much as a PS5. Don't get me wrong, a gaming PC or GPU upgrade can still make more sense in the right circumstances. I just think there's only a relatively small group of people who think that spending tons on their computer is a better value proposition than a console.

Look at it this way: the $899 you'd spend for an RX 7900 XT can get you a PS5 and a TV. Ask a budget-conscious gamer which of the two they'd rather have and they'll very likely pick the console setup.
 
That's true; I'm mainly talking about the horsepower needed to provide a clear edge over current-gen consoles. You don't need an RX 7900 or RTX 40 series for that, of course, but even an RTX 3060 costs almost as much as a PS5. Don't get me wrong, a gaming PC or GPU upgrade can still make more sense in the right circumstances. I just think there's only a relatively small group of people who think that spending tons on their computer is a better value proposition than a console.

Look at it this way: the $899 you'd spend for an RX 7900 XT can get you a PS5 and a TV. Ask a budget-conscious gamer which of the two they'd rather have and they'll very likely pick the console setup.
This is why neither AMD nor Nvidia is putting much effort into the mid-tier or budget segments for gaming GPUs, there is too strong a competition from the consoles there and it is a hard and unprofitable market segment to fight in.
 
This is why neither AMD nor Nvidia is putting much effort into the mid-tier or budget segments for gaming GPUs, there is too strong a competition from the consoles there and it is a hard and unprofitable market segment to fight in.
Seems like a repeat of the general computer and tablet markets in that light. That is, it's either a "race to the bottom" where you mainly compete on price (the very low-end GPUs) or emphasize premium design (the flagship $1K cards). A bit of a shame for the PC market, but it also means consoles aren't in danger any time soon. I'd say that's mostly good for variety and competition.
 
Ray tracing on the other hand has improved by 50-70% over previous generation. That's not bad !!
AMD is in a position where that actually is pretty bad. Their competition is on their third gen where they’ve been hitting 50-70% on each generation, and they’ve shown a roadmap where that trend is projected to continue if not accelerate. These gains stack, so the gap grows wider and wider even when both AMD and Nvidia are hitting the same generational % gains.

Nvidia is at 1 * 1.7 * 1.7 = 2.9x their first gen.
AMD is at 1 * 1.7 = 1.7x their first gen.

If we project this out another two cycles, Nvidia will be at 8.4x while AMD will be diddling along at 4.9x.

If AMD doesn’t hit 2.5-3x RT performance on the 8000-series, they’re basically dead to the RT market.
 
AMD is in a position where that actually is pretty bad. Their competition is on their third gen where they’ve been hitting 50-70% on each generation, and they’ve shown a roadmap where that trend is projected to continue if not accelerate. These gains stack, so the gap grows wider and wider even when both AMD and Nvidia are hitting the same generational % gains.

Nvidia is at 1 * 1.7 * 1.7 = 2.9x their first gen.
AMD is at 1 * 1.7 = 1.7x their first gen.

If we project this out another two cycles, Nvidia will be at 8.4x while AMD will be diddling along at 4.9x.

If AMD doesn’t hit 2.5-3x RT performance on the 8000-series, they’re basically dead to the RT market.
I'd say that's an overstatement. And only true if nVidia actually get a hold of a relevant console in the next generation (if there is another console generation, some have argued that PS5 and XSX will be the last).

AMD will continue to be relevant because by default devs will be programming for AMD cards if for no other reason than because of consoles.

As for the rest, we'll see. nVidia for the first time in history actually delivered on their promises with the 4090, albeit in a mostly unaffordiable package for the 99.99%. But the 2080? That was under-performance city. And it wasn't that long ago. Even mighty Intel strugged on 14nm and performance per watt goals for the better half of a decade. There is no world where nVidia just "auto-magically" gets perfect generational increases every time. This isn't to say that AMD doesn't need to step their game, of course they do, but this isn't nearly as cut and dry as you're making it.
 
AMD is in a position where that actually is pretty bad. Their competition is on their third gen where they’ve been hitting 50-70% on each generation, and they’ve shown a roadmap where that trend is projected to continue if not accelerate. These gains stack, so the gap grows wider and wider even when both AMD and Nvidia are hitting the same generational % gains.

Nvidia is at 1 * 1.7 * 1.7 = 2.9x their first gen.
AMD is at 1 * 1.7 = 1.7x their first gen.

If we project this out another two cycles, Nvidia will be at 8.4x while AMD will be diddling along at 4.9x.

If AMD doesn’t hit 2.5-3x RT performance on the 8000-series, they’re basically dead to the RT market.

The new cards are basically equal to the 3090 which is not very far behind a 4080. Now the 4090 is a different story but I dont see Nvidia making another leap like that in a 5090 or whatever they call it. They pretty much maxed out on the latest node TSMC has, so they will be behind the eight ball next round I think.
 
AMD is in a position where that actually is pretty bad. Their competition is on their third gen where they’ve been hitting 50-70% on each generation, and they’ve shown a roadmap where that trend is projected to continue if not accelerate. These gains stack, so the gap grows wider and wider even when both AMD and Nvidia are hitting the same generational % gains.

Nvidia is at 1 * 1.7 * 1.7 = 2.9x their first gen.
AMD is at 1 * 1.7 = 1.7x their first gen.

If we project this out another two cycles, Nvidia will be at 8.4x while AMD will be diddling along at 4.9x.

If AMD doesn’t hit 2.5-3x RT performance on the 8000-series, they’re basically dead to the RT market.
Not quite, but sort of?
You have to understand that Nvidia is currently an industry leader in the field of Ray Tracing, Nvidia's engineers have quite literally written the book on the topic, and in the last 6 years have been making breakthroughs that other companies had given up on as impossible.
The patents Nvidia holds on a number of algorithms that deal with ray calculations and noise reduction give them an unbeatable edge as things currently stand and until they expire AMD has little to no chance of actually catching up unless they work with Nvidia for some sort of licensing agreement to get access to those algorithms.
But also the degree of acceleration that Nvidia sees because of them won't scale like that forever, it will eventually plateau like most other things and to accelerate it further they will simply have to dedicate more silicon to the solution.
This is where Nvidia's MCM research comes in which will allow them to assemble their GPUs much like Intel does theirs, if you look at the Ponte Vecchio GPUs and their multi-tile packaging you can see where they are going with it.
 
The obvious difference being you can actually by a 3060.
If someone really wanted a PS5 then they can get one. Every single coworker that wanted one has had one for over a year now through various means whether that was through PlayStation direct or at Walmart. Even Best Buy occasionally will have some stacked up where you first walk in the store here but most PS5 at Best Buy are sold through the app.
 
The obvious difference being you can actually by a 3060.
They aren’t hard to come by, I just went to my nearest box store asked to go on their list. They called me 3 days later to let me know it was there, left work early drove over (2h drive, I live in the ass end of nowhere Canada) and got it. Really wasn’t a big deal and that was last December.
 
AMD is in a position where that actually is pretty bad. Their competition is on their third gen where they’ve been hitting 50-70% on each generation, and they’ve shown a roadmap where that trend is projected to continue if not accelerate. These gains stack, so the gap grows wider and wider even when both AMD and Nvidia are hitting the same generational % gains.

Nvidia is at 1 * 1.7 * 1.7 = 2.9x their first gen.
AMD is at 1 * 1.7 = 1.7x their first gen.

If we project this out another two cycles, Nvidia will be at 8.4x while AMD will be diddling along at 4.9x.

If AMD doesn’t hit 2.5-3x RT performance on the 8000-series, they’re basically dead to the RT market.
Well all the RT isn't anywhere anyone should be considering it beyond screen shot arguments aside.

AMD made it work gen one... which was quite a bit behind in performance sure. (but is also the defacto target now) This generation... they are matching the previous generations 3090 TI. I'm not seeing how that isn't a massive step up.

IF your argument is... ALL AMD can do this generation is 3090 TI numbers. (which was a $1500 flag ship #1 RT card on sale just two months ago). Then you are admitting RT is no where near prime time.
If ALL AMD can do is = the last generations BEST, and that just isn't good enough. What does that mean for anyone that bought a 3070/3080/ never mind 2080. lol

I'm sure over the next few years we'll see better games aimed at the consoles using RT intelligently... that will make RT worth turning on. Right now all I see is Silly tech demos like Portal 2/Quake RTX stuff... designed specifically to make your GPU complain, but not really use RT wisely. This has happened with just about every advance we have gotten since discrete video cards where first sold... the first 3-5 generations the feature is basically pointless and the only games that push it are tech demo type games no one remembers for their game play.
 
IF your argument is... ALL AMD can do this generation is 3090 TI numbers. (which was a $1500 flag ship #1 RT card on sale just two months ago). Then you are admitting RT is no where near prime time.
If ALL AMD can do is = the last generations BEST, and that just isn't good enough. What does that mean for anyone that bought a 3070/3080/ never mind 2080. lol
Exactly. There's only 1 card out right now that I would actually say is "good" at ray tracing. Amazing how many people gushed about ray tracing on the 3070, 3080, 3090, etc. Now that AMD can match that same criteria suddenly it's "shit". The 4080 isn't amazingly better than the 3090 in that regard, especially nowhere near the 4090 so I guess that must be bad at RT, too.

The games that use RT out now are literally the same ones that people are still using their Ampere cards on and apparently loved oh so much they could never consider a non-RT card ever again.

Idk. I know a few people will get defensive about this, but once again the impact of RT on the market is being grossly over estimated for yet ANOTHER generation.
 
Have any UE5.1 benchmarks been posted yet? Seems like that’s the most hardware-agnostic ray traced engine currently available, and it is certain to become the most ubiquitous.
fortnite.png


https://www.hwupgrade.it/articoli/s...sione-geforce-rtx-4080-accerchiata_index.html

Not much sample size I could find (or game to test to start with I think, could be nice to see some of the demo tested but that I can imagine being misleading)

the xtx is more 40-45% higher than the 6950xt in those quite extreme workload type than 30-35%, at least in that single example.
 
Last edited:
Well all the RT isn't anywhere anyone should be considering it beyond screen shot arguments aside.

AMD made it work gen one... which was quite a bit behind in performance sure. (but is also the defacto target now) This generation... they are matching the previous generations 3090 TI. I'm not seeing how that isn't a massive step up.

IF your argument is... ALL AMD can do this generation is 3090 TI numbers. (which was a $1500 flag ship #1 RT card on sale just two months ago). Then you are admitting RT is no where near prime time.
If ALL AMD can do is = the last generations BEST, and that just isn't good enough. What does that mean for anyone that bought a 3070/3080/ never mind 2080. lol

I'm sure over the next few years we'll see better games aimed at the consoles using RT intelligently... that will make RT worth turning on. Right now all I see is Silly tech demos like Portal 2/Quake RTX stuff... designed specifically to make your GPU complain, but not really use RT wisely. This has happened with just about every advance we have gotten since discrete video cards where first sold... the first 3-5 generations the feature is basically pointless and the only games that push it are tech demo type games no one remembers for their game play.
Until the lower mid range GPUs can run ray tracing without catastrophic performance degradation, ray tracing is nothing. That means raster is king and will still be king for a minimum of 3 more GPU generations. RDNA 3 is doing great on raster and I would expect it to continue that way down the product line as the lower tier products are eventually released. At this point in time RDNA 3 is a great value relatively speaking. It has great raster performance and decent ray tracing performance which is very similar to the ray tracing performance of the previous GPU generation's ray tracing king halo card.

Additionally this is the first GPU chiplet card. That means bugs to work out in hardware and software with likely considerable performance improvements on the software side with regards to drivers and possibly some optimization regarding games themselves as well as game engines. Drivers almost always end up improving performance from the time of launch for new cards so I see no reason to believe this will be any different.
 
Even if all you care about is raster, the jump from the 6950xt to the 7900xtx is only around 35% which is not impressive at all. The ONLY reason the 7900xtx looks decent is because the 4080 is so overpriced.
 
Last edited:
Well all the RT isn't anywhere anyone should be considering it beyond screen shot arguments aside.
Disagree. I remember years and years people talking about 1080p wasn't widely adopted so it didnt matter. I still have a 3080, but I turn on RT and DLSS in every single game right off. Better for everything, not just screenshots. If you think that, and im just making an observation, you clearly don't have a system that pushes it. There is any reason I wouldn't run RT at this point on any game that supports it, which there are a TON of now. Its not just those 7-10 titles like 2 yrs ago.
 
Disagree. I remember years and years people talking about 1080p wasn't widely adopted so it didnt matter. I still have a 3080, but I turn on RT and DLSS in every single game right off. Better for everything, not just screenshots. If you think that, and im just making an observation, you clearly don't have a system that pushes it. There is any reason I wouldn't run RT at this point on any game that supports it, which there are a TON of now. Its not just those 7-10 titles like 2 yrs ago.
Its not 7-10 titles no... its a little over 70 last time I counted a few months ago. (I took Nvidias list and culled out the double entries, and some of the early alpha games that where 100% broken) Of those 70 over half are still BS games, that are early release on steam... a few are released but have less then 100 reviews on steam with the common denominator being the reviews all say "buggy garbage".

If we are talking real true AAA quality games with RT... ya 7-10 is about right as I also don't count things like Minecraft, Portal... Quake or a Crysis remaster as being anything other then tech demos. Hell I remember being excited LAST century when I managed to get OpenGL hardware acceleration working on Quake II on a ATI card (back when glide and 3DFX was king)... I am not going to get excited about a ray traced version today, at least not anymore then I would about any other cool tech demo. I mean just go and google any top 10-20 Ray traced title list... and when you pull it up discount anything that is patched into a 20 year old game, discount the fortnights, call of dutys and any other FPS where no one playing would turn it on and take the frame time hit.... what are you honestly left with right now ? Control (11-15hrs of play time unless you love replays) Hit man 3, Guardians of the Galaxy, Cyberpunk (again if you love replays I guess otherwise its a benchmark title), Watch Dogs (ditto), Deathloop, Doom Eternal.

Anyway I don't mean to shit all over RT. I think the tech is the future of game lighting... this feature is going to matter at some point in the future. I suspect by the time it matters the difference between Nvidia AMD and Intel if they stick around will be basically zero and fluctuate by title the same way raster performance does. This many years into RT being a thing though... and we just don't have a ton of great excuses to worry about it. imo There are a few very good games that have it well implemented today I agree that in a few of the titles I listed it looks great... I think games released in 2024 and beyond will probably look even better. RT like any other new feature, developers are just not targeting it as the primary solution. Art directon is heavily focused on making 100% raster look good. I suspect RT will matter more in 3-4 years out when more games will hit that where developed from the start with RT in mind... and raster lighting being an after thought. Of course then it might be the RT will look a ton better because the Raster lighting will get less care and optimization and just look shitter then it could.
 
Last edited:
.... what are you honestly left with right now ? Control (11-15hrs of play time unless you love replays) Hit man 3, Guardians of the Galaxy, Cyberpunk (again if you love replays I guess otherwise its a benchmark title), Watch Dogs (ditto), Deathloop, Doom Eternal.
The Spider Man entries, The Callisto Protocol, Dying Light 2, Fornite, Witcher 3

On that list:
https://www.rockpapershotgun.com/confirmed-ray-tracing-and-dlss-games-2022

I think how much RT will catch up will be link to how much Unreal 5 will work well or not and the next generation of consoles
 
Its not 7-10 titles no... its a little over 70 last time I counted a few months ago. (I took Nvidias list and culled out the double entries, and some of the early alpha games that where 100% broken) Of those 70 over half are still BS games, that are early release on steam... a few are released but have less then 100 reviews on steam with the common denominator being the reviews all say "buggy garbage".

If we are talking real true AAA quality games with RT... ya 7-10 is about right as I also don't count things like Minecraft, Portal... Quake or a Crysis remaster as being anything other then tech demos. Hell I remember being excited LAST century when I managed to get OpenGL hardware acceleration working on Quake II on a ATI card (back when glide and 3DFX was king)... I am not going to get excited about a ray traced version today, at least not anymore then I would about any other cool tech demo. I mean just go and google any top 10-20 Ray traced title list... and when you pull it up discount anything that is patched into a 20 year old game, discount the fortnights, call of dutys and any other FPS where no one playing would turn it on and take the frame time hit.... what are you honestly left with right now ? Control (11-15hrs of play time unless you love replays) Hit man 3, Guardians of the Galaxy, Cyberpunk (again if you love replays I guess otherwise its a benchmark title), Watch Dogs (ditto), Deathloop, Doom Eternal.

Anyway I don't mean to shit all over RT. I think the tech is the future of game lighting... this feature is going to matter at some point in the future. I suspect by the time it matters the difference between Nvidia AMD and Intel if they stick around will be basically zero and fluctuate by title the same way raster performance does. This many years into RT being a thing though... and we just don't have a ton of great excuses to worry about it. imo There are a few very good games that have it well implemented today I agree that in a few of the titles I listed it looks great... I think games released in 2024 and beyond will probably look even better. RT like any other new feature, developers are just not targeting it as the primary solution. Art directon is heavily focused on making 100% raster look good. I suspect RT will matter more in 3-4 years out when more games will hit that where developed from the start with RT in mind... and raster lighting being an after thought. Of course then it might be the RT will look a ton better because the Raster lighting will get less care and optimization and just look shitter then it could.
Agree to disagree. The fact remains that RT is here. AMD isn't. I would not buy a card that couldnt do RT as it should be. I am an early adopter, but its just writing on the wall at this point. It is a big issue for enthusiasts.

If you are happy buying a brand new card that is handicapped, that's fine. But i don't buy "top of line cards" to not have top of line features. You forgot quite a lot of AAA titles on that list as well just FYI.
 
Agree to disagree. The fact remains that RT is here. AMD isn't. I would not buy a card that couldnt do RT as it should be. I am an early adopter, but its just writing on the wall at this point. It is a big issue for enthusiasts.

If you are happy buying a brand new card that is handicapped, that's fine. But i don't buy "top of line cards" to not have top of line features. You forgot quite a lot of AAA titles on that list as well just FYI.
So you are arguing the 4090 is the only card worth buying right now ? The 4080 is barely faster then the XTX... 2 or 3 FPS is not exactly a huge win. So the 4090. Your saying the 4090 is the only real GPU anyone should consider. OK Agree to disagree.
 
Back
Top