AMD 7900XT and 7900 XTX Reviews

I had AMD Radeon RX 6800 did not have good experience, random lock ups with black screen of death, and non stop driver crashes. turned me off from AMD until they fix there drivers with black screen death and random lock ups. games lagging and it was just nightmare.
I have a thread in another subforum detailing frequent driver timeouts (but BSOD was rare). I bought a new SSD, reinstalled Windows from a fresh ISO, got the latest Radeon driver, and did a couple of extra things, like make sure I actually had the Intel ME driver, chipset drivers, etc., which I hadn't done originally. FWIW since then I've had almost no problems. One or two driver timeouts since then, none in the last week or two, and the worst I see is every once in a while I'll see a Chrome window flash white and then refresh itself.
 
That's a shame, my 6800 xt is rock solid, got it for 519 after MIR. Was the 3080 used? I looked all over for one around that price.
Yes i bought it on forum's here, it came with 2 year warranty left. member on forum bought model that has transferable warranty
 
I have a thread in another subforum detailing frequent driver timeouts (but BSOD was rare). I bought a new SSD, reinstalled Windows from a fresh ISO, got the latest Radeon driver, and did a couple of extra things, like make sure I actually had the Intel ME driver, chipset drivers, etc., which I hadn't done originally. FWIW since then I've had almost no problems. One or two driver timeouts since then, none in the last week or two, and the worst I see is every once in a while I'll see a Chrome window flash white and then refresh itself.
I redid windows 11 but still had black screen of death and locks and gaming laggy. I tried for 2 weeks get it working. it was good card don't get me wrong, i had driver problem, I will give amd go again when amd rx 8000 series comes out. I truly believe amd will have RT down in next gen.

I understand amd is working with in budget from cpu to apu to gpu that's a lot of cash going in and out in 3 different div's
 
I redid windows 11 but still had black screen of death and locks and gaming laggy. I tried for 2 weeks get it working. it was good card don't get me wrong, i had driver problem, I will give amd go again when amd rx 8000 series comes out. I truly believe amd will have RT down in next gen.

I understand amd is working with in budget from cpu to apu to gpu that's a lot of cash going in and out in 3 different div's
I don't blame you for moving on. I would've too, if my problems hadn't been 98% resolved. I don't like the browser goofiness but it's not a big deal, because it fixes itself.
 
I don't blame you for moving on. I would've too, if my problems hadn't been 98% resolved. I don't like the browser goofiness but it's not a big deal, because it fixes itself.
Truthfully. i wish i coulda fixed it. as i said it was nice card. RX 6800 reminded me of Radeon VII lol. odd ball lol i had problem with windows media players and also other media players, if i went full screen it would be laggy and not smooth,
 
Generally I think the 7900XTX (not sure about the regular XT, it wouldn't be the card for me if I was going to buy one it would be the top of the line especially with such a small difference in price between them MSRP) seems to make a pretty strong showing. In general use and rasterized gaming it does very well, with the 7900XTX on average being somewhere between the 4080 and 4090 while being cheaper than both of them. Raytracing is really the sticking point but as others were discussing, its RT capability is at least equal to the 3090 Ti and that's no slouch - essentially even the 4080 isn't "that" good at RT (and the 7900XTX comes relatively close) with the 4090 being the ONLY card really suitable for 4K raytracing at high frame rates w/o DLSS/FSR/XESS type upscaling tech - that tells me that even if it WAS (and I'm not sure that it is) such a massive improvement that all this is desirable its going to be quite a few generations before it can be rolled for anything but the tip top halo products.

I do think we need to stop letting Nvidia dictate what is "necessary" for gaming these days -they continue to push proprietary everything and, since the last generation, have leaned heavily on RT as their big feature because the 6000 series was the first one in a long time where AMD had high end competitors and even sometimes won in that sphere.. The same seems to be true with the 7900XTX and even more so, with the advance in RT performance being pretty significant over the last gen. In any case, I don't really know about the big hard-on for raytracing being motivated by anything other than e-peen measurement and directed by marketing attempts on behalf of Nvidia. Looking at many of the RT using games they either seem to fall into tech demos (ie minecraft, portal etc) or have a relatively small improvement for a massive performance penalty. Even titles like Control or Cyberpunk2077 that put some effort into it, you get some additional lighting or reflections, but it doesn't seem to be enough to justify the performance issues. There are a lot of other games that also include some RT functions and don't do nearly "as much" with it,. to say nothing about console titles that - both Xbox Series X and PS5 - offer some of those features on much weaker AMD GPU powered hardware. It has always been a sore spot for me, even as a 3090 owner (and if I was to buy a new NV card, it would be the 4090 - amazingly the least price increased vs previous gen and with the greatest performance ) Nvidia has been able to get everyone onboard their marketing train and you have reviewers and "influencers" advising people to buy NV or else they won't get RT, even if they're in the market for a mid-level GPU which neither manufacturer is going to have a high end RT + max settings + high resolution + no-upscaling capability, to say nothing for really assessing what RT is going to give you in the majority of games. Nvidia has always been about pushing proprietary tech, even to a fault ( the whole NV owners cant use FreeSync monitors thing for ages until they looked so stupid they finally reversed course) and I don't want to see them rewarded for that, compared to AMD's more open source/spec ethos . What a user above mentioned about Nvidia's head start leading to them "writing the book" on raytracing implementations is true, but I wasn't aware they had patented proprietary algorithms - yet another reason to oppose software patents!

I'm curious about other avenues for the 7900XTX; its good to see its overall performance in gaming and even its improvement in raytracing, but I am wondering how AMD will thrive with some content creation and compute workloads. I'll ahve to watch the video on the previous page , but besides RT and DLSS (which AMD can continue to offer an open alternative to with FSR 2.x and 3.x when it arrives, as well as Intel's development on XeSS but not sure how well that is going) another area that Nvidia tends to hold is the content creation/streaming thanks to their yet another proprietary codec for streaming and of course CUDA. While anyone hard into CUDA will be locked to NV for the time being, there are other compute workloads that AMD should focus upon in addition to the streaming/encoding element. For instance, Stable Diffusion and AI art is a growing hobby that benefits from a GPU's performance and last I checked you need to kinda set them up differently using Nvidia or AMD hardware; the perception was the Nvidia was either easier to set up or more performant, but apparently this has changed a bit in the RX 6000 generation and may do even more so this one. One element is that it may take a little time to update some of the software (and drivers) as another user mentioned that this generation of AMD GPUs has some structural elements that require proper support to get the most out of the GPU; certainly not impossible, but it may take a little more time for everyone to update. If AMD can close this gap and be suitable for hybrid work comparative to NV within a given tier that will be another nice step forward.

Overall, it looks like the 7900XTX seems to be a solid card for the (admittedly high, but a value compared to similar card tiers) price all around and edges into a near must buy if you place less emphasis on raytracing. If this is also true for not just gaming but enthusiast compute/content creation workloads that will be even better. The NV4090 may carry the crown for those willing to spend 1600+, but given its ability to fall between the 4080/4090 in rasterized gameplay and even make a decent showing of raytracing all for $900-1000, that's commendable. Some who want to support Linux or AMD's open source/spec endeavors may be encouraged to buy the card for that reason, but overall it seems that it should be a solid card for high end gaming and mixed usage. I'm just curious if we'll see more emphasis on this or if a lot of the reviewers many listen to dull this edge with hand-wringing over raytracing weighted to heavily in their assessment of the card as a whole.
 
I think saying the rasterization performance of the 7900XTX is between the 4080 and 4090 is making it sound better than it actually is. Sure, it performs in between the two cards, but overall it is significantly closer to 4080 performance than 4090 performance. Based on all the reviews I’ve read, in pure rasterization performance the 7900XTX is overall ≤5% ahead of the 4080 but is 15-25% behind than the 4090.

The RT performance of the 7900XTX is definitely acceptable and usable as it essentially performs close to the top offerings from Nvidia’s last generation. But Nvidia did raise the bar again with the 40 series so expectations continue to increase. The 4090 is the first GPU I have had experience with where enabling RT still keeps performance where I would want it to be in most games without using DLSS. People can argue whatever they want with anecdotes and make sweeping statements like “RT isn’t important”, but its only becoming more popular now with the next generation consoles having solid footing. It’s importance and worth is subjective.

Its also great to see AMD also finally level the playing field with their video encoding when using AV1. Their h.264/265 performance was (and apparently still is) abysmal in comparison to the competition.

Overall, I do agree that the 7900 XTX is a solid showing and is definitely competitive with Nvidia. But just like the 40 series Nvidia cards, they’re overpriced. It’s understandable from a business perspective that AMD prices their cards to maximize profit, but part of me still has the unrealistic expectation that AMD being competitive would keep prices in check. But that isn’t the case unfortunately.
 
I think saying the rasterization performance of the 7900XTX is between the 4080 and 4090 is making it sound better than it actually is. Sure, it performs in between the two cards, but overall it is significantly closer to 4080 performance than 4090 performance. Based on all the reviews I’ve read, in pure rasterization performance the 7900XTX is overall ≤5% ahead of the 4080 but is 15-25% behind than the 4090.

The RT performance of the 7900XTX is definitely acceptable and usable as it essentially performs close to the top offerings from Nvidia’s last generation. But Nvidia did raise the bar again with the 40 series so expectations continue to increase. The 4090 is the first GPU I have had experience with where enabling RT still keeps performance where I would want it to be in most games without using DLSS. People can argue whatever they want with anecdotes and make sweeping statements like “RT isn’t important”, but its only becoming more popular now with the next generation consoles having solid footing. It’s importance and worth is subjective.

Its also great to see AMD also finally level the playing field with their video encoding when using AV1. Their h.264/265 performance was (and apparently still is) abysmal in comparison to the competition.

Overall, I do agree that the 7900 XTX is a solid showing and is definitely competitive with Nvidia. But just like the 40 series Nvidia cards, they’re overpriced. It’s understandable from a business perspective that AMD prices their cards to maximize profit, but part of me still has the unrealistic expectation that AMD being competitive would keep prices in check. But that isn’t the case unfortunately.

If the 7900 XTX RT performance isn't more then acceptable then RT is a dead tech. 7900 XTX basically hits 3090 TI RT numbers... last generations #1 with a bullet.

The 4080 is faster then 3090ti in RT but not by much... Gamers Nexus says in control 4k 4080 51.5 fps 3090ti 47.6 fps Cyberpunk 4k 4080 57.5 fps 3090ti 44.9 fps F1 4080 56.2 3090ti 47.8.
My point being the 4080 is faster then the 3090 in Ray tracing... but lets all be honest about the performance here. Its not night and day and both are under 60 fps.

The 7900 XTX is the second fastest raster card on the market. The #1 is a halo card with a MSRP of $1600 and a actual street price that is more like 2k almost everywhere. No doubt nvidia holds the crown... it took an almost perfect datacenter class die to do it but they did it. So its up to consumers now... if you have the bank or are willing to go into debt to have the best for the next 6 months at least (until AMD and or Nvidia release a new product) then you choice is clear. If you still want to spend a lot of money on a GPU and get the second best card... that is probably AMD. Unless you are really going to argue that 8-10% better ray tracing performance (that is still not hitting 60 fps at 4k anyway) is worth a couple hundred more for the third fastest raster card.

At least the choices this generation are clear. 4090 is the halo card no question... and Nvidia has priced it accordingly. 7900 XTX is the second fastest card and still very expensive but much much cheaper. (and its second place by single digits) The 4080 is the second fastest ray tracing card... also by single digits.

7900 XTX vs 4080 at this point is mostly preference. Single digit wins in either direction not counting the odd outlier title that for some reason favors one or the other. I mean some are blatant... if you are a hard core Call of Duty player buying anything but the 7900 XTX would be silly as it easily bests the 4090 in the call of duty titles.
 
The 4080 is faster then 3090ti in RT but not by much... Gamers Nexus says in control 4k 4080 51.5 fps 3090ti 47.6 fps Cyberpunk 4k 4080 57.5 fps 3090ti 44.9 fps F1 4080 56.2 3090ti 47.8.
My point being the 4080 is faster then the 3090 in Ray tracing... but lets all be honest about the performance here. Its not night and day and both are under 60 fps
Those are not actual game played like that, in extreme load the 3090 ti extra memory bandwith will make it close or even better than a 4080, but they are workload too big to deliver playable framerate anyway, when the memory is not a big bottleneck and it is more of an RT vs RT performance the 4080 is significantly faster.

With framerate people actually want to play:
Control: 91.7 fps vs 78.3 fps
Cyberpunk: 60 fps vs 49 (1440p) or 90 fps vs 73 (1080p)
F1: 116 vs 96
Metro exodus: 88 vs 71
RE village: 120 vs 102
Watch Dog: 95 vs 78
Portal RTX: 67 vs 49 fps

The 4080 is about 44% more powerful in raw RT calculation than a 3090TI and tend to be 20-25% above in RT games

Looking at what happen under 60 fps can be interesting for theoretical purpose, but that a way to see how well the memory bandwith reduction vs high cache was done on Nvidia part, yes there will be a significant performant cost to it in some scenario, will any of them will be in relevant one ? Or in scenario the card would have been under 60 fps anyway, that do not really matter.

RT is a dead tech
Yes lot of the best selling 2022 title use it and lot of the announced 2023 use it has well, I feel with Unreal 5.1 that debate is almost done, the tech is here to stay, the last 30 days alone we had The Callisto Protocol, The Witcher remake, Marvel Midnight Suns using it
 
Those are not actual game played like that, in extreme load the 3090 ti extra memory bandwith will make it close or even better than a 4080, but they are workload too big to deliver playable framerate anyway, when the memory is not a big bottleneck and it is more of an RT vs RT performance the 4080 is significantly faster.

With framerate people actually want to play:
Control: 91.7 fps vs 78.3 fps
Cyberpunk: 60 fps vs 49 (1440p) or 90 fps vs 73 (1080p)
F1: 116 vs 96
Metro exodus: 88 vs 71
RE village: 120 vs 102
Watch Dog: 95 vs 78
Portal RTX: 67 vs 49 fps

The 4080 is about 44% more powerful in raw RT calculation than a 3090TI and tend to be 20-25% above in RT games

Looking at what happen under 60 fps can be interesting for theoretical purpose, but that a way to see how well the memory bandwith reduction vs high cache was done on Nvidia part, yes there will be a significant performant cost to it in some scenario, will any of them will be in relevant one ? Or in scenario the card would have been under 60 fps anyway, that do not really matter.


Yes lot of the best selling 2022 title use it and lot of the announced 2023 use it has well, I feel with Unreal 5.1 that debate is almost done, the tech is here to stay, the last 30 days alone we had The Callisto Protocol, The Witcher remake, Marvel Midnight Suns using it

Raw what now ? What are you talking about. The numbers I posted where numbers for those games at 4k WITHOUT any silly upscaling or frame generation techs. (every reviewer agrees lower resolutions are CPU bound) That is raw RT performance. If anyone is buying a 4080 to play at 1080p I'll say it and I mean it... they are stupid, Very very stupid.
At 1440 I guess I can see your argument. (although at 1440 the 4080 is still CPU bound but whatever) According to hardware unboxed. The 4080 is 24% faster in raster at 1440 (13 game test average) vs the 3090.
Hardware unboxed published 1440 RT numbers... F1 4080 128 fps 3090ti 105 fps. (18%) Watch dogs 4080 106 fps 3090ti 82 fps (23%) Guardians of the Galaxy 4080 133 fps 4090ti 111 (17%) I won't keep going he did a few more games. Bottom line is the difference is around 17-18% on average. If you count DLSS performance. They are essentially tied as both become CPU bound and the 4080 and 3090ti end up with the exact same FPS within a margin of error (in watch dogs as an exmple RT with DLSS at 1440 the 3090 TI has a 2fps advantage at 106)

On the last bit don't misquote me. I did not say RT was a dead tech.
What I said was...
" If the 7900 XTX RT performance isn't more then acceptable then RT is a dead tech. "
Read it again... if the 7900 XTX isn't good enough to push RT for folks, then RT is dead because after 3 generations of Ray tracing cards form Nvidia you are admitting they have only made one or two acceptable card.
 
Raw what now ? What are you talking about.
Raw RT tflops, the 4080 has 112.7 the 3090TI has 78.1, when the workload get close to be purely about raytracing:
https://www.pugetsystems.com/labs/articles/nvidia-geforce-rtx-4080-16gb-content-creation-review/
The 4080 tend to be about that 40% faster (like it is in Vray).

When you play a video game there is much more things going on in a frame than RT-denoising steps with much more things affecting the frame rate than how good a card is at raytracing or denoising, specially if you get a massive bottleneck in unplayable regardless scenarios we are not getting a clear idea of how in real life faster it will be.


If anyone is buying a 4080 to play at 1080p I'll say it and I mean it... they are stupid, Very very stupid.
If they are buying a 4080 to play at 45 fps........ they are not ? playing with DLSS 2 on if you want to play on a 4k monitor is not stupid (at least not versus playing at 45 fps)

At 1440 I guess I can see your argument.
It is not specially about the resolution, it is about in playable framerate scenarios, can be 4k, can 1080p, I feel what matter is how computer hardware do in things people actually do and when playing PC games in 2022 on high end hardware it tend to be at least at 60 fps, there is exception say a Flight simulator, but a formula 1 racing game or a shooter, not likely.

" If the 7900 XTX RT performance isn't more then acceptable then RT is a dead tech. "
Read it again... if the 7900 XTX isn't good enough to push RT for folks, then RT is dead because after 3 generations of Ray tracing cards form Nvidia you are admitting they have only made one acceptable card.
Ah yes that make more sense, but again that would be strange if RT is now only "acceptable" I do not see why it would imply that it is dead, if the 4080/4090 would be the only card ever made with an acceptable above 100 tflops RT performance, what does it tell us if some form of raytracing will be used in realtime in games in 2028 or not ?
 
Last edited:
Raw RT tflops, the 4080 has 112.7 the 3090TI has 78.1, when the workload get close to be purely about raytracing:
https://www.pugetsystems.com/labs/articles/nvidia-geforce-rtx-4080-16gb-content-creation-review/
The 4080 tend to be about that 40% faster (like it is in Vray).

When you play a video game there is much more things going on in a frame than RT-denoising steps with much more things affecting the frame rate than how good a card is at raytracing or denoising, specially if you get a massive bottleneck in unplayable regardless scenarios we are not getting a clear idea of how in real life faster it will be.

If they are buying a 4080 to play at 45 fps........ they are not ? playing with DLSS 2 on if you want to play on a 4k monitor is not stupid (at least not versus playing at 45 fps)

It is not specially about the resolution, it is about in playable framerate scenarios, can be 4k, can 1080p, I feel what matter is how computer hardware do in things people actually do and when playing PC games in 2022 on high end hardware it tend to be at least at 60 fps, there is exception say a Flight simulator, but a formula 1 racing game or a shooter, not likely.

Ah yes that make more sense, but again that would be strange if RT is now only "acceptable" I do not see why it would imply that it is dead, if the 4080/4090 would be the only card ever made with an acceptable above 100 tflops RT performance, what does it tell us if some form of raytracing will be used in realtime in games in 2028 or not ?
I agree no one is buying 4080 or 3090 class cards to game sub 60 fps. That however is the reality right now unless you choose to degrade your image quality with upscaling. So one step forward, one step back.

As for 2028 I don't care about 2028 with a card I buy in 2022 or even 2023. As a gamer of over 30 years I'm fully aware that in 2028 any card I buy in 2022 is going to be a paper weight in 6 years. When you buy a GPU worry about the games you are playing today... games that just went into development for release in two or three years, are not what you should be worried about. (when they come out your current GPU isn't going to be great)

I do believe improved lighting is the future. I'm not 100% convinced that is actually "ray tracing" as has been sold by Nvidia for four years now. Yes I believe in 2028 3/4 of games will be running Unreal engine... 6 or 7 whatever is out and being developed for at that time. I suspect lighting will be a form of hybrid RT and Raster even then, and the results will be much better then the traced stuff we are seeing today.
 
some of the folks he say ray tracing is dead... lol how many years did it take between DirectX9 launch to when almost every game released fully utilized all the features? It took many many years. It is the same with Ray Tracing. It is the future, and just like DX9, it will take many years for it to be fully adopted.
 
some of the folks he say ray tracing is dead... lol how many years did it take between DirectX9 launch to when almost every game released fully utilized all the features? It took many many years. It is the same with Ray Tracing. It is the future, and just like DX9, it will take many years for it to be fully adopted.

Maybe in 5 years or more it might actually be something that matters, I doubt anyone will still be using these cards discussed at that point and who knows how Ray Tracing tech itself will evolve. After all hardware physX died, despite many thinking it was the next big thing and became a software driven task by the cpu.
 
some of the folks he say ray tracing is dead... lol how many years did it take between DirectX9 launch to when almost every game released fully utilized all the features? It took many many years. It is the same with Ray Tracing. It is the future, and just like DX9, it will take many years for it to be fully adopted.
Exactly.
This is why worrying about minor performance differences with a feature like Ray tracing that is in its very early stages right now is silly.
Yes Nvidia holds the RT crown with the 4090 its halo and even with that card your forced to make compromises to get playable frame rates in demanding titles.
Coming down to the 4080/7900 price range... Nvidia claims a win but its a pyric victory. Neither option is really delivering turn it on and forget about it performance either. Both are going to require at the min toggling DLSS/FSR.

It reminds me of Tessellation. ATI released the first Tessellation capable hardware. It made for very pretty demos of smooth ass dragons with the first gen. With the second gen a couple games used it and it was glorious. Still it took like 8 or 9 years before all games used it and every GPU maker supported it. Despite ATIs lead and a couple generations where Nvidia didn't support it at all... by the time it mattered (as in the majority of games used it) both supported it and performance was identical or close enough too.
 
Only the XT is consistently available, the XTX is gone in a min max.
Yeah, I just checked and dug a bit deeper. Their website is being a bit more misleading about what is and isn't available. You have to drill down into the store and log in with a code to find out.

I picked up a 7900 XTX yesterday, randomly logging in.
 
Yeah, I just checked and dug a bit deeper. Their website is being a bit more misleading about what is and isn't available. You have to drill down into the store and log in with a code to find out.

I picked up a 7900 XTX yesterday, randomly logging in.
There is no login, period, for the AMD store, never has been, I picked up a XTX too around 7 am PST but only coz I pounced on a discord alert fwiw. XTX was OOS a few seconds after I completed checkout (whole thing probably took 50 seconds).
 
There is no login, period, for the AMD store, never has been, I picked up a XTX too around 7 am PST but only coz I pounced on a discord alert fwiw. XTX was OOS a few seconds after I completed checkout (whole thing probably took 50 seconds).
I'm talking about the code they make you enter to step into their ordering queue. That's a "log in" per se. You don't see if there is actually stock until you verify you're human.
 
I'm talking about the code they make you enter to step into their ordering queue. That's a "log in" per se. You don't see if there is actually stock until you verify you're human.
nothing like that for me going on. Just went amd.com and clicked on "buy on amd.com" link for rx7000 series right on main page on the 14th they had a drop. Not sure if I want to keep it cuz I also got merc being ready for pickup next week at best buy that I essentially got for not much more with cash back bonus. Might refuse shipment on AMD package but do want to touch the card for sure lmao. Almost have a thought about putting it in my son's computer.
 
Big heads up everyone. AMD just confirmed with me on Twitter that the RX 7000 series GPUs are not planned to ever receive a Windows 7 hardware driver. Those that have been holding off for a final upgrade for their Windows 7 machines should probably go ahead and pick up a RX 6000 or RTX 3000 card while they're still available at MSPR.
 
Exactly.
This is why worrying about minor performance differences with a feature like Ray tracing that is in its very early stages right now is silly.
Yes Nvidia holds the RT crown with the 4090 its halo and even with that card your forced to make compromises to get playable frame rates in demanding titles.
Coming down to the 4080/7900 price range... Nvidia claims a win but its a pyric victory. Neither option is really delivering turn it on and forget about it performance either. Both are going to require at the min toggling DLSS/FSR.

It reminds me of Tessellation. ATI released the first Tessellation capable hardware. It made for very pretty demos of smooth ass dragons with the first gen. With the second gen a couple games used it and it was glorious. Still it took like 8 or 9 years before all games used it and every GPU maker supported it. Despite ATIs lead and a couple generations where Nvidia didn't support it at all... by the time it mattered (as in the majority of games used it) both supported it and performance was identical or close enough too.
QFT. It's bragging rights at this stage IMHO. I have a 3080, don't play anything that's RT at all, in fact turn off DLSS in bannerlord 2 because it creates weird artifacts. I'm not sold on RT or these new gimmicks. In 10 years when they're transparent to me, sure, but now, IDGAF
 
Big heads up everyone. AMD just confirmed with me on Twitter that the RX 7000 series GPUs are not planned to ever receive a Windows 7 hardware driver. Those that have been holding off for a final upgrade for their Windows 7 machines should probably go ahead and pick up a RX 6000 or RTX 3000 card while they're still available at MSPR.
They need to work on drivers for newer os for sure lmao.
 
QFT. It's bragging rights at this stage IMHO. I have a 3080, don't play anything that's RT at all, in fact turn off DLSS in bannerlord 2 because it creates weird artifacts. I'm not sold on RT or these new gimmicks. In 10 years when they're transparent to me, sure, but now, IDGAF
Sorry gotta to disagree there, RT was borderline on the 2080 Ti, usable on the 3080/90 and very much part of the main experience on the 4090. I've had all of the above cards and played multiple games with RT/DLSS on.
 
I respect your respectful disagreement. My only experience with DLSS isn't a perfect one.
Between DLSS and FSR, I prefer DLSS but both just try to make the best of a bad situation.
If you try running a game at 1080p native on a 1440p screen it looks bad, things are warped and distorted, and both FSR and DLSS do a better job of scaling 1080p resolution to a 1440p screen than the game is going to do natively, with only a relatively minor overhead.
The same goes for trying to run a game at 720p on a 1080p monitor.
Ultimately you want to be running the game natively to match the screen but sometimes you just can't, and if you had to choose between 1440p mid or 1080p high for your graphics to maintain a good frame rate, 1080p high upscaled to 1440p, is going to look better and be more enjoyable than native 1440p with most of the graphics features turned off.
 
Sorry gotta to disagree there, RT was borderline on the 2080 Ti, usable on the 3080/90 and very much part of the main experience on the 4090. I've had all of the above cards and played multiple games with RT/DLSS on.
No doubt we have RT games on the market... and the 4090 I don't have one or sat down with one at this point, but yes obviously its a powerful enough card to have a very decent 4k experience. IMO we just don't have enough games right now that really justify a $1600 price tag. Of course that is a matter of opinion, if you can afford it and you want it cool. I think it is sort of bragging rights... or you just have fuck it money which is fine. Hey I was raised by Scotts, I wouldn't buy every flag ship... and can't even bring myself to do that every X number of years. Nothing against anyone that chooses too though... but ya I mean it is sort of bragging rights at this point.
In 2 or 3 years when 80 games (with around 20 of them being actual worthy titles) turns into 400-500 games I'll make RT performance part of my own spend calculus.
 
Between DLSS and FSR, I prefer DLSS but both just try to make the best of a bad situation.
If you try running a game at 1080p native on a 1440p screen it looks bad, things are warped and distorted, and both FSR and DLSS do a better job of scaling 1080p resolution to a 1440p screen than the game is going to do natively, with only a relatively minor overhead.
The same goes for trying to run a game at 720p on a 1080p monitor.
Ultimately you want to be running the game natively to match the screen but sometimes you just can't, and if you had to choose between 1440p mid or 1080p high for your graphics to maintain a good frame rate, 1080p high upscaled to 1440p, is going to look better and be more enjoyable than native 1440p with most of the graphics features turned off.

Both upscalers have issues... both are interesting tech, and both work better then I would have suspected a few years back when they where proposed.
I do like that the quality modes work as a nice AA method. I am not a fan of the odd animation ghosting artifacts with DLSS, I don't see those with FSR... but I also dislike what FSR does with thin lines and fine transitions (some of the base textures in no mans sky have a strange edge with FSR 2).

I'm sure they will both get better. Considering how much better then I expected all ready.. I am sure they will get even better yet. (assuming they both don't waste a ton of time on stupid frame generation crap).

The main reason I would say FSR is superior though is simply it works with older cards. It seems Nvidia wants to keep adding features that won't be backwards compatible with even their own cards never mind the competitions. I love that I can install games my wife likes to play that are starting to push her little machine with a RX 570 in it like Anno1800 and flip FSR on. That same logic is going to apply at some point in the next year when I upgrade my 5700 and she is using it. lmao
 
No doubt we have RT games on the market... and the 4090 I don't have one or sat down with one at this point, but yes obviously its a powerful enough card to have a very decent 4k experience. IMO we just don't have enough games right now that really justify a $1600 price tag. Of course that is a matter of opinion, if you can afford it and you want it cool. I think it is sort of bragging rights... or you just have fuck it money which is fine. Hey I was raised by Scotts, I wouldn't buy every flag ship... and can't even bring myself to do that every X number of years. Nothing against anyone that chooses too though... but ya I mean it is sort of bragging rights at this point.
In 2 or 3 years when 80 games (with around 20 of them being actual worthy titles) turns into 400-500 games I'll make RT performance part of my own spend calculus.
I am Indian lol, we start our negotiations at 50% off list price, sadly the best I could manage with the 4090 was 10% off the 1600 list price with 5% paypal cc bonus on top. Better that to satisfy my itch than a new EV to replace my 5 year old SUV...even with Fed tax incentives, I'd be 30-40k 'in the hole'. I do not have fuck it money or at least it will never be fuck it to me coz it has never come easy, same reason I can't spend on business class flights even internationally, 10 hours of comfort is never gonna be worth 5k to me even if I made 4x what I do, but a 1500 tech toy I use for a couple of years and get at least half back from, sure. Apologies for the tangent and to each their own of course.
 
nothing like that for me going on. Just went amd.com and clicked on "buy on amd.com" link for rx7000 series right on main page on the 14th they had a drop. Not sure if I want to keep it cuz I also got merc being ready for pickup next week at best buy that I essentially got for not much more with cash back bonus. Might refuse shipment on AMD package but do want to touch the card for sure lmao. Almost have a thought about putting it in my son's computer.
FYI I don't think AMD/Digital River will accept a return, if its opened.
 
I am Indian lol, we start our negotiations at 50% off list price, sadly the best I could manage with the 4090 was 10% off the 1600 list price with 5% paypal cc bonus on top. Better that to satisfy my itch than a new EV to replace my 5 year old SUV...even with Fed tax incentives, I'd be 30-40k 'in the hole'. I do not have fuck it money or at least it will never be fuck it to me coz it has never come easy, same reason I can't spend on business class flights even internationally, 10 hours of comfort is never gonna be worth 5k to me even if I made 4x what I do, but a 1500 tech toy I use for a couple of years and get at least half back from, sure. Apologies for the tangent and to each their own of course.

Hey we wouldn't be talking on a forum for an old tech enthusiast site if this shit wasn't important to us. We all share the hobby and spending on the hobby isn't crazy. Before my eyes aged out on me I used to be pretty heavy into astronomy... I know people that have spent over 50k on their setups. Granted they tend to last them 20+ years. Anyway point is there are a lot of expensive hobbies... this was never exactly a cheap hobby. I do miss the days though when we could tweak, overclock and hack together cheap setups with high end performance. I guess with those days behind us its easier to seem like crabby old men complaining that to get the high end performance these days the only option is to spend. :) Hope the 4090 lasts you many years and you get your monies worth. o7
 
FYI I don't think AMD/Digital River will accept a return, if its opened.
Yea. I know that’s why I am probably going to refuse it. It’s coming in tomorrow and I do not have the impulse control to not open and touch it Lmao. 😂
 
Also showing very good scaling as the frequency is increased. This is what they said:

"While the AMD reference RX 7900 XTX was 8% faster than the RTX 4080, the stock Sapphire Nitro+ is 11.9% faster, if you add OC+UV on top, you're getting 21.5%—this looks much closer now to what AMD has promised us."

Been trying to get a XTX Merc, loved the 6900XT version, anyways no luck. It also was able to clock to around 3200mhz.
 
Also showing very good scaling as the frequency is increased. This is what they said:

"While the AMD reference RX 7900 XTX was 8% faster than the RTX 4080, the stock Sapphire Nitro+ is 11.9% faster, if you add OC+UV on top, you're getting 21.5%—this looks much closer now to what AMD has promised us."

Been trying to get a XTX Merc, loved the 6900XT version, anyways no luck. It also was able to clock to around 3200mhz.
I scored a merc off BB on launch. I was just uplate and got notification 1am in the morning lmao. They dropped them very early on the 13th. Suppose to be ready for pickup next wed, hopefully earlier as that has happened in the past.

Also ordered a reference from amd. I wanted to put that in my sons computer which has 6600xt and sell the 6600xt. Not sure if I want to be that nice on second thought. But I do have the urge to test the reference in my system until I get the merc. So my impulse might have the best of me. I was thinking about refusing shipment but being in a snow storm stuck home on the weekend I am suppose to get it today, how would I not wanna play with the reference card? 😭
 
  • Like
Reactions: noko
like this
I scored a merc off BB on launch. I was just uplate and got notification 1am in the morning lmao. They dropped them very early on the 13th. Suppose to be ready for pickup next wed, hopefully earlier as that has happened in the past.

Also ordered a reference from amd. I wanted to put that in my sons computer which has 6600xt and sell the 6600xt. Not sure if I want to be that nice on second thought. But I do have the urge to test the reference in my system until I get the merc. So my impulse might have the best of me. I was thinking about refusing shipment but being in a snow storm stuck home on the weekend I am suppose to get it today, how would I not wanna play with the reference card? 😭
I was very close, it had add to cart for the Merc on BB, I didn't know AIBs were going to be available so I investigated and lost the window to add it to the cart. Yep, you snooze you lose.
 
Back
Top