7900xtx or 4080 super?

What other unresolved driver issues do they have? Just curious.

Also, the Dolby Atmos problem, you are having a different issue. The long running Dolby Atmos problem and audio dropouts was specific to the 3xxx cards. But I have seen people complain about this with both AMD and Nvidia GPUs after upgrading to AM5. Their issues only started after happening after this. So maybe it's a motherboard/CPU problem.
Did you just say I'm not having the robotic / audio dropping out issue while using Dolby Atmos and my 4090 as the sound source? Because I definitely am.

Toodles.
 
Did you just say I'm not having the robotic / audio dropping out issue while using Dolby Atmos and my 4090 as the sound source? Because I definitely am.

Toodles.

No, that's not what I saying at all. The issue you are having at the moment isn't the same as the problem that was happening on the 3xxx cards. The audio dropouts on the 3xxx cards was a known issue. It was difficult to solve because it also required effort from the manufacturers of AV equipment like TVs, AV Receivers etc. For example there is a massive thread on AVS forums about Sony's TVs having audio dropouts when connecting to 3xxx cards. The fix wasn't just Nvidia updating their drivers, but also Sony releasing a firmware update for those TVs with the problem.

The 4xxx series cards haven't got that same issue. Either the device you are connecting to needs firmware upgrade or it might be a motherboard/CPU problem or you might need a reinstall of windows. Let me ask you a question. Did you have the problem when you first got your 4090? Did you by any chance happen to upgrade to an AM5 motherboard and CPU and then started having the issue? The reason I am asking is that I know people with both 7900 and 4xxx cards who started having issues with Dolby Atmos after switching to AM5.
 
I just heard in a review that you have to use AMD's built in FPS counter, as MSI afterburner and the like will not read the interpolated extra frames. Is that the issue?

I only use AMD overlay to show information, it's great :)
 
No, that's not what I saying at all. The issue you are having at the moment isn't the same as the problem that was happening on the 3xxx cards. The audio dropouts on the 3xxx cards was a known issue. It was difficult to solve because it also required effort from the manufacturers of AV equipment like TVs, AV Receivers etc. For example there is a massive thread on AVS forums about Sony's TVs having audio dropouts when connecting to 3xxx cards. The fix wasn't just Nvidia updating their drivers, but also Sony releasing a firmware update for those TVs with the problem.

The 4xxx series cards haven't got that same issue. Either the device you are connecting to needs firmware upgrade or it might be a motherboard/CPU problem or you might need a reinstall of windows. Let me ask you a question. Did you have the problem when you first got your 4090? Did you by any chance happen to upgrade to an AM5 motherboard and CPU and then started having the issue? The reason I am asking is that I know people with both 7900 and 4xxx cards who started having issues with Dolby Atmos after switching to AM5.
I did start having the issue with the 5800X3D after I upgraded to the 4090 FE (it was never an issue with the 3090 FTW3), but it wasn't at this frequency with the 7800X3D where it happens multiple times a day, and is easily repeatable by alt tabbing / selecting different windows - they don't even need to be ones with audio playing. EDIT - same Onkyo TX-RZ810 receiver used across both working and not really working Dolby Atmos setups, same firmware even (it hasn't gotten updates in a long time).

I can say that it happened in Windows 10 and 11. It has happened with at least the last 6 WHQL drivers, including clean installs and updating over the old. It still happens after clean installing the X670E AMD chipset drivers. Uninstalling and re-installing Dolby Access didn't fix it, nor did installing DTS:X. Using Soundkeeper helps to slightly reduce the frequency of the events (I did start using this with my 5800X3D and 4090 and it helped tremendously, but with the 7800X3D it doesn't seem to help nearly as much). Disabling Atmos upmixing in the Dolby Access app definitely reduced the frequency (I went a full day without it happening on my 7800X3D, then it happened again this morning). I've also tried deactivating HPET and the dynamic synthetic timer (some random solution that I found on M$ answers). I even went so far as purchasing a new set of VESA certified HDMI 2.1 cables. This issue does not occur when setting audio to Stereo (I never use this), 5.1 or DTS:X. Only Dolby Atmos.
 
Last edited:
Driver issues happen with both vendors and sometimes go unresolved for long periods of time (FACT). I frequent the AMD and Nvidia forums to research/monitor reports of issues and take no ones opinion on the matter because I do the legwork myself. Many of the issues people experience in both vendors forums are not at all driver related and can be resolved by basic trouble shooting. Some reports are legitimate problems and are generally far more widely reported and tend to remain unresolved for much longer periods. In my opinion both vendors do their best to resolve issues as quickly as they can and should be commended for doing so. As users it can be quite frustrating for us to spend large sums of our hard earned money on such a complicated device only to have an unexpected malfunction that creates a situation where we cannot use said device as we intended. Believe me I get it as I had to wait nearly 9 months to use my current gpu in my favorite VR warbird sims. Absolute bummer but I knew it would be fixed eventually so I found another title or two to enjoy while I waited.... The previous generation of 3000 series Nvidia and AMD 6000 series gpus had VR issues similar to my current gen gpu that went unresolved for approximately the same amount of time so users in both camps had to deal with it for awhile. Nvidia does tend to get their launch drivers in better shape than AMD but AMD seems to be working much harder on this recently than they have historically. Point is writing drivers for such complicated devices is a terribly difficult and complex undertaking that most of us can't even begin to understand. Fixing drivers is a monumental task that takes time and a great deal of effort to suss out issues from 4 million or more lines of code. Perhaps we should all exhibit a bit more patience with the people that toil away for us so we can enjoy the fruits of their labors. Complacency has never solved a single problem in this world. The people responsible for resolving these kinds of problems know just as well as the rest of us what is expected of them at their jobs. I certainly appreciate it when people exhibit patience with me at my job when the sloution to a problem I'm presented with eludes me.
 
I can't see why I would choose 4080s over 7900 XTX.

I prefer a higher FPS (+Hz) than an RT gimmick.
A slower refresh rate always leads to blurriness on your monitor, so heavy RT => blurry gaming, no more, no less. If someone likes it - fine, I do not :)

Drivers? I have used Radeon for gaming since Vega, but I can't remember where I had a problem with drivers. Yes, AMD (as Nvidia) can drop a driver that can do problems (very rare ofc), but what, can't I revert to the previous one that worked, huh?
When someone can't say something reasonable he starts targeting the drivers... But he can't understand that most of the time, the main problem is in his/her skills to resolve something with the OS/Software/Hardware.
 
I just bought an asrock aqua 7900xtx and flashed the bios for 2950mhz clocks. Damn this card is fast. I'm getting over 200fps in Avatar in ultra settings at 2k res. I will try 4k on my TV once I get a longer hdmi cable. Avatar forces Ray Tracing and you can't turn it off so it shows me that AMD cards absolutely can trace rays as good as nVidia If a game is fully optimized for AMD. Is there some kind of bias toward optimizing only for nVidia and that is maybe why nv seems to have better rt performance? 🤔 is avatar the game that is uncovering a possible industry wide conspiracy? Tune in for the next episode of Coast to coast AM radio with Tangoseal. No seriously though, I am glad I didn't buy an nVidia GPU instead. I'm tired of nVidia, hearing about nVidia, and definitely tired of the games they play with people. They are greedy and unjustifiably expensive. So if you haven't purchased a gpu yet, I am recommending an xtx coupled with a 7800x3d. Your system will be in the top 1% of gaming rigs worldwide.
 
Last edited:
I can't see why I would choose 4080s over 7900 XTX.

I prefer a higher FPS (+Hz) than an RT gimmick.
A slower refresh rate always leads to blurriness on your monitor, so heavy RT => blurry gaming, no more, no less. If someone likes it - fine, I do not :)

Drivers? I have used Radeon for gaming since Vega, but I can't remember where I had a problem with drivers. Yes, AMD (as Nvidia) can drop a driver that can do problems (very rare ofc), but what, can't I revert to the previous one that worked, huh?
When someone can't say something reasonable he starts targeting the drivers... But he can't understand that most of the time, the main problem is in his/her skills to resolve something with the OS/Software/Hardware.
It's a close race.

7900 XTX has more raw horsepower but the 4080 S has more refinement.

If you value:
- RT
- G-SYNC
- Like having a newer card (released in 2024) for resale

Get the 4080 S

For other reasons, get the 7900 XTX.

Price is a toss up...unfortunately the 7900 XTX hasn't had an "official" price cut - but I was able to get a Nitro+ 7900 XTX from Amazon new for $1029 which I think is a great deal. The 4080 S TUF I picked up was $999 - also a great deal. Yeah, the 4080 S has availability issues but restocks have been fast and furious.

It's just so close:
https://www.techpowerup.com/review/nvidia-geforce-rtx-4080-super-founders-edition/31.html

The 16GB versus 24GB is irrelevant. By the time you "need" 24GB you're going to "need" a faster card to push those frames. Otherwise I'd be around here with an RTX 3060 flexing my VRAM.

The AMD CPU/GPU harmony is irrelevant. They simply haven't done enough there to make it markedly better.

Source: used Nitro+ 7900 XTX and 4080 S TUF in the same 7800X3D box.
 
I just bought an asrock aqua 7900xtx and flashed the bios for 2950mhz clocks. Damn this card is fast. I'm getting over 200fps in Avatar in ultra settings at 2k res. I will try 4k on my TV once I get a longer hdmi cable. Avatar forces Ray Tracing and you can't turn it off so it shows me that AMD cards absolutely can trace rays as good as nVidia If a game is fully optimized for AMD. Is there some kind of bias toward optimizing only for nVidia and that is maybe why nv seems to have better rt performance? 🤔 is avatar the game that is uncovering a possible industry wide conspiracy? Tune in for the next episode of Coast to coast AM radio with Tangoseal. No seriously though, I am glad I didn't buy an nVidia GPU instead. I'm tired of nVidia, hearing about nVidia, and definitely tired of the games they play with people. They are greedy and unjustifiably expensive. So if you haven't purchased a gpu yet, I am recommending an xtx coupled with a 7800x3d. Your system will be in the top 1% of gaming rigs worldwide.

The fact that the consoles run Avatar with RT should tell you all you need to know about how RT is implemented in that game. 7900XTX is the same price as the 4080S but only NV is greedy? As someone running a full AMD system I'd easily recommend someone go for the 4080S over the 7900XTX. RDNA3 is a great GPU but it needs to be cheaper than the RTX cards or it doesn't make sense.

Also:
1709141553657.png
 
Price is a toss up...unfortunately the 7900 XTX hasn't had an "official" price cut - but I was able to get a Nitro+ 7900 XTX from Amazon new for $1029 which I think is a great deal. The 4080 S TUF I picked up was $999 - also a great deal. Yeah, the 4080 S has availability issues but restocks have been fast and furious.

7900 XTXs can be had in the $900 range currently and have dipped below that for at least the last 8 months. There's no reason to spend over $1000 on one unless you want a specific variant.
 
7900 XTXs can be had in the $900 range currently and have dipped below that for at least the last 8 months. There's no reason to spend over $1000 on one unless you want a specific variant.
Same can be said about the RTX 4080 (not S) if we are going by market pricing. So pricing is still mostly moot until there is an official price cut.
 
The fact that the consoles run Avatar with RT should tell you all you need to know about how RT is implemented in that game. 7900XTX is the same price as the 4080S but only NV is greedy? As someone running a full AMD system I'd easily recommend someone go for the 4080S over the 7900XTX. RDNA3 is a great GPU but it needs to be cheaper than the RTX cards or it doesn't make sense.

Also:
View attachment 638425
That bench is Bullshit. Pure bullshit. I'll bench mine tonight and show you over 200 fps at 1440p ultra
 
Yeah, with FSR and likely Frame-gen.
Well isn't that how nVidia works??
With DLSS and all that magic jazz


So why are you assuming nVidia is superior to AMD when nVidia was using fairy magic long before AMD?
 
Yeah, with FSR and likely Frame-gen.

It can really depend where you are at in a game, in Hogwarts Legacy being in a town or in Hogwarts with lots of NPC's makes a huge difference in FPS. But canned benchmarks I find tend to be lower than actual gameplay.
 
But... but the more you buy the more you save!
How can you guys even compare NVIDIA and AMD? The pricing is similar almost entirely across each of their product stacks. Somehow that makes AMD the Robin Hood of GPUs??

NVIDIA is the market leader and sets pricing so I guess you can say they’re greedy - AMD sets their pricing in line with NVIDIA though so we don’t have some kind of huge value play on AMD’s side here.
 
Well isn't that how nVidia works??
With DLSS and all that magic jazz


So why are you assuming nVidia is superior to AMD when nVidia was using fairy magic long before AMD?
I didn't say anything about that. I was just postulating why you would call BS on TPU's results and say you're getting 200fps when no other Avatar benchmarks are near that high even with a 4090.
 
Yeah both AMD and Nvidia are going to continue to price cards as high as possible. The most recent shakeup that made some traction for AMD was the RX 480 at 199 at launch and not the moonshot launch price and reduction they do now. Nvidia really doesn’t reduce prices too much historically though.
NVIDIA is the market leader and sets pricing so I guess you can say they’re greedy - AMD sets their pricing in line with NVIDIA though so we don’t have some kind of huge value play on AMD’s side here.

On topic I was interested in the 4080 near launch but not for a grand or more. I’m just not getting to that price point personally. Doesn’t feel right.
3-700 new or used is about all I’m willing to shell out.

for the same price I’d take the 4080s vs the 7900xtx. This is coming from a 7900xt owner who is completely satisfied with it.
 
How can you guys even compare NVIDIA and AMD? The pricing is similar almost entirely across each of their product stacks. Somehow that makes AMD the Robin Hood of GPUs??

NVIDIA is the market leader and sets pricing so I guess you can say they’re greedy - AMD sets their pricing in line with NVIDIA though so we don’t have some kind of huge value play on AMD’s side here.

I own both, have owned many many multiples of both, and use them heavily for actual gaming instead of just benchmarking. In my opinion if you're a pure gamer that wants plug and play just buy the Nvidia card even if its $50 more than the closest AMD. Hardware enthusiasts who know how to tweak and tune, buy what you want.

Why? Take the newest big hit - Helldivers 2. AMD users are being forced to do all sorts of weird shit like manually switching to DX11 mode, capping at 85fps, disabling in game graphics options, etc just to play the game for more than a few minutes without crashing. This is AFTER AMD released a driver specifically for the game, while Nvidia has basically no problems at all.

I can't even tell you how many times I've had issues with smaller titles that I've sunk a lot of hours into with AMD cards and ended up spending more time troubleshooting than I do playing. This is not an "AMD sucks" rant, its simply reality. I prefer AMD, I've liked them better since the ATi days and I have a lot of nostalgia for the company. Their driver division is smaller and spends an awful lot of time trying to innovate new features rather than polishing the old ones and fixing bugs that are years old.

At the end of the day, both make good products but to pretend like AMD is better based on their pricing and business model is disingenuous at best. AMD is great at milking and manipulating that fanboy fanaticism to squeak by on products that should otherwise be unacceptable (and AMD content blows the hell up on every platform so content creators milk it too).
 
I own both, have owned many many multiples of both, and use them heavily for actual gaming instead of just benchmarking. In my opinion if you're a pure gamer that wants plug and play just buy the Nvidia card even if its $50 more than the closest AMD. Hardware enthusiasts who know how to tweak and tune, buy what you want.

Why? Take the newest big hit - Helldivers 2. AMD users are being forced to do all sorts of weird shit like manually switching to DX11 mode, capping at 85fps, disabling in game graphics options, etc just to play the game for more than a few minutes without crashing. This is AFTER AMD released a driver specifically for the game, while Nvidia has basically no problems at all.

I can't even tell you how many times I've had issues with smaller titles that I've sunk a lot of hours into with AMD cards and ended up spending more time troubleshooting than I do playing. This is not an "AMD sucks" rant, its simply reality. I prefer AMD, I've liked them better since the ATi days and I have a lot of nostalgia for the company. Their driver division is smaller and spends an awful lot of time trying to innovate new features rather than polishing the old ones and fixing bugs that are years old.

At the end of the day, both make good products but to pretend like AMD is better based on their pricing and business model is disingenuous at best. AMD is great at milking and manipulating that fanboy fanaticism to squeak by on products that should otherwise be unacceptable.

It's a brand new game and for some reason they can have issues with 1 brand over the other. Seen people with Nvidia cards have issues, check out the rants about Hogwarts Legacy and how AMD was sabotaging Nvidia or some other BS. Reality is sometimes you have to wait a bit for a better driver which sucks or roll back to a older driver that works better with that game. Doesn't matter what brand you own, sometimes you got to wait for them to fix the issue.
 
  • Like
Reactions: noko
like this
It's a brand new game and for some reason they can have issues with 1 brand over the other. Seen people with Nvidia cards have issues, check out the rants about Hogwarts Legacy and how AMD was sabotaging Nvidia or some other BS. Reality is sometimes you have to wait a bit for a better driver which sucks or roll back to a older driver that works better with that game. Doesn't matter what brand you own, sometimes you got to wait for them to fix the issue.

I'm willing to bet if there was a site like Consumer Reports that does data collection for all GPU brands, 70% of the complaints would come from AMD. Again, I'm not saying this to be anti-AMD, they are smaller than Nvidia and do way less "big data" business so they try to innovate with features that take time away from doing other things like fixing the bug backlog. Its just the way it is.

Thats why I suggest to pure gamers who don't like to tweak things to buy Nvidia, and people who are hardware enthusiasts buy the one you like.

We're all generally enthusiasts here, but if you ask normal people you get a different answer. I have a friend that is disabled and on a fixed income, I upgraded his rig last year with an AM4 board and 5800X I had but he had to keep his GTX 970 at the time. I offered to give him my RX 6600 for dirt cheap and he said no because he always had problems with Radeon and he would rather struggle with the old card than deal with any issues that might pop up because its more frustrating than lowering settings. Thats just how people see it, and its not for no reason.
 
I didn't say anything about that. I was just postulating why you would call BS on TPU's results and say you're getting 200fps when no other Avatar benchmarks are near that high even with a 4090.
I understand. But I mean if all the hype and tech now is DLSS and FSR3+++ as it evolves then we might as well bench with it. If we want pure raw hardware raster then I believe the 4090 is king, the 4080s and 7900xtx are going to be arm wrestling for sure. When it comes to Ray Tracing I dont for a minute believe that AMD is that bad at it. I think that games that have implemented RT have done so wtih a strong bias for the method that nVidia uses. Is this wrong? No, I mean nvidia owns the market for GPUs share wise. Is it scientific to suggest that nvidia is just naturally better at RT, no. I dont believe we can say that unless a game dev puts equal effort into coding and fine tuning the methods for RT to work in favor equally for nvidia and amd alike. Then we can truly see. Clearly Avatar is heavily biased for AMD. I wont lie. But the same can be said for others games like Cyberpunk which is heavily biased to nVidia. I am not jocking for AMD here. I am just trying to say that TPUs numbers are not applicable to the current contemporary. I have a 3070ti, Ive owned a 3090, and RT was great and all but clearly faster at the time than AMD. But I think scientifically we can't prove AMD is bad at RT from the outside looking in. We need developers to fully embrace AMDs method of Rays with the same energy and enthusiasm as they do for nV. Maybe then we would see reality. But realitry is defined by the one experiencing it and so far when it comes to king it is in fact the 4090 and the 4080s isn't too far behind but that price difference is sickening.
1000005537.jpg


1000005536.jpg

1000005535.jpg

Avg 208 fps @ 2k res ultra preset

I changed FSR scaling to Ultra Quality and got this

1000005538.jpg

Ignore the 7000 fps but the avg is still 180

And my card is Asrock Liquid With a 2900mhz bios flash so its faster by quite a bit. That was pulling 475watts I believe

To be very fair to you and nVidia the 4080 super is way more power efficient. Quite a bit in fact. And with overclocking the 4080 can be a little beast. However I am defending AMD this time. The7900 xtx is damn near a 490 if AMD wouldn't be so weak on holding back the power and frequency.
 
Last edited:
How can you guys even compare NVIDIA and AMD? The pricing is similar almost entirely across each of their product stacks. Somehow that makes AMD the Robin Hood of GPUs??
Oh trust me, I haven't lost sight of this. My enthusiasm for RDNA3 hit the floor when they said the cards were going to be $900 and $1,000 for the XT and XTX respectively. So, in essence: fuck no AMD isn't the Robin Hood of GPUs. They are just as guilty, because if RDNA 3 was an RT monster as well as in raster they'd be pricing the 7900 XTX right in line with the 4090.

AMD doesn't get a pass either just because they aren't currently in the spotlight in the GPU game. Scalpers and cryptomining showed them what the market would bear. Now we're all paying.
 
Oh trust me, I haven't lost sight of this. My enthusiasm for RDNA3 hit the floor when they said the cards were going to be $900 and $1,000 for the XT and XTX respectively. So, in essence: fuck no AMD isn't the Robin Hood of GPUs. They are just as guilty, because if RDNA 3 was an RT monster as well as in raster they'd be pricing the 7900 XTX right in line with the 4090.

AMD doesn't get a pass either just because they aren't currently in the spotlight in the GPU game. Scalpers and cryptomining showed them what the market would bear. Now we're all paying.
I agree. However I was fortunate and took quite a risk at Microcenter and got an xtx with waterblock that was returned for 877. So it was a great Massively great deal. But the xtx is priced too high. Should be 750 and xt 550 and so forth. 4080s should be 750, 4090 should be 1000 maybe even 1100 but 2k is asanine.
 
I agree. However I was fortunate and took quite a risk at Microcenter and got an xtx with waterblock that was returned for 877. So it was a great Massively great deal. But the xtx is priced too high. Should be 750 and xt 550 and so forth. 4080s should be 750, 4090 should be 1000 maybe even 1100 but 2k is asanine.
See, accounting for inflation those are prices I could get behind. "Reasonable" (never thought I'd be saying that about Turing pricing, but here we are) as per the first gen RTX cards. $2,000 at the top is way too much, agreed. We've paying now for GeForce cards what businesses used to for Quadros... It's not kosher.
 
Last edited:
Oh trust me, I haven't lost sight of this. My enthusiasm for RDNA3 hit the floor when they said the cards were going to be $900 and $1,000 for the XT and XTX respectively. So, in essence: fuck no AMD isn't the Robin Hood of GPUs. They are just as guilty, because if RDNA 3 was an RT monster as well as in raster they'd be pricing the 7900 XTX right in line with the 4090.

AMD doesn't get a pass either just because they aren't currently in the spotlight in the GPU game. Scalpers and cryptomining showed them what the market would bear. Now we're all paying.
Even a dual-GPU RX 7990 isn't outside the realm of possibility. There were supposed to be 7950 XT, 7950 XTX, 7970 XTXX to combat 4090 Ti, 4090 Ti Super, TITAN Ada respectively before they abandoned the will to complete at the highest level and capped at 7900 XTX. The reason Nvidia held back and followed up with so many bullshit afterwards until now is because of that.
 
Even a dual-GPU RX 7990 isn't outside the realm of possibility. There were supposed to be 7950 XT, 7950 XTX, 7970 XTXX to combat 4090 Ti, 4090 Ti Super, TITAN Ada respectively before they abandoned the will to complete at the highest level and capped at 7900 XTX. The reason Nvidia held back and followed up with so many bullshit afterwards until now is because of that.
RT killed them. They just need to readjust for next gen. I mean why bring out a dual GPU card in one if it’ll still be maligned for having only 95% RT perf of a single 4090?
 
No sense in doing dual-gpu in this day and age where SLI and crossfire support have been basically abandoned because devs aren't willing to pick up the slack left by the driver teams after moving to DX12/Vulkan.
 
How can you guys even compare NVIDIA and AMD? The pricing is similar almost entirely across each of their product stacks. Somehow that makes AMD the Robin Hood of GPUs??

NVIDIA is the market leader and sets pricing so I guess you can say they’re greedy - AMD sets their pricing in line with NVIDIA though so we don’t have some kind of huge value play on AMD’s side here.
You are right. They are both greedy. When the 4080 was announced for $1200 and the 7900XTX for $999 then the numbers made sense. When the 4080S was announced for $999, AMD refused to cut the prices on its 7900XTX which shows that it is just as greedy as Nvidia.

If they had cut their price on the XTX to $799 I would have considered AMD but now no thanks.

I believe they are hoping that the 4080S is a low stock run so that they don't have to lower the profit margins on their XTX card.

So although the 4080S is more attractive to most people now but if it sells out after 10,000 or so cards then AMD feels no need to respond because news flash, they don't want to lower prices if they don't have to.

The reason AMD is losing is that they don't seem to get that nobody is picking them when they are not a price leader because they are NOT the performance leader.

Who knows, maybe they are happy just beating up on Intel in the cpu sphere and don't care about coming in 2nd in the GPU sphere because at least they are capturing most of the console market outside of the Nintentdo Switch, which went with Nvidia.
 
Last edited:
Even a dual-GPU RX 7990 isn't outside the realm of possibility. There were supposed to be 7950 XT, 7950 XTX, 7970 XTXX to combat 4090 Ti, 4090 Ti Super, TITAN Ada respectively before they abandoned the will to complete at the highest level and capped at 7900 XTX. The reason Nvidia held back and followed up with so many bullshit afterwards until now is because of that.
Wait... I thought Crossfire (and SLi by association) was dead and buried for the consumer side due to DX12, lazy devs, and "relatively few" users of multi-GPU? I haven't heard of anything beyond a 7900 XTX in the pipeline to my knowledge. I agree that nVIDIA held back to the point of mediocrity at all but the top because AMD can't currently compete. We need competition again to stir things up.
 
I understand. But I mean if all the hype and tech now is DLSS and FSR3+++ as it evolves then we might as well bench with it. If we want pure raw hardware raster then I believe the 4090 is king, the 4080s and 7900xtx are going to be arm wrestling for sure. When it comes to Ray Tracing I dont for a minute believe that AMD is that bad at it. I think that games that have implemented RT have done so wtih a strong bias for the method that nVidia uses. Is this wrong? No, I mean nvidia owns the market for GPUs share wise. Is it scientific to suggest that nvidia is just naturally better at RT, no. I dont believe we can say that unless a game dev puts equal effort into coding and fine tuning the methods for RT to work in favor equally for nvidia and amd alike. Then we can truly see. Clearly Avatar is heavily biased for AMD. I wont lie. But the same can be said for others games like Cyberpunk which is heavily biased to nVidia. I am not jocking for AMD here. I am just trying to say that TPUs numbers are not applicable to the current contemporary. I have a 3070ti, Ive owned a 3090, and RT was great and all but clearly faster at the time than AMD. But I think scientifically we can't prove AMD is bad at RT from the outside looking in. We need developers to fully Avg 208 fps @ 2k res ultra preset

I changed FSR scaling to Ultra Quality and got this

Ignore the 7000 fps but the avg is still 180

And my card is Asrock Liquid With a 2900mhz bios flash so its faster by quite a bit. That was pulling 475watts I believe

To be very fair to you and nVidia the 4080 super is way more power efficient. Quite a bit in fact. And with overclocking the 4080 can be a little beast. However I am defending AMD this time. The7900 xtx is damn near a 490 if AMD wouldn't be so weak on holding back the power and frequency.

I wasn't saying that I didn't believe that you were getting the performance that you were saying. Just that it's not really impressive to get 200fps in a game with upscaling AND frame gen on a top-end GPU; in that scenario even a 4070 is going to be like 170fps. And the 'AMD RT method' is just to have minimal ray tracing; the reason CP2077 favors NV is because it uses near full-fat ray tracing (as in it's calculating shit-tons of light pathing as opposed to a handful of best guessed rays layered upon a shader). The RT games that AMD is competitive in are games that you can't even see a difference between RT on and off. Avatar especially is guilty of being one of these 'bad' RT titles as even the RT reflections are low-quality. Just remember, if a console can do it well then it has been written to run well on a low-end PC.
 
Last edited:
I wasn't saying that I didn't believe that you were getting the performance that you were saying. Just that it's not really impressive to get 200fps in a game with upscaling AND frame gen on a top-end GPU; in that scenario even a 4070 is going to be like 170fps. And the 'AMD RT method' is just to have minimal ray tracing; the reason CP2077 favors NV is because it uses near full-fat ray tracing (as in it's calculating shit-tons of light pathing as opposed to a handful of best guessed rays layered upon a shader). The RT games that AMD is competitive in are games that you can't even see a difference between RT on and off. Avatar especially is guilty of being one of these 'bad' RT titles as even the RT reflections are low-quality. Just remember, if a console can do it well then it has been written to run well on a low-end PC.

The big argument was never who could trace rays better, everyone knows nVidia can do it better. It was a claim that Nvidia was like waaay faster in Avatar but its not. When you use the upscaling tech that both cards can throw at the game they are undiscernably comparable. And since no one on Earth can humanly detect framegen and upscaling, FSR and DLSS, etc... then shouldn't we use those numbers? Do I disagree with frame gen etc... I don't know. On one hand to me its shady, untruthful, and gimmick like, but that's because I'm stuck in my old ways and all I know is pure hardware level performance. But times are changing with or without my old salty yesteryear expectations. I'm getting tired of beating up the GPUs because I don't like upscaling. It's like trying to swim vertical up a waterfall. It's futile at this point. So we might as well embrace the pixel dust and fairy magic upscaling tech. Thus with that being said the 4080s is Not superior to the 7900xtx in Avatar and that chart posted has no relevance with the contemporary tech being used.

Is cp2077 better on nVidia - you betcha but the game is an abysmal pit of boredom after about 15 hours (personal opinion), so enjoy the win nVidia. And I also own an nV gpu btw. There are many games that AMD is neck to neck with nV on and that is the bigger picture. We MUST, as hardware enthusiasts, support AMD, because we will never get true innovation without competition, we just get bullshit stagnation of tech and increasing prices and so forth. And we have seen this with nVidia before AMD created rDNA, and we seen it with Intel before AMD created Zen.

Also many people are postulating that this is going to DESTROY FSR and DLSS and become the upscaler of choice because it's goes beyond prorietary methods of pixel magic and fairy spells to upscale.

1000005568.jpg
 
Last edited:
I didn't see anyone claim that Nvidia was way faster in Avatar; though it does appear to be about 15% faster there regardless. Stating that they're the same performance because upscaling/frame gen can put them at super high framerates is an odd argument; 'hey, just put them both on a 60hz display and they're exactly the same'! Depending on the game it can be anywhere from easy to hard to tell the difference between FSR and DLSS2 or AFMF and DLSS3 but that difference is almost always in favor of the Nvidia tech. Avatar has an issue where the UI doesn't update at the same rate as the rest of the image so you have weird artifacts around it with AFMF enabled. I don't think Avatar has DLSS3 so I don't think it can be directly compared; although that would be nice. Maybe one of those DLSS modders has managed to inject it?

CP2077 being 'boring after 15' hours is another odd argument. I thought we were talking about graphics tech? If we're going to shift the goalposts to it being about whether a game is fun or not one could argue that Avatar is just another boring copy-paste Ubisoft game...
 
Last edited:
I didn't see anyone claim that Nvidia was way faster in Avatar; though it does appear to be about 15% faster there regardless. Stating that they're the same performance because upscaling/frame gen can put them at super high framerates is an odd argument; 'hey, just put them both on a 60hz display and they're exactly the same'! Depending on the game it can be anywhere from easy to hard to tell the difference between FSR and DLSS2 or AFMF and DLSS3 but that difference is almost always in favor of the Nvidia tech. Avatar has an issue where the UI doesn't update at the same rate as the rest of the image so you have weird artifacts around it with AFMF enabled. I don't think Avatar has DLSS3 so I don't think it can be directly compared; although that would be nice. Maybe one of those DLSS modders has managed to inject it?

CP2077 being 'boring after 15' hours is another odd argument. I thought we were talking about graphics tech? If we're going to shift the goalposts to it being about whether a game is fun or not one could argue that Avatar is just another boring copy-paste Ubisoft game...
It just doesn't make sense to compare one game unless that's a game you are going to play for years and years (Fortnite?). This is why I love TechPowerUp's reviews and their aggregate of FPS per resolution across over 20 games.
 
Last edited:
Wait... I thought Crossfire (and SLi by association) was dead and buried for the consumer side due to DX12, lazy devs, and "relatively few" users of multi-GPU? I haven't heard of anything beyond a 7900 XTX in the pipeline to my knowledge. I agree that nVIDIA held back to the point of mediocrity at all but the top because AMD can't currently compete. We need competition again to stir things up.
Oh it's dead all right. I think he might be talking about the multi-chiplet design for GPUs that is rumored to come out with RDNA5. But that's two generations from now. RDNA4 from all reports will stick with a single monolithic GPU design.

This new chiplet design has already been copyrighted by AMD as "Distributed Geometry" using what is described as "Geometry Engine" design:

1709245929742.png


Article: https://www.pcgamer.com/amds-new-ch...r-graphics-cards-what-ryzen-did-for-its-cpus/
 
I didn't see anyone claim that Nvidia was way faster in Avatar; though it does appear to be about 15% faster there regardless. Stating that they're the same performance because upscaling/frame gen can put them at super high framerates is an odd argument; 'hey, just put them both on a 60hz display and they're exactly the same'! Depending on the game it can be anywhere from easy to hard to tell the difference between FSR and DLSS2 or AFMF and DLSS3 but that difference is almost always in favor of the Nvidia tech. Avatar has an issue where the UI doesn't update at the same rate as the rest of the image so you have weird artifacts around it with AFMF enabled. I don't think Avatar has DLSS3 so I don't think it can be directly compared; although that would be nice. Maybe one of those DLSS modders has managed to inject it?

CP2077 being 'boring after 15' hours is another odd argument. I thought we were talking about graphics tech? If we're going to shift the goalposts to it being about whether a game is fun or not one could argue that Avatar is just another boring copy-paste Ubisoft game...
You can't fight fanboyism. He admitted that you MUST support AMD. Seems clear cut to me.
 
Back
Top