AMD's Radeon RX 7900-series Highlights

Main thing I want to know is how well these handle RTX.
RTX is Nvidia naming, AMD was clear, 1.7x RT, meaning for yesterday and current tech games, Nvidia should have a clear advantage. I expect in general the 7900XTX is around 3090 level. For more current optimized games and future, AMD should catch up some.
 
Main thing I want to know is how well these handle RTX.
Its well know its gonna be around 3090 and 3090ti level for RTX comparing benches by AMD. So there is that. Don't expect anything better but I guess in most rtx cases you still gotta use dlss2 or fsr to truly enjoy it. Which can make things playable at 4k.
 
  • Like
Reactions: noko
like this
4090 vs 7900xtx, blah who cares. We should all be buying 3080ti or 6900xt half off. While I paid $1500 for my power color devil, I don't want anyone else making that mistake. Now that I see them for $600, I can max all settings 1440p 144hz on mine. What more do we need?
 
Pointing out the obvious isn't getting riled up. But the fact that you think it does and then revel in it says so much about you. :coffee:

The whole point in AMDs changing the Navi naming scheme from 3 digits (RX480) to 4 digits (6900XT) is to compare them to Nvidia's offerings; if it weren't they could have happily continued naming them somenthing that doesn't even remotely resemble their competition.

And here is the crux of any argument in defense of AMD. People around calling Nvidia, "Ngreedia" while in the same breath praying that AMD will save them from the big bad green. Even on these very forums.

Except they already have a history with not naming their top teir x900. Like the 5000 series, where their top tier was a 5700XT, because that's what they could compete with. This generation they just said, "Fuck it".

The last 3 generations were the only time AMD has tried to play monkey see, monkey do with their naming scheme on the GPU side, and thus the only time both AMD's and Nvidia's numbers were even remotely similar to sort of align. And I absolutely agree that they made the mistake in trying to copy Nvidia's naming scheme.
they can still make a 76xx, 77xx, 78xx, 7950, 7970, and 7990.

The whole point in AMDs changing the Navi naming scheme from 3 digits (RX480) to 4 digits (6900XT) is to compare them to Nvidia's offerings
Or Maybe it's because back in the day there was the Radeon HD 6000 series. "67xx, 68xx, 6950, 6970 and 6990"
 
4090 vs 7900xtx, blah who cares. We should all be buying 3080ti or 6900xt half off. While I paid $1500 for my power color devil, I don't want anyone else making that mistake. Now that I see them for $600, I can max all settings 1440p 144hz on mine. What more do we need?
I’m excited for the 7900s. I hope to buy one…in two years at a steep discount.
 
I think the answer will be "not particularly well", but I don't expect AMD RT to be competitive with NVIDIA for several more generations. They're getting better at it though.
I expect the game engines will find a way around it and they will level the playing field there before AMD catches up. Things like Lumen/Nanite and the Entity Component System will create a baseline that exists somewhat independently and a little more platform agnostic.

I half expect that Nvidia pushes it as much as they do because it is absolutely needed in the enterprise space and with the current manufacturing costs and development costs having separate cores for enterprise and consumer parts is too expensive.

In future generations when Nvidia isn’t monolithic and UCIe standards are more nailed in I expect that Nvidia May start changing up the options on their consumer and business class cards a little more than they are now and the RTX cores may simply land in a spot where they are “good enough”. Nvidia owns a lot of patents on the topic of light reflection calculations and noise reduction that are industry leading that I doubt AMD or anybody else for that matter has the resources to top any time soon. In a generation or two Nvidia wont need 2x the performance of AMD there the 10-15% their algorithms provide over the other stuff out there will be enough.
 
4090 vs 7900xtx, blah who cares. We should all be buying 3080ti or 6900xt half off. While I paid $1500 for my power color devil, I don't want anyone else making that mistake. Now that I see them for $600, I can max all settings 1440p 144hz on mine. What more do we need?
Max all settings? What about raytracing? You can't do that.
 
Max all settings? What about raytracing? You can't do that.

I think people need to think about Ray Tracing AS 'Max Settings'

Install a browser extension to replace the words "Ray Tracing" with "Max Settings" and listen to how bad AMD apologists sound.

"I don't really care about Max Settings, I'm happy with turning Max Settings off and playing with higher FPS"

"Yeah, Who cares about Max Settings when in most games I can't tell the difference?"

"So what if AMD looses in Max Settings if you turn that off its faster!"

It kind of sounds a bit defeatist when you look at it like that?
 
I think people need to think about Ray Tracing AS 'Max Settings'

Install a browser extension to replace the words "Ray Tracing" with "Max Settings" and listen to how bad AMD apologists sound.

"I don't really care about Max Settings, I'm happy with turning Max Settings off and playing with higher FPS"

"Yeah, Who cares about Max Settings when in most games I can't tell the difference?"

"So what if AMD looses in Max Settings if you turn that off its faster!"

It kind of sounds a bit defeatist when you look at it like that?
Yep... I don't get how they rationalize it but they do... :rolleyes:. Raytracing is part of max settings.

"Well I can max everything but uh, no aa or AF, and I have to disable post process effects, guys! I'm maxing it!" - AMD fan back in the day.
 
I think people need to think about Ray Tracing AS 'Max Settings'

Install a browser extension to replace the words "Ray Tracing" with "Max Settings" and listen to how bad AMD apologists sound.

"I don't really care about Max Settings, I'm happy with turning Max Settings off and playing with higher FPS"

"Yeah, Who cares about Max Settings when in most games I can't tell the difference?"

"So what if AMD looses in Max Settings if you turn that off its faster!"

It kind of sounds a bit defeatist when you look at it like that?
You could also use the same argument for Physx. And you could further that by saying RT at present is about as useful as Physx was.
 
You could also use the same argument for Physx. And you could further that by saying RT at present is about as useful as Physx was.
Not at all. Raytracing is not proprietary, is vendor neutral, and part of all major graphics api. It also makes a huge difference in many cases. It's a standard graphics option now.

You could say a lot of things, but that doesn't make them true. :ROFLMAO:. ⚡
 
Not at all. Raytracing is not proprietary, is vendor neutral, and part of all major graphics api. It also makes a huge difference in many cases. It's a standard graphics option now.

You could say a lot of things, but that doesn't make them true. :ROFLMAO:. ⚡
For many, using it degrades the IQ and makes games unplayable. So not max after all :smug:
 
It doesn't degrade image quality, :LOL:. It's part of max settings, whether you like it or not. It has no effect on playability if you have a good card for it.
Really? I guess if 10fps is acceptable to you on a 2060 rendering at 720P.

I can run Cyberpunk on Ultra, 4K performance DLSS and get if I remember right 50fps, over 60fps with high performance DLSS but looks like crap. IQ is definitely degraded due to Performance DLSS. It performs better, looks better (Sharper textures, more fluid, way faster) with rt off, better game experience overall. Now are you saying a 3090 has crap rt performance? Something the 7900XTX should be around if not better.
 
You could also use the same argument for Physx. And you could further that by saying RT at present is about as useful as Physx was.
PhysX is still used in a lot of titles and is still actively developed. It’s just not very good for hard body physics, for that Havok or Bullet will do better but if it’s purely for particle, liquid, or visual effects than PhysX is about the best there is. This may have changed up a little with the release of v5 of the PhysX engine, Nvidia was promising lots of improvements to hard body effects but I haven’t played with it.

Edit:
It’s also available as opensource under a BSD license.
https://github.com/NVIDIA-Omniverse/PhysX
 
Last edited:
It doesn't degrade image quality, :LOL:. It's part of max settings, whether you like it or not. It has no effect on playability if you have a good card for it.
Well even on a 4090 if we are talking 4k anyway. Its still a choice between high refresh and RT. There is a reason Nvidia only wants to talk about DLSS and frame generation. The only way to get high refresh + RT right now is to enable DLSS and Frame generation those for sure degrade IQ. For the most part anyway... I guess a few titles that use it sparingly you can probably flip it on and forget it with 3090/4080/90 class cards. With the few titles that really use like Cyberpunk even the 4090 drops under 60fps at max settings... and only gets to 60fps territory with dlss.
 
  • Like
Reactions: noko
like this
Well even on a 4090 if we are talking 4k anyway. Its still a choice between high refresh and RT. There is a reason Nvidia only wants to talk about DLSS and frame generation. The only way to get high refresh + RT right now is to enable DLSS and Frame generation those for sure degrade IQ. For the most part anyway... I guess a few titles that use it sparingly you can probably flip it on and forget it with 3090/4080/90 class cards. With the few titles that really use like Cyberpunk even the 4090 drops under 60fps at max settings... and only gets to 60fps territory with dlss.
Frame gen has very little downside visually from end-user reports so long as you're at 45-60fps without it. Dlss quality mode is good stuff, I rarely can tell the difference in motion (I run a 3080 EVGA ftw3 hybrid and have since Dec 2020)... I wouldn't bother with the performance mode and don't have to.

Virtually nothing is as demanding as Cyberpunk 2077, so it's kind of silly to use that as the one and only example. It also hasn't had its dlss 3 frame gen patch yet (soon it will, sacrifice a scrap of input lag for double fps? Yes please!).

You can run many games without dlss or only dlss quality on a high end card already, and that will only trickle down with further generations.

noko seems obsessed across the forums with claiming raytracing is bad for some reason. I'm not going to bother responding to his misleading tripe as if raytracing is a part of dlss :).
 
The only way to get high refresh + RT right now is to enable DLSS and Frame generation those for sure degrade IQ.
At 4K, sure. At 1440p, not even breaking a sweat. Something a bunch of people didn't think Nvidia would be able to do on their third iteration of the technology, but here we are.

And DLSS once implemented correctly has proven to improve IQ instead of degrading it. Something that can't be said about FSR.
 
At 4K, sure. At 1440p, not even breaking a sweat. Something a bunch of people didn't think Nvidia would be able to do on their third iteration of the technology, but here we are.

And DLSS once implemented correctly has proven to improve IQ instead of degrading it. Something that can't be said about FSR.
This. Raytracing is brand new to consumer gaming... Very few run 4k (I have since 2014), but some of the best monitors such as the 240hz oled LG one and the alienware ultrawide don't even do 4k yet.

DLSS can enhance quality or at worst lose barely anything at all, already. Maybe noko being an AMD fan hasn't yet experienced dlss and only has seen the far worse FSR?

Thing is, unlike what noko claims, you can do raytracing in many cases and most resolutions without dlss now. At 4k you can with quality mode. People at guru 3d were showing off ultra high refresh with 4k maxed Spider-Man raytracing at 180+ fps with DLAA and frame gen enabled by dlss3 on their 4090 cards. I can't wait to see that in person! The videos looked amazing.
 
Frame gen has very little downside visually from end-user reports so long as you're at 45-60fps without it. Dlss quality mode is good stuff, I rarely can tell the difference in motion (I run a 3080 EVGA ftw3 hybrid and have since Dec 2020)... I wouldn't bother with the performance mode and don't have to.

Virtually nothing is as demanding as Cyberpunk 2077, so it's kind of silly to use that as the one and only example. It also hasn't had its dlss 3 frame gen patch yet (soon it will, sacrifice a scrap of input lag for double fps? Yes please!).

You can run many games without dlss or only dlss quality on a high end card already, and that will only trickle down with further generations.

noko seems obsessed across the forums with claiming raytracing is bad for some reason. I'm not going to bother responding to his misleading tripe as if raytracing is a part of dlss :).
Nope, not obsessed nor brainwash. RT can increase IQ if one has acceptable performance with it. If performance is severely degraded and other compromises are made in order to use it, IQ can be degraded. Not very hard to follow.

Control is another rather hard hitting title. I played it at 1440p DLSS quality max settings roughly 70Fps. DLSS degraded the experience somewhat, motion artifacts but gave an acceptable smooth game play with RT improving the IQ enough to make it a better overal gaming experience. At 4k, 42" OLED, my choice would be much different, smooth game play is the most important with as high as quality possible. 4k without RT maybe better than 1440p with RT in other words. Most likely I would still use some of the RT choices as long as performance is good enough for me.

I've seen DLSS degrade image quality, Control, Cyberpunk but also seen it improve image quality, Doom Eternal, Tomb Raider (Rise). Not cut and dry.
 
Frame gen has very little downside visually from end-user reports so long as you're at 45-60fps without it. Dlss quality mode is good stuff, I rarely can tell the difference in motion (I run a 3080 EVGA ftw3 hybrid and have since Dec 2020)... I wouldn't bother with the performance mode and don't have to.

Virtually nothing is as demanding as Cyberpunk 2077, so it's kind of silly to use that as the one and only example. It also hasn't had its dlss 3 frame gen patch yet (soon it will, sacrifice a scrap of input lag for double fps? Yes please!).

You can run many games without dlss or only dlss quality on a high end card already, and that will only trickle down with further generations.

noko seems obsessed across the forums with claiming raytracing is bad for some reason. I'm not going to bother responding to his misleading tripe as if raytracing is a part of dlss :).
I have heard frame generation is sickness inducing unless your well above 60fps already. I'll stipulate the tech is new... and I haven't sat down and played a game with it myself at this point. From what I have seen of grabbled images and the reported effect that is similar to the soap opera effect when televisions do frame generation, it doesn't seem like its for me. (and for sure you can't argue it doesn't introduce some IQ degradation... which was my point, if you need frame generation to get over 60fps on even the $1600 card its questionably the max IQ setting imo anyway)

As for Cyberpunk... I go there because frankly there just aren't that may RT games at this point. We are still well south of 100 total titles years into RT. I know the numbers are growing... the witcher 3 patch in a few days seems interesting.

RT performance will be important at some point I don't doubt it. I'm just not sure we are there yet. If even the 4090 needs to resort to tricks like frame generation to get decent frame rates the tech is just not there yet imo. Perhpas I'm biased as none of the games I care about much have RT (or their implementations are not revolutionary I do go back to rift breaker now and then RT isn't a massive improvement there imo).
 
  • Like
Reactions: noko
like this
This. Raytracing is brand new to consumer gaming... Very few run 4k (I have since 2014), but some of the best monitors such as the 240hz oled LG one and the alienware ultrawide don't even do 4k yet.

DLSS can enhance quality or at worst lose barely anything at all, already. Maybe noko being an AMD fan hasn't yet experienced dlss and only has seen the far worse FSR?

Thing is, unlike what noko claims, you can do raytracing in many cases and most resolutions without dlss now. At 4k you can with quality mode. People at guru 3d were showing off ultra high refresh with 4k maxed Spider-Man raytracing at 180+ fps with DLAA and frame gen enabled by dlss3 on their 4090 cards. I can't wait to see that in person! The videos looked amazing.
That is your argument, must be a fanboy. Lol, own two Nvidia cards, 3090 and 3080Ti. Maybe you speak what you know and accuse others being really a Nvidia fan.

You actually think most RTX users care about RTX? Maybe if it can improve the gaming experience. Otherwise off it will be. What Nvidia RTX cards have good enough RT for most people? 2060? 3060? 3070? 4090?
 
At 4K, sure. At 1440p, not even breaking a sweat. Something a bunch of people didn't think Nvidia would be able to do on their third iteration of the technology, but here we are.

And DLSS once implemented correctly has proven to improve IQ instead of degrading it. Something that can't be said about FSR.
No doubt there at 1440p sure even last generation was able to get decent frame rates in the lighter RT titles. DLSS even quality... its a taste thing. It "improves" some aspects, degrades others. Ghosting still happens even with the latest version of DLSS 2. I'm not going to argue FSR is better, the latest FSR is pretty close though... and I'll agree Quality setting on both FSR and DLSS is pretty usable, worth using if your not already over 100fps anyway.
 
Its kind of funny watching someone who posts paragraphs of 100% praise for everything Nvidia does call someone else the fanboy for having a handful of reservations over raytracing. A feature that the vast majority still don't use.

Idk. Just an interesting perspective difference I guess.
 
I have 3 games I play with RT on currently, Gotham Knights, MechWarrior 5, and now Darktide. And it’s awesome.
I look forward to adding Baldurs Gate 3 to that list when it launches. I have the pre pay done so I can play it now but after sinking in too many hours I had to stop out of fear of not wanting to play it when it does launch.
 
It doesn't degrade image quality, :LOL:. It's part of max settings, whether you like it or not. It has no effect on playability if you have a good card for it.
It's max settings yet to properly using it the RTX owner shall activate DLSS options also, which "degrade image quality".
You can call it gaming on max settings using RTX when the user uses native resolution, not DLSS / FSR / XeSS.
 
  • Like
Reactions: noko
like this
You could also use the same argument for Physx. And you could further that by saying RT at present is about as useful as Physx was.
But PhysX was a real selling point for some people. Much more niche and specific to circumstances but nonetheless it was a feature that added to the experience when enabled.

Also comparing PhysX to RTX is an apt example because it's something that while initially was exclusive to Nvidia, was deemed to be useful and beneficial enough to become standardised and open sourced.
 
It's max settings yet to properly using it the RTX owner shall activate DLSS options also, which "degrade image quality".
You can call it gaming on max settings using RTX when the user uses native resolution, not DLSS / FSR / XeSS.
DLSS has some 'better than native' features when enabled, especially the rendering of high frequency transparency and sub-pixel texture detail. It also creates a much more temporally stable image versus native or TAA. It's not a clean sweep benefit but in many instances it can generate a net positive improvement over native and at higher FPS.


ALSO moot point, because any situation comparing DLSS on Nvidia is a situation you'll want to to enable FSR on AMD cards as well, and even then: Nvidia does Max settings better.

ALSO ALSO moot point, because even if you DONT enable image reconstruction on either card, Nvidia is still faster at maximum settings... by a lot.
 
RT degrades IQ?

*citation required
my only complaint about some of the raytracing I've seen is this sort of... radiation particle look around the edges. You see it in those render programs where the software seems to run passes and more calculations to fill in more and more detail until it smoothes out. It just gives the impression that our current performance levels are not enough to completely get rid of that look in real time yet.
 
what time does the review embargo lift?
9am EST, normally.
But PhysX was a real selling point for some people. Much more niche and specific to circumstances but nonetheless it was a feature that added to the experience when enabled.

Also comparing PhysX to RTX is an apt example because it's something that while initially was exclusive to Nvidia, was deemed to be useful and beneficial enough to become standardised and open sourced.
That assessment is mine as well. RT, while not proprietary to nV, was a selling point for them starting with Turing when it wasn't even a glint in AMD's eye. Physx too was a selling point, and nV capitalized through their ability to run the feature. Moreover, it continued to be a niche feature (however much nV pushed it) and it became more important in they eyes of consumers and AMD, morphing into open source solutions. In a way, the same point could be said for gSync/Freesync: nV capitalizes on their hardware/licence to run it, the feature becomes more useful/prevelant, AMD comes out with support, and finally it becomes largely open source/hardware agnostic.
 
9am EST, normally.

That assessment is mine as well. RT, while not proprietary to nV, was a selling point for them starting with Turing when it wasn't even a glint in AMD's eye. Physx too was a selling point, and nV capitalized through their ability to run the feature. Moreover, it continued to be a niche feature (however much nV pushed it) and it became more important in they eyes of consumers and AMD, morphing into open source solutions. In a way, the same point could be said for gSync/Freesync: nV capitalizes on their hardware/licence to run it, the feature becomes more useful/prevelant, AMD comes out with support, and finally it becomes largely open source/hardware agnostic.

AMD Allegedly Has 200,000 Radeon RX 7900 Series GPUs for Launch Day


Hmm 🤔 🧐 not too shabby
 
  • Like
Reactions: noko
like this
Back
Top