amd rdna 5700 or

It is contrast aware sharpening and that mere sharpening filter is beating the months of machine learned dlss, that tells me that dlss amounts for absolutely nothing more than marketing words.

A simple sharpening pass on reshade will exponentially increase shimmering when it isn't needed, of course if they instead use the same contrast aware sharpening seeing that its free for everyone as any other part of Fidelity FX, then that won't be such a big problem.

And that's the problem, the only one overreacting is you Snow, you have a hard time accepting that the simple sharpening filter with a simple check of "is it really needed? Y/n?" is better as an actually usable solution.

Edit :

We are more than 8 months past the unveiling of dlss and that second pass hasn't appeared yet for any supporting game, the "image quality pass", seeing that it's trained per game, per resolution, per setting, seriously stop thinking that it will be widely adopted, nvidia would be wiser to refine contrast adaptive sharpening instead as a simpler and much more effective way to increase image quality of a lower resolution source upscaled.
 
Last edited:
It is contrast aware sharpening and that mere sharpening filter is beating the months of machine learned dlss, that tells me that dlss amounts for absolutely nothing more than marketing words.

A simple sharpening pass on reshade will exponentially increase shimmering when it isn't needed, of course if they instead use the same contrast aware sharpening seeing that its free for everyone as any other part of Fidelity FX, then that won't be such a big problem.

And that's the problem, the only one overreacting is you Snow, you have a hard time accepting that the simple sharpening filter with a simple check of "is it really needed? Y/n?" is better as an actually usable solution.

Edit :

We are more than 8 months past the unveiling of dlss and that second pass hasn't appeared yet for any supporting game, the "image quality pass", seeing that it's trained per game, per resolution, per setting, seriously stop thinking that it will be widely adopted, nvidia would be wiser to refine contrast adaptive sharpening instead as a simpler and much more effective way to increase image quality of a lower resolution source upscaled.

Again, not defending DLSS. It failed, you don't even need sharpening to do better, that was already demonstrated.

You also haven't looked into NVidia Freestyle, they have multiple, tune-able controls that affect contrast and sharpness. The "Clarity" control sounds like it is very similar to RIS. It sharpens while avoiding edge artifacts.

Really nothing new to see here.
 
It actually doesn't matter if it is new or not. I've never even heard of Freestyle, it wasn't marketed at all AFAIK and I'm here on the forum and on computer websites everyday.

AMD has a chance to take something simple and basic but THAT WORKS, and market it correctly. No complex setup, just click a box in the driver and you are good to go.

As we have seen, it can work well compared to DLSS and also is better than relying on display or standard GPU scaling (which almost always looks bad).

So, yeah, it is a pretty big deal.
 
A strength slider doesn't make an algorithm contrast aware. So again, not the same, but they can add the same if they want as the algorithm has been shared freely by amd.
 
I'm gonna take a wild ass guess here and predict Nvidia will have that same check box in no time flat if it actually makes any difference in card sales.
 
Again, not defending DLSS. It failed, you don't even need sharpening to do better, that was already demonstrated.

You also haven't looked into NVidia Freestyle, they have multiple, tune-able controls that affect contrast and sharpness. The "Clarity" control sounds like it is very similar to RIS. It sharpens while avoiding edge artifacts.

Really nothing new to see here.

I looked into them.
They're nearly identical but there is a big difference to me...

Amd's just works, enable it and done, freestyle is not.
It's not overly aggressive nor too soft to not show any difference, doesn't generate artifact, ignores mostly what should be ignored.
In the end, it's not exactly magic, I don't know why they haven't before and was something I did with freestyle for my most played games..

Same as I don't know why something like the HDMI AA adapter thing isn't just inside our gpu dies already, it also just works... no performance penalty.
If people are on hdmi 60hz displays I say just buy it!

My conclusion: there is tons of features out there which are great, nvidia has them too but AMD have done a great thing by having it so simple.
DLSS was something to release to include machine learning cause machine learning is good, people see machine learning and think future, product of the future is great, it's the next thing and you therefore must buy it!
 
It should really be highlighted that these are two very different solutions to two very different problems. DLSS is primarily designed to work in conjunction with ray tracing, specifically RTX, and using it as a simple upscaling technology while possible really isn't the best use of it.

It's also quite apparently not very well dialed in, and that's where Nvidia appears to have failed up to this point. No idea why they haven't leveraged it better as the need for it absolutely exists for 4k120 and VR, both of which Nvidia provides the best solutions for and for which Nvidia's best products still fall short.

It's not overly aggressive nor too soft to not show any difference, doesn't generate artifact, ignores mostly what should be ignored.

There are two related concerns with sharpening filters, and this is true whether we're talking about a game, a movie, or even photo editing. The first is artifacting, and while we've seen subjective claims of low artifacting, this is something that's hard to make an objective claim about due to the sheer number of variables. If anything, AMD needs to get per-game customization sorted quickly. The second, which is related to artifacting, is the potential for the introduction of false detail. This is a real problem in photography as the algorithm can wind up adding detail that wasn't there in the object being photographed. In a game, this could lead to distractions or misidentifications and so on that could be problematic for multiplayer and especially more competitive gaming. Some research is going to be needed to set appropriate boundaries to mitigate potential issues.
 
It looks pretty good to me, but it does work better on some games than others.

My monitor on the 5700 XT rig is 1080p, so I only tested at 720p and 900p. 720p looked bad, but better than you would assume. I mean it was playable quality it just looked like an overblown Photoshop filter. But I would say it was still better than the blur of 720p upscaled.

At 900p it was very close to 1080p quality overall just with some artifacts. Of course, native 1080p still looked better, but 900p was acceptable to me. Especially SotTR looked good. L4D didn't look as nice. But it was usable and could be key in getting a little extra performance.

Also would be more useful at 1440p or 4k as I'm already getting great performance at 1080p so RIS is not needed.
 
Back
Top