Radeon image sharpening ported to Reshade.

Revdarian

2[H]4U
Joined
Aug 16, 2010
Messages
2,616
https://www.dsogaming.com/news/modd...ade-allowing-all-nvidia-amd-gamers-to-use-it/

Long story short, since the algorithm for the contrast aware sharpening was free to the public it was ported by a hobbyist to Reshade, modifying the parts that used rapid packed math and fp16, that reshade doesn't support, this gave it a small performance hit but now it can run on nvidia hardware, and also on dx11/opengl.

Sounds like it's worth a try.
 
Awesome! Thanks a lot.

Btw it should work on basically any gpu so it's really sweet :D
 
Yeah, amd made it public... Hint hint Nvidia. Kind of like freesync is a available to anyone and gsync is closed. Two different models. When you own majority of the market you get a lot more on adoption of features.
 
Yeah, amd made it public... Hint hint Nvidia. Kind of like freesync is a available to anyone and gsync is closed. Two different models. When you own majority of the market you get a lot more on adoption of features.

You know that you can use Freesync with Nvidia cards right? Nv enabled that a while ago...

Also I'm going to take a wild ass guess and say that there probably are a more Freesync monitors out there than Gsync ones.
 
I'd like to see IQ comparison/performance vs DLSS and RIS
 
You know that you can use Freesync with Nvidia cards right? Nv enabled that a while ago...

Also I'm going to take a wild ass guess and say that there probably are a more Freesync monitors out there than Gsync ones.

Yes, because freesync is open for anyone to use, including Nvidia... Unlike nvidias solution gsync which is closed: amd cannot imement it (or at least without some crazy negotiations).
 
That's pretty cool. If I understand it correctly, it really needs the resolution scaling feature that are only found in a small subset of games like Battlefield though, right?
 
That's pretty cool. If I understand it correctly, it really needs the resolution scaling feature that are only found in a small subset of games like Battlefield though, right?

I'm pretty sure downsampling should be possible through the drivers, too.
 
I'm pretty sure downsampling should be possible through the drivers, too.

Can anyone confirm if this is possible? My 1080 is long in the tooth at 4k. I need to upgrade but I don't want to spend a ton of money now, since all the new cards are coming next year...
 
Can anyone confirm if this is possible? My 1080 is long in the tooth at 4k. I need to upgrade but I don't want to spend a ton of money now, since all the new cards are coming next year...
Downscaling? That's been supported by AMD and nvidia for quite a while.
 
Downscaling? That's been supported by AMD and nvidia for quite a while.

Are you talking about DSR? My projector is already 4k. I was trying to figure out a cheaper way to get to next year than buying a holdover 2080 super at $700. I was thinking 5700XT + Sharpening or using Reshade to get the benefits of Sharpening on my 1080.

Resolution downscaling is only available in some games like Battlefield though. So I think in order to do Sharpening for all games, I would need change my resolution at the desktop level to 1440p or 1800p to do that.
 
This is only CAS being ported from FidelityFX to reshade.

The FidelityFX combines Contrast-Adaptive Sharpening (CAS) along with Luma Preserving Mapping (LPM) technologies. (The LPM is very similar to neural up-scaling); but it rebuilds the image as if it was using PCM to upscale as a wave/fractal. I think only the sharpening works atm with the reshade, not the upscale.
 
Last edited:
Played with this on a couple titles, it is a nice clean sharpen. I'm not using at as part of a resolution scale, so dialed it's strength back to 50-70%. Really brings out details nicely without looking grainy or harsh.

Neat!
 
Are you talking about DSR? My projector is already 4k. I was trying to figure out a cheaper way to get to next year than buying a holdover 2080 super at $700. I was thinking 5700XT + Sharpening or using Reshade to get the benefits of Sharpening on my 1080.

Resolution downscaling is only available in some games like Battlefield though. So I think in order to do Sharpening for all games, I would need change my resolution at the desktop level to 1440p or 1800p to do that.

Yeah, so basically the opposite of DSR...or DSR at sub-100% scaling. Instead of rendering at a higher resolution and scaling it down, render at a lower resolution and scale it up.
 
Can anyone confirm if this is possible? My 1080 is long in the tooth at 4k. I need to upgrade but I don't want to spend a ton of money now, since all the new cards are coming next year...
Yes. Radeon Image Sharpening will work at basically any resolution your monitor supports. You just need a Navi card and enable both RIS and GPU scaling in the driver.

In my case, I have a 1080p monitor and I got the best results at 900p. If you have a 4k display you will probably want to set to 1440p or 1800p if possible. It works in any DX9, DX12, and Vulkan title. If the game has a render scale option, you can use that to fine tune it more, but it's not needed.
 
Unless that sharpening algorithm is some kind of magic, I failing to comprehend why such features often blown out of proportions. The thing with sharpening is, you want as little as possible of it
 
The point is to restore some sharpness and detail when upscaling. For example when running 1440p on a 4k monitor.

Most video cards struggle at 4k, but if you can get good performance at 1440p, then the sharpening helps reduce the blur of standard upscaling methods.
 
Yes. Radeon Image Sharpening will work at basically any resolution your monitor supports. You just need a Navi card and enable both RIS and GPU scaling in the driver.

In my case, I have a 1080p monitor and I got the best results at 900p. If you have a 4k display you will probably want to set to 1440p or 1800p if possible. It works in any DX9, DX12, and Vulkan title. If the game has a render scale option, you can use that to fine tune it more, but it's not needed.

Ok, so sharpening would work fine if I set my desktop resolution to 1800p (custom resolution) or 1440p, and then sharpened? Would it also work if I just set the game resolution lower?

Everything I was seeing was saying to use resolution scaling from game options, which isn't in a whole lot of games. Not much info on using a desktop or different game resolution and then sharpening.
 
Ok, so sharpening would work fine if I set my desktop resolution to 1800p (custom resolution) or 1440p, and then sharpened? Would it also work if I just set the game resolution lower?

Everything I was seeing was saying to use resolution scaling from game options, which isn't in a whole lot of games. Not much info on using a desktop or different game resolution and then sharpening.

It works in both cases. The benefit of in-game reduced resolution scale is that it keeps the UI elements at full res, just does the 3d worldspace rendering at low res.

It's also okay to just lower the game resolution overall, but then you can get a little bit fuzzy and stretched-looking UI elements. It's not a huge deal, but cripser buttons / HUD elements is nice.
 
The point is to restore some sharpness and detail when upscaling. For example when running 1440p on a 4k monitor.

Most video cards struggle at 4k, but if you can get good performance at 1440p, then the sharpening helps reduce the blur of standard upscaling methods.

Or to restore detail lost from things like anti-aliasing (ie. TAA in Fallout 4, basically every implementation of FXAA).
 
Here you go: https://www.techspot.com/article/1873-radeon-image-sharpening-vs-nvidia-dlss/

Conclusion: DLSS looks better for a given downsampled resolution (sort of), but the performance hit for DLSS is greater meaning that RIS looks better for a given level of performance. The image quality floor of DLSS is much lower, though (see: the BFV tank shot.)

But now that it is ported to Reshade, you can use RIS (AKA CAS) on NVidia cards as well, so you can use it instead of DLSS if you don't like DLSS.
 
I applaud the effort for NVidia going outside the box with trying to come up with a new solution, but sometimes simpler is more effective. Also, with AMD leaving it open source makes it simple for anyone (including their rivals nvidia) to implement it (but that would be admitting amd had a good idea/product, so they will wait until it all dies down and is almost forgotten about to implement). Just look how long NVidia waited to enable freesync which was possible for a really long time, but they wanted to try to keep amd out of the variable refresh rate market and wanted to sell their gsync. Supporting freesync is admitting that people actually wanted it and not everyone wanted to shell out the nvidia tax to purchase a monitor that only worked with nvidia.
 
This is now reaching critical mass, with reviewers using load screens pixel peeping for sharpness lol.

Soon enough we can expect the same hysteria as the camera world that has for years now been staring at brick walls looking for the ultimate detail and IQ.

I don't belittle any sharpening solution here. But at some point it really should just be a personal choice based on what one prefers. This is going to go further than I expected I guess.
 
I don't belittle any sharpening solution here. But at some point it really should just be a personal choice based on what one prefers. This is going to go further than I expected I guess.

It really, really needs to be implemented by developers. Give the user a choice, sure, but developers could apply it selectively so as to avoid artifacting while keeping performance up.


Soon enough we can expect the same hysteria as the camera world that has for years now been staring at brick walls looking for the ultimate detail and IQ.

Plenty of discussion of 'false detail' there too. Just like in gaming, sharpening can be done well, it can be overdone, and it can add detail that simply wasn't there, if that's important. In both cases, the less the better.
 
This is now reaching critical mass, with reviewers using load screens pixel peeping for sharpness lol.

Soon enough we can expect the same hysteria as the camera world that has for years now been staring at brick walls looking for the ultimate detail and IQ.

I don't belittle any sharpening solution here. But at some point it really should just be a personal choice based on what one prefers. This is going to go further than I expected I guess.

You sound like a veteran of the DPreview forums. I have been on their since I bought my first Digital Camera. A 2MP Nikon CP-950. I have seen no end of wacky sharpening/resizing schemes on those forums, though it was a bigger issue when cameras were 2MP-3MP, than the 20MP+ of today.
 
though it was a bigger issue when cameras were 2MP-3MP, than the 20MP+ of today.

The most recent sharpening segway has been with respect to the lack of anti-aliasing filters on higher-resolution cameras. Typically, to help prevent moire, which is an artifact of using various color filter arrays, AA filters sit in front of the sensor and blur the image slightly. Then, knowing how the image was blurred, some sharpening is applied in software when creating JPEGs.

But with no AA filter, you essentially don't want any sharpening, and when you added it- well, stuff could get wacky. And that's more or less what we see with games, as there is no color filter array involved. A little bit of sharpening can look good, even if the detail isn't real, but it can get out of hand pretty quickly.
 
You sound like a veteran of the DPreview forums. I have been on their since I bought my first Digital Camera. A 2MP Nikon CP-950. I have seen no end of wacky sharpening/resizing schemes on those forums, though it was a bigger issue when cameras were 2MP-3MP, than the 20MP+ of today.

Photography paid for my house and my kids education, yeah I've been to DPreview from time to time only to quietly exit stage right....
 
Unless that sharpening algorithm is some kind of magic, I failing to comprehend why such features often blown out of proportions. The thing with sharpening is, you want as little as possible of it

The part that it's contrast aware means that it checks pixels according to their boundaries and if it finds that the contrast is already high enough then no sharpening happens in that zone, it is a relatively simple check that in the end greatly diminishes the worst offenders of oversharpening and thus the final output is easier on the eyes, meeting the premise of "bringing out" detail otherwise lost from the not perfectly scaled resolution without creating a high contrast grainy mess.
 
  • Like
Reactions: dgz
like this
The part that it's contrast aware means that it checks pixels according to their boundaries and if it finds that the contrast is already high enough then no sharpening happens in that zone, it is a relatively simple check that in the end greatly diminishes the worst offenders of oversharpening and thus the final output is easier on the eyes, meeting the premise of "bringing out" detail otherwise lost from the not perfectly scaled resolution without creating a high contrast grainy mess.

There's a lot more to it, though. It really needs to be 'content aware', which would either be done by AI at render time (expensive) or done by the developer beforehand (perhaps using AI). Those contrast-based thresholds (there are more than one) have to be adjusted locally in different parts of a scene based on content to ensure that artifacts and false detail don't creep in.

You see this done automatically more and more in photography software, and it by and large works- but it isn't cheap, most certainly cheap enough to want to use on a frame-by-frame basis. This is why developer input would help significantly. Tagging objects and surfaces with different thresholds for sharpening, and building the sharpening pass into the engine, would allow for even greater scaling flexibility without significant detail loss or artifacting.
 
The detractors here don't really seem to understand what is the sharpening for.

This isn't meant to be used directly on an already pixel perfect output, this is meant to be used when your output is around 70-80% of the actual physical resolution of your display device. As you know since the scaling stops being pixel perfect at that level then this is indeed an artificial way to get certain parts of the image closer to how it should have looked at the correct resolution.

Ideally this sharpening algorithm should be the last thing done after you already used your preferred AA method. The main difference between it being applied by reshade or by a game would be that the game could do the internal render at the 70-80%, then upscale to the output of the display device and at this point do the CAS, this is the way it would generate the least errors.


These methods are all cutting corners? Well, of course! These are all scaling methods to save up performance, anyone who thinks otherwise is just telling lies to themselves!
 
With deferred rendering it's close to impossible to just select a way to tag objects for specific sharpening since by that point in the pipeline most relevant data for a proper selection is long gone AFAIK.

This is why edge detection AA is not offered as it was in the past, again AFAIK.
 
This isn't meant to be used directly on an already pixel perfect output, this is meant to be used when your output is around 70-80% of the actual physical resolution of your display device.

See TXAA examples, even FXAA and MSAA

then this is indeed an artificial way to get certain parts of the image closer to how it should have looked at the correct resolution.

It's not 'should have looked', because that is impossible- the detail has been removed and / or was never there. What sharpening does is try to make the image look less unpleasent. As evidenced by Radeon Image Sharpening, by and large it is successful. However, it is not adding back in detail, so the best we can hope for is that the sharpening doesn't add detail that is too wrong- so it is important to recognize that sharpened details will never be 'right'.

then upscale to the output of the display device and at this point do the CAS, this is the way it would generate the least errors.

Obviously you don't want to sharpen before you apply anti-aliasing (essentially a smart blur filter), but artifacts are unavoidable. A content-aware sharpener could reduce artifacting.

With deferred rendering it's close to impossible to just select a way to tag objects for specific sharpening since by that point in the pipeline most relevant data for a proper selection is long gone AFAIK.

This is why edge detection AA is not offered as it was in the past, again AFAIK.

The tag would have to survive the render stage. Doesn't matter how that's done so long as it's available for the sharpening pass.


With respect to final output, if content-aware, anti-aliasing, scaling (up or down), and output sharpening could be done at the same time. Being able to do so flexibly would be a boon to content portability.
 
  • Like
Reactions: Parja
like this
For like Shadow Of The Tomb Raider, sharpening works well with TXAA from my observation. Also you must have some detail to begin with to sharpen. Otherwise it would be adding stuff not meant to be there. In Doom I do not like it, I see too much noise when on. Seems more like a game by game choice.
 
Yep, Shadow of Tomb Raider worked well for me at 900p on a 1080p monitor. It had a soft, console look but certainly good enough to play.
 
I'd like to see this compared to the "high pass sharp" shader in Reshade. That's a really good sharpening shader, which avoids a lot of the noise which other sharpening filters add. Such as Luma Sharp. I think I will do some comparison shots.
 
Heya, thanks for reminding me that some people /games do use naively blurry AA solutions, yeahhh I guess that I can see an use for this sharpening filter at the same resolution, I guess that I was too hell bent on my personal use case.

Just FYI, IIC posts aren't detracting the idea, I think that sadly we will never get those kinds of solutions working properly, since this goes beyond material information and adds development complexity for a small subset of customers when they are trying to move for simplicity in the code of effects through ray tracing. Still yeah those sound cool but sadly we are too far from those atm.
 
Back
Top