AMD Radeon™ Image Sharpening

I'd imagine a byproduct from AMD's console tech, So games look better on 4K TV's.

Hey, if it works, it works. And if it gives a good experience on a 4K TV, maybe it's time we move to consoles soon haha B')
 
It's hilarious it's "DLSS but better" considering nVidia has specialty hardware... crazy to me. I was so excited to DLSS2X (AI super sampling rather than DLSS which is AI upscaling) which never actually materialized. I'll be salty about that for years to come.
 
DLSS has flopped hard, so the comparison here is kind of shooting fish in a barrel.

But DLSS is running at 1440p, before it's interpolation to 4K, and they are comparing 1700-1800p with sharpening, which is a bit of a mismatch.

NVidia has it's own post process feature set that includes sharpening (and a many other filters) which I mentioned in another thread:
https://hardforum.com/threads/whos-planning-to-buy-5700-gpu.1982369/page-7#post-1044259763

Surprising they never compared to either Freestyle (NVidia) which is over a year old, nor Reshade (Open) which has been doing this stuff for a long time. I suppose it wouldn't be shooting fish in a barrel then.
 
Surprising they never compared to either Freestyle (NVidia) which is over a year old, nor Reshade (Open) which has been doing this stuff for a long time. I suppose it wouldn't be shooting fish in a barrel then.

I know Reshade+SweetFX from the Fallout 4 mod community, and it drastically changes that graphically underwhelming game into a decent, respectable-looking one (apart from other mesh/texture mods).

Edit: some of the comments in that YouTube video mentioned that this "Radeon Image Sharpening" might be how AMD fulfills Sony/Microsoft's claims of "4K fidelity" (or close enough) on their upcoming next-gen consoles, and I think that's a pretty good, and logical, assumption to take.
 
  • Like
Reactions: Auer
like this
From r/Amd:

"I ported FidelityFX CAS to ReShade so anyone can use it, with nearly any game"

 
I know Reshade+SweetFX from the Fallout 4 mod community, and it drastically changes that graphically underwhelming game into a decent, respectable-looking one (apart from other mesh/texture mods).

Edit: some of the comments in that YouTube video mentioned that this "Radeon Image Sharpening" might be how AMD fulfills Sony/Microsoft's claims of "4K fidelity" (or close enough) on their upcoming next-gen consoles, and I think that's a pretty good, and logical, assumption to take.

You already have resolution scaling, and Checkerboard Rendering on the current generation which does a good job. I read a detailed description on how they do it in Zero Dawn and it's pretty serious work, not a simple resize, but I think better results than simple resize and sharpen.
 
You already have resolution scaling, and Checkerboard Rendering on the current generation which does a good job. I read a detailed description on how they do it in Zero Dawn and it's pretty serious work, not a simple resize, but I think better results than simple resize and sharpen.
Yeah, honestly the current checkerboard method may not be true 4K but it still results in a very impressive image quality. Zero Dawn and Titanfall 2 both looked great.
 
I was just going to post this. The fact that sharpening with 75% scaling looks as good or better than native 4k is what DLSS SHOULD have been. Great results from Metro, BFV, Div2, and RE2.
Screenshot_20190711-210424_YouTube.jpg

Screenshot_20190711-210339_YouTube.jpg
 
So 78% scaling has about the same performance boost over 4k as does DLSS 4k:
Screenshot_20190711-213310_YouTube.jpg


Even with this 1080p screenshot, the differences are very noticeable. The sharpened 78% and 4k are very close while the 4k DLSS looks rather bad. Just look at the tank rivets!
 
DLSS has flopped hard, so the comparison here is kind of shooting fish in a barrel.

But DLSS is running at 1440p, before it's interpolation to 4K, and they are comparing 1700-1800p with sharpening, which is a bit of a mismatch.

NVidia has it's own post process feature set that includes sharpening (and a many other filters) which I mentioned in another thread:
https://hardforum.com/threads/whos-planning-to-buy-5700-gpu.1982369/page-7#post-1044259763

Surprising they never compared to either Freestyle (NVidia) which is over a year old, nor Reshade (Open) which has been doing this stuff for a long time. I suppose it wouldn't be shooting fish in a barrel then.

Nice try, but going from native 4k to DLSS running at 1440p has the same performance hit as going from native 4k to 1800p with CAS, so it is still an incredibly valid comparison.
 
It's hilarious it's "DLSS but better" considering nVidia has specialty hardware... crazy to me. I was so excited to DLSS2X (AI super sampling rather than DLSS which is AI upscaling) which never actually materialized. I'll be salty about that for years to come.

DLSS was the one feature of the new cards that I was really looking forward to. What a let down it turned out to be.
 
Well, to be fair, AMD is saying this feature is tied to the new architecture, which is why it is only available on Navi.

The developer of ReShade has ported the CAS algorithm over and you can now use it on any card and any game. The only downside is a slightly larger performance hit since the method would normally make use of FP16 and rapid packed math on Navi.
 
The developer of ReShade has ported the CAS algorithm over and you can now use it on any card and any game. The only downside is a slightly larger performance hit since the method would normally make use of FP16 and rapid packed math on Navi.

I would love to see the reshade version on a pascal chip like 1080 ti/ Titan XP compared to 5700/5700xt using it natively.

The funny thing about all this is nVidia didn't even need the Super series of cards. If they ported 1080 ti to the current node Turing uses and priced it at $350 it would make Navi DOA. I've never been a fan of turing and felt tensor cores were a way for nVidia to subsidize their datacenter ambitions at the cost of consumer gaming. At least dxr is still intriguing enough though.
 
Last edited:
That screenshot, if an accurate representation, really goes to show sharpening is doing a much better job than DLSS. DLSS looks so blurry and without fine detail it's pathetic. I feel like the textures went from HD to SD or back at least 10 years in gaming texture graphics details. It looks like I'm seeing BF Vietnam detail set in a city.
 
Wasnt the DLSS implementation in BFV particularly poorly implemented?

I never saw that kind of detail loss in Anthem with DLSS on.
 
I find it funny

nVidia feature sucks - "Developers aren't doing it right"
AMD feature sucks - "AMD sucks"

Anyone who bought into DLSS deserves exactly what they paid for.
 
I find it funny

nVidia feature sucks - "Developers aren't doing it right"
AMD feature sucks - "AMD sucks"

Anyone who bought into DLSS deserves exactly what they paid for.

The response to AMD's sharpening seems mostly very positive, not sure why you are so upset?
 
Anyone who bought into DLSS deserves exactly what they paid for.

I don't think anyone bought these cards for DLSS, unless they pre-ordered before reviews (which NEVER makes sense).

IMO DLSS is the biggest failure on NVidias part. Initially it looked some kind of super AA method, with low overhead, but eventually it was revealed it had two modes. One where it was used as an up-scaler that did a fairly weak job, and second method that just did AA, that never actually showed up almost a year later.

NVdia deserves an award for over-promising and under-delivering on DLSS.
 
Wasnt the DLSS implementation in BFV particularly poorly implemented?

I never saw that kind of detail loss in Anthem with DLSS on.

BFV is an example of how not to use a major AAA game as an example for new technology, when it really should be. Every 'headline' technology the developers have employed has been sub-par.
 
Nice try, but going from native 4k to DLSS running at 1440p has the same performance hit as going from native 4k to 1800p with CAS, so it is still an incredibly valid comparison.

I am just pointing out that it is scaling from lower resolution, thus we aren't talking an equal comparison for image quality.

Since the whole point really is to make Ray Tracing more viable, getting to that lower resolution is probably more important, since the Ray Tracing savings trump the DLSS overhead.

IOW for 4K Ray Tracing: 1440p + Ray Tracing + DLSS (likely) outperforms 1800p + Ray Tracing. Because Ray Tracing expense is significantly worse at 1800p than it is at 1440p. Though I'd love to see that test. DLSS may be so bad that it fails even that.

I would never suggest anyone use DLSS for general scaling without Ray Tracing.

And Ultimately I think DLSS is fundamentally flawed for PC games, because of it's fragility. It has become clear that ANY change in visuals requires a change in network training at NVidia. Which means if you get a 3rd party texture or model upgrade, you will likely partially break DLSS, and that applies to anything that changes visuals. It's a fundamentally brittle technology. Since the reveal it was even stated that it even needs training at each resolution, to perform adequately.

As I have said in a different thread. I think Tensor cores for the consumer were a solution in search of a problem, and DLSS was the justification for including Tensor cores, that are really aimed at professional workloads in the same chips that also sold in professional cards.

When AMD implements their RT solution they can just save die area and safely skip the Tensor cores, which contribute next to nothing here.
 
But I was assured it was only possible to do such things with tensor cores...

lol

I find it funny

nVidia feature sucks - "Developers aren't doing it right"
AMD feature sucks - "AMD sucks"

Anyone who bought into DLSS deserves exactly what they paid for.

the haters will always hate

The response to AMD's sharpening seems mostly very positive, not sure why you are so upset?

I'm sure he's talking about the Nvidia fanbois.
 
I would love to see the reshade version on a pascal chip like 1080 ti/ Titan XP compared to 5700/5700xt using it natively.

The funny thing about all this is nVidia didn't even need the Super series of cards. If they ported 1080 ti to the current node Turing uses and priced it at $350 it would make Navi DOA. I've never been a fan of turing and felt tensor cores were a way for nVidia to subsidize their datacenter ambitions at the cost of consumer gaming. At least dxr is still intriguing enough though.

pretty much nailed it on the head there.. kinda glad AMD decided to split their datacenter architecture and consumer card architecture development since we know the vega architecture works really well for datacenters but not so much for gaming.
 
Last edited:
Well, to be fair, AMD is saying this feature is tied to the new architecture, which is why it is only available on Navi.

Yeah, Nvidia marketed it as some next level super computer backed magic. Dedicated hardware up-scaling has been around forever. DLSS always felt it was tacked on at the end because of the huge performance hit related to RTX.
 
It looks like DLSS and RIS are approaching the problem from significantly different directions, with neither really hitting a 'happy medium', though RIS does look closer at the moment.

RIS appears to be a broad-brush approach with little to no per-title / per-scene tuning. We can expect it to work well in less challenging examples and to start to fall apart in more challenging examples- though it's certainly possible for RIS to work 'enough' that we never get to the more challenging examples.

DLSS appears to be an attempt at an all-ecompassing solution, where per-scene training may be used to smartly sharpen without adding artifacts across a broad range of scenes. It currently appears to be tuned for more 'worst case' scenarios than RIS due to its first implementation supporting ray tracing.

In both cases, applying more local smarts (inference) would likely allow for more utility, where RIS could be pushed to the extreme, say 1080p -> 4k, and where DLSS could use a 'lighter' touch that is less dependent on pre-profiling.
 
Personally I think DLSS looks better. Not as sharp as RIS, but it has a higher quality look. RIS can look sort of like a Photoshop filter overdone in some cases.

However, RIS works in many games (not DX11 oddly, but I imagine they could add support). Nearly all the games I tried worked with it with no problem.

We see that DLSS has not become popular, a year in and only a couple games support it. AMD was smart to find a generic solution, especially since it seems to work better some of the time.
 
I mean, everyone is entitles to their opinions, buuuuut...

I guess, at their best, they can look pretty similar but the floor for image quality with DLSS seems to be much lower, and the performance impact is still larger.
 
As a small experiment I ran Conan Exiles at 1440p on my 4k monitor and used the Nv Experience in game tools from the overlay to sharpen and otherwise tweak things a bit and felt it came out allright.
As expected, The RTX2070 has no problems running the game @60fps 1440p.
 

Attachments

  • Conan Exiles Screenshot 2019.07.12 - 14.22.53.92.png
    Conan Exiles Screenshot 2019.07.12 - 14.22.53.92.png
    3.7 MB · Views: 0
  • Conan Exiles Screenshot 2019.07.12 - 14.25.16.20.png
    Conan Exiles Screenshot 2019.07.12 - 14.25.16.20.png
    4 MB · Views: 0
Last edited:
As a small experiment I ran Conan Exiles at 1440p on my 4k monitor and used the Nv ex[erience in game tools from the overlay to sharpen and otherwise tweak things a bit and felt it came out allright.
As expected, The RTX2070 has no problems running the game @60fps 1440p.

I use nVidia freeplay/filters in games reshade doesn't work with almost identical results which is nice. I'm surprised nVidia didn't advertise it more as it works like RIS but in more games, especially dx11.
 
  • Like
Reactions: Auer
like this
I use nVidia freeplay/filters in games reshade doesn't work with almost identical results which is nice. I'm surprised nVidia didn't advertise it more as it works like RIS but in more games, especially dx11.

They really should- not just to reattain parity with RIS, but because they have the infrastructure to do what RIS does, but better, with their current hardware and research resources.
 
I use nVidia freeplay/filters in games reshade doesn't work with almost identical results which is nice. I'm surprised nVidia didn't advertise it more as it works like RIS but in more games, especially dx11.

Ah Freeplay. I knew they had a name for that.

Poorly marketed, somewhat unusual for Nv.
 
Nvidia should be ashamed how bad that detail is in DLSS, have to see more on this new sharpening from AMD to make a judgement on it.
 
Back
Top