AMD Radeon™ Image Sharpening

Discussion in 'Video Cards' started by cybereality, Jul 11, 2019.

  1. cybereality

    cybereality [H]ardness Supreme

    Messages:
    4,381
    Joined:
    Mar 22, 2008
  2. Auer

    Auer Gawd

    Messages:
    627
    Joined:
    Nov 2, 2018
    I'd imagine a byproduct from AMD's console tech, So games look better on 4K TV's.

    Hey, if it works, it works. And if it gives a good experience on a 4K TV, maybe it's time we move to consoles soon haha B')
     
  3. Dayaks

    Dayaks [H]ardness Supreme

    Messages:
    6,807
    Joined:
    Feb 22, 2012
    It's hilarious it's "DLSS but better" considering nVidia has specialty hardware... crazy to me. I was so excited to DLSS2X (AI super sampling rather than DLSS which is AI upscaling) which never actually materialized. I'll be salty about that for years to come.
     
    Auer, KazeoHin and Gideon like this.
  4. Snowdog

    Snowdog [H]ardForum Junkie

    Messages:
    8,811
    Joined:
    Apr 22, 2006
    DLSS has flopped hard, so the comparison here is kind of shooting fish in a barrel.

    But DLSS is running at 1440p, before it's interpolation to 4K, and they are comparing 1700-1800p with sharpening, which is a bit of a mismatch.

    NVidia has it's own post process feature set that includes sharpening (and a many other filters) which I mentioned in another thread:
    https://hardforum.com/threads/whos-planning-to-buy-5700-gpu.1982369/page-7#post-1044259763

    Surprising they never compared to either Freestyle (NVidia) which is over a year old, nor Reshade (Open) which has been doing this stuff for a long time. I suppose it wouldn't be shooting fish in a barrel then.
     
  5. tungt88

    tungt88 [H]ard|Gawd

    Messages:
    1,990
    Joined:
    Jan 14, 2008
    I know Reshade+SweetFX from the Fallout 4 mod community, and it drastically changes that graphically underwhelming game into a decent, respectable-looking one (apart from other mesh/texture mods).

    Edit: some of the comments in that YouTube video mentioned that this "Radeon Image Sharpening" might be how AMD fulfills Sony/Microsoft's claims of "4K fidelity" (or close enough) on their upcoming next-gen consoles, and I think that's a pretty good, and logical, assumption to take.
     
    Auer likes this.
  6. Auer

    Auer Gawd

    Messages:
    627
    Joined:
    Nov 2, 2018
    From r/Amd:

    "I ported FidelityFX CAS to ReShade so anyone can use it, with nearly any game"

     
  7. Snowdog

    Snowdog [H]ardForum Junkie

    Messages:
    8,811
    Joined:
    Apr 22, 2006
    You already have resolution scaling, and Checkerboard Rendering on the current generation which does a good job. I read a detailed description on how they do it in Zero Dawn and it's pretty serious work, not a simple resize, but I think better results than simple resize and sharpen.
     
    tungt88 likes this.
  8. Nolan7689

    Nolan7689 [H]ard|Gawd

    Messages:
    1,302
    Joined:
    Jun 5, 2015
    Yeah, honestly the current checkerboard method may not be true 4K but it still results in a very impressive image quality. Zero Dawn and Titanfall 2 both looked great.
     
  9. Auer

    Auer Gawd

    Messages:
    627
    Joined:
    Nov 2, 2018
    Well, speak of the devil....https://www.nvidia.com/en-us/geforce/news/monster-hunter-world-nvidia-dlss/

    "Furthermore, Monster Hunter: World integrates a new, community-requested DLSS feature - a slider to adjust sharpness, enabling users to make the image sharper or softer based on their own personal preferences. To tweak, simply move the Sharpness slider in the Display options menu"
     
  10. Nightfire

    Nightfire [H]ard|Gawd

    Messages:
    1,526
    Joined:
    Sep 7, 2017
    I was just going to post this. The fact that sharpening with 75% scaling looks as good or better than native 4k is what DLSS SHOULD have been. Great results from Metro, BFV, Div2, and RE2.
    Screenshot_20190711-210424_YouTube.jpg
    Screenshot_20190711-210339_YouTube.jpg
     
    Maddness, Algrim, Auer and 2 others like this.
  11. Nightfire

    Nightfire [H]ard|Gawd

    Messages:
    1,526
    Joined:
    Sep 7, 2017
    So 78% scaling has about the same performance boost over 4k as does DLSS 4k:
    Screenshot_20190711-213310_YouTube.jpg

    Even with this 1080p screenshot, the differences are very noticeable. The sharpened 78% and 4k are very close while the 4k DLSS looks rather bad. Just look at the tank rivets!
     
    N4CR, Maddness and Auer like this.
  12. russnuck

    russnuck Gawd

    Messages:
    684
    Joined:
    Mar 25, 2005
    But I was assured it was only possible to do such things with tensor cores...
     
  13. cybereality

    cybereality [H]ardness Supreme

    Messages:
    4,381
    Joined:
    Mar 22, 2008
    Well, to be fair, AMD is saying this feature is tied to the new architecture, which is why it is only available on Navi.
     
    Auer likes this.
  14. KazeoHin

    KazeoHin [H]ardness Supreme

    Messages:
    7,804
    Joined:
    Sep 7, 2011
    Nice try, but going from native 4k to DLSS running at 1440p has the same performance hit as going from native 4k to 1800p with CAS, so it is still an incredibly valid comparison.
     
    cybereality and Nightfire like this.
  15. Nightfire

    Nightfire [H]ard|Gawd

    Messages:
    1,526
    Joined:
    Sep 7, 2017
    Yep, DLSS was so bad in the last DLSS test by HUB that the similar performing 1800p actually looked better in some cases.
     
    {NG}Fidel, N4CR and Auer like this.
  16. reaper12

    reaper12 2[H]4U

    Messages:
    2,261
    Joined:
    Oct 21, 2006
    DLSS was the one feature of the new cards that I was really looking forward to. What a let down it turned out to be.
     
    {NG}Fidel and Auer like this.
  17. BrotherMichigan

    BrotherMichigan [H]Lite

    Messages:
    103
    Joined:
    Apr 8, 2016
    The developer of ReShade has ported the CAS algorithm over and you can now use it on any card and any game. The only downside is a slightly larger performance hit since the method would normally make use of FP16 and rapid packed math on Navi.
     
    cybereality, Auer and 5150Joker like this.
  18. 5150Joker

    5150Joker 2[H]4U

    Messages:
    3,033
    Joined:
    Aug 1, 2005
    I would love to see the reshade version on a pascal chip like 1080 ti/ Titan XP compared to 5700/5700xt using it natively.

    The funny thing about all this is nVidia didn't even need the Super series of cards. If they ported 1080 ti to the current node Turing uses and priced it at $350 it would make Navi DOA. I've never been a fan of turing and felt tensor cores were a way for nVidia to subsidize their datacenter ambitions at the cost of consumer gaming. At least dxr is still intriguing enough though.
     
    Last edited: Jul 12, 2019 at 6:52 AM
    Brian_B likes this.
  19. WBurchnall

    WBurchnall 2[H]4U

    Messages:
    2,616
    Joined:
    Oct 10, 2009
    That screenshot, if an accurate representation, really goes to show sharpening is doing a much better job than DLSS. DLSS looks so blurry and without fine detail it's pathetic. I feel like the textures went from HD to SD or back at least 10 years in gaming texture graphics details. It looks like I'm seeing BF Vietnam detail set in a city.
     
    {NG}Fidel likes this.
  20. Auer

    Auer Gawd

    Messages:
    627
    Joined:
    Nov 2, 2018
    Wasnt the DLSS implementation in BFV particularly poorly implemented?

    I never saw that kind of detail loss in Anthem with DLSS on.
     
  21. Brian_B

    Brian_B 2[H]4U

    Messages:
    2,729
    Joined:
    Mar 23, 2012
    I find it funny

    nVidia feature sucks - "Developers aren't doing it right"
    AMD feature sucks - "AMD sucks"

    Anyone who bought into DLSS deserves exactly what they paid for.
     
    {NG}Fidel and N4CR like this.
  22. Auer

    Auer Gawd

    Messages:
    627
    Joined:
    Nov 2, 2018
    The response to AMD's sharpening seems mostly very positive, not sure why you are so upset?
     
  23. Snowdog

    Snowdog [H]ardForum Junkie

    Messages:
    8,811
    Joined:
    Apr 22, 2006
    I don't think anyone bought these cards for DLSS, unless they pre-ordered before reviews (which NEVER makes sense).

    IMO DLSS is the biggest failure on NVidias part. Initially it looked some kind of super AA method, with low overhead, but eventually it was revealed it had two modes. One where it was used as an up-scaler that did a fairly weak job, and second method that just did AA, that never actually showed up almost a year later.

    NVdia deserves an award for over-promising and under-delivering on DLSS.
     
  24. Brian_B

    Brian_B 2[H]4U

    Messages:
    2,729
    Joined:
    Mar 23, 2012
    I think you misinterpreted my tone. I got nothing against AMD for Image Sharpening.
     
  25. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    10,299
    Joined:
    Jun 13, 2003
    BFV is an example of how not to use a major AAA game as an example for new technology, when it really should be. Every 'headline' technology the developers have employed has been sub-par.
     
  26. Snowdog

    Snowdog [H]ardForum Junkie

    Messages:
    8,811
    Joined:
    Apr 22, 2006
    I am just pointing out that it is scaling from lower resolution, thus we aren't talking an equal comparison for image quality.

    Since the whole point really is to make Ray Tracing more viable, getting to that lower resolution is probably more important, since the Ray Tracing savings trump the DLSS overhead.

    IOW for 4K Ray Tracing: 1440p + Ray Tracing + DLSS (likely) outperforms 1800p + Ray Tracing. Because Ray Tracing expense is significantly worse at 1800p than it is at 1440p. Though I'd love to see that test. DLSS may be so bad that it fails even that.

    I would never suggest anyone use DLSS for general scaling without Ray Tracing.

    And Ultimately I think DLSS is fundamentally flawed for PC games, because of it's fragility. It has become clear that ANY change in visuals requires a change in network training at NVidia. Which means if you get a 3rd party texture or model upgrade, you will likely partially break DLSS, and that applies to anything that changes visuals. It's a fundamentally brittle technology. Since the reveal it was even stated that it even needs training at each resolution, to perform adequately.

    As I have said in a different thread. I think Tensor cores for the consumer were a solution in search of a problem, and DLSS was the justification for including Tensor cores, that are really aimed at professional workloads in the same chips that also sold in professional cards.

    When AMD implements their RT solution they can just save die area and safely skip the Tensor cores, which contribute next to nothing here.
     
    {NG}Fidel likes this.
  27. OFaceSIG

    OFaceSIG [H]ard|Gawd

    Messages:
    1,947
    Joined:
    Aug 31, 2009
    lol

    the haters will always hate

    I'm sure he's talking about the Nvidia fanbois.
     
    {NG}Fidel likes this.
  28. sirmonkey1985

    sirmonkey1985 [H]ard|DCer of the Month - July 2010

    Messages:
    21,246
    Joined:
    Sep 13, 2008
    pretty much nailed it on the head there.. kinda glad AMD decided to split their datacenter architecture and consumer card architecture development since we know the vega architecture works really well for datacenters but not so much for gaming.
     
    Last edited: Jul 12, 2019 at 9:59 AM
    5150Joker and Auer like this.
  29. russnuck

    russnuck Gawd

    Messages:
    684
    Joined:
    Mar 25, 2005
    Yeah, Nvidia marketed it as some next level super computer backed magic. Dedicated hardware up-scaling has been around forever. DLSS always felt it was tacked on at the end because of the huge performance hit related to RTX.
     
  30. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    10,299
    Joined:
    Jun 13, 2003
    It looks like DLSS and RIS are approaching the problem from significantly different directions, with neither really hitting a 'happy medium', though RIS does look closer at the moment.

    RIS appears to be a broad-brush approach with little to no per-title / per-scene tuning. We can expect it to work well in less challenging examples and to start to fall apart in more challenging examples- though it's certainly possible for RIS to work 'enough' that we never get to the more challenging examples.

    DLSS appears to be an attempt at an all-ecompassing solution, where per-scene training may be used to smartly sharpen without adding artifacts across a broad range of scenes. It currently appears to be tuned for more 'worst case' scenarios than RIS due to its first implementation supporting ray tracing.

    In both cases, applying more local smarts (inference) would likely allow for more utility, where RIS could be pushed to the extreme, say 1080p -> 4k, and where DLSS could use a 'lighter' touch that is less dependent on pre-profiling.
     
  31. Snowdog

    Snowdog [H]ardForum Junkie

    Messages:
    8,811
    Joined:
    Apr 22, 2006
    IdiotInCharge and cybereality like this.
  32. cybereality

    cybereality [H]ardness Supreme

    Messages:
    4,381
    Joined:
    Mar 22, 2008
    Personally I think DLSS looks better. Not as sharp as RIS, but it has a higher quality look. RIS can look sort of like a Photoshop filter overdone in some cases.

    However, RIS works in many games (not DX11 oddly, but I imagine they could add support). Nearly all the games I tried worked with it with no problem.

    We see that DLSS has not become popular, a year in and only a couple games support it. AMD was smart to find a generic solution, especially since it seems to work better some of the time.
     
    Trimlock and IdiotInCharge like this.
  33. BrotherMichigan

    BrotherMichigan [H]Lite

    Messages:
    103
    Joined:
    Apr 8, 2016
    I mean, everyone is entitles to their opinions, buuuuut...

    I guess, at their best, they can look pretty similar but the floor for image quality with DLSS seems to be much lower, and the performance impact is still larger.
     
    kirbyrj and N4CR like this.
  34. Auer

    Auer Gawd

    Messages:
    627
    Joined:
    Nov 2, 2018
    As a small experiment I ran Conan Exiles at 1440p on my 4k monitor and used the Nv Experience in game tools from the overlay to sharpen and otherwise tweak things a bit and felt it came out allright.
    As expected, The RTX2070 has no problems running the game @60fps 1440p.
     

    Attached Files:

    Last edited: Jul 12, 2019 at 4:34 PM
    5150Joker and IdiotInCharge like this.
  35. 5150Joker

    5150Joker 2[H]4U

    Messages:
    3,033
    Joined:
    Aug 1, 2005
    I use nVidia freeplay/filters in games reshade doesn't work with almost identical results which is nice. I'm surprised nVidia didn't advertise it more as it works like RIS but in more games, especially dx11.
     
    Auer likes this.
  36. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    10,299
    Joined:
    Jun 13, 2003
    They really should- not just to reattain parity with RIS, but because they have the infrastructure to do what RIS does, but better, with their current hardware and research resources.
     
  37. Auer

    Auer Gawd

    Messages:
    627
    Joined:
    Nov 2, 2018
    Ah Freeplay. I knew they had a name for that.

    Poorly marketed, somewhat unusual for Nv.
     
  38. Gideon

    Gideon 2[H]4U

    Messages:
    2,157
    Joined:
    Apr 13, 2006
    Nvidia should be ashamed how bad that detail is in DLSS, have to see more on this new sharpening from AMD to make a judgement on it.