Microsoft Announces Variable Rate Shading Support for DX12

Discussion in 'HardForum Tech News' started by cageymaru, Mar 18, 2019.

  1. cageymaru

    cageymaru [H]ard as it Gets

    Messages:
    19,709
    Joined:
    Apr 10, 2003
    Variable Rate Shading (VRS) is a powerful new API that gives the developers the ability to use GPUs more intelligently. Shaders are used to calculate the color of each pixel in a screen. Shading rate refers to the resolution at which these shaders are called (which is different from the overall screen resolution). A higher shading rate means better visual fidelity at the cost of using more GPU power. All pixels in a frame are affected by the game's shading rate. VSR allows developers to choose which areas of the frame are more important and increase the visual fidelity, or set parts of the frame to have lower fidelity and gain extra performance. Lowering the fidelity of parts of the scene can help low spec machines to run faster.

    There are two tiers of support for VRS. First of all the VRS API lets developers set the shading rate in 3 different ways: per draw, within a draw by using a screenspace image, or within a draw, per primitive. The hardware that can support per-draw VRS hardware are Tier 1. There's also a Tier 2, the hardware that can support both per-draw and within-draw variable rate shading. VRS support exists today on in-market NVIDIA hardware and on upcoming Intel hardware. AMD is rumored to be working on support for the feature.

    For example, foveated rendering, rendering the most detail in the area where the user is paying attention, and gradually decreasing the shading rate outside this area to save on performance. In a first-person shooter, the user is likely paying most attention to their crosshairs, and not much attention to the far edges of the screen, making FPS games an ideal candidate for this technique. Another use case for a screenspace image is using an edge detection filter to determine the areas that need a higher shading rate, since edges are where aliasing happens. Once the locations of the edges are known, a developer can set the screenspace image based on that, shading the areas where the edges are with high detail, and reducing the shading rate in other areas of the screen.
     
  2. Lakados

    Lakados [H]ard|Gawd

    Messages:
    1,479
    Joined:
    Feb 3, 2014
    Probably going to be a while before we start seeing any titles that use it, my money would be that Unreal gets it there first. Could be a pretty big deal if implemented for newer titles designed for XBox and could really help out lower end PC's in general. I mean yeah parts of the screen wont look as good but it would be one of those trade off settings, pair that with the eye tracking hardware that is coming to market and in a few years I could see this being a pretty useful thing.
     
    ordray likes this.
  3. tetris42

    tetris42 [H]ardness Supreme

    Messages:
    4,518
    Joined:
    Apr 29, 2014
  4. Kor

    Kor 2[H]4U

    Messages:
    2,176
    Joined:
    Mar 31, 2010
    Works alright in Vulkan with Wolf 2, be good to see it expanded.
     
    DrezKill likes this.
  5. gxp500

    gxp500 Gawd

    Messages:
    867
    Joined:
    Mar 4, 2015
    Because enemies always jump into your crosshair...
     
    SvenBent and tetris42 like this.
  6. KazeoHin

    KazeoHin [H]ardness Supreme

    Messages:
    7,803
    Joined:
    Sep 7, 2011
    jfreund and chili dog like this.
  7. cageymaru

    cageymaru [H]ard as it Gets

    Messages:
    19,709
    Joined:
    Apr 10, 2003
    In VR this makes sense if the headset has eye tracking. The headset can track your eyes and adjust the view accordingly. This technology seems perfect for that scenario!

    In a regular video game... Unless you're playing on a potato... Why would you want this? The blurring of the world except for the area around the crosshair seems terrible. What if someone pops up out of a trench to the right? I want to see him in full HDR; not low resolution.

    Maybe I'm getting old. This new 2018 - 2019 trend of lowering video fidelity to turn on extra eye candy seems counterintuitive. Maybe by 2020 it will make more sense to me. :) I want my video cards fast and powerful! Throw more transistors at the problem!
     
  8. STEvil

    STEvil 2[H]4U

    Messages:
    2,815
    Joined:
    Oct 17, 2000
    What really matters is how much they lower the resolution of the out-of-view image.
     
  9. knowom

    knowom Limp Gawd

    Messages:
    424
    Joined:
    Aug 15, 2008
    I see potential with Microsoft's own hololens 2 if they can use it with zoom/auto focus and post process regions you're eyes are focused on imagine the hololens doing a bit of post process on a 4K screen your looking at in real time along with other cool stuff like custom configurable UI hud's another cog in the wheel.
    As for counter intuitive not so much. I think certain parts of scenes can be more adaptive to make performance more suitable overall. I don't care if a lot of stuff in MIPMAP LOD regions and outer regions of peripheral view get reduced a bit for the sake of enhancing more important regions on screen that are more vital the majority of the time. I think it could be especially great with eye tracking combined.
     
    Last edited: Mar 19, 2019
  10. Zulgrib

    Zulgrib n00b

    Messages:
    31
    Joined:
    Dec 11, 2018
    So, compute power growth is too slow to the point such cheats are required ?
     
  11. ChadD

    ChadD 2[H]4U

    Messages:
    3,932
    Joined:
    Feb 8, 2016
    Even if software had all the compute power it could handle. Why burn resources and power for nothing. Our own brains use this cheat at all times... using natures "cheats" is only logical.
     
  12. naib

    naib [H]ard|Gawd

    Messages:
    1,262
    Joined:
    Jul 26, 2013
    Yes, I am surprised it took the slowdown of Moore's law for ingenuity to return to software. What coders use to do in the 80's was amazing.
    While Moore's law was the observation of doubling in transistor density every 12-18 months, coders have been getting lazier and lazier resulting in the optimal use of the resources being waste. There was a paper I read some time ago which had a figure around x1.1 improvement in software performance at the same time as hardware improvements. That means in real terms the code is getting worse
     
    Zulgrib and Submarinesailor like this.
  13. DedEmbryonicCe11

    DedEmbryonicCe11 [H]ard|Gawd

    Messages:
    1,571
    Joined:
    Jun 6, 2006
    Why do you want less fidelity on the periphery of your vision in a video game to mimic reality? Don't you want to go beyond reality? I thought that's what the industry has been aiming for all along...
     
    lostin3d likes this.
  14. Kor

    Kor 2[H]4U

    Messages:
    2,176
    Joined:
    Mar 31, 2010
    Why would you want to waste resources on something a human is typically incapable of perceiving. The light spectrum is infinite but we more or less stopped at 1.07bn colours from a 32-bit palette and no one is asking for more.
     
  15. tetris42

    tetris42 [H]ardness Supreme

    Messages:
    4,518
    Joined:
    Apr 29, 2014
    Can't speak for everyone, but I think it's lack of trust developers will use this wisely. The assumption is you're always looking dead center at your game and they'll only downscale unnoticed content. Here's some scenarios I can see developers overlooking:

    -Not having detail on any content past a 16:9 ratio because it didn't occur to them people would run it at 21:9
    -As gxp500 pointed out, maybe enemies are coming at you from your periphery and you actually did need some of that detail
    -Maybe the default FOV is horrendously low, so someone is using a mod to make it suitable and now there's no detail on the edges
    -Maybe you want to look at background details you like during a cutscene with locked camera angles

    Remember, this is the same industry that brought us tinting the entire screen brown, overblown bloom, chromatic aberration on everything, depth of field for nice screenshots that makes you feel nearsighted, zoomed-in FOV, 30fps caps, physics tied to framerate, etc.
     
    deton8, jfreund and SvenBent like this.
  16. Uvaman2

    Uvaman2 2[H]4U

    Messages:
    3,031
    Joined:
    Jan 4, 2016
    I think this is true, and I wonder if x86 suffered more years of this than ARM so far.
    So many of the programs I downloaded for android are mere MBs reminds me of the DOS, and 3.1 windows .
    Im glad so efficient methods are being implemented.
     
  17. SvenBent

    SvenBent 2[H]4U

    Messages:
    2,890
    Joined:
    Sep 13, 2008
    Why do you use compression for audio and video.when it removes fidelty?
    It improves overall efficiency.
    the concept should really be easy to understand.

    but lets see the actual execution
     
  18. jfreund

    jfreund Gawd

    Messages:
    951
    Joined:
    Sep 3, 2006
    In reverse, though.
     
    katanaD likes this.
  19. lostin3d

    lostin3d [H]ard|Gawd

    Messages:
    1,995
    Joined:
    Oct 13, 2016
    It only took me 2-3 seconds to identify the image on the right was the one w/o it being used on a 1440p display. Granted that was a still and not moving so no guarantees I'd perceive it in motion. I am, however, one of those obsessed with clarity and sharpness in games so at 4k I'm pretty sure I'd notice, especially after doing a lot of testing with DLSS in the last few months. If your bottom line is frames then then VRS and DLSS are great compromises but if not then it's another half step backwards. The positive I see in this is that if it's an option it becomes another tool gamer's can use to optimize per their need or resources.

    I've commented numerous times recently how all these new features and their various combinations (1080p/1440p/4k, HDR, DLSS, RT, DX11, DX12, Vulkan) are adding an incredulous amount of testing overhead to PC game reviewer's metrics now. It occurred to me that instead of itemizing RT+DLSS, and now VRS, some simplification could be used. How about 2 tiers of, 'everything on' and sharp as can be, or 'all compromises used' and fast & blurry it can be rendered.
     
  20. knowom

    knowom Limp Gawd

    Messages:
    424
    Joined:
    Aug 15, 2008
    Trading IQ in area's you don't care much about for higher IQ in area's you'd rather see improved seems like a great compromise. Done well it can be a very good thing done poorly it can be a rather ugly thing.
     
  21. PeaKr

    PeaKr Gawd

    Messages:
    716
    Joined:
    Sep 6, 2004
    These technologies like VRS/Adaptive Resolution etc... seem like good tools to benefit game streaming. Makes me wonder if it's why they're being developed, nothing I'm interested in.
     
    Shagittarius likes this.
  22. Shagittarius

    Shagittarius n00b

    Messages:
    63
    Joined:
    May 3, 2016
    This is about streaming. Streaming should die.
     
    jfreund likes this.