NVIDIA on DLSS: "It Will Improve over Time"

Discussion in '[H]ard|OCP Front Page News' started by Megalith, Feb 17, 2019.

  1. Megalith

    Megalith 24-bit/48kHz Staff Member

    Messages:
    13,003
    Joined:
    Aug 20, 2006
    Be patient, says NVIDIA: gamers who are unimpressed with the company’s Deep Learning Super Sampling (DLSS) technique should understand the technology is still in its infancy and that there is plenty of potential to be realized in the coming future. As Andrew Edelsten (Technical Director of Deep Learning) explains, DLSS is reliant on training data, which only continues to grow. It’s part of the reason why the technique is less impressive on lower resolutions, as the focus was on 4K. Edelstein advises gamers may want to avoid TAA due to “high-motion ghosting and flickering.”

    We built DLSS to leverage the Turing architecture’s Tensor Cores and to provide the largest benefit when GPU load is high. To this end, we concentrated on high resolutions during development (where GPU load is highest) with 4K (3840x2160) being the most common training target. Running at 4K is beneficial when it comes to image quality as the number of input pixels is high. Typically for 4K DLSS, we have around 3.5-5.5 million pixels from which to generate the final frame, while at 1920x1080 we only have around 1.0-1.5 million pixels. The less source data, the greater the challenge for DLSS to detect features in the input frame and predict the final frame.
     
  2. MMitch

    MMitch Limp Gawd

    Messages:
    500
    Joined:
    Nov 29, 2016
    Yeah I would turn RT and DLSS off with what I saw there... Check 6:30 the radio... DLSS is awful.
     
  3. Hagrid

    Hagrid [H]ardness Supreme

    Messages:
    7,993
    Joined:
    Nov 23, 2006
    Yeah, but it's new. It will be nice when we can have everything turned on with good frames on 4K. It will definitely be a few years.
     
    ZeroBarrier likes this.
  4. PhaseNoise

    PhaseNoise [H]ard|Gawd

    Messages:
    1,254
    Joined:
    May 11, 2005
    The whole idea of using non-critical path resources (offline + tensor cores) to determine where best to spend critical-path resources is a Good Thing.

    DLSS is not the final word on the topic.
     
  5. cageymaru

    cageymaru [H]ard|News

    Messages:
    19,086
    Joined:
    Apr 10, 2003
    I think the best thing that I've seen out of NVIDIA's new tech is the older games getting updated visuals with A.I.. If this DLSS fad eventually dies out and we're left with that; I'd be a happy gamer. And the randomly generated faces should make game development easier. Heck if we could have a game where the game's characters were randomly generated so no play through was the same; I see all types of possibilities coming from NVIDIA and game developers in the future.

    I hope they continue to take chances with new technology as it gives others ideas on how to improve it. Maybe we don't "want" DLSS, but we might want what someone thinking outside of the box creates in the future.

    NVIDIA should concentrate on lowering the price of their cards so more creatives and gamers can afford them.
     
    trandoanhung1991, Bcc335 and msshammy like this.
  6. Chris_B

    Chris_B [H]ardness Supreme

    Messages:
    5,052
    Joined:
    May 29, 2001
    Be patient, it's been what 6 months almost and its in 2 games and a couple of benchmarks?
     
  7. captaindiptoad

    captaindiptoad Limp Gawd

    Messages:
    367
    Joined:
    Dec 22, 2014
    DLSS is shit, and will always be shit, its some weird downscale then upscale bullshit that ruins quality.
     
    Nightfire and darckhart like this.
  8. russnuck

    russnuck Gawd

    Messages:
    647
    Joined:
    Mar 25, 2005
    But i was promised "it just works" (tm)
     
  9. N4CR

    N4CR 2[H]4U

    Messages:
    3,426
    Joined:
    Oct 17, 2011
    That texture upscaling tech for old games e.g. ff7, is nothing to do with ngreedia. It's a group of enthusiasts delivering what e.g. sqauresoft won't (and seem to think people want to play tomb raider ff7 and ff8..).
     
    Last edited: Feb 17, 2019
  10. MMitch

    MMitch Limp Gawd

    Messages:
    500
    Joined:
    Nov 29, 2016
    The thing is DLSS isn't 4K, it's upscaled lower resolution. Yeah the display renders 4K number of pixel but the fact is it's upscaled.
    You can't magically create pixels. They were guessed and the result show for itself... it's blurry. I'm not ok with settling for lower quality for the sake of saying it runs at 4K (upscaled).
     
    Shaten and captaindiptoad like this.
  11. gxp500

    gxp500 Gawd

    Messages:
    872
    Joined:
    Mar 4, 2015
    Thank you Nvidia for bringing console quality graphics to the pc with dlss...
     
    Shaten, Bigshrimp, Flexion and 5 others like this.
  12. kirbyrj

    kirbyrj [H]ard as it Gets

    Messages:
    23,704
    Joined:
    Feb 1, 2005
    So in other words 1/3 of the way through the release cycle for the current generation of cards that were 40% more than previous generations due to the new features.
     
  13. BloodyIron

    BloodyIron 2[H]4U

    Messages:
    3,431
    Joined:
    Jul 11, 2005
    Why sell them on it being great now when we can sell them on it being great later?
     
    crazycuz20 likes this.
  14. mkk

    mkk [H]Lite

    Messages:
    91
    Joined:
    Jun 20, 2018
    Interesting how they now mention that if the framerate becomes too high, then DLSS stops working because it can't keep up.
    Yeah, this kind of tech belongs on consoles.
     
    N4CR, thenapalm, Marees and 1 other person like this.
  15. 5150Joker

    5150Joker 2[H]4U

    Messages:
    2,914
    Joined:
    Aug 1, 2005
    Turing as a whole has been a failure because of NVIDIAs pricing. Just look at the new 1660 Ti at $279 and 6 GB of vram with performance only matching a $300 GTX 1070. Without any extra benefit of Turing's RTX/DLSS features, why would anyone want a 1660 Ti over a GTX 1070 which is selling for $299 on NewEgg right now. No wonder NVIDIA isn't meeting their financial targets for gaming, their price/performance is lousy and gamers aren't stupid. If the 2080 Ti was $700 I'd have bought one or maybe even two of them but they simply priced themselves out of the market this generation.
     
    Bigshrimp, Ranulfo, N4CR and 3 others like this.
  16. Galvin

    Galvin 2[H]4U

    Messages:
    2,666
    Joined:
    Jan 22, 2002
    Why can't AA just be done off the 3D models in the game. The game knows all

    Find the outline of a 3D object from game data make a 2D line, AA that line. Then do each one until all is done on the screen. Instead of having to search the whole screen for this stuff
     
  17. Krenum

    Krenum [H]ardForum Junkie

    Messages:
    15,100
    Joined:
    Apr 29, 2005
    I use DLSS in BFV, I don't see what all the complaining is about.
     
  18. 5150Joker

    5150Joker 2[H]4U

    Messages:
    2,914
    Joined:
    Aug 1, 2005
    Well some pictures have shown that it doesn't look any better than resolution sampling at 80-85% and has the same performance hit.
     
    Nightfire likes this.
  19. Flogger23m

    Flogger23m [H]ardForum Junkie

    Messages:
    9,491
    Joined:
    Jun 19, 2009
    And if we can get randomly generated faces that look realistic enough perhaps we can cut actors out of the picture, aside from motion capture. A lot of games are starting to hire high profile actors which are nothing but another cost to balloon development costs. Kicking those clowns to the curb would be a huge benefit. Hell, if we can kick them out of Hollywood and replace humans with realistic CGI characters that would be great.
     
  20. Grimlaking

    Grimlaking 2[H]4U

    Messages:
    2,558
    Joined:
    May 9, 2006
    So am I the only one that read that announcement as. DLSS is great and will be amazing... when we get a couple more generations of cards under our belts. Thanks for buying our test cards! We had to start somewhere and the early adopters get to pay for it.

    Same as all the other tech really. I'm not terribly surprised really, nor am I worried about it. This is normal for new tech.
     
    crazycuz20 likes this.
  21. Furious_Styles

    Furious_Styles [H]ard|Gawd

    Messages:
    1,052
    Joined:
    Jan 16, 2013
    I agree somewhat. I'm tired of having celebrities in everything. I don't need Tom Cruise voicing my character, almost anyone would do fine.
     
    GoldenTiger likes this.
  22. MMitch

    MMitch Limp Gawd

    Messages:
    500
    Joined:
    Nov 29, 2016
    DLSS as an upscaler will never be better than native resolution. Now if you add proprietary features like RT and then upscale it that's something else. Right now, RT + DLSS would still be disabled on my rig, I still find actual gameplay footage to be better looking than upscaled low resolution with bells and whistles. If they can lower the image quality hit, maybe we can compromise with I prefer more special effects like RT than better quality.

    However, I do think AI has something to do in the future but maybe not as an upscaler. They might be able to use it for other purpose that will make the overall experience better.
     
  23. Krenum

    Krenum [H]ardForum Junkie

    Messages:
    15,100
    Joined:
    Apr 29, 2005
    fc1ba34a16f46e9ff0caa3f3f4e45dd6.jpg
     
  24. lostin3d

    lostin3d [H]ard|Gawd

    Messages:
    1,826
    Joined:
    Oct 13, 2016
    I still have some faith in Rt thanks to what I've seen in Metro. DLSS, on the other hand, still needs to prove itself to me beyond concept. I was 'upscaling' things back in DVD days from the downloads I got via dial-up. I know what it's like to squeeze water from the stone but currently it feels more like several steps backwards.
     
    darckhart and MMitch like this.
  25. Minutemaid

    Minutemaid n00b

    Messages:
    13
    Joined:
    Nov 5, 2011
    They had to price their 20* series cards high or the excess inventory of 10* series would never sell. Nvidia is doing what they need to do right now. Once inventory clears, it'll be a different story... and the competition (or lack thereof) certainly isn't forcing their hand.
     
  26. 5150Joker

    5150Joker 2[H]4U

    Messages:
    2,914
    Joined:
    Aug 1, 2005
    Well considering their net revenue fell over 50% I'd say the strategy isn't working out that great. AMD is not a good competitor and nobody should expect them to ever be anymore (I wrote them off years ago, they're a joke). What NVIDIA should be losing sleep over is the fact that Intel is coming after them in 2020 with a top down solution for consumers and professional markets. Intel has the ability to leverage full platforms that include a custom CPU/GPU/Chipset to companies and consumers alike but NVIDIA has nothing to compete against them. Once NVIDIAs bread and butter in the discrete space gets eaten by Intel's midrange offerings, they will be up shit creek w/out a paddle.

    This article is very telling about NVIDIAs future: https://finance.yahoo.com/news/why-nvidia-apos-growth-days-183000981.html

    Then factor in that discrete GPU attach rates keep shrinking each year and have been since around 2006-2007: https://www.jonpeddie.com/press-rel...ch-reports-gpu-shipments-up-in-q318-from-las/

    Without a viable CPU architecture and growth opportunities (data center will be gone as will be self driving cars eventually), NVIDIA is fucked. I'm usually an NVIDIA supporter because I believe they make great technology but their Turing release was a slap in the face of consumers. Yes DXR/ray tracing may be something we can use down the line but the way they tried to subsidize Turing (it's really for professional use) by shoving it down our throats at inflated prices was extremely greedy and a disservice to their customers.
     
    Last edited: Feb 18, 2019
    crazycuz20 likes this.
  27. Minutemaid

    Minutemaid n00b

    Messages:
    13
    Joined:
    Nov 5, 2011
    Interesting times ahead, indeed. Intel and AMD have the CPU architecture advantage that makes this an interesting race.
    Totally agree that ray tracing is more applicable to professional use right now, but you've got to start somewhere on the gaming side. We are at the chicken or the egg stage, and it's frustrating... but nobody forced anyone to buy a Turing card.
     
  28. polonyc2

    polonyc2 [H]ardForum Junkie

    Messages:
    15,926
    Joined:
    Oct 25, 2004
    DLSS is pretty much the same as FXAA (no real performance hit but the introduction of blur)...I honestly was expecting something much better after all the hype
     
  29. 5150Joker

    5150Joker 2[H]4U

    Messages:
    2,914
    Joined:
    Aug 1, 2005
    True and judging by NVIDIAs financial results hardly anyone did. :ROFLMAO:
     
    Submarinesailor likes this.
  30. DejaWiz

    DejaWiz Oracle of Unfortunate Truths

    Messages:
    18,475
    Joined:
    Apr 15, 2005
    Be patient, eh?

    So, DLSS will improve when a future GPU is powerful enough to take a 4K base and apply it to an 8K render at playable framerates?

    It that what they mean?
     
  31. Submarinesailor

    Submarinesailor [H]Lite

    Messages:
    73
    Joined:
    Mar 23, 2016
    Let me get this straight: Nvidia is using a supercomputer to create maps/models to improve DLSS on a per game basis?
    That means the "AI" portion of the deep learning crap is all offline and not a part of RTX technology (in case you thought RTX was the AI cat's meow).

    Are game publishers required to pay Nvidia for creating DLSS models? Hello higher development costs.
    Is Geforce experience phoning home with data from your playthrough to feed the supercomputer?
     
  32. Shadowed

    Shadowed Limp Gawd

    Messages:
    460
    Joined:
    Mar 21, 2018
    They did try really hard promoting Tegra 2 and 3 for mainstream devices at least. Gotta give them some credit! :android:
     
  33. Pieter3dnow

    Pieter3dnow [H]ardness Supreme

    Messages:
    6,351
    Joined:
    Jul 29, 2009
    So it finally is official:
    #Waitfor_DLSS

    Maybe until these generation cards are obsolete ?
     
  34. adder1971

    adder1971 n00b

    Messages:
    30
    Joined:
    Nov 19, 2010
    Buy it now! It just works!

    Or wait to buy it until it works......
     
    Last edited: Feb 18, 2019
    Nightfire likes this.
  35. fightingfi

    fightingfi Look at Me! I need the attention.

    Messages:
    2,405
    Joined:
    Oct 9, 2008
    so typical of nvidia to produce games to be darker as to increase fps............
     
  36. {NG}Fidel

    {NG}Fidel [H]ardness Supreme

    Messages:
    5,817
    Joined:
    Jan 17, 2005
    Just seems like a speed hack (I am being harsh admittedly) to allow RTX to actually come into play-ability. I could have seen this gaining traction during the 290/780 days when 4k was almost impossible without dropping a massive amount of cash and dealing with several other draw backs.
    Now I think people are close enough to 4k60 that we do not want to sacrifice resolution in order to get a single effect no matter how great that effect may be.
    So I will pass on DLSS, color me unimpressed.
     
    Marees and Shadowed like this.
  37. Dayaks

    Dayaks [H]ardness Supreme

    Messages:
    6,585
    Joined:
    Feb 22, 2012
    The “learning” happens at nVidia to create the AI algorithm and is free to the dev.

    Sending screenshots from your computer to nVidia would be worthless to nVidia. That’s not how it works.
     
  38. Submarinesailor

    Submarinesailor [H]Lite

    Messages:
    73
    Joined:
    Mar 23, 2016
    How do you know it's free to the devs?
    I didn't say screenshots. I said data.
     
  39. Dayaks

    Dayaks [H]ardness Supreme

    Messages:
    6,585
    Joined:
    Feb 22, 2012
    Data would be useless and they’ve said it’s a free service to the devs.
     
  40. STEM

    STEM Gawd

    Messages:
    523
    Joined:
    Jun 7, 2007
    According to Jensen's presentation at launch, DLSS is this: they feed low-resolution images and then very high-resolution versions of the same images of any given game into their supercomputer. It's supervised machine learning, and we had this for a long time now. The neural network will then create a model that NVIDIA embeds into their driver updates and that runs on the Tensor cores on your RTX graphics card.

    In theory, this all sounds nice, except that 3D games are very hard to predict, so the ML model cannot account for every situation. Hence, you get a blurry mess, where an upscaled game with TAA enabled looks better than DLSS.

    NVIDIA knew this of course, so DLSS wasn't ever meant to improve anything, except help fake real-time ray-tracing. Let's get one thing straight: RTX GPUs aren't capable of full frame real-time ray tracing. So what NVIDIA does is, for example, calculate a handful of rays, inside a frame that will be rasterized in the end mind you, and then DLSS is supposed to fill in how the rest of that "real-time ray-tracing" shadow or reflection looks like. That's why without DLSS you get massive performance drops in Battlefield 5 for example.

    Turing GPUs were explicitly designed for enterprise applications. They are fantastic for visual effects studios, natural gas exploration, medical imaging and so on. NVIDIA didn't want to create a separate line of GPUs for gaming, and the die portion with Tensor cores and whatever the RT cores are made of (did you guy notice that the RT cores count matches the SM count on each RTX model) would have been too much to disable in consumer GPUs. So they decided to bring us real-time ray-tracing and deep learning supersampling. If you think about it, this last one is ridiculous, given that 4K displays are readily available for gaming these days, so ... Anyway, not to go in a circle, but DLSS at this point can do only one thing: help sell RTX ON in games. You will get slightly better reflections and shittier all-around image quality from now when these two are turned on.

    So, can we agree that Turing has been a flop so far for the consumer market, and we would have been better off getting more SMs for better-rasterized graphics for this generation?

    It's sad that two years after the GTX 1080Ti was released, it actually looks like a hot deal if you can find one brand new at the MSRP of $699 or $799 ~ $899 for a highly factory overclocked card with upgraded cooling. Yay for progress!

    I can already hear the sharks ...err... lawyers sharpening their teeth. This will turn into the worse class action lawsuit that NVIDIA ever had to deal with on the consumer side.
     
    Last edited: Feb 18, 2019
    Marees and 5150Joker like this.