NVIDIA on DLSS: "It Will Improve over Time"

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
Be patient, says NVIDIA: gamers who are unimpressed with the company’s Deep Learning Super Sampling (DLSS) technique should understand the technology is still in its infancy and that there is plenty of potential to be realized in the coming future. As Andrew Edelsten (Technical Director of Deep Learning) explains, DLSS is reliant on training data, which only continues to grow. It’s part of the reason why the technique is less impressive on lower resolutions, as the focus was on 4K. Edelstein advises gamers may want to avoid TAA due to “high-motion ghosting and flickering.”

We built DLSS to leverage the Turing architecture’s Tensor Cores and to provide the largest benefit when GPU load is high. To this end, we concentrated on high resolutions during development (where GPU load is highest) with 4K (3840x2160) being the most common training target. Running at 4K is beneficial when it comes to image quality as the number of input pixels is high. Typically for 4K DLSS, we have around 3.5-5.5 million pixels from which to generate the final frame, while at 1920x1080 we only have around 1.0-1.5 million pixels. The less source data, the greater the challenge for DLSS to detect features in the input frame and predict the final frame.
 
Yeah I would turn RT and DLSS off with what I saw there... Check 6:30 the radio... DLSS is awful.
Yeah, but it's new. It will be nice when we can have everything turned on with good frames on 4K. It will definitely be a few years.
 
I think the best thing that I've seen out of NVIDIA's new tech is the older games getting updated visuals with A.I.. If this DLSS fad eventually dies out and we're left with that; I'd be a happy gamer. And the randomly generated faces should make game development easier. Heck if we could have a game where the game's characters were randomly generated so no play through was the same; I see all types of possibilities coming from NVIDIA and game developers in the future.

I hope they continue to take chances with new technology as it gives others ideas on how to improve it. Maybe we don't "want" DLSS, but we might want what someone thinking outside of the box creates in the future.

NVIDIA should concentrate on lowering the price of their cards so more creatives and gamers can afford them.
 
I think the best thing that I've seen out of NVIDIA's new tech is the older games getting updated visuals with A.I.. If this DLSS fad eventually dies out and we're left with that; I'd be a happy gamer. And the randomly generated faces should make game development easier. Heck if we could have a game where the game's characters were randomly generated so no play through was the same; I see all types of possibilities coming from NVIDIA and game developers in the future.

I hope they continue to take chances with new technology as it gives others ideas on how to improve it. Maybe we don't "want" DLSS, but we might want what someone thinking outside of the box creates in the future.

NVIDIA should concentrate on lowering the price of their cards so more creatives and gamers can afford them.
That texture upscaling tech for old games e.g. ff7, is nothing to do with ngreedia. It's a group of enthusiasts delivering what e.g. sqauresoft won't (and seem to think people want to play tomb raider ff7 and ff8..).
 
Last edited:
Yeah, but it's new. It will be nice when we can have everything turned on with good frames on 4K. It will definitely be a few years.

The thing is DLSS isn't 4K, it's upscaled lower resolution. Yeah the display renders 4K number of pixel but the fact is it's upscaled.
You can't magically create pixels. They were guessed and the result show for itself... it's blurry. I'm not ok with settling for lower quality for the sake of saying it runs at 4K (upscaled).
 
Turing as a whole has been a failure because of NVIDIAs pricing. Just look at the new 1660 Ti at $279 and 6 GB of vram with performance only matching a $300 GTX 1070. Without any extra benefit of Turing's RTX/DLSS features, why would anyone want a 1660 Ti over a GTX 1070 which is selling for $299 on NewEgg right now. No wonder NVIDIA isn't meeting their financial targets for gaming, their price/performance is lousy and gamers aren't stupid. If the 2080 Ti was $700 I'd have bought one or maybe even two of them but they simply priced themselves out of the market this generation.
 
Why can't AA just be done off the 3D models in the game. The game knows all

Find the outline of a 3D object from game data make a 2D line, AA that line. Then do each one until all is done on the screen. Instead of having to search the whole screen for this stuff
 
I use DLSS in BFV, I don't see what all the complaining is about.
 
I think the best thing that I've seen out of NVIDIA's new tech is the older games getting updated visuals with A.I.. If this DLSS fad eventually dies out and we're left with that; I'd be a happy gamer. And the randomly generated faces should make game development easier. Heck if we could have a game where the game's characters were randomly generated so no play through was the same; I see all types of possibilities coming from NVIDIA and game developers in the future.

I hope they continue to take chances with new technology as it gives others ideas on how to improve it. Maybe we don't "want" DLSS, but we might want what someone thinking outside of the box creates in the future.

NVIDIA should concentrate on lowering the price of their cards so more creatives and gamers can afford them.

And if we can get randomly generated faces that look realistic enough perhaps we can cut actors out of the picture, aside from motion capture. A lot of games are starting to hire high profile actors which are nothing but another cost to balloon development costs. Kicking those clowns to the curb would be a huge benefit. Hell, if we can kick them out of Hollywood and replace humans with realistic CGI characters that would be great.
 
So am I the only one that read that announcement as. DLSS is great and will be amazing... when we get a couple more generations of cards under our belts. Thanks for buying our test cards! We had to start somewhere and the early adopters get to pay for it.

Same as all the other tech really. I'm not terribly surprised really, nor am I worried about it. This is normal for new tech.
 
And if we can get randomly generated faces that look realistic enough perhaps we can cut actors out of the picture, aside from motion capture. A lot of games are starting to hire high profile actors which are nothing but another cost to balloon development costs. Kicking those clowns to the curb would be a huge benefit. Hell, if we can kick them out of Hollywood and replace humans with realistic CGI characters that would be great.

I agree somewhat. I'm tired of having celebrities in everything. I don't need Tom Cruise voicing my character, almost anyone would do fine.
 
So am I the only one that read that announcement as. DLSS is great and will be amazing... when we get a couple more generations of cards under our belts. Thanks for buying our test cards! We had to start somewhere and the early adopters get to pay for it.

Same as all the other tech really. I'm not terribly surprised really, nor am I worried about it. This is normal for new tech.

DLSS as an upscaler will never be better than native resolution. Now if you add proprietary features like RT and then upscale it that's something else. Right now, RT + DLSS would still be disabled on my rig, I still find actual gameplay footage to be better looking than upscaled low resolution with bells and whistles. If they can lower the image quality hit, maybe we can compromise with I prefer more special effects like RT than better quality.

However, I do think AI has something to do in the future but maybe not as an upscaler. They might be able to use it for other purpose that will make the overall experience better.
 
And if we can get randomly generated faces that look realistic enough perhaps we can cut actors out of the picture, aside from motion capture. A lot of games are starting to hire high profile actors which are nothing but another cost to balloon development costs. Kicking those clowns to the curb would be a huge benefit. Hell, if we can kick them out of Hollywood and replace humans with realistic CGI characters that would be great.

fc1ba34a16f46e9ff0caa3f3f4e45dd6.jpg
 
I still have some faith in Rt thanks to what I've seen in Metro. DLSS, on the other hand, still needs to prove itself to me beyond concept. I was 'upscaling' things back in DVD days from the downloads I got via dial-up. I know what it's like to squeeze water from the stone but currently it feels more like several steps backwards.
 
Turing as a whole has been a failure because of NVIDIAs pricing. Just look at the new 1660 Ti at $279 and 6 GB of vram with performance only matching a $300 GTX 1070. Without any extra benefit of Turing's RTX/DLSS features, why would anyone want a 1660 Ti over a GTX 1070 which is selling for $299 on NewEgg right now. No wonder NVIDIA isn't meeting their financial targets for gaming, their price/performance is lousy and gamers aren't stupid. If the 2080 Ti was $700 I'd have bought one or maybe even two of them but they simply priced themselves out of the market this generation.

They had to price their 20* series cards high or the excess inventory of 10* series would never sell. Nvidia is doing what they need to do right now. Once inventory clears, it'll be a different story... and the competition (or lack thereof) certainly isn't forcing their hand.
 
They had to price their 20* series cards high or the excess inventory of 10* series would never sell. Nvidia is doing what they need to do right now. Once inventory clears, it'll be a different story... and the competition (or lack thereof) certainly isn't forcing their hand.

Well considering their net revenue fell over 50% I'd say the strategy isn't working out that great. AMD is not a good competitor and nobody should expect them to ever be anymore (I wrote them off years ago, they're a joke). What NVIDIA should be losing sleep over is the fact that Intel is coming after them in 2020 with a top down solution for consumers and professional markets. Intel has the ability to leverage full platforms that include a custom CPU/GPU/Chipset to companies and consumers alike but NVIDIA has nothing to compete against them. Once NVIDIAs bread and butter in the discrete space gets eaten by Intel's midrange offerings, they will be up shit creek w/out a paddle.

This article is very telling about NVIDIAs future: https://finance.yahoo.com/news/why-nvidia-apos-growth-days-183000981.html

Then factor in that discrete GPU attach rates keep shrinking each year and have been since around 2006-2007: https://www.jonpeddie.com/press-rel...ch-reports-gpu-shipments-up-in-q318-from-las/

Without a viable CPU architecture and growth opportunities (data center will be gone as will be self driving cars eventually), NVIDIA is fucked. I'm usually an NVIDIA supporter because I believe they make great technology but their Turing release was a slap in the face of consumers. Yes DXR/ray tracing may be something we can use down the line but the way they tried to subsidize Turing (it's really for professional use) by shoving it down our throats at inflated prices was extremely greedy and a disservice to their customers.
 
Last edited:
Well considering their net revenue fell over 50% I'd say the strategy isn't working out that great. AMD is not a good competitor and nobody should expect them to ever be anymore (I wrote them off years ago, they're a joke). What NVIDIA should be losing sleep over is the fact that Intel is coming after them in 2020 with a top down solution for consumers and professional markets. Intel has the ability to leverage full platforms that include a custom CPU/GPU/Chipset to companies and consumers alike but NVIDIA has nothing to compete against them. Once NVIDIAs bread and butter in the discrete space gets eaten by Intel's midrange offerings, they will be up shit creek w/out a paddle.

This article is very telling about NVIDIAs future: https://finance.yahoo.com/news/why-nvidia-apos-growth-days-183000981.html

Then factor in that discrete GPU attach rates keep shrinking each year and have been since around 2006-2007: https://www.jonpeddie.com/press-rel...ch-reports-gpu-shipments-up-in-q318-from-las/

Without a viable CPU architecture and growth opportunities (data center will be gone as will be self driving cars eventually), NVIDIA is fucked. I'm usually an NVIDIA supporter because I believe they make great technology but their Turing release was a slap in the face of consumers. Yes DXR/ray tracing may be something we can use down the line but the way they tried to subsidize Turing (it's really for professional use) by shoving it down our throats at inflated prices was extremely greedy and a disservice to their customers.

Interesting times ahead, indeed. Intel and AMD have the CPU architecture advantage that makes this an interesting race.
Totally agree that ray tracing is more applicable to professional use right now, but you've got to start somewhere on the gaming side. We are at the chicken or the egg stage, and it's frustrating... but nobody forced anyone to buy a Turing card.
 
DLSS is pretty much the same as FXAA (no real performance hit but the introduction of blur)...I honestly was expecting something much better after all the hype
 
Interesting times ahead, indeed. Intel and AMD have the CPU architecture advantage that makes this an interesting race.
Totally agree that ray tracing is more applicable to professional use right now, but you've got to start somewhere on the gaming side. We are at the chicken or the egg stage, and it's frustrating... but nobody forced anyone to buy a Turing card.

True and judging by NVIDIAs financial results hardly anyone did. :ROFLMAO:
 
Be patient, eh?

So, DLSS will improve when a future GPU is powerful enough to take a 4K base and apply it to an 8K render at playable framerates?

It that what they mean?
 
Let me get this straight: Nvidia is using a supercomputer to create maps/models to improve DLSS on a per game basis?
That means the "AI" portion of the deep learning crap is all offline and not a part of RTX technology (in case you thought RTX was the AI cat's meow).

Are game publishers required to pay Nvidia for creating DLSS models? Hello higher development costs.
Is Geforce experience phoning home with data from your playthrough to feed the supercomputer?
 
Well considering their net revenue fell over 50% I'd say the strategy isn't working out that great. AMD is not a good competitor and nobody should expect them to ever be anymore (I wrote them off years ago, they're a joke). What NVIDIA should be losing sleep over is the fact that Intel is coming after them in 2020 with a top down solution for consumers and professional markets. Intel has the ability to leverage full platforms that include a custom CPU/GPU/Chipset to companies and consumers alike but NVIDIA has nothing to compete against them. Once NVIDIAs bread and butter in the discrete space gets eaten by Intel's midrange offerings, they will be up shit creek w/out a paddle.

This article is very telling about NVIDIAs future: https://finance.yahoo.com/news/why-nvidia-apos-growth-days-183000981.html

Then factor in that discrete GPU attach rates keep shrinking each year and have been since around 2006-2007: https://www.jonpeddie.com/press-rel...ch-reports-gpu-shipments-up-in-q318-from-las/

Without a viable CPU architecture and growth opportunities (data center will be gone as will be self driving cars eventually), NVIDIA is fucked. I'm usually an NVIDIA supporter because I believe they make great technology but their Turing release was a slap in the face of consumers. Yes DXR/ray tracing may be something we can use down the line but the way they tried to subsidize Turing (it's really for professional use) by shoving it down our throats at inflated prices was extremely greedy and a disservice to their customers.

They did try really hard promoting Tegra 2 and 3 for mainstream devices at least. Gotta give them some credit! :android:
 
So it finally is official:
#Waitfor_DLSS

Maybe until these generation cards are obsolete ?
 
so typical of nvidia to produce games to be darker as to increase fps............
 
Just seems like a speed hack (I am being harsh admittedly) to allow RTX to actually come into play-ability. I could have seen this gaining traction during the 290/780 days when 4k was almost impossible without dropping a massive amount of cash and dealing with several other draw backs.
Now I think people are close enough to 4k60 that we do not want to sacrifice resolution in order to get a single effect no matter how great that effect may be.
So I will pass on DLSS, color me unimpressed.
 
Let me get this straight: Nvidia is using a supercomputer to create maps/models to improve DLSS on a per game basis?
That means the "AI" portion of the deep learning crap is all offline and not a part of RTX technology (in case you thought RTX was the AI cat's meow).

Are game publishers required to pay Nvidia for creating DLSS models? Hello higher development costs.
Is Geforce experience phoning home with data from your playthrough to feed the supercomputer?

The “learning” happens at nVidia to create the AI algorithm and is free to the dev.

Sending screenshots from your computer to nVidia would be worthless to nVidia. That’s not how it works.
 
The “learning” happens at nVidia to create the AI algorithm and is free to the dev.

Sending screenshots from your computer to nVidia would be worthless to nVidia. That’s not how it works.

How do you know it's free to the devs?
I didn't say screenshots. I said data.
 
According to Jensen's presentation at launch, DLSS is this: they feed low-resolution images and then very high-resolution versions of the same images of any given game into their supercomputer. It's supervised machine learning, and we had this for a long time now. The neural network will then create a model that NVIDIA embeds into their driver updates and that runs on the Tensor cores on your RTX graphics card.

In theory, this all sounds nice, except that 3D games are very hard to predict, so the ML model cannot account for every situation. Hence, you get a blurry mess, where an upscaled game with TAA enabled looks better than DLSS.

NVIDIA knew this of course, so DLSS wasn't ever meant to improve anything, except help fake real-time ray-tracing. Let's get one thing straight: RTX GPUs aren't capable of full frame real-time ray tracing. So what NVIDIA does is, for example, calculate a handful of rays, inside a frame that will be rasterized in the end mind you, and then DLSS is supposed to fill in how the rest of that "real-time ray-tracing" shadow or reflection looks like. That's why without DLSS you get massive performance drops in Battlefield 5 for example.

Turing GPUs were explicitly designed for enterprise applications. They are fantastic for visual effects studios, natural gas exploration, medical imaging and so on. NVIDIA didn't want to create a separate line of GPUs for gaming, and the die portion with Tensor cores and whatever the RT cores are made of (did you guy notice that the RT cores count matches the SM count on each RTX model) would have been too much to disable in consumer GPUs. So they decided to bring us real-time ray-tracing and deep learning supersampling. If you think about it, this last one is ridiculous, given that 4K displays are readily available for gaming these days, so ... Anyway, not to go in a circle, but DLSS at this point can do only one thing: help sell RTX ON in games. You will get slightly better reflections and shittier all-around image quality from now when these two are turned on.

So, can we agree that Turing has been a flop so far for the consumer market, and we would have been better off getting more SMs for better-rasterized graphics for this generation?

It's sad that two years after the GTX 1080Ti was released, it actually looks like a hot deal if you can find one brand new at the MSRP of $699 or $799 ~ $899 for a highly factory overclocked card with upgraded cooling. Yay for progress!

I can already hear the sharks ...err... lawyers sharpening their teeth. This will turn into the worse class action lawsuit that NVIDIA ever had to deal with on the consumer side.
 
Last edited:
Back
Top