NVIDIA on DLSS: "It Will Improve over Time"

When I first read about DLSS I was excited. It was actually the most interesting feature of RTX cards for me.

But it has been one disappointment after another.

HWUB are fairly rigorous and unbiased testers, their DLSS results in BF5 are an unbelievable fail.


All this AI technology and ML training on supercomputers and they end result is clearly worse than simple upscaling.

AMD should just encourage devs to decouple the HUD resolution from 3D resolution, and have a couple of tweakable generic scalers for gamers, and they would beat DLSS with a solution that just worked everywhere.

Heads should be rolling at NVidia, after this dismal demonstration of their technology.
 
Glad I did not buy a new RTX laptop not worth it. I can live with my Asus Rog laptop w I7 and 1060 for a while longer....
 
And if we can get randomly generated faces that look realistic enough perhaps we can cut actors out of the picture, aside from motion capture. A lot of games are starting to hire high profile actors which are nothing but another cost to balloon development costs. Kicking those clowns to the curb would be a huge benefit. Hell, if we can kick them out of Hollywood and replace humans with realistic CGI characters that would be great.

That will never happen. Generally speaking, actors are getting paid less and less, and there is a dime a dozen of them, and the competition is stiff.

They had to price their 20* series cards high or the excess inventory of 10* series would never sell. Nvidia is doing what they need to do right now. Once inventory clears, it'll be a different story... and the competition (or lack thereof) certainly isn't forcing their hand.

After they screwed over their board partners and the consumers, they will lower their prices because...? Right, because if they did, everyone, including me, would become ever more angry at them for getting fleeced ($800 for an 8GB GPU in 2019, wtf NVIDIA?!) and would demand the difference in price back. NVIDIA will keep eating their shit-pie do whatever it takes, including, but not limited to, giving money to developers to implement, even if half-assed, RTX features into their games. Mark my words, NVIDIA will do whatever it takes before they concede that RTX was a failure and that that they priced this GPU generation way too high.

All this AI technology and ML training on supercomputers and they end result is clearly worse than simple upscaling.

Well, NVIDIA has a storied history of cutting image quality to improve FPS. This time around they just did it very openly, so no one can call them out on cheating.

AMD should just encourage devs to decouple the HUD resolution from 3D resolution, and have a couple of tweakable generic scalers for gamers, and they would beat DLSS with a solution that just worked everywhere.

Simple solutions are always the best. But then NVIDIA and other companies would have to cut down drastically on their marketing budget, and people would stop investing in nonsense. Just like ASUS will happily sell you a Z390 Maximus XI Hero motherboard as an 8+2 phase motherboard, when in fact it only has 4+2. Screw efficiency, if we can save $20 in VRM components. All sarcasm aside, SLI is dead, and NVLINK on consumer cards is a gamble at best. So NVIDIA has to make their margins by selling fewer GPUs. Admittedly, SLI has always been a little bit like VoooDoo magic (pun intended). So they are selling us garbage upscaller with a fancy name - DLSS - and a feature called Real Time Ray-Tracing that will take several generations to get to an acceptable performance at today's resolutions. So no, no one likes simple common sense solutions because it doesn't generate sales. Sorry...
 
Here comes the $20 rebates in 3 years.

Yeah, every game having an in-game rendering scaler so the HUD and junk is native is great. All games should do that now.
 
after coming here the other day and having played about 90 min of metro, i decied to go in and turn off DLSS to see if I notice a difference - NIGHT & DAY. with DLSS on it's like looking through Vaseline on glass.. with it off the game is so much more clear and crisp. flipped back and forth a few times and all I can say is I'm really disappointed in DLSS. Battlefield V's implementation is far superior to Metro's. I'm running 2080Ti @ 1440P, so not sure if that resolution is what's driving Metro's DLSS to make it look like 720P smudged to 1440P.

I think the idea and concept of DLSS and where they are going is the right direction... but it should NOT make the game look visually horrible. leaving off for now as it's only a 10-20 FPS hit from what I can tell so far, I still run over 60 FPS with DXR and no DLSS @ 1440P
 
after coming here the other day and having played about 90 min of metro, i decied to go in and turn off DLSS to see if I notice a difference - NIGHT & DAY. with DLSS on it's like looking through Vaseline on glass.. with it off the game is so much more clear and crisp. flipped back and forth a few times and all I can say is I'm really disappointed in DLSS. Battlefield V's implementation is far superior to Metro's. I'm running 2080Ti @ 1440P, so not sure if that resolution is what's driving Metro's DLSS to make it look like 720P smudged to 1440P.

I think the idea and concept of DLSS and where they are going is the right direction... but it should NOT make the game look visually horrible. leaving off for now as it's only a 10-20 FPS hit from what I can tell so far, I still run over 60 FPS with DXR and no DLSS @ 1440P

Maybe these are rushed implementations, I can only imagine NVIDIA leveraging all their relationships in the industry and throwing money at the developers. "Let's make it happen" so we can show them that "it just works". Maybe NVIDIA could fix this to where DLSS wouldn't upscale an image from a lower resolution, but rather when you enable DLSS at 4K for example, it fills in the gaps for RT and improves image quality instead of having to rely on TAA. If NVIDIA's goal for consumer Turing with their RTX cards was to improve image quality, then so far they failed miserably by half-assing it. Fix it - fix it - fix it - please NVIDIA.
 
Maybe these are rushed implementations, I can only imagine NVIDIA leveraging all their relationships in the industry and throwing money at the developers. "Let's make it happen" so we can show them that "it just works". Maybe NVIDIA could fix this to where DLSS wouldn't upscale an image from a lower resolution, but rather when you enable DLSS at 4K for example, it fills in the gaps for RT and improves image quality instead of having to rely on TAA. If NVIDIA's goal for consumer Turing with their RTX cards was to improve image quality, then so far they failed miserably by half-assing it. Fix it - fix it - fix it - please NVIDIA.

I think they will, but I would of course caution people to never buy anything based upon what might be.

As I said before, it's a neat concept overall - using offline and otherwise-unused resources to speed up critical path resources. After all, what is optimization? Fundamentally - not doing work you don't really need to do, or work which has poor ROI. Applying some resources not in the critical path to figure out where to do critical path work is a really great approach taken as a generalization.

I do think these processes and techniques will be used increasingly in the future, and honestly, will probably end up in vendor-agnostic forms as this sort of thing is really best done by the developers themselves. NV, as you say, is throwing money and resources at it to get their version going now (to sell their cards) - but this approach overall is going to be a general trend.

So when I say I'm excited about DLSS, it's not DLSS itself, but rather the overall thought process and design iterations which will inevitably come from this.,
 
  • Like
Reactions: STEM
like this
Well, NVIDIA has a storied history of cutting image quality to improve FPS. This time around they just did it very openly, so no one can call them out on cheating.

Yes but DLSS doesn't get better FPS than simple scaling. It's both more expensive, and worse quality than simple scaling. That takes a special kind of gross incompetence.
 
  • Like
Reactions: STEM
like this
The thing is DLSS isn't 4K, it's upscaled lower resolution. Yeah the display renders 4K number of pixel but the fact is it's upscaled.
You can't magically create pixels. They were guessed and the result show for itself... it's blurry. I'm not ok with settling for lower quality for the sake of saying it runs at 4K (upscaled).

It's the later result they are banking on, currently DLSS is rendered at a lower resolution, but several interviews have revealed that they plan to release DLSS2x which is essentially rendering at the same resolution, when we get it, and what is needed for it is unsure, but you can clearly see NVidia isn't betting on these tech advances paying off now, it's all ground work and foundation for the future.
 
Buy it now! It just works!

Or wait to buy it until it works......


I just watched this. Holy crap it is worse than expected. Performance is on par with 1685p yet 1685p blows it away in visual quality.

"DLSS is so bad it should be removed from BF5 as it robs gamers of potential visual quality." He goes on to say that it may only work on games that already have a layer of blurriness like RE2.
 


Steve and his team do a great job of walking you through the different performance options and comparing is to TSAA. The way I see it, right now with the newness of the tech, R&D cost, cost of the actual tensor cores themselves, you're trading correct lighting for image quality. You have to sacrifice something. DLSS implementation in it's current state should be used to get playable frame rates with RT enabled.

In Gen 2/Gen 3 of RT/DLSS we'll be able to get both realistic lighting AND playable frame rates as the (1) technology matures, and (2) software developers learn how to implement the technology more effectively. But hopefully, by then the market will go to DX12 DXR, developers can code for both NAVI and TURING/TENSOR; and like all Nvidia proprietary technology, it goes the way of the dinosaur. But hey, someone needs to lead the way and someone needs to beta test the hardware and software.

Being a consumer in this industry for 30 years and now working in a startup SAAS company, this all makes sense to me and I hold no ill will towards any organization or person, unless it directly affects me. Companies are out to make money and consumers are out to spend their own cash for their own utility. The only fools in the equation are the ones that get worked up over other peoples lives that have no effect on their own.
 
The apparent vram cost doesn't seem high at 250 mb:
https://www.techpowerup.com/reviews/Performance_Analysis/Metro_Exodus/6.html

However that is over 4k native whereas visual quality is closer to 1440p. Now the cost is 1 GB. Add in the 1 GB hit for RTX, and you are using a full 2 GB more running 4k DLSS w/RTX over 1440p native.

This does not even factor in the dx11 to dx12 penalty.

That's why you buy two 2080's, link them together with NVLINK, and one day in the future (maybe), you'll have 16GB of VRAM... Until then, you know what happens when your GPU runs out of VRAM...
 
FINE WINE ! :D

Maybe, inside 6 years, this tech makes sense...to your wallet! :eek:

 
  • Like
Reactions: N4CR
like this
and the competition (or lack thereof) certainly isn't forcing their hand.
AMD competes with 99% of the market stack currently.

When I first read about DLSS I was excited. It was actually the most interesting feature of RTX cards for me.

But it has been one disappointment after another.

HWUB are fairly rigorous and unbiased testers, their DLSS results in BF5 are an unbelievable fail.


All this AI technology and ML training on supercomputers and they end result is clearly worse than simple upscaling.

AMD should just encourage devs to decouple the HUD resolution from 3D resolution, and have a couple of tweakable generic scalers for gamers, and they would beat DLSS with a solution that just worked everywhere.

Heads should be rolling at NVidia, after this dismal demonstration of their technology.


Holy heck that looks like crap..
This is worse than the 'wait for HBCC/vega bullshit' and it's from team green - I'm quite shocked to be honest.
All they needed to do was not try to re-purpose ML/tensor cores for compute for some shitty feature that looks awful and instead make a dedicated die that they could cut to make an x80Ti, and x80 and an x70 for now with a smaller die for the next three lower models.
It was that damn simple but they got greedy and lazy to please shareholders and it blew up in their face. That's not even counting the space invaders RMA 70-80Ti debarcle.
Is this the shittiest GPU launch in the last decade? Even the Fury didn't suck this hard.
 
DIs Looks Sucky Sucky can go to hell for the time being.
 
I wonder when we will see "Turing advanced shaders" used, and if they will be more of the same shitstorm as the rest of the new features from this awful generation of GPU?

Now nGreedia has DLSS (Don't Look, Sucky Sucky) problems too, so that's 2/3 so far!
 
Maybe they should have made the pricing of RTX cards increase over time, as the features they claim it has actually become viable/valuable. ;)
 
Is this the shittiest GPU launch in the last decade? Even the Fury didn't suck this hard.

Fermi comes to mind in the last decade, before that the 2900XT sucked pretty badly.

FINE WINE ! :D

Maybe, inside 6 years, this tech makes sense...to your wallet! :eek:

Now nGreedia has DLSS (Don't Look, Sucky Sucky) problems too, so that's 2/3 so far!

Maybe they should have made the pricing of RTX cards increase over time, as the features they claim it has actually become viable/valuable. ;)


We would not poke subtle, and sometimes not so subtle, fun at NVIDIA every chance we got if the RTX launch went something like this:

  • Jensen (Jen Hsun) Huang gets up on stage, makes a few geeky jokes
  • Announces three GPUs:
    • RTX 2080 Ti - $899 FE / $799 MSRP
    • RTX 2080 - $599 FE / $499 MSRP
    • RTX 2070 - $449 FE / $399 MSRP
  • Moves on to give his hour-long RT / DLSS presentation
  • Shows performance data vs. Pascal
Even those prices are hard, however, we would have swallowed them much easier and would have happily waited for those RTX features to materialize, if ever.

Lastly, it would have helped if NVIDIA did allow NVLINK on the RTX 2070 as well. Common NVIDIA, how greedy can you be? It's only useful to a handful of people anyway, mostly hobbyists like me who like to dabble in ML and 3D animation and also like to play with technology. We know that SLI will be dead and buried once we're past the current generation of games.

For a lot of us nerds, geeks, and hardware junkies, getting the latest tech is one of life's greatest pleasures. And once we were NVIDIA's core customers, and NVIDIA still makes money off us. But when NVIDIA says that I will have to pay almost double for every new GPU release, I just want to go out and buy AMD, even if only out of spite. Or not buy anything at all. It just triggers the anger, bitterness and ... the sadness in me, and I guess it's the same for many others here.

On launch day I was watching the presentation on my TV on YouTube, waiting with my credit card in hand to order two Ti cards. When I saw the price and the fact that they were taking pre-orders, along with no actual performance data (not even something from the marketing department), I put my credit card back in my valet. Right now I have one 2080 Fe in my 1950X machine, and I still have my two 1080 Ti's in my other system. And I got that 2080 FE just a few weeks ago. And somehow I just don't want to give NVIDIA any money ever again, and it would be so cool to have other options. Their behavior doesn't need to be encouraged by our dollars.

Sorry for going on a tangent there, but that's the main reason why everyone takes a stab at NVIDIA. People are angry with you NVIDIA, I hope someone from the company is reading this.
 
Back
Top