AMD Dismisses NVIDIA's DLSS for Its "Image Artifacts" and "Harsh Scaling"

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
AMD doesn’t seem all that impressed with NVIDIA’s deep-learning supersampling (DLSS) technique: despite earlier reports suggesting the company was developing a DLSS alternative built on Microsoft’s DirectML API, representatives now say AMD is “doubling down on SMAA and TAA,” which they claim don’t come with “the image artifacts caused by the upscaling and harsh sharpening of DLSS.”

While AMD isn’t actively pursuing DLSS-like technologies right now, that isn’t to say it’s not possible something similar will be rolled out using WindowsML or DirectML to create an open source version of DLSS. Using machine learning to enhance the visuals and the performance of games on Radeon GPUs in the future is therefore still a possibility. Just hopefully without that “harsh sharpening,” eh?
 
So they pre render crap and stuff it in your driver or is it streaming of sorts? ...If it a bad guess on the rendering or not applicable or not fast enough it switches to less quality rendering?.. Some such like this?
Meh sounds like it will work well consistently as time passes I guess... Put it reads like cloud computing more than anything else...or like Streaming pieces of the redering process... Centralize some more baby!
 
Last edited:
I think it's pretty interesting. It's not the answer to all problems, but it's an option which gives pretty solid visuals at a high rate of speed.
 
Nvidia just marketed DLSS incorrectly, IMO.

It's an upscaler that puts idle tensor cores to work, not a MSAA or SMAA replacement. If you have to run a game at a non-native resolution for performance reasons, DLSS is still way better than upscaling the image with a bilinear (or is it bicubic these days?) filter the driver would otherwise use.

I've been saying that AMD/Nvidia should bake sharper resolution scaling into the GPU driver for years, as something like a sharpened spline or lanzcos filter is trivial to run on GPUs these days.
 
Sometimes it looks sharper but other times it doesn't and sometimes its shiny but does it really matter? If a game is good enough I wont care. CPUs & GPUs tend to have a lot of power that is misused in games. Instead of spending that power making things shiny I wish they'd make it more immersive. Make thriving cities that feel like there are more than 12 people living there. Make space battles that don't feel like 4 ships fighting in 2 dimensions. I do realise the people that make the shiny part aren't the same people as those who make the core game but there is a lot the shiny makers could do that would have more impact and improve the game. If you're making a cake it doesn't matter how shiny the cherry on top is if the cake is made from dogfood and yams.

Its not totally the GPU maker's fault as they're just making what game devs want.. more and more shiny to distract from the fact their actual game is flat, lifeless and boring.
 
If a game is good enough I wont care.
Gameplay is certainly a decidedly huge factor in determining whether a game is good but DLSS is strictly about improving GPU performance and made no claims for doing much of anything about improving gameplay.

You got a good point and all its just way OT is what I'm saying.
 
Nvidia just marketed DLSS incorrectly, IMO.
That's too generous, IMO - their marketing around it has been downright malicious. When they put their own 1440p results up against non-Turing cards' 4K results, they're trying to claim a massive performance advantage that doesn't actually exist, and those results made it into some fairly high-profile review graphs. They also boosted brightness and saturation on the DLSS side of their Final Fantasy DLSS comparison video.

It's good at what it does, but their marketing is completely ignoring its actual purpose in favor of whatever makes Nvidia look like a wizard.
 
  • Like
Reactions: mkk
like this
They sucked people in to paying stupid money for a promise that ended in a lie months later.
People use 4K for the extra detail it exposes, not to have it look much worse.
Future trust wont be so easy.
This isnt bad marketing, its a bad product for a poor price.
 
Wow the differences shown by 3dmark, on an entirely fixed benchmark, made me realize that checkerboard rendering is superior. I mean, there were differences in levels of effects applied, it's like with dlss reflection and shadows were tampered with when in theory it was only about screen resolution.
Ambient Occlusion would probably be the thing changed more than shadows per se. In any case it's not the holy grail promised.
 
I do 1080p and 1440p with 2 1080ti cards and TAA usually looks as good or better than my brothers rig with his new 2080 ... Now on a 2080ti with higher detail there might be a diffrence? Don't know.

But for us mine 60% of the time looks nicer.

Maybe after some tweaking I'll start needing to switch, but for now I'm good
 
Nvidia just marketed DLSS incorrectly, IMO.

It's an upscaler that puts idle tensor cores to work, not a MSAA or SMAA replacement. If you have to run a game at a non-native resolution for performance reasons, DLSS is still way better than upscaling the image with a bilinear (or is it bicubic these days?) filter the driver would otherwise use.

I've been saying that AMD/Nvidia should bake sharper resolution scaling into the GPU driver for years, as something like a sharpened spline or lanzcos filter is trivial to run on GPUs these days.


DLSS imo seems like something last minute they threw together to try and make the ridiculous price more palatable. Near 6 months later this is what they have to show for it, 1 game that's a blurathon with it turned on and weird resolution requirements based on the gpu.
 
I was watching the video fullscreen, the DLSS looked sharper and a bit brighter even.

AMD's comment and half of those in this thread are not surprising. I myself haven't been impressed with how the DLSS + RTX Battlefield V comparisons looked. But I think BfV is just a crappy overall example of RTX, and probably of DLSS as well. The DX12 hit that game takes is a dead giveaway that the programmers haven't done their job with DX12, which by extension affects RTX, and probably DLSS as well. So I am not going to let 1 impression set my opinion in stone.

Of course for those who want nVidia to fail for whatever reason, expect an excess of negative remarks.

The Metro Exodus game has very comparable DX11 to DX12 performance. It of course still takes a performance hit for RTX, but the FPS is in a playable range. Can't recall if there are also DLSS performance numbers or image quality comparisons out there yet for Metro. If worse comes to worse and it does reduce image quality, then I think it still has a place for 2060, 2070 users to help boost FPS.

Still to early to tell how that tech will end up, no point in coming to a judgement after only 2 examples of the implementation. Can't test it myself as I'm still running the 1080ti :)
 
Nvidia just marketed DLSS incorrectly, IMO.

It's an upscaler that puts idle tensor cores to work, not a MSAA or SMAA replacement. If you have to run a game at a non-native resolution for performance reasons, DLSS is still way better than upscaling the image with a bilinear (or is it bicubic these days?) filter the driver would otherwise use.

I've been saying that AMD/Nvidia should bake sharper resolution scaling into the GPU driver for years, as something like a sharpened spline or lanzcos filter is trivial to run on GPUs these days.
I think it's pure gimmickry, the reason being is it's not a universal solution. The game has to be specifically submitted to Nvidia by the developer so they can add support for it in their drivers. As I understand it, you can't run DLSS on any game that hasn't been officially supported. That's make it DOA for me and will probably be forgotten in the proprietary dustbin along with TXAA in a few years.
 
I think DLSS works great for the intended function. Gives 60+FPS in Metro Exodus @ 4k Ultra settings on a 2080ti. Without it enabled you have to put the shader scaling down to 60-70% which IMO looks worse. It adds some marginal blur, but you don't loose the detail in distance objects like you do with normal scaling.

I think the biggest issue is in how they marketed the feature. They should have simply marketed it as a higher-quality scaling solution that allows one to play at higher resolutions at a higher FPS with raytracing enabled. That's really what it is there for.

The issue is between the poor marketing of the feature, and how bungled the BF5 release was, I think they missed their chance to roll out it right. With Metro, however, i'm glad the technology is there. It's done right, and really allows for some amazing visuals and great FPS @ 4k.
 
I think it's pure gimmickry, the reason being is it's not a universal solution. The game has to be specifically submitted to Nvidia by the developer so they can add support for it in their drivers. As I understand it, you can't run DLSS on any game that hasn't been officially supported. That's make it DOA for me and will probably be forgotten in the proprietary dustbin along with TXAA in a few years.

It's nothing more then a performance boost. That's why it's very specific by card, by game, by resolution used. You'd have no reason to run it in most games where you are already hitting 60+ FPS. They should have marketed it as a performance boost when using raytracing at higher resolutions. That's really all it is. It's not intended to be a new AA solution. The side effect of how it works is that you get AA.

Nvidia would have been better being honest with customers as to the performance impact DXR/RTX would have on games, and how DLSS helps offset that impact rather then acting like it's some fancy AA solution.
 
It's nothing more then a performance boost. That's why it's very specific by card, by game, by resolution used. You'd have no reason to run it in most games where you are already hitting 60+ FPS. They should have marketed it as a performance boost when using raytracing at higher resolutions. That's really all it is. It's not intended to be a new AA solution. The side effect of how it works is that you get AA.

Nvidia would have been better being honest with customers as to the performance impact DXR/RTX would have on games, and how DLSS helps offset that impact rather then acting like it's some fancy AA solution.
Yeah that's the problem, it's NOT intended as a new AA solution. You get the best visual quality with SSAA, but the worst performance, this could potentially be a decent compromise. Problem is, if the game I want to run it on isn't on the list, then too bad, it doesn't do anything for me. It's been my experience is proprietary tech like this doesn't survive unless they give developers money to add it in. Sure, Nvidia will for several games, but after their allocation for that dries up, I see the technology becoming dead in the water. In 5 years, I'm expecting almost no new games to use this. Meanwhile, universal options like FXAA, DSR, etc. are still being used long after they were introduced.
 
IMO this tech won't survive for very long. It's just a stop-gap to allow reasonable FPS while using ray tracing. So as soon as the technology catches up it will be irrelevant.

Of course, display manufacturers have gone full retard with resolution. Soon we'll have 8k 27'' panels which makes no sense. I wish they'd focus on improving the things that matter. More resolution beyond what we already have simply is a waste IMO. But this technology will make sense since the resolution wars don't seem to be stopping any time soon.
 
IMO this tech won't survive for very long. It's just a stop-gap to allow reasonable FPS while using ray tracing. So as soon as the technology catches up it will be irrelevant..

If by technology catching up you mean newer GPU's that have 100% performance gain to current generations, you might be kidding yourself. The move to 7nm and 5nm are not going to bring the same huge performance boosts we used to get in years past. Moore's law is dead, the gains are much smaller and slower now.
To me this solution is an out of the box solution to performance issues. It's going to take some time to get out to other games and other resolutions, as the nVidia (mainframe?) has to crunch numbers on a game by game basis to be able to add it to those games.

Time will tell, but a breakthrough in the chip manufacturing process is probably going to be what it takes for us to see some new, really impressive GPU's.
 
I love this back and forth stuff. :) Going to stick with AMD as long as they are around, which could be for a long time to come. (For personal computing only, not so much everyone else's computers.)
 
The solution imho is to split the GPU much in the same way as the CPU/GPU split. Specialized processors.
 
I was watching the video fullscreen, the DLSS looked sharper and a bit brighter even.
If an AA or scaling algo makes anything brighter on a large scale, it's overstepping.

The FF DLSS demo was blatant. I don't know whether it was tweaked for the video or if FF's DLSS implementation is just set up to do that, but either way is bad. If it's the former, the dishonesty is obvious. If it's the latter, either Square Enix and Nvidia cooperated to dull down the non-DLSS tonemapping to a level below what Square Enix thought was ideal (like tweaking the video but worse), or Nvidia thought that FF's tonemapping wasn't hot enough and took matters into their own hands (in which case Square Enix ought to be seriously pissed off).

The Port Royal DLSS demo is a lot more subtle, but there's still some stuff in the tonemapping that should match up and doesn't.
 
They're moving away from DLSS after everyone did the research for them and figured out DLSS is a waste of time.
 
I dislike NV. I want any reason to support my bias. This video doesn't do it. In fact DLSS looks better in the published video.
 
I dislike NV. I want any reason to support my bias. This video doesn't do it. In fact DLSS looks better in the published video.

Canned benchmark you can consider a video... yeah looks somewhat better.
Go see in a previous posted news how it fare in real life gameplay in BFV.. it didn't look better and actually was more blurry for things like grass, trees, etc.

I think there's a reason nothing was available day 1 and it's not because they couldn't make it happen. If on day 1 you knew what it would look like when actually playing a game I think everyone would of said it was crappy.

Now, everyone says "wait you'll see..." but the truth, my point of view, is the resulting image in game is worse right now with DLSS.
 
  • Like
Reactions: Nenu
like this
I don't know what everyone is bitching about, I have DLSS on in my games. What's the issue?

15o1i8.jpg
 
  • Like
Reactions: filip
like this
Canned benchmark you can consider a video... yeah looks somewhat better.
Go see in a previous posted news how it fare in real life gameplay in BFV.. it didn't look better and actually was more blurry for things like grass, trees, etc.

I think there's a reason nothing was available day 1 and it's not because they couldn't make it happen. If on day 1 you knew what it would look like when actually playing a game I think everyone would of said it was crappy.

Now, everyone says "wait you'll see..." but the truth, my point of view, is the resulting image in game is worse right now with DLSS.

Nvidia would have rather had it on day 1. They were banking on DLSS making up for the performance impact of raytracing. Not having it on day 1 for BFV really hurt them in that aspect.

In that regard, DLSS does the job. It allows someone to use raytracing. It's just a sophisticated scaler. It does a better job, and you get some AA as a side effect.
 
Nvidia would have rather had it on day 1. They were banking on DLSS making up for the performance impact of raytracing. Not having it on day 1 for BFV really hurt them in that aspect.

In that regard, DLSS does the job. It allows someone to use raytracing. It's just a sophisticated scaler. It does a better job, and you get some AA as a side effect.

Yes I agree. Where I have issue is comparing let's say FPS from 4k vs 4k DLSS. 4k DLSS as you pointed out isn't real 4k and I think the image quality is worse.
So yeah, you get better framerate but loss of quality. DLSS should be branded as 1080P upscaled not 4k.
 
I'll maintain that it isn't even just upscaling.

An upscaling algorithm wouldn't give you the change in reflections and shadows, and would be easier to just apply, as is the comparison with, what amd offered to console manufacturers with hardware checkerboard rendering. They seem to also tamper with effect levels guessing through the months the best combination of effects levels they think work, and imho it doesn't really n work because at that point you aren't comparing apples to apples iq.

Again this is born from that 3dmark video when it goes back and forth on the very same static image.

Honestly I'm feeling that dynamic resolution scaling in Wolfenstein is a better deal by far too.
 
Back
Top