cageymaru

Fully [H]
Joined
Apr 10, 2003
Messages
22,054
Hardware Unboxed has released its newest video where they dissect the image quality of Battlefield V with the new NVIDIA technology called Deep Learning Super Sample (DLSS) enabled. They not only compare the DLSS image quality to the native 4K image; Tim takes it a step further and compares the DLSS image to an 1685p upscaled 78% resolution scale image. They chose 1685p because it performs at a similar frame rate as when DLSS is enabled in-game.

In all instances, the DLSS image looks to be a smeared image and the 1685p upscaled 78% resolution scale image is much more pleasing to look at. Tim Schiesser says, "The 1685p image destroys the DLSS image in terms of sharpness, texture quality, clarity; basically everything." He goes on to say, "The 78% scaled image preserves the fine detail on the rocks, the sign, the sandbags, the cloth, the gun; pretty much everywhere. With DLSS everything is blurred to the point where this detail is lost." Resolution scaling has been available to gamers for decades. They hold back no punches and say, "DLSS sucks."

But the real kicker is looking at the visual quality comparisons. We'll start with native 4K versus 4K DLSS. Across all of the scenes that I've tested, there is a severe loss of detail when switching on DLSS. Just look at the trees in this scene. The 4K presentation is just as you'd expect; sharp, clean, high detail on both the foliage and trunk textures. But DLSS is like a strong blur filter has been applied. Texture detail is completely wiped out. In some cases it's like you've loaded a low texture mode. While some of the fine branch detail has been blurred away, or even thickened in some cases. Which makes the game look kinda weird in some situations. Of course this is to be expected. DLSS was never going to supply the same image quality as native 4K while also providing a 37% performance uplift. That would be pretty much black magic. But the quality difference comparing the two is almost laughable at how far away DLSS is from the native presentation in these stressful areas.
 
It's really bad in BFV.

It's the better choice in Metro, however.

I think the reason it looks so bad in BFV is just that there are more variables that can change. While moving in particular it just looks horrible.

In Metro, it actually does a great job, and basically looks like TAA is enabled - But you get a decent performance boost, and you don't get the loss in detail of distant objects that normal scaling brings.
 
Just a heads up, they are a known biased anti nVidia web site. Not a place you want to get your information or news from.

Based on who? You and a few reddit users? The test was very objective and straight forward.

Besides, this was not anti-Nvidia, but anti DLSS. Clearly, it is your bias showing.

They have had similar conclusions on the RTX cards as everyone else and was subjectively more critical of R7 than most.
 
I don’t know if they are or aren’t, but they’re right. Dlss sucks. It is shaping up to be a big fail.

Not really, as nVidia has said, the A.I. improves DLSS over time. So if something in a constant state of change and improvement, still in the race, it cannot possibly fail. So no, DLSS is not a fail. Now, if you want to talk about AMD Radeon, then the word "fail" would be much more appropriate to use.

I am sure after today, nVidia and EA make sure those results are gone from the game.
 
DLSS looks awful in Battlefield and Metro: Exodus...total FXAA smear...there are tons of screenshots online comparing it...it's no coincidence that DLSS was introduced at the same time as ray-tracing...the only benefit of DLSS is it lessens the huge performance hit of RTX...that's it
 
Not really, as nVidia has said, the A.I. improves DLSS over time. So if something in a constant state of change and improvement, still in the race, it cannot possibly fail. So no, DLSS is not a fail. Now, if you want to talk about AMD Radeon, then the word "fail" would be much more appropriate to use.

I am sure after today, nVidia and EA make sure those results are gone from the game.

You do realize that nVidia marketed this and rtx as easy to implement and that it just works. You chose to gobble up their current market speak while ignoring past.

I bought a 2080ti but I know rtx and dlss are going to be hot garbage for this generation.
 
Just a heads up, they are a known biased anti nVidia web site. Not a place you want to get your information or news from.

HAHAHA wow.

Just to give you guys a background, what Mr. SFD is talking about is this: They showed an RX570 is faster than a 1050Ti (which was never a revolutionary idea) and now all the Nvidiots think they were biased and only "show AMD in the best light" even though the 1050Ti is, on average, 20% more expensive than the RX570. But because they didn't show the RX570 against the 1060 3GB, (which is much, MUCH more expensive, yet it still performs on par with) they are "anti-Nvidia AMD shills".

Hardware Unboxed has always been Pro-facts. And the FACT is that DLSS is a flop.
 
You do realize that nVidia marketed this and rtx as easy to implement and that it just works. You chose to gobble up their current market speak while ignoring past.

I bought a 2080ti but I know rtx and dlss are going to be hot garbage for this generation.
Yeah but at least you have the fastest consumer GPU to console yourself with.(y)
 
Are you something kinda special? LOL ... can someone tell this kid that DLSS is nVidia ?

Based on "the word on the street" ... lot of people are talking about that website. It's a fairly open secret. And no, it's really hard to be biased when I can afford anything I want. I've told everyone here time and time again I am a fan of performance, I could get shit about what brand name is on the outside of a box. You guys are fanboys, not me. You guys can be fans and suffer the fate of slower performance, not me.

Just go watch some videos, check them out, the hate comes through fairly easy unless you're "something kinda special." lol

You mention "special" twice. Very big of you.

Being against a certain nVidia technology does not make you anti-Nvidia. I suppose [H] would be anti-nvida for its critical assessment of RTX in BFV.

You seem to be lashing out big time here for no apparent reason other than your maturity.
 
It's really bad in BFV.

It's the better choice in Metro, however.

I think the reason it looks so bad in BFV is just that there are more variables that can change. While moving in particular it just looks horrible.

In Metro, it actually does a great job, and basically looks like TAA is enabled - But you get a decent performance boost, and you don't get the loss in detail of distant objects that normal scaling brings.
Looks bad in metro, based on videos I watched.
 
Wow look at the 4k vs 4k DLSS and 1685p Upscaled vs 4k DLSS at the highlight time(5:26), the quality difference is laughable. Come on Nvidia you're better than this. Hopefully the DLSS implementations get better otherwise Nvidia has better use of their manpower.
 
Based on who? You and a few reddit users? The test was very objective and straight forward.

Besides, this was not anti-Nvidia, but anti DLSS. Clearly, it is your bias showing.

They have had similar conclusions on the RTX cards as everyone else and was subjectively more critical of R7 than most.


Everyone knows that every site that doesn't shower praise on my favorite brand, even at their worst, is anti-that brand and a shill for the competition :p
 
As a computer scientist with a couple of decades of experience in the field, one simple thing I can say is when you see something being marketed as "AI powered", it is 100% a marketing gimmick. It is the same algorithms we've had for decades, with their known limitations running on much faster consumer hardware. They were not usually called "AI" until relatively recently, as computer scientists tended to reserve that term for other things like "hard AI" etc, but they were the same algorithms, there hasn't been any drastic advancement. Consumer hardware is much faster so you can try algorithms, ML etc which you couldn't before on such devices, but they are definitely not "magic" and not "smart" in the sense that marketing tries to imply.

If you want more specifically to talk about DLSS for example, I read in some places that it is "Deep Learning" which means that it will become better and better as it is "learning". Ehhh, yeah, that's not how it works at all. A DL system can improve if you can give it more training data closely related to your scenarios, but with ever diminishing returns (imagine a classic asymptotic). We've got to believe DLSS is already trained quite a bit on a title like BFV, so it it is near the max you are going to get anyway - training with different data sets (e.g other games) will not help. To get any significant improvements, humans have to actually change/tweak/etc the DL algorithm they designed for DLSS and make it better (as they'd do with any other kind of algorithm). And I guess it is possible the current implementation is nowhere near the best they can come up with and DLSS could potentially work some day. But it definitely will not get much better by itself, by somehow "learning".
 
Last edited:
So does DLSS take memory? How much if the case?

Neural net to what? Itself or is that just marketing speak/hogwash that Nvidia is pulling out of their . . .?

So far DLSS is the worst option for AA - I would call it a failure. . . Hopefully NVidia can add some magic to it to improve it some. After 6+ months and this is the best they can do? Does not look too promising.

The restrictions is all telling as well as to the limitation of the tensor core ability - 2080 Ti won't be able to do lower resolutions - because DSLL would actually slow it down, same reason restricted to RTX - So a AA method that not only blurs but if used by itself or at lower resolutions would significantly hurt performance. lol. I would call that a fail. What developer will go for that in the future? Nvidia better be willing to pay some big shrill money out.
 
Since this is a dlss thred, I bet it’s a blurry line they crossed
Doing Lots of Screen Smearing.png
 
They created a game specific upscaler and added some buzz words. While there is nothing inherently wrong with that, they should stop blowing smoke up peoples' back sides.

Definitely Like a Shit Sandwich
 
Last edited:
Looks bad in metro, based on videos I watched.

It could depending on what video you watched, and if that video fairly compared it to setting shader scaling to around the 60-70% mark.

With shader scaling set to a value that gives similar performance gains @ 4k - I personally think DLSS looks better.

But in BFV it absolutely is trash. It's not worth using at all. Not to mention raytracing sucks in BFV as well as it's only for reflections, not global illumination like Metro.
 
So does DLSS take memory? How much if the case?

Not sure about this game but another site recorded 250 MB more vram using 4k DLSS over 4k native (edit: in Metro). When compared to 1440p native, a closer visual comparison, it was about 1 GB more.

Not sure if system memory was affected.
 
It works better but still relatively mediocre in a tunnel shooter and a canned benchmark, wow! It's almost as if machine learning takes lots of effort to 'just work' in easy scenarios..
 
Compared to up-res, yes it does. For similar fps it looks worse.
That's a fail.

Again, i've switched between it in Metro on my own system and DLSS absolutely looks better compared to 60% scaling.

In BFV though it looks -terrible-. To the point that they should have just scrapped the feature.
 
Not sure about this game but another site recorded 250 MB more vram using 4k DLSS over 4k native (edit: in Metro). When compared to 1440p native, a closer visual comparison, it was about 1 GB more.

Not sure if system memory was affected.
Thanks, great info, wonder how the 2060 will fair with the additional VRam needs, still it will be rendering at a lower resolution and upscaling so probably OK. Makes you wonder how 2xDLSS at native 4K resolution will fare, since I would think it is the same data, 1gb is my best guess plus RTX additional ram requirements.

Now I was impress with what I saw in 3dMark with DLSS, looks like best case scenario and assume more learnt data could be downloaded for BFV for more specific levels and conditions as well. BFV TAA is about the best I've seen for TAA for maintaining sharpness and probably the worst for DLSS so far.

Anyone can wave their hand across their eyes and the fingers will blur a lot - in motion our eyes do blur unless we follow an object in motion - meaning having some blur with TAA if in motion is camouflage well with our own eye capability, unless we follow an object on the screen with our eyes which then any blurriness will be more apparent. Anyways TAA is a relatively very good AA method when done right which BFV did. With BFV DLSS, it is just downright soft/blurry standing still plus the limitations on usage just strikes it out.
 
Back
Top