At least you can use RTX in locked 30 or 60 fps mode in games like Metro and take advantage of it but DLSS is completely useless unfortunately. Now whether it gets better over time, who knows but I have a feeling NVIDIA is going to quietly drop DLSS in it's next architecture and it will be a forgotten feature. Hopefully the piss poor Turing sales spurs NVIDIA to build another GPU like Pascal that is 100% for gaming and not some cut down pro rendering card to pad their margins using their customers as suckers. Anyone who is even remotely unbiased will recognize that Turing is a Turd and DLSS is hot garbage.

I sat this generation out because Turing didn't offer anything that interested me and these reviews just continue reinforcing my decision. NVIDIA better hurry up and step it up before Intel gets their act together and comes after them in a big way.
 
At least you can use RTX in locked 30 or 60 fps mode in games like Metro and take advantage of it but DLSS is completely useless unfortunately. Now whether it gets better over time, who knows but I have a feeling NVIDIA is going to quietly drop DLSS in it's next architecture and it will be a forgotten feature. Hopefully the piss poor Turing sales spurs NVIDIA to build another GPU like Pascal that is 100% for gaming and not some cut down pro rendering card to pad their margins using their customers as suckers. Anyone who is even remotely unbiased will recognize that Turing is a Turd and DLSS is hot garbage.

I sat this generation out because Turing didn't offer anything that interested me and these reviews just continue reinforcing my decision. NVIDIA better hurry up and step it up before Intel gets their act together and comes after them in a big way.

You can't hit anywhere near 60 FPS if you use raytracing @ 4k and don't upscale or use DLSS to scale. It's entirely useful in that regard.
 
I think DLSS would be fine for foveated AA similar to if you modified this split screen post process and set it up so the display had a bit of a diamond peripheral area that received MSAA and outside of the diamond the rest of the resolution used DLSS. Horizontal splits with some angled splits to make a bit of diamond eye like shape. Basically better sharper AA in the focal area and peripheral AA getting lower blurrier DLSS style AA. I really speculate DLSS was designed to help with RTRT noise, but also for VR foveated rendering as well as it's fairly ideal for the latter. You could still cleverly do some type of post process foveated rendering on a desktop display to work like a peripheral AA effect in theory if those SweetFX examples were a bit more adaptable and could be used for two separate post process injection AA types to the different regions.
https://delightlylinux.wordpress.com/2014/01/20/sweetfx-splitscreen/

Or you can just use resolution scaling which does a cleaner job for the same fps hit.
That's true, but that doesn't minimize ray tracing noise.
 
Thanks, great info, wonder how the 2060 will fair with the additional VRam needs, still it will be rendering at a lower resolution and upscaling so probably OK. Makes you wonder how 2xDLSS at native 4K resolution will fare, since I would think it is the same data, 1gb is my best guess plus RTX additional ram requirements.

Now I was impress with what I saw in 3dMark with DLSS, looks like best case scenario and assume more learnt data could be downloaded for BFV for more specific levels and conditions as well. BFV TAA is about the best I've seen for TAA for maintaining sharpness and probably the worst for DLSS so far.

Anyone can wave their hand across their eyes and the fingers will blur a lot - in motion our eyes do blur unless we follow an object in motion - meaning having some blur with TAA if in motion is camouflage well with our own eye capability, unless we follow an object on the screen with our eyes which then any blurriness will be more apparent. Anyways TAA is a relatively very good AA method when done right which BFV did. With BFV DLSS, it is just downright soft/blurry standing still plus the limitations on usage just strikes it out.


Should be ok. Best I could find shows around 5 GB vram and 11 GB system ram. Skip to 3:45 to avoid the CPU bottleneck:


DLSS off does show a little more usage but I have to wonder if 1080p DLSS is closer to 720p upscaled visuals as 4k DLSS looks closer to 1440p upscaled.
 
Or you can just use resolution scaling which does a cleaner job for the same fps hit.

This will be the third time i've said it - I've switched between the two and DLSS in Metro is superior from what I can see compared to 60% shader scaling which nets the same performance.

In BFV it's absolutely piss poor though and I agree.
 
I was hoping the use of RTX in Exodus would be better then BFV, as Global Illumination is bound to add more then reflections to a shooter. For the most part, this seems to be the case. Some issues with dark areas, but that seems to be more level/game design then it just not working.

DLSS seems to be better in Metro, but it also isn't as dynamic as BFV. DLSS, with all upscaling, suffers from the fact that you only have so much detail with fewer pixels. You can only do so much with a given dataset, a native rez frame will always look better then an upscaled frame. Whether it's noticeable in a given game setting is up to the user.

The RTX series is a mess.

My hands-on time with an RTX card is limited, as I only had a couple days with an AIB OC'd 2080. A friend was good enough to loan me the card so I could decide if it was worth the upgrade from my decently clocking 1070. My 1070 will best a stock 1080FE in the games I play, which includes modern titles and goes back aways. I have a 4k/60Hz:1080P/120Hz screen, but no VRR. Not the best, but has great PQ and won't be upgraded until HDMI 2.1 is sorted. This 2080 is what I would consider the minimum performance bump to upgrade, ymmv. This 1070 was just under $420 to my door, over 2 years ago, add the $80 waterblock and we're at $500.

So what are my options?

2080 vanilla - @$700, which is the same I would have paid for a 1080Ti 2 years ago if I had been so inclined (was more RL distracting me from PC upgrades during the launch then disinterest in the card). This isn't as fast as the 2080 I tested. Thanks again, Kyle/Brent for including the different flavors in your RVII review.

2080 AIB OC - >$800, or essentially twice what I paid for my current card, OVER 2 YEARS AGO. It also just equals the performance of the 1080Ti I could have got 2 years ago.

Used 1080TI - I'm not paying $5-600 for a used card. I witnessed a 1080Ti FE changing hands for $600 last week.

Radeon VII - @ $700, one of these under a full coverage block is intriguing. Whatever I get is getting added to the loop, so that's worth considering. All depends on if they can keep up the 2K clocks without killing the HBM clocks. I actually could use the VRAM with some of my video editing projects, so that's also a consideration. The Vulcan performance is disappointing, but not unexpected given the lower SM count. The value here for creatives who can get their hands on them cannot be understated.

None of these options sit well. I just want a video card upgrade, damnit. I used to upgrade every year, then moved to 2 years when product cycles slowed. I'd love to go top of the line now, but there is no way I'm spending as much on a video card as I could for a brand new, upgraded, turbocharger for one of my cars. Anything over $700 is just a non-starter, with most gamers I know setting that bar much lower.

I guess we got spoiled by the 9 and 10 series cards. If you were previously using low end cards and want to go high end now, I'm sorry. If you got a 1070-1080Ti (or even a 290/390X) at launch, your options suck for upgrades unless you just have to have the latest. RT/DLSS just isn't worth the premium nVidia decided to levy with this current generation. I know die sizes are huge, but that's nVidias problem, not mine.
 
From the recent Jayztwocents video showing the hit on ultra wide. Here is a high res pic.
Screenshot_20190218-215447_YouTube.jpg


 
Again, i've switched between it in Metro on my own system and DLSS absolutely looks better compared to 60% scaling.

In BFV though it looks -terrible-. To the point that they should have just scrapped the feature.

Results may vary. The two screens were taken with DLSS on and off. You could probably match fps of DLSS with 95% scaling in this instance.

Screenshot_20190218-231745_YouTube.jpg

Screenshot_20190218-231752_YouTube.jpg


Both pics are running 1080p. Feel free to come to your own conclusions on what looks better.

 
Results may vary. The two screens were taken with DLSS on and off. You could probably match fps of DLSS with 95% scaling in this instance.

View attachment 142899
View attachment 142900

Both pics are running 1080p. Feel free to come to your own conclusions on what looks better.



I should note that I’m speaking of 2080ti/4K. You need to drop shader scaling to the 60% mark at 4K with high raytracing to match DLSS which on the 2080ti is rendering at 1440p.

I’m sure the lower the resolution DLSS isn’t going to have as much impact or generally be worth it. It feels as though Nvidia is offering it on these lower end parts / resolutions just as a sales point, and it isn’t even a feature actually worth using.
 
At 4k, DLSS does indeed give about a 50% boost in fps. I don't see an upscaled resolution that can match fps being able to match the visuals.
 
Not really, as nVidia has said, the A.I. improves DLSS over time. So if something in a constant state of change and improvement, still in the race, it cannot possibly fail. So no, DLSS is not a fail. Now, if you want to talk about AMD Radeon, then the word "fail" would be much more appropriate to use.

I am sure after today, nVidia and EA make sure those results are gone from the game.
Wow someone actually posted company style propaganda with a straight face.
Dlss is a bunch of over promising and under delivering.
 
Just a heads up, they are a known biased anti nVidia web site. Not a place you want to get your information or news from.
.........

Redacted after reading Kyle's warning. I don't need yet another ban for calling a hammer, a hammer...

But to be more constructive and OT, why did nVidia think that they would get away with this, if in-game scaling looks better and performs the same? Could DLSS be broken somehow, or is it simply TWIMTBP?

To be fair, I do not own these two games, and have only seen it in 3DMark, where it looked fine, not amazing, but fine, and did perform much better, at least to the point of acceptable frame rates with DXRT enabled on my lowly 2070.
 
Last edited:
Results may vary. The two screens were taken with DLSS on and off. You could probably match fps of DLSS with 95% scaling in this instance.

View attachment 142899
View attachment 142900

Both pics are running 1080p. Feel free to come to your own conclusions on what looks better.
Its not just the pixel smear, there is a brown smear all over the gun and darker shades are replaced with lighter ones, the image looks washed out.
Colours are less realistic, more cartoon like.
I called out the loss of colour detail with DLSS from the early pixelated shots before the cards were released, I feared it was a ruse then.

I dont see the point or value of NVidia side AI optimisation, it is a backward step for IQ.
Not something I intend buying in to.
 
an rtx feature that is worthless, one less reason to buy rtx, what the hell was Nvidia smoking?
 
I am personally skeptical of AA methods on still images. I’d have to see it in person.... not running out to buy Metro though.

So far ray tracing seems great and DLSS a let down. I am just hesitant to condem something without seeing it in person.
 
I remember 3dfx glide and RRedline for my Rendition card... then physX... gameworks, tressfx... freesync and g-sync... now its rtx/dlss or whatever. All these are buzz words that mean... $$$ out of your pocket for new hardware and we still rare to find a game with decent AI, play-ability, story bla bla...lol... its all about eye-candy.
 
So no, DLSS is not a fail. Now, if you want to talk about AMD Radeon, then the word "fail" would be much more appropriate to use.

Why did you work in a slam against AMD? Just curious, as it's so totally off-topic, and glaringly inconsistent with your claims of being both unbiased and anti-bias.
 
This will be the third time i've said it - I've switched between the two and DLSS in Metro is superior from what I can see compared to 60% shader scaling which nets the same performance.

In BFV it's absolutely piss poor though and I agree.

Have you tried screenies and zooming? What resolution? But fine, it works okay in an easily predictable and optimisable tunnel shooter. Maybe racing games too or similar?
Other than that it's not looking very good for DLSS at all for a majority of games which aren't tunnel/rail/course based or mostly fixed-POV.
 
The article is click bait for AMD fanbois. Ignore and drive on. This is why I don't pay attention to any of the YouTube "journalists". They are paid based on the number of views that they get. They have an incentive to be controversial or inflammatory. None of this negativity or AMD vs Nvidia BS is healthy for the community or for technology in general.
 
Well when you trust a guy in A LEATHER COAT THAT LOOKS LIKE IT CAME FROM 1990 THIS IS WHAT YOU GET LOL. Thanks Jasper for the$1200 paper weight! lol.
 
YouTube journalism is worse than the tabloids. In the age of internet, anyone with a webcam can become a "Hardware Expert" and when cagey posts links on this site it does nothing other than support the general level of ignorance.

It's really sad. I'd rather go back to the days when it was just Kyle, Tom, and Anand where Kyle called people out for posting bullshit or inaccurate data.

What you end up with these days is more videos like the one the Verge posted on building a gaming PC, but the mistakes and inaccuracies are less evident, so people accept it as gospel.

Who cares what a 3rd party says which only unboxes Hardware even Bit Witt can do that for hits.
 
What I was hoping at least at the begining is DLSS @4k to perform like 1440 and have better IQ. And that IQ would improve in time bringing near 4k quality.

But its like the exact opposite. Performance isn't much better and IQ is far worse.

BTW if the neural network does all the AI learning, why do we need the tensor cores in the first place?
 
I have a feeling NVIDIA is going to quietly drop DLSS in it's next architecture and it will be a forgotten feature. Hopefully the piss poor Turing sales spurs NVIDIA to build another GPU like Pascal that is 100% for gaming and not some cut down pro rendering card to pad their margins using their customers as suckers

There are people who need Tensor cores
For example Google's TPU deep learning logic
And software such as Alpha Zero & Leela Chess zero

I think NVIDIA didn't go about in the right way this release
But there is a demand for Machine Learning using Tensor cores
 
Have you tried screenies and zooming? What resolution? But fine, it works okay in an easily predictable and optimisable tunnel shooter. Maybe racing games too or similar?
Other than that it's not looking very good for DLSS at all for a majority of games which aren't tunnel/rail/course based or mostly fixed-POV.

Metro has some larger environments, and that's where you see the largest FPS boost.
 
I am taking a wag that DLSS probably works a lot better for games like TF2, games without realistic textures. AI upscaling works literal miracles on comic art, but I've not seen great results on photos.
 
There are people who need Tensor cores
For example Google's TPU deep learning logic
And software such as Alpha Zero & Leela Chess zero

I think NVIDIA didn't go about in the right way this release
But there is a demand for Machine Learning using Tensor cores

This, I honestly think nvidia saw that their generational performance increases weren't enough to justify the price hike they wanted so they shoehorned the tensor cores in and half baked some features that take advantage of them to give more perceived value. Conceptually real time ray tracing is cool for a number of genres(imagine a thief game that takes advantage of them). Where they missed the boat is with gamer priorities, the number 1 and 1a most important features to pc gamers are 4k@60+fps and 1440p@120+fps. If Ray tracing worked at those performance targets we would be having an entirely different discussion.
 
most important features to pc gamers are 4k@60+fps and 1440p@120+fps. If Ray tracing worked at those performance targets we would be having an entirely different discussion.

I don't know about that, I think most people would be happy if RTX did just work, even if it came with a hefty performance hit. We are hitting the reality that it doesn't just work, it takes a lot of work to implement in a world that already takes hundreds of man hours to create a Triple A title. This means adoption is going to be slow and limited to titles that nVidia pays for said adoption, and with such limited market penetration there is little temptation for developers to take the time on these features.

DLSS is just silly, maybe it will get better, but you can't charge a premium for a product marketing DLSS and deliver this crap, then not expect the deserved backlash.
 
There are people who need Tensor cores
For example Google's TPU deep learning logic
And software such as Alpha Zero & Leela Chess zero

I think NVIDIA didn't go about in the right way this release
But there is a demand for Machine Learning using Tensor cores

Tensor cores belong in their pro cards not gaming.
 
The article is click bait for AMD fanbois. Ignore and drive on. This is why I don't pay attention to any of the YouTube "journalists". They are paid based on the number of views that they get. They have an incentive to be controversial or inflammatory. None of this negativity or AMD vs Nvidia BS is healthy for the community or for technology in general.

How exactly is talking about one of the primary pillars of Nvidia's RTX marketing clickbait? FFXV's use of DLSS is bad, BFV's use of it is horrible, Metro's use seems okay but not great. As it stands right now, DLSS is a bad feature and Nvidia put a ton of hype and marketing behind something that does not work as good as they claimed. Right now DLSS is a worse solution than then FXAA and the performance benefits are nowhere near worth the downgrade on image quality. That's not even getting into the utterly stupid way Nvidia and game devs are segregating who can use it as specific resolutions. Nvidia excuses it by saying "it will get better". If its a bad feature now they should have waited to roll it out until it got to that "better" point. For as dubious as the benefits of RTX are right now, at least those features work properly.
 
Back
Top