More DLSS...

I find the reflections to be too clear in games with RTX on, but maybe it is more accurate overall. Seems a bit off though. Metro Exodus I feel did a great job with lighting with it on as it did more than reflections. But I'll redownload this game and give it a go anyways to see the performance drop if anything.
 
I find the reflections to be too clear in games with RTX on, but maybe it is more accurate overall. Seems a bit off though. Metro Exodus I feel did a great job with lighting with it on as it did more than reflections. But I'll redownload this game and give it a go anyways to see the performance drop if anything.

I find WolfYB to be the best implementation of DLSS and RT in a game yet, My opinion of course.

Very immersive.
 
I find WolfYB to be the best implementation of DLSS and RT in a game yet, My opinion of course.

Very immersive.
Which means the path forward for RTX has become much more clearer and others can follow suit using that as a good standard what to expect and can be achieved. This I hope will help Ampere succeed more in pushing gaming quality as in rendering forward. Also wondering how AMD is tackling DXR and if they have anything unique to contribute for upscaling quality. Hard to argue with the quality of DLSS and ability in this game, really amazing -> If more developers just incorporated good DLSS options would be a big plus for gamers in the end. Next month will be interesting both from AMD and Nvidia.
 
Wolfenstein YB is probably one of the better implementations of Ray Tracing and DLSS so far.
 
Wolfenstein YB is probably one of the better implementations of Ray Tracing and DLSS so far.
It is indeed, the best showing of DLSS.

For ray tracing its kind of a wierd situation. For performance, it isn't really any better than anything else. Its still a 40% - 50% performance hit. What saves it, is that general performance of the engine is so high to begin with; That even a 50% cut usually means still playable framerates at decent resolutions, with ray tracing enabled.

Visually, the ray tracing in this game is probably a bit more refined than some of the other games in terms of general accuracy. So that's good. For future games which don't use such highly performative engines, we definitely want as high of accuracy as possible. I mean, if we are going to take the preformance hit, it better look right. Using Control as an example: RT in that game generally looks pretty great. But there are situations where little objects are strewn about on the floor and the shadowing on those can definitely be wrong.
 
I'd have a lot less to argue about if it actually were in 90% of games... BUT...it's in closer to 0.09% instead... Granted not everything is a AAA title, but neither are half of the "supported' games.

13,049 games available on Steam and Nvidia cites 22 games with RT and 29 that support DLSS. Bravo, Nvidia, well played...

The problem is getting developers to jump on board and build RTX features into their games. Nvidia can't make anyone use a feature... all they can do is make the technology and hope it gets adopted.... like SLI. :p
 


8:42 - 29% to 49% performance hit depending on the game. Quite frankly, unacceptable in a $999-1300 card. You're dropping framerates into the realm of a $400-500 card. I can suspend a lot of disbelief with "marginal" rasterized lighting performance to save $500-800.

34% to 49% with the 2060 which is the difference between playable and unplayable in many cases.
 
8:42 - 29% to 49% performance hit depending on the game. Quite frankly, unacceptable in a $999-1300 card. You're dropping framerates into the realm of a $400-500 card. I can suspend a lot of disbelief with "marginal" rasterized lighting performance to save $500-800.

34% to 49% with the 2060 which is the difference between playable and unplayable in many cases.

That’s such a weird response since plenty of settings can drop it that much or more... I prefer running medium/high + RT rather than no RT at all in the vast majority of games. There’s always a tradeoff. Given what it’s doing I am surprised it’s not more.
 
That’s such a weird response since plenty of settings can drop it that much or more... I prefer running medium/high + RT rather than no RT at all in the vast majority of games. There’s always a tradeoff. Given what it’s doing I am surprised it’s not more.

And scaling, resolution etc. Many things one can do. I scale on a 4K 60Hz monitor and get very nice results (usually).


Wolfenstein  Youngblood Screenshot 2020.02.08 - 21.36.37.69.png
 
8:42 - 29% to 49% performance hit depending on the game. Quite frankly, unacceptable in a $999-1300 card. You're dropping framerates into the realm of a $400-500 card. I can suspend a lot of disbelief with "marginal" rasterized lighting performance to save $500-800.

34% to 49% with the 2060 which is the difference between playable and unplayable in many cases.

You must be to young to remember how new features always pushed hardware?
AA was a killer at first.
Shaders tanked perfomance way worse the RTX does today.
Going for 16 bit to 32 bit was also a pickle in the start.


Perhaps you should go the console route, it would save you a lot of agony because you treat RT performance as it has never happened before...
 
That’s such a weird response since plenty of settings can drop it that much or more... I prefer running medium/high + RT rather than no RT at all in the vast majority of games. There’s always a tradeoff. Given what it’s doing I am surprised it’s not more.

Yup the complaints about the performance hit are valid but the same folks should be complaining about ultra settings and 4K too.

4K is the biggest waste of processing resources and has the worst ROI I’ve seen in my many years of gaming. Yet people are spending a ton of money to get there. The holy grail of running everything at “ultra” is also a stupid waste of money for no gain in most titles.

RTX is a bargain in comparison. Raytracing at 1440 has far more visceral impact IMO than simply bumping the resolution over 2x for little benefit on most monitors.
 
Yup the complaints about the performance hit are valid but the same folks should be complaining about ultra settings and 4K too.

4K is the biggest waste of processing resources and has the worst ROI I’ve seen in my many years of gaming. Yet people are spending a ton of money to get there. The holy grail of running everything at “ultra” is also a stupid waste of money for no gain in most titles.

RTX is a bargain in comparison. Raytracing at 1440 has far more visceral impact IMO than simply bumping the resolution over 2x for little benefit on most monitors.
I came to a similar conclussion that 1440p HDR (for games that support it well, a.k.a FC5, Shadow Of The Tomb Raider) was much better than 4K SDR.

The quality of less pixels can outweigh the quantity of pixels. Add in better fps, larger Freesync ranges makes it even better. Once 4K has fast HDR, VRR monitors with GPU's that can push them at a decent cost, 1440p is the sweet spot for now, 3440x1440 even better.
 
Yup the complaints about the performance hit are valid but the same folks should be complaining about ultra settings and 4K too.

4K is the biggest waste of processing resources and has the worst ROI I’ve seen in my many years of gaming. Yet people are spending a ton of money to get there. The holy grail of running everything at “ultra” is also a stupid waste of money for no gain in most titles.

RTX is a bargain in comparison. Raytracing at 1440 has far more visceral impact IMO than simply bumping the resolution over 2x for little benefit on most monitors.
And how much does it cost right now to do 60fps at 1440p internal res --- with Ray Tracing?? You're still looking at the same cards/money, as 4K. Ray Tracing is currently also a very bad ROI.

My RTX 2060 is considered a 1440p card. But with Ray Tracing ----- most of the time you are under 1080p, in order to stay around 60fps. For a mostly very small change in visuals. At this point, its like I'm still using my 7870, which this 2060 replaced.
 
And how much does it cost right now to do 60fps at 1440p internal res --- with Ray Tracing?? You're still looking at the same cards/money, as 4K. Ray Tracing is currently also a very bad ROI.

My RTX 2060 is considered a 1440p card. But with Ray Tracing ----- most of the time you are under 1080p, in order to stay around 60fps. For a mostly very small change in visuals. At this point, its like I'm still using my 7870, which this 2060 replaced.

Very small?

https://hardforum.com/threads/more-dlss.1990492/post-1044481997
 
I really need to give DLSS another try. I tried it on some Battlefield game... I think BF5 and it was a terrible implementation. It just looked blurry. It was like somebody trying to upscale a 1080p image to 1440p. I've been hooking up my laptop to my monitor thinking I could just sell my 2080ti desktop and keep my 2080MQ laptop as my main gaming computer, but certain games just run terrible at 1440p with RTX On... mainly Metro Exodus at the moment. Even with RTX off it's hard to play Exodus on my laptop at 1440p maxed-out in every other respect and get 60fps. It'd be nice to have just my laptop as my main rig for everything, but it's not up to my standards for performance especially for really performance-intensive games even though it's the most powerful gaming GPU you can get in a laptop at the moment short of a full 2080 which I couldn't do b/c there is virtually no battery life and they're thick and heavy... might as well not even have a laptop at that point.
 
I think BF5 and it was a terrible implementation.
BF:V is a terrible implementation of most things.

DLSS was deployed here primarily to augment RT, and for that it has been improved significantly since release -- but both were shoehorned into an engine designed for DX11, not the DX12 that RT requires, and I've yet to see EA / DICE do DX12 well.

In BF:V you're probably better off using some form of upscaling in place of DLSS.
 


Well done. A bit dragging at the end with the back patting and bitching about NV's marketing (Wow corporations lie?!?!?).

Also saying that use DLSS without RTX in WolfYB.....?? Because its a shooter?
Seriously, RTX in that game is very good and can most certainly be 100% enjoyed regardless of how "Fast" you play the game.
Individual preferences and all that...

Still no guarantees about the future obviously, and who and what will implement.
But it's here, Potential is very high, Pressure now is on AMD IF it becomes more adapted.
 


If this is is accurate, this is game changing news for DLSS.

DLSS 2.0 as he is calling it, works as I thought DLSS was going to work initially. It no longer requires per game training. They are also using Tensor cores again.

He also considered Balance/Quality mode is essentially the same as native.

It remains to be seen if all DLSS work as well as Wolf:YB, going forward.

But if it does, and it really requires no individual game training, then DLSS usage should explode, even on games without Ray Tracing. Giving them a free performance leg up in upcoming games.
 
DLSS 2.0 as he is calling it, works as I thought DLSS was going to work initially. It no longer requires per game training. They are also using Tensor cores again.

Same, and honestly Nvidia could have gained some credibility by waiting to release until they had DLSS working as a more general solution.
 
Same, and honestly Nvidia could have gained some credibility by waiting to release until they had DLSS working as a more general solution.

Yeah, like bilking a whole generation of card owners only to have it working right before they release the next gen?
 
DLSS 2.0 looks like a serious improvement and will be easier to implement but it is still doesnt escape one of the main issues when it was first released, its reliant on TAA quality which to me lowers effective resolution with its smear effect.
As pointed out in the video, MSAA at native res trumps it.
Obviously the impact on framerate is huge but MSAA brings the best out of UHD res which DLSS doesnt, even with sharpening.
The difference is no longer night and day so I will be sure to use it, but that brings up the next point, when can you use it?

DLSS 2.0 isnt a universal solution, support for it has to be built into the game, despite it using a generic AI template.
Games supporting previous versions of DLSS cannot use it either.
Support is sure to improve but right now there is very little upcoming.
There will be a huge number of games that will never see support from it, old and new.

Some part of me wishes the die space used for the Tensor cores was dedicated to a better form of AA instead, that can be used with any game.
But the max fps race wouldnt benefit from this and seeing as this is the major driving force behind sales I suppose its better to be thankful for what we have got.

I would like to see what DLSS 2.0 at native UHD res looks like, this 'might' be worth something.
Can this be done using DSR with DLSS 2.0?
 
This is mostly a consideration where framerates at UHD aren't up to par, for whatever reason, and is an alternative to lowering in-game settings, potentially including resolution.

Which is what you'd have to do otherwise to get performance 'up to par'.


So, for where to go next, I'd like to see two improvements:
  • First, it'd be nice to see DLSS working to intelligently upscale between any pair of lower and higher resolutions. The further apart they are the worse it'll look, but it should be possible to make it always look better than some form of upscaling and shader-based sharpening.
  • Second, I'd like to see Nvidia figure out how to provide DLSS without developer involvement. I assume that such an external solution would be less effective, but I'd also expect them to find a way to provide a benefit over standard upscaling.
I expect both of those are on Nvidia's radar. As it stands, DLSS and Tensor cores, while perhaps not entirely irreplicatable by others, is currently an advantage that when applied should provide benefits up and down their product stack.

And for the 'pipe dream', I'd like to see Nvidia doing local profiling using machine learning. Not sure that it could be easily done transparently to the user, but it'd be interesting to have a provision for it, where a user's interactions are 'profiled' by comparing stock DLSS algorithms with normal output (i.e., differencing 1440p + DLSS with UHD) and testing refinements, while allowing users to opt in to sending their results to Nvidia.

The results of which could then be collated and culled for distribution as 'DLSS profiles'. Further, instead of pushing out all of these optimizations in full driver packages, Nvidia could automate it through Geforce Experience, where the user could control whether profiles are downloaded and then toggle whether they're used.
 
DLSS 2.0 looks like a serious improvement and will be easier to implement but it is still doesnt escape one of the main issues when it was first released, its reliant on TAA quality which to me lowers effective resolution with its smear effect.
As pointed out in the video, MSAA at native res trumps it.
Obviously the impact on framerate is huge but MSAA brings the best out of UHD res which DLSS doesnt, even with sharpening.
The difference is no longer night and day so I will be sure to use it, but that brings up the next point, when can you use it?

Selective hearing amazes me.

This was NOT pointed out in the video, because the game doesn't even support MSAA (not many do these days).

What the video actually says, is that for any AA mode the game actually has, the quality of DLSS ranges from Better than Native, down to equivalent to Native.

Watch again starting around 14 minutes for the quality discussion.
 
Selective hearing amazes me.

This was NOT pointed out in the video, because the game doesn't even support MSAA (not many do these days).

What the video actually says, is that for any AA mode the game actually has, the quality of DLSS ranges from Better than Native, down to equivalent to Native.

Watch again starting around 14 minutes for the quality discussion.
You are right, I made a few errors, my apologies.
The game supports SMAA not MSAA.

When talking about the quality of SMAA T1X and TSSAA T8X, I thought he said "are great" when he said "arent great". Its not very clear to my ear.
The following statement where he said "produce a bit of blur", I assumed he must still be talking about DLSS because of the previous error, otherwise it made no sense.
On watching it again, after correcting my first error where he says "arent great" it became clear I had misinterpreted as well and am wrong.
In my defence the rate of speech was quite quick with a lot of information to digest.
I should have watched again before commenting though, my bad.
 
You are right, I made a few errors, my apologies.
The game supports SMAA not MSAA.

When talking about the quality of SMAA T1X and TSSAA T8X, I thought he said "are great" when he said "arent great". Its not very clear to my ear.
The following statement where he said "produce a bit of blur", I assumed he must still be talking about DLSS because of the previous error, otherwise it made no sense.
On watching it again, after correcting my first error where he says "arent great" it became clear I had misinterpreted as well and am wrong.
In my defence the rate of speech was quite quick with a lot of information to digest.
I should have watched again before commenting though, my bad.

All Cool. Hope I didn't come across as harsh. I thought you might be another of those people that just assume the negative for everything NVidia. So many here just seem to see everything one way because of bias.

Glad that's not you.

To say I was disappointing by DLSS 1.0 would be understatement of epic proportion. I had high expectation after presentations/marketing, but considered it 100% pointless garbage when it landed. Massive over promise and under deliver from NVidia.

But DLSS 2.0 is showing a lot of promise.

But a part of me remains skeptical. I need see quite a few more games performing like this, after being burned on DLSS 1.0.
 
  • Like
Reactions: Nenu
like this
But it's here, Potential is very high, Pressure now is on AMD IF it becomes more adapted.

🤣🤣🤣 AMD can't even figure out how to fix black screens in their Navi gpu, you expect them to create an actual dlss competitor? I'll predict it right now: they will make some half assed attempt and then say it's open source and wash their hands of it like they've often done in the past.
 
All Cool. Hope I didn't come across as harsh. I thought you might be another of those people that just assume the negative for everything NVidia. So many here just seem to see everything one way because of bias.

Glad that's not you.

To say I was disappointing by DLSS 1.0 would be understatement of epic proportion. I had high expectation after presentations/marketing, but considered it 100% pointless garbage when it landed. Massive over promise and under deliver from NVidia.

But DLSS 2.0 is showing a lot of promise.

But a part of me remains skeptical. I need see quite a few more games performing like this, after being burned on DLSS 1.0.
I'm a happy 1080ti owner, 980ti and GTX580 preceding, no negative bias. My only annoyance has been pricing of 20xx.
My last AMD card was 3 cards ago, I was forced to sell it after a bug drove me nuts.
Its been plain sailing since.

DLSS 2.0 looks promising, a new card later this year will let me try it.
 
Great video and now a very potent potential with RTX cards. Nvidia fine wine version that blows any fine wine (real or not) that AMD had. Basically games that support DLSS (2.0, Unboxed named) gives 30%-40% performance boost for basically same if not better rendering quality. Hell that is a generational jump right there. He talked about it can be driver triggered, wonder if some games will have a driver level DLSS switch/override available?

Now for benchmarks, well if the rendering quality is the equivalent then both on and off compared to AMD performance should be shown and then any AMD method if equivalent on/off benchmarks.

I would expect many more games will have DLSS incorporated, Nvidia does own the discrete market and RTX cards are increasing -> Free performance for smoother game play developers can deliver. This makes Ampere way more interesting at least to me whenever Nvidia can deliver.
 
I tried BF5 with DLSS and RTX @ 3440 x 1440 about a year ago and the blurriness made me feel nauseous. I quit playing because of the severe stuttering and hitching in DX12 mode. I reinstalled BFV again recently, and I have to say, even if DLSS on BF5 isn't the best implementation of DLSS, it improved enough to reduce the blurring and make it playable. 3440 x 1440 Ultra preset, DXR medium and DLSS gives me about 80fps average on my 2080. Best of all, no more DX12 lagging. That said, I only really notice the real time ray traced reflections on certain maps.
 
I tried BF5 with DLSS and RTX @ 3440 x 1440 about a year ago and the blurriness made me feel nauseous. I quit playing because of the severe stuttering and hitching in DX12 mode. I reinstalled BFV again recently, and I have to say, even if DLSS on BF5 isn't the best implementation of DLSS, it improved enough to reduce the blurring and make it playable. 3440 x 1440 Ultra preset, DXR medium and DLSS gives me about 80fps average on my 2080. Best of all, no more DX12 lagging. That said, I only really notice the real time ray traced reflections on certain maps.

BF5 is almost exactly they type of game you don't want to showcase Ray Tracing with. The last place most people are going to care about shadow and reflection quality is in a competitive multiplayer shooter.

But NVidia had to to go with what was "ready" at the time.
 
BF5 is almost exactly they type of game you don't want to showcase Ray Tracing with. The last place most people are going to care about shadow and reflection quality is in a competitive multiplayer shooter.

But NVidia had to to go with what was "ready" at the time.

I'd actually like to see more classic games get rebuilt with it (like Q2). Even REALLY old games. There's something cool and surreal about games that obviously do not look realistic, but have very realistic lighting applied. At least, I find it very interesting to look at. I'd love to see System Shock Enhanced (not the new reamke, but the enhanced 1994 game) with added RT. Things like that. Maybe Q3A. Sure modern games with it are great too, but I like the strange appearance of old+RT.
 
I'd actually like to see more classic games get rebuilt with it (like Q2). Even REALLY old games. There's something cool and surreal about games that obviously do not look realistic, but have very realistic lighting applied. At least, I find it very interesting to look at. I'd love to see System Shock Enhanced (not the new reamke, but the enhanced 1994 game) with added RT. Things like that. Maybe Q3A. Sure modern games with it are great too, but I like the strange appearance of old+RT.

I'd draw the line at fully 3d games. SS is one of those old games with sprite based characters, not 3d modeled ones.

I want Jedi Outcast and Academy. I believe the source is available for the games and the Q3 engine they are based on.
 
  • Like
Reactions: Auer
like this
Back
Top