NVIDIA DLSS 3 “Frame Generation” Lock Reportedly Bypassed

Am I the only person on earth that thinks DLSS, ALL DLSS, is a bit shit?
The technology is interesting (to me) as a “this is just cool on a maths, programming and proper hw/sw vertical-stack implementation” thing, but the application to gaming is total horseshit.

I am impressed how nvidia is able to pull off combining bleeding edge research, tight hardware + software integration and optimization, and roll all this out in way that gamedevs can actually play with it and we can see the outcome (good or bad) in games launching soon (tm). Neither intel nor AMD have managed this even once in recent memory (maybe in a very very minor way AMD did this with HDR+AA on the HD2900 series way back when iirc?).

Still hate how this comes at the expense of proper research and development into gfx cores and raster perf though haha.

But to be completely fair, the 4090 isn’t a slouch at 4K raster perf either.
 
The technology is interesting (to me) as a “this is just cool on a maths, programming and proper hw/sw vertical-stack implementation” thing, but the application to gaming is total horseshit.

I am impressed how nvidia is able to pull off combining bleeding edge research, tight hardware + software integration and optimization, and roll all this out in way that gamedevs can actually play with it and we can see the outcome (good or bad) in games launching soon (tm). Neither intel nor AMD have managed this even once in recent memory (maybe in a very very minor way AMD did this with HDR+AA on the HD2900 series way back when iirc?).

Still hate how this comes at the expense of proper research and development into gfx cores and raster perf though haha.
I think AMD and Nvidia realized early on that Raster performance wouldn't get them there and they both had to work on alternatives.
I am hoping MCM designs eventually get us to a point where Raster and Raytracing effects alone can be done with out the need for DLSS and the likes but I fear it is a good number of years off.
Ray tracing isn't going anywhere, it streamlines too much of the development process and cuts down on man hours too much for studios to give it up any time soon. If anything they are going to push for more adoption of it.
 
Yeah I think what people may not realize is that FSR is basically using and improving on decades of university research into general upscaling algorithms. It’s basically talking ye-olde “turn 1 pixel into 4 pixels” and accelerating this on a GPU in “real-time”. Obviously I’m over simplifying the actual algorithms and complexity here, but this is why FSR “just works” everywhere.

Nvidias DLSS requires that an ML model is pre-trained by the game developer, basically (this is really over simplified) you’re pre-generating a set of patterns and situations which get processed at runtime based on your current frame and so you can “predict” possible outcomes and pull them from your model. Obviously much more complicated and smarter, but fundamentally the tensor cores are just “generating” potential frames and then selecting the best-fit. This is why you get some strange glitching around reflections in water and non-deterministic scenes (or the devs just didn’t create and train a proper model).

I personally think nvidia went this route because they have tensor cores to sell (these are much more lucrative in the compute AI enterprise markets) and proper gfx cores are becoming less and less important. I also think it’s overall stupid to try to merge AI compute and gaming product lines like this but whatever - I’m a lowly engineer and not a CEO so what do I know.
AMD doesn't even hide it, it's in their own slides. The FidelityFX system uses what they call EASU which is their image upscaling pass where it uses a combination of 2 or 3 different well-known upscaling methods to generate the image then it performs its RCAS pass which again uses 2 or 3 different well-known and researched sharpening algorithms to sharpen the image.
Honestly, I prefer the AMD approach here, they aren't reinventing the wheel it is very clear and basic in what it does. Overall for as complex as it is, the AMD FSR method is very much a KISS approach, it is a series of very simple methods and it just has a decently fast and complex decision tree on choosing which methods for which parts of a scene for the best overall result.
Nvidia throws all that out the window and leverages its AI knowledge base and toolset to generate custom solutions for its registered developer partners, of course, custom solutions (when done correctly), are going to generally get better results with more features and customizations. But more isn't always better.

EASU (Edge-Adaptive Spatial Upsampling) is an upscaling pass that additionally includes edge reconstruction.
RCAS (Robust Contrast-Adaptive Sharpening) is a sharpening technique that extracts pixel detail in an upscaled image. FSR also includes utility methods for color space conversions, dithering, and tone mapping to make it easier to integrate it into today’s game rendering pipelines.
1665686520215.png
 
I'd rather play at 1440p than play with some bizarre simulated "Vaseline smear" filter in the form of DLSS, but I get you.
I dunno what DLSS you all have been using, but at 4K, DLSS Quality looks better than native with TAA applied and I get a frame boost on top of it. Now Balanced and Performance, ok, you CAN notice for sure, but Quality is damn near perfect in every game I have played and my preferred method of AA now.
 
Nvidias DLSS requires that an ML model is pre-trained by the game developer, basically (this is really over simplified) you’re pre-generating a set of patterns and situations which get processed at runtime based on your current frame and so you can “predict” possible outcomes and pull them from your model. Obviously much more complicated and smarter, but fundamentally the tensor cores are just “generating” potential frames and then selecting the best-fit. This is why you get some strange glitching around reflections in water and non-deterministic scenes (or the devs just didn’t create and train a proper model).

I think that was DLSS 1, on 2.0+ it is a more generic training than game specific model that require the game studio (or Nvidia) to have trained on a specific title for that title to support DLSS, a bit like voice recognition used to have to be trained to your specific voice in the past but not anymore.

Now Nvidia generate a giant amount of 16K image at extremely details quality and train what regular real time gaming look like versus the high quality and it has been trained for so long by now that it became generic enough to work on a new title without specific training.
 
The big take away from me regarding DLSS 3 is the increased latency, a lot of people told me that reflex would resolve this issue, doesn't seem to be the case. I would rather use actual native frame rates at a lower frame rate than use this as it stands. Seems like an interesting start, but reminds me more of DLSS 1.0 in terms of where its at currently, it's a beta that's not ready for prtime time. DLSS quality I think is cool and works well in most situations, sometimes looks better than native with TAA or other AA techniques but rarely.
It also begs the question, are the 4080 cards fast enough for people to pay a premium? Or are people going to balk at the price and not see DLSS 3.0 as a reason to buy one? I'm kind of in the latter group at the moment. Buying a 4090 makes sense for its brute strength if you're playing at 4K, I don't see any point to the 4080 series if they can't beat the 3080 ti, 3090 or 3090 ti at native resolutions.
 
Last edited:
  • Like
Reactions: ChadD
like this
The big take away from me regarding DLSS 3 is the increased latency, a lot of people told me that reflex would resolve this issue, doesn't seem to be the case. I would rather use actual native frame rates at a lower frame rate than use this as it stands. Seems like an interesting start, but reminds me more of DLSS 1.0 in terms of where its at currently, it's a beta that's not ready for real time. DLSS quality I think is cool and works well in most situations, sometimes looks better than native with TAA or other AA techniques but rarely.
It also begs the question, are the 4080 cards fast enough for people to pay a premium? Or are people going to balk at the price and not see DLSS 3.0 as a reason to buy one? I'm kind of in the latter group at the moment.
Everything I have seen has shown latency to be equal to or better than native with DLSS 3. But yes it has higher latency than DLSS 2. But if it's still better than native does it matter? Or have I missed one of the reviews where it is showing DLSS 3 to have higher latency than native?
 
Everything I have seen has shown latency to be equal to or better than native with DLSS 3. But yes it has higher latency than DLSS 2. But if it's still better than native does it matter? Or have I missed one of the reviews where it is showing DLSS 3 to have higher latency than native?
check out the HUB video above, seems there's some latency issues. Edit: here's the video again it's on the previous page, checkout the latency section.
 
I dunno what DLSS you all have been using, but at 4K, DLSS Quality looks better than native with TAA applied and I get a frame boost on top of it. Now Balanced and Performance, ok, you CAN notice for sure, but Quality is damn near perfect in every game I have played and my preferred method of AA now.
I wouldn't go that far. Native 4k looks better then using DLSS Quality. No it isn't that much better, but it is better. Now I am not saying it isn't worth using either, as in Cyberpunk I do use DLSS quality at 4k to get better framerate. But I just know it is a slight decrease in IQ.
 
Last edited:
Imagine getting mad over an optional feature you don't have to enable.
I don't think anyone is mad about it, they're just not happy with what it is currently. Nvidia is presumably charging a premium acting like DLSS 3 making their new cards exponentially faster than the previous generation. It's just clearly not the case, the 4090 is still an impressive generational uplift, I just think the disappointment is really going to settle in when the 4080 series doesn't have much of a generational uplift without DLSS 3.
 
I wouldn't go that far. Native 4k looks better then using DLSS Quality. No it isn't that much better, but it is better. Now I am not saying it isn't worth using either, as in Cyberpunk I do use DLSS quality at 4k to get better framerate. But I just know it is a slight decrease in IQ.
Seeing as most games use either TAA or FXAA, DLSS Quality is always a better option for getting rid of Aliasing as it disables the other forms of AA. the only time I would say "native" is better is if MSAA is an option, but I rarely see that in games anymore. Last game I played that had it was Forza Horizons 5, and it does look damn good and play smooth.
 
Seeing as most games use either TAA or FXAA, DLSS Quality is always a better option for getting rid of Aliasing as it disables the other forms of AA. the only time I would say "native" is better is if MSAA is an option, but I rarely see that in games anymore. Last game I played that had it was Forza Horizons 5, and it does look damn good and play smooth.
I would agree with this, I also think it's rarely an option because it has such a massive performance hit. Still though, I would like to see games include it.
 
Imagine getting mad over an optional feature you don't have to enable.
What a copout. Nvidia blew up dlss3 into something it frankly isn't to the point it almost felt like they were trying to sell it, not a GPU.
 
  • Like
Reactions: ChadD
like this
The technology is interesting (to me) as a “this is just cool on a maths, programming and proper hw/sw vertical-stack implementation” thing, but the application to gaming is total horseshit.

I am impressed how nvidia is able to pull off combining bleeding edge research, tight hardware + software integration and optimization, and roll all this out in way that gamedevs can actually play with it and we can see the outcome (good or bad) in games launching soon (tm). Neither intel nor AMD have managed this even once in recent memory (maybe in a very very minor way AMD did this with HDR+AA on the HD2900 series way back when iirc?).

Still hate how this comes at the expense of proper research and development into gfx cores and raster perf though haha.

But to be completely fair, the 4090 isn’t a slouch at 4K raster perf either.

That is a fair assessment. As far as cutting edge GPU features... I mean IMO you probably have to go back even further. Truform was 100% ATI in 2001... one of the few real game changing bits of research turned practical AMD/ATI was directly responsible for. Sort of like RT in a lot of ways that didn't really get into a proper hardware form and get any real software traction till the HD2900.... and AMD including the tech in the Xbox 360. Everyone takes Tessellation for granted now.

To be fair to AMD though... on the software side they have focused on some pretty practical things aimed at raster and quality of life. Image sharpening, FSR, Chill, Anti Lag. They have done a pretty good job of implementing them all driver side and making it easy for users to choose when it makes sense to use which settings. I actually like Imagine Sharpening in a few old titles, FSR Quality mode is actually not bad for a handful of games that push hardware more, Chill is a feature Nvidia should APE but never has I guess slowing the card down on purpose is anathema to their philosophy of win all the FPS benchmarks always. Anti Lag is awesome for esports type stuff if you play any of it.
None of it is flashy new research turned into selling bullet. I would say though that all those things are far more practical then even high end RT is for the time being anyway. A lot of Nvidia users think I'm a crazy shill when I say AMDs drivers are superior... and I don't know perhaps I am, but they are superior. AMD has made QOL for gamers a priority... in a way Nvidia really has not.
 
  • Like
Reactions: noko
like this
I would agree with this, I also think it's rarely an option because it has such a massive performance hit. Still though, I would like to see games include it.
This is taken from the Techspot website. Where you can see the quality difference. Its slight but it easily looks better in native 4k, even in there conclusion they say as much. Now, DLSS quality is a damn good thing, but can't be spreading fud around saying its better than native when it's not in anyway.

https://static.techspot.com/articles-info/2165/images/F-14.jpg
https://static.techspot.com/articles-info/2165/images/F-15.jpg
 
This is taken from the Techspot website. Where you can see the quality difference. Its slight but it easily looks better in native 4k, even in there conclusion they say as much. Now, DLSS quality is a damn good thing, but can't be spreading fud around saying its better than native when it's not in anyway.

https://static.techspot.com/articles-info/2165/images/F-14.jpg
https://static.techspot.com/articles-info/2165/images/F-15.jpg
When did that even start? I remember DLSS used to be talked about as a pretty worthwhile compromise. Then suddenly it shifted to gift from God, "better than native resolution" status.
 
When did that even start? I remember DLSS used to be talked about as a pretty worthwhile compromise. Then suddenly it shifted to gift from God, "better than native resolution" status.

only in a select few games did it look better than native- Death Stranding I think was one
 
That is a fair assessment. As far as cutting edge GPU features... I mean IMO you probably have to go back even further. Truform was 100% ATI in 2001... one of the few real game changing bits of research turned practical AMD/ATI was directly responsible for. Sort of like RT in a lot of ways that didn't really get into a proper hardware form and get any real software traction till the HD2900.... and AMD including the tech in the Xbox 360. Everyone takes Tessellation for granted now.

To be fair to AMD though... on the software side they have focused on some pretty practical things aimed at raster and quality of life. Image sharpening, FSR, Chill, Anti Lag. They have done a pretty good job of implementing them all driver side and making it easy for users to choose when it makes sense to use which settings. I actually like Imagine Sharpening in a few old titles, FSR Quality mode is actually not bad for a handful of games that push hardware more, Chill is a feature Nvidia should APE but never has I guess slowing the card down on purpose is anathema to their philosophy of win all the FPS benchmarks always. Anti Lag is awesome for esports type stuff if you play any of it.
None of it is flashy new research turned into selling bullet. I would say though that all those things are far more practical then even high end RT is for the time being anyway. A lot of Nvidia users think I'm a crazy shill when I say AMDs drivers are superior... and I don't know perhaps I am, but they are superior. AMD has made QOL for gamers a priority... in a way Nvidia really has not.
Unless I’m totally wrong - all of those were reactionary based on nvidia pushing the envelope first though right? This bothers me about AMD a bit, but I understand why things are the way they are - AMD is really treading water with a fraction of the personnel that nvidia has just to keep up. There’s a positive to that in that they are much more mobile and there’s less inertia to pivot quickly.
 
Unless I’m totally wrong - all of those were reactionary based on nvidia pushing the envelope first though right? This bothers me about AMD a bit, but I understand why things are the way they are - AMD is really treading water with a fraction of the personnel that nvidia has just to keep up. There’s a positive to that in that they are much more mobile and there’s less inertia to pivot quickly.
I'm pretty sure Nvidia doesn't have a feature similar to Radeon Chill. (hands down my favorite driver feature ever... I play a few games now that my GPU doesn't even have to turn the fan on for and its not noticeable) FSR I guess you could call reactionary sort of...as others have said its iterative tech I'm not so sure it wouldn't have found its way into their software anyway. Regardless its the superior feature... as cool as DLSS is in theory FSR actually solves a real issue. DLSS was invented to sell new hardware (which continues with Version 3)... FSR actually extends the life of old hardware. That is a pretty big difference in mission.
AMD has also been putting some serious man power behind their drivers the last year or two. They have rewritten huge parts of driver code for things like DX11... and on the Pro consumer side they even spent time rewriting their OpenGL code.
You can look at AMDs workman like take on software as reactionary if you like... I prefer to think of it as Nvidia is constantly the dreamy flighty one of the two. Nvidia has plenty of interesting ideas that rarely pan out, for generations of cards. Some of them are also dead ends. Perhaps Nvidia finds away to make DLSS 3.5 or 4 work with optical flow... as they did with DLSS 1. I think it might also be likely that they never really fully solve the Lag and grabble issues. I know I'll never turn a feature on that garbles my games UI... or makes it feel like I'm playing a game running at 40FPS even though my eyes are seeing 120FPS. (Which is how Hardware Uboxed describes DLSS 3) Cool idea on paper... but frame generation ? I think the issues there are just too many, from the HU testing it sounds like on mid range cards it will be even more annoying, the exact market where feature like DLSS make sense.
 
  • Like
Reactions: noko
like this
When did that even start? I remember DLSS used to be talked about as a pretty worthwhile compromise. Then suddenly it shifted to gift from God, "better than native resolution" status.
IT can handle the motion of small, sub-pixel details more smoothly than native TAA when implemented properly. Things like Grass, hair and wire lines CAN look better than native in some games.
 
When did that even start? I remember DLSS used to be talked about as a pretty worthwhile compromise. Then suddenly it shifted to gift from God, "better than native resolution" status.
Dlss quality is really good and should be used. I actually think it’s a fantastic feature. And in the 4K+ resolution it was needed to get decent frames above 60fps. Now with the 4090 that shouldn’t an issue at 4K native. But using dlss 2.0 on a 4090 at 4K would ensure you be getting over 100fps easily which would be a god send in my eyes.
 
Dlss quality is really good and should be used. I actually think it’s a fantastic feature. And in the 4K+ resolution it was needed to get decent frames above 60fps. Now with the 4090 that shouldn’t an issue at 4K native. But using dlss 2.0 on a 4090 at 4K would ensure you be getting over 100fps easily which would be a god send in my eyes.
Right. I'm only wondering about the "better than native res" talk.
 
What a copout. Nvidia blew up dlss3 into something it frankly isn't to the point it almost felt like they were trying to sell it, not a GPU.

I guess? I don't really care how any optional feature is ever marketed since it still comes down to doing one's own testing in the end. Any modifier/filter on a game's graphics is always going to be subjective, and DLSS of any level still just boils down to an option to A/B compare and see if it happens to work better for your particular mix of GPU/Monitor/game/eyes.
 
Last edited:
Hyperbole aside, a modifier/filter on a game's graphics is always going to be subjective, and DLSS of any level still just boils down to an option to A/B compare and see if it happens to work better for your particular mix of GPU/Monitor/game. I don't really care how any optional feature is ever marketed since it still comes down to doing one's own testing in the end.
That sums it up. Its subjective. I like sharpening in some games but it isn't the intended look. I like it... others may not. Same has been true with AA methods going back long before DLSS and FSR tech.
Some people like to tweak things like reshade as well... some people think the results are great in a lot of older games, but sometimes it leads to odd bits as well.
 
It is shit, but I don't see anybody selling hardware that offers good framerates for native 4K.
So we're kind of where we are at with that.
Have you seen the reviews of the RTX 4090 cards? 4k looks pretty crazy playable to me.

So much so that the review I watched said to not bother buying one of you play at 1440p or below because you are going to be CPU limited. The results showed that.
 
It is shit, but I don't see anybody selling hardware that offers good framerates for native 4K.
So we're kind of where we are at with that.
Lot of people were playing lot of game with good FPS before last week (Maybe just not at very ultra setting), but that do not sound up to date:

NVIDIA-RTX-4090-rasterized-benchmarks-4K.png

Collection of not the easiest title to run, ultra settings, 4K often above 100 fps average
 
Lot of people were playing lot of game with good FPS before last week (Maybe just not at very ultra setting), but that do not sound up to date:

View attachment 518247
Collection of not the easiest title to run, ultra settings, 4K often above 100 fps average
I saw those sorts of numbers and thought that was with DLSS on, I am surprised to see Anthem on that list. I'm sad every time I think of that game because I really liked it, and it was so good, they just fucked it up so hard...
 
Have you heard of Frame Generation?
I want to see side by side output quality comparisons with it disabled and enabled.

If it turns the output quality to crap to boost the frame rate than it is still crap.
 
Back
Top