More DLSS...

Yesterday, Nvidia added DLDSR in the new drivers. Its an AI assisted version of DSR. It supposedly allows you to downscale with a trivial peformance loss. The exampel they show is to downscale 1620p to 1080p.
https://www.nvidia.com/en-us/geforc...&ranSiteID=kXQk6.ivFEQ-8vx6zqxUyqtoey5TKKE6mA

I am trying exactly that and it doesn't seem to be working. It looks just like regular DSR and gives me a big performance hit, just like DSR. **actually, using regular DSR is smoother for me. With DLDSR, the frametimes are jacked up and it gives me a slight rubber-banding effect.

View attachment 432715
https://www.tomshardware.com/news/nvidia-subwarp-ray-tracing-performance

Nvidia GPU Subwarp Interleaving Boosts Ray Tracing by up to 20%​




1643160627368.png


1643160638338.png
 
Very pleased with the IQ and performance bump in FSR 2.0. Want to see more games / data points. DLSS is on notice.
 
If the techno is in his infancy and would have a big jump the first decade before crawling slowly in a log way, this could be quite the future game changer and a lot of it could be "lossless compression/decompression" has well.
Looking at the video and the benchmark, DLSS 3.0 now need to get better to stay relevant it seem.
 
Nvidia DLSS 3 Promises Up to 4X Performance Acceleration; 35 Games/Apps Already Supported

https://videocardz.com/press-releas...lss-3-upscaling-technology-35-games-on-a-list

Exclusive First Look! Cyberpunk 2077 with NVIDIA DLSS 3 & Ray Tracing



Nvidia is even promising benefits for CPU-bound games, which generally didn't run much faster with DLSS 2.0. For example, the notoriously CPU-heavy Microsoft Flight Simulator gets up to 2x improved performance with the new DLSS...

 
So much for DLSS being a way to extend the legs of a GPU, just you watch nVidia will paywall the latest version behind the latest GPU. Wouldn't put it past them to prevent devs from using prior versions as well.
 
Exclusive First Look! Cyberpunk 2077 with NVIDIA DLSS 3 & Ray Tracing
I am a bit curious what are the setting used in Cyberpunk 2077 that ran at 20 fps on a 4080 or 4090, maybe it is simply running the game at 8k or something of the sort, but new RT level setting in the game would be nice.

After looking at:


Maybe it was not hard to have setting with the current 1.5 version at 4K that ran on the low 20 even with a 4080
 
I wonder how far away they are from skipping the raster and just feeding the game data to the AI, to produce an image.
 
I am a bit curious what are the setting used in Cyberpunk 2077 that ran at 20 fps on a 4080 or 4090, maybe it is simply running the game at 8k or something of the sort, but new RT level setting in the game would be nice.
It seem to be a yet to be released new version (dev build) of the game with a different level of RT finally.
 
So much for DLSS being a way to extend the legs of a GPU, just you watch nVidia will paywall the latest version behind the latest GPU. Wouldn't put it past them to prevent devs from using prior versions as well.
They said dlss 3 is a superset of dlss 2 so games with dlss 3 will run in dlss 2 mode for those of us with 3000-series or 2000-seriee cards still. So, no, it isn't locked.
 
They said dlss 3 is a superset of dlss 2 so games with dlss 3 will run in dlss 2 mode for those of us with 3000-series or 2000-seriee cards still. So, no, it isn't locked.
DLSS 3 just adds a frame interpolation(DLSS Frame Generation) and Shader Execution Reordering. This shouldn't have been called DLSS 3 imho.
 
DLSS 3 just adds a frame interpolation(DLSS Frame Generation) and Shader Execution Reordering. This shouldn't have been called DLSS 3 imho.

so you think the quality jump from DLSS 2.1 to DLSS 3 will be minor?...not the same as the jump from DLSS 1.0 to 2.0?
 
DLSS 3 just adds a frame interpolation(DLSS Frame Generation) and Shader Execution Reordering. This shouldn't have been called DLSS 3 imho.
This also begs the question... is the 4080 12GB and 16GB truly faster than the previous gen? We know the 4090 will be faster as it will just brute force its way on framerates, but the 4080 16GB and 12GB both seem somewhat inferior to previous gen 3080 10GB and 12GB versions.
 
The 4080 12GB uses a 192-bit memory bus. How could it possibly keep up with the 3080 10GB (320-bit) or 3080 12GB (384-bit) at high resolutions like 4K? (excluding DLSS 3 compliant titles, of course)
 
The 4080 12GB uses a 192-bit memory bus. How could it possibly keep up with the 3080 10GB (320-bit) or 3080 12GB (384-bit) at high resolutions like 4K? (excluding DLSS 3 compliant titles, of course)
Come on now you know architectural differences matter much more than simply looking at some specs on paper.
 
The 4080 12GB uses a 192-bit memory bus. How could it possibly keep up with the 3080 10GB (320-bit) or 3080 12GB (384-bit) at high resolutions like 4K? (excluding DLSS 3 compliant titles, of course)
In part for similar reason a 6950xt 256 bits RDNA 2 cards often did beat a 3080 at high resolution, the memory cache exploded on this generation, 16xtime more L2 cache (I imagine to save power-money on the memory bus like AMD did).

And other possible reason, could be GDDR6x improvement is despite the 192 bits bus, the VRAM speed of 21 gbs will reduce the gap a bit, it will have the memory bandwith similar to a 6950xt I think.

You can very safely assume that those company design choice are sound enough to clearly beat the previous generation,worrying about RDNA2 memory bandwith or Ampere VRAM amount was completely overblown, they would not have put a 192 bit memory bus if it was crippling for what that card will often be doing like playing a will be ultra standard in that tier of card playing at 4K (well maybe, but almost impossible).
 
Last edited:
In part for similar reason a 6950xt 256 bits RDNA 2 cards often did beat a 3080 at high resolution, the memory cache exploded on this generation, 16xtime more L2 cache (I imagine to save power-money on the memory bus like AMD did).

You can very safely assume that those company design choice are sound enough to clearly beat the previous generation,worrying about RDNA2 memory bandwith or Ampere VRAM amount was completely overblown, they would not have put a 192 bit memory bus if it was crippling for what that card will often be doing like playing a will be ultra standard in that tier of card playing at 4K (well maybe, but almost impossible).
Yep, they also are using much faster gddr6x this round. Good points.
 
so you think the quality jump from DLSS 2.1 to DLSS 3 will be minor?...not the same as the jump from DLSS 1.0 to 2.0?

We will have to wait and see. But given that neither DLSS Frame Generation or SER should have an impact to increasing the image quality... no? Again though, have to wait and see.

That is a MAJOR feature, I don't see why it wouldn't warrant a major version bump. Along with all the other updates it's presumably seen.

... honestly I'll have to wait and see. Because right now, it doesn't sound that different that enabling the motion smoothing feature of a television set.
 
We will have to wait and see. But given that neither DLSS Frame Generation or SER should have an impact to increasing the image quality... no? Again though, have to wait and see.
Can you imagine the confusion to keep the exact same first numbers (say it would have been DLSS 2.2 or 2.3) but not make it available to the previous generation of cards ?

Seem like a massive no-brainer to something has massive has a new inter frame generator and a new tracking of motion vector less element on the scene to get a new number, seem not that smaller if any smaller than going from 1 to 2 because of the generalised instead of specialised machine learning.

I would even go to say if you decide to not make it available on the previous gneration of cards you have almost no choice at all to change the name or at least the number, so game with the little sticker do not create confusion on the buyer.
 
Can you imagine the confusion to keep the exact same first numbers (say it would have been DLSS 2.2 or 2.3) but not make it available to the previous generation of cards ?

Seem like a massive no-brainer to something has massive has a new inter frame generator and a new tracking of motion vector less element on the scene to get a new number, seem not that smaller if any smaller than going from 1 to 2 because of the generalised instead of specialised machine learning.

I would even go to say if you decide to not make it available on the previous gneration of cards you have almost no choice at all to change the name or at least the number, so game with the little sticker do not create confusion on the buyer.
Fair enough!
 
According to a Nvidia person, has expected DLSS 3 could have ran on all Turing card (they all have "optical flow acceletor" has show by the fact you can use them via their sdk for years), but the turing-ampere were not fast-good enough to actually boost the frame rate and have a good image quality in those made up frame at the same time.

https://twitter.com/ctnzr/status/1572330879372136449

A bit crazy to think (if it deliver quality wise) that you will play with only 1/8 of the pixel coming from actual game rendering
 
Last edited:
We will have to wait and see. But given that neither DLSS Frame Generation or SER should have an impact to increasing the image quality... no? Again though, have to wait and see.



... honestly I'll have to wait and see. Because right now, it doesn't sound that different that enabling the motion smoothing feature of a television set.


I guess the win essentially comes from:
1. It's fast enough that there's no input lag.
2. It looks like the typical interpolation artifacts are SEVERELY diminished.

I don't see how it'll ever improve quality, but it'll look smoother.

It's kind of weird because it is and isn't your framerate. Like it's just a hallucinated frame, none of the world was actually rendered to produce it - it's just a fabrication off of the last and current.

Does the engine still run the game loop and basically just skip renderering for that frame? I guess it must, otherwise you'd have the weird effect of stuff like not processing input for a frame DLSS shit out.

I guess there has to be a frame of input lag if it's interpolation. If it's pulling a new frame out of nowhere only from one, it's really extrapolation.
 
It's going to be really interesting to see what it actually looks like. Because this isn't a new idea, VR has been playing with this for awhile. With the likes of Asynchronous SpaceWarp and SteamVR's Motion smoothing.
 
It's going to be really interesting to see what it actually looks like. Because this isn't a new idea, VR has been playing with this for awhile. With the likes of Asynchronous SpaceWarp and SteamVR's Motion smoothing.
Its great as long as it doesn’t add input latency. My fear, though, is that it does, and Nvidia will tell competitive gamers to disable DLSS 3.
 
So much for DLSS being a way to extend the legs of a GPU, just you watch nVidia will paywall the latest version behind the latest GPU. Wouldn't put it past them to prevent devs from using prior versions as well.
ok I am sure Nvidia will put continued effort into DLSS 2.... :ROFLMAO: Nvidia has a new slogan "Pay us more or get lost."
 
ok I am sure Nvidia will put continued effort into DLSS 2.... :ROFLMAO: Nvidia has a new slogan "Pay us more or get lost."
Yeah, it's a bit funny. They take a huge dump on every single potential customer who isn't gunning for a 4090, but at the same time will totally look after those who don't want to buy into new, expensive cards.

Sure.
 
ok I am sure Nvidia will put continued effort into DLSS 2.... :ROFLMAO: Nvidia has a new slogan "Pay us more or get lost."
It seem official that any game that will have support for DLSS 3 will support DLSS 2 (I do not see why not, same motion vector for the physical scene object are used) and a lot of the upgrade for the supersampling part made for dlss 3.0 could directly be usable in dlss 2.0
 
Its great as long as it doesn’t add input latency. My fear, though, is that it does, and Nvidia will tell competitive gamers to disable DLSS 3.

Increase latency is possible, but I'm expecting it's going to have motion artifacts around objects in real use. Notice how everything in the DLSS 3 trailers in moving is super predictable and more or less straight ways... yeah well people don't move like that when they play games. So I'm curious to see how it looks in actual game-play.
 
Last edited:
confirmed by Nvidia's AndyBNV on the ResetEra forums:

"When a DLSS 3 game comes out, and you own a GeForce RTX 30 Series or 20 Series graphics card or laptop, you can use DLSS 2 Super Resolution in that title...So if you own a GeForce RTX 30 or 20 Series, you would get DLSS 2 + Reflex. Even if a game is labeled as a "DLSS 3 Game"

https://www.resetera.com/threads/nv...-will-get-support.634256/page-3#post-93607910
 
Back
Top