More DLSS...

Patiently waiting to use my step up on something stronger. Hopefully nvidia drops something by early february
While I get the sentiment, here's to nvidia not dropping anything exciting in the next few months, and focus on actually manufacturing the cards that are already out (or about to be out like the 3060 ti). You already have your 3070, but so many of us are still left waiting to buy something from this generation. They better be able to make a truckload of 3060s, since they must be easier to manufacture than the higher models.
 
While I get the sentiment, here's to nvidia not dropping anything exciting in the next few months, and focus on actually manufacturing the cards that are already out (or about to be out like the 3060 ti). You already have your 3070, but so many of us are still left waiting to buy something from this generation. They better be able to make a truckload of 3060s, since they must be easier to manufacture than the higher models.
Did you do the notify me for any of the evga cards at launch? I've been able to get 4 cards now. 3070FTW3, 3070 xc3, 3080 xc3 and 3080ftw. The 3080 cards I sold to friends, the other 3070 I sold at cost online and I kept the 3070FTW hoping there would be a 3070ti or something similar released that I can step up to
 
While I get the sentiment, here's to nvidia not dropping anything exciting in the next few months, and focus on actually manufacturing the cards that are already out (or about to be out like the 3060 ti). You already have your 3070, but so many of us are still left waiting to buy something from this generation. They better be able to make a truckload of 3060s, since they must be easier to manufacture than the higher models.
We'll see more of them but demand will be greater for these mainstream cards. Double-edged sword.
 
We'll see more of them but demand will be greater for these mainstream cards. Double-edged sword.
That's the issue. The 60 series cards are the most popular due to price. Remember what happened with the 3070. They delayed it 2 weeks so they would have more stock and they still sold out damn near instantly online. The 3060 cards I think will be even worse availability wise due to the lower price
 
Did you do the notify me for any of the evga cards at launch? I've been able to get 4 cards now. 3070FTW3, 3070 xc3, 3080 xc3 and 3080ftw. The 3080 cards I sold to friends, the other 3070 I sold at cost online and I kept the 3070FTW hoping there would be a 3070ti or something similar released that I can step up to
This. Or hit up the sale forum here. They hit regularly enough. Especially 3090s
 
Agree with those sentiments. DLSS, despite whatever nit-pick flaws it may have, is tremendous technology.
Agreed. Now it just needs to be in more than a few popular games. I still do not see it being a buying factor at this time considering the years of promising and not delivering.
 
Agreed. Now it just needs to be in more than a few popular games. I still do not see it being a buying factor at this time considering the years of promising and not delivering.
That’s the kicker. If they can get it into more games we’d be in good shape.
 
How much longer til DLSS 2.2 is out?

--
"What kind of advancements can we expect from DLSS? Most people were expecting a DLSS 3.0, or, at the very least, something like DLSS 2.1. Are you going to keep improving DLSS and offer support for more games while maintaining the same version?

DLSS SDK 2.1 is out and it includes three updates:
- New ultra performance mode for 8K gaming. Delivers 8K gaming on GeForce RTX 3090 with a new 9x scaling option.
- VR support. DLSS is now supported for VR titles.
- Dynamic resolution support. The input buffer can change dimensions from frame to frame while the output size remains fixed. If the rendering engine supports dynamic resolution, DLSS can be used to perform the required upscale to the display resolution.
 
While I get the sentiment, here's to nvidia not dropping anything exciting in the next few months, and focus on actually manufacturing the cards that are already out (or about to be out like the 3060 ti). You already have your 3070, but so many of us are still left waiting to buy something from this generation. They better be able to make a truckload of 3060s, since they must be easier to manufacture than the higher models.
Nvidia doesn't do the manufacturing; they outsource all that. Samsung is doing it right now and they are struggling get good yields on their new 8nm process.
 
Nvidia doesn't do the manufacturing; they outsource all that. Samsung is doing it right now and they are struggling get good yields on their new 8nm process.
I know that, but if there are more products to produce, the same production capacity gets split among more products, there’s less of each of them. Didn’t think that needed pointing out...
 
I know that, but if there are more products to produce, the same production capacity gets split among more products, there’s less of each of them. Didn’t think that needed pointing out...
Yes but if there is more of them overall because of better yield, isn't that plus for people without card ?

It is a minus for people wellplace on a waitlist already for an already existing product, but if mean more cards overall that not necessarily a minus overall.
 
Is DLSS a fee-based technology? Can any indie developer implement free or do they need some pockets for it?
Has anyone heard from other devs if DLSS is easy to implement, or complicated? Wondering why not every game released has DLSS now
 
On the developer Nvidia website it says it's invite only.

1608101210648.png


https://developer.nvidia.com/nvidia-dlss-early-access-program
 
Dlss was a lifesaver for me on a 3440x1440 144hz monitor playing newer games. I wish every game had it, sure the image quality isn’t the same but damn the fps increase I saw was huge and well worth it
 
Early access program? They’re already on DLSS 2.1 and it’s been over 2 years since DLSS was first introduced. How can they still call it early access?
Interesting that it is only available to select developers, Nvidia whole enterprise seems to be do it our way or hit the highway mentality. What makes a developer not selected? Now I can see why developers are slow to take up DLSS. Previously with Nvidia Gameworks, initially, a developer would have to virtually give away their code to Nvidia to use Nvidia code and anything they developed could be used by Nvidia and was owned by Nvidia (my memory note on that aspect). Nvidia seems to always have entangled webs of control trying to be exerted with you giving away free something to Nvidia in the end.
  • Gameworks -> your code becomes their code which you won't even be able to reuse if you leave their plantation (failed attempt)
  • GPP -> Your brand becomes their brand (Failed)
  • Review Cards -> We tell you what is relevant (Failed)
  • DLSS -> ???
 
Is DLSS a fee-based technology? Can any indie developer implement free or do they need some pockets for it?
Has anyone heard from other devs if DLSS is easy to implement, or complicated? Wondering why not every game released has DLSS now
Right now Nvidia PR teams contact studios and offer them money to build raytracing or DLSS into their games. Remember how shitty titles are at launch with all the bugs? Large Studios can BARELY get their game released, without trying to spend months of man hours on learning how to implement these technologies. Nvidia is paying a ton of money to the studios to slap it on, and the weak implementations we're seeing in a lot of games, especially on the ray tracing front, is because they're just doing it to take the check from Nvidia. It's the last thing they want to think about, and the paltry number of raytracing gpus barely touches the surface of game sales. Consoles will start to implement the AMD raytracing format in the next few years, but they have a target audience that will all have the right hardware to benefit from it. Until a few more GPU generations from now, Ray Tracing will continue to be something NVIDIA is forcing down our throats with back alley payouts, similar to how EPIC GAMES lands AAA titles on their store instead of stream.
 
  • Like
Reactions: noko
like this
Early access program? They’re already on DLSS 2.1 and it’s been over 2 years since DLSS was first introduced. How can they still call it early access?

Control over the product, and likely over the developer(s) as well, this is nVidia after all.
 
Right now Nvidia PR teams contact studios and offer them money to build raytracing or DLSS into their games. Remember how shitty titles are at launch with all the bugs? Large Studios can BARELY get their game released, without trying to spend months of man hours on learning how to implement these technologies. Nvidia is paying a ton of money to the studios to slap it on, and the weak implementations we're seeing in a lot of games, especially on the ray tracing front, is because they're just doing it to take the check from Nvidia. It's the last thing they want to think about, and the paltry number of raytracing gpus barely touches the surface of game sales. Consoles will start to implement the AMD raytracing format in the next few years, but they have a target audience that will all have the right hardware to benefit from it. Until a few more GPU generations from now, Ray Tracing will continue to be something NVIDIA is forcing down our throats with back alley payouts, similar to how EPIC GAMES lands AAA titles on their store instead of stream.

Hmm... As a fan of ray tracing, I guess I'll be waiting in that alley for them to have their way with me. :p
 
Hmm... As a fan of ray tracing, I guess I'll be waiting in that alley for them to have their way with me. :p
It's actually pretty standard corporate marketing stuff, we've just learned over the years to consider everything is evil and that the man is out to cut our throats and bleed us dry. I believe ray tracing is the future of real time lighting, it may be 10 years off before we get a decent implementation in game, to the standards I'm looking for, but if Nvidia wasn't pushing it so hard, no one would be using it.
 
I believe ray tracing is the future of real time lighting, it may be 10 years off before we get a decent implementation in game, to the standards I'm looking for, but if Nvidia wasn't pushing it so hard, no one would be using it.
I certainly agree with you on this. Now, I still think Nvidia are a bunch of a**holes who play dirty tricks and abusive business practices on the regular, but that doesn't demean their position on technology. RT is the holy grail of virtual lighting, period. The fact that they were able to make a denoiser so good that we can shoot well below the required rays and still make a super noisy raytraced image look good, is beyond commendable. Of course, as many things that Nvidia touches that turn to gold, turn to sh*t, the former because of good engineering, the latter because of their marketing team. Nvidia's public appearances and statements have shot it in its feet, but the technological progress is there.

So many people are complaining about the "slowness" of progress on RT performance. If you'll recall, when we got pixel shader model 1 - that's Geforce 3 territory, 2001 - they were wonderful but shader-anemic. I remember reading constantly about the potential of pixel shaders but how they were too weak in any implementation at the time and there was no point in using shaders in games anyway. Here's a nice deja-vu history lesson, from Anandtech's 2002 review of the Geforce 4:

1608306324907.png


Geforce 3 showed what the future could look like, but didn't have the performance. GF4 was better, but still far from where we needed to be. Devs weren't implementing pixel shaders enough, because not enough people had capable cards. It wasn't until 6 years later with the Geforce 8 and unified shaders, by the time we got to shader model 4, that performance and quality got legitimately good and there had been 4 pixel shader capable card generations. That's a good while for industry support to grow and feature sets to evolve and improve. Raytracing has a similar path ahead of itself - we're in year 3, basically. In 2021 we'll get the Super version of the 30 series, and in 2022 we'll get the 3rd RT architecture. If history has been any indication, that's when things will get actually good. Not perfect, but a noticeable jump in performance, quality and affordability (think Half Life 2 levels of improvement - with pixel shaders - and how then the technology is on track to progressively improve in the following decade).

Bottom line is, patience. Nvidia might actually be evil, but their technology is good, and their reasons for pushing raytracing are entirely justified and beneficial to everyone (devs and gamers). Their practices may be sh*t, and you don't have to like the company (I've certainly gone from really liking them in the early 2000s to being disgusted by them and begrudgingly giving them money in 2020) but raytracing is the future.
 
Last edited:
I certainly agree with you on this. Now, I still think Nvidia are a bunch of a**holes who play dirty tricks and abusive business practices on the regular, but that doesn't demean their position on technology. RT is the holy grail of virtual lighting, period. The fact that they were able to make a denoiser so good that we can shoot well below the required rays and still make a super noisy raytraced image look good, is beyond commendable. Of course, as many things that Nvidia touches that turn to gold, turn to sh*t, the former because of good engineering, the latter because of their marketing team. Nvidia's public appearances and statements have shot it in its feet, but the technological progress is there.

So many people are complaining about the "slowness" of progress on RT performance. If you'll recall, when we got pixel shader model 1 - that's Geforce 3 territory, 2001 - they were wonderful but shader-anemic. I remember reading constantly about the potential of pixel shaders but how they were too weak in any implementation at the time. It wasn't until 6 years later with the Geforce 8 and unified shaders, by the time we got to shader model 4, that performance and quality got legitimately good. That's a good while for industry support to grow and feature sets to evolve and improve. Raytracing has a similar path ahead of itself - we're in year 3, basically. In 2021 we'll get the Super version of the 30 series, and in 2022 we'll get the 3rd RT architecture. If history has been any indication, that's when things will get actually good. Not perfect, but a noticeable jump in performance, quality and affordability (think Half Life 2 levels of improvement - with pixel shaders - and how then the technology is on track to progressively improve in the following decade).

Bottom line is, patience. Nvidia might actually be evil, but their technology is good, and their reasons for pushing raytracing are entirely justified and beneficial to everyone (devs and gamers). Their practices may be sh*t, and you don't have to like the company (I've certainly gone from really liking them in the early 2000s to being disgusted by them and begrudgingly giving them money in 2020) but raytracing is the future.
It's just really painful to see them pulling shareholder tricks and sacrificing cutting edge technology for things like Cheap wafer space at an old 10nm samsung node. If ampere was 7nm TSMC we could literally have seen like 40% gains watt/watt instead of eerily similar scaling as the 20 series. ::cries::

I'm honestly starting to get tempted by DLSS, there's just so much more performance available, and if they can continue to make image quality close enough to native at 'quality mode', then that 25% boost is literally free performance you can't pickup with AMD. 'performance mode' however is straight garbage and shouldn't even be an option. I saw digital foundry using it as the baseline comparing against the 6800xt in their pro-nvidia 6800 series review and it stank of pandering. They're clearly towing the line with nvidia PR and paid videos. Shameful from some tech enthusiast experts.
 
Last edited:
It's just really painful to see them pulling shareholder tricks
Yeah, they really have 0 reason to act in the crappy ways they do. Even so, I give them my money because they tend to offer what I want. Right now, I want to play Control with RT enabled, but I'm stuck on my 1060 3GB because nothing is available to purchase at MSRP. So I figure, I'll play Control with RT off and in 2021 I'll do a 2nd round with RT on. Same for the newest Wolfenstein - which I haven't touched yet - and Metro - which I played a few months ago. I don't think TSMC's 7nm would give us any amazing improvement, because that would be used somewhere else in the silicon. We'll get gains in several generations, but not with just 1 jump in technology. DLSS will go the way of many other things, such as Physx, or Gsync - first exclusive to Nvidia, eventually democratized either under the same brand or a new one, but everyone will be able to use it. Right now DLSS is pretty good, but a DirectML equivalent is coming in 2021 - AMD calls it super resolution, Nvidia will keep calling it DLSS but it'll be essentially the same. What does it for me this next round is RT performance - Nvidia has an advantage as they're in their 2nd attempt at things, and I'd rather play the RT games I own with better performance for the same money, which for right now, means buying Nvidia, however much I dislike them. I'm really interested in Intel doing well when they come into the market, a 3-way race will be awesome news for consumers.
 
It's just really painful to see them pulling shareholder tricks and sacrificing cutting edge technology for things like Cheap wafer space at an old 10nm samsung node. If ampere was 7nm TSMC we could literally have seen like 40% gains watt/watt instead of eerily similar scaling as the 20 series. ::cries::

I'm honestly starting to get tempted by DLSS, there's just so much more performance available, and if they can continue to make image quality close enough to native at 'quality mode', then that 25% boost is literally free performance you can't pickup with AMD. 'performance mode' however is straight garbage and shouldn't even be an option. I saw digital foundry using it as the baseline comparing against the 6800xt in their pro-nvidia 6800 series review and it stank of pandering. They're clearly towing the line with nvidia PR and paid videos. Shameful from some tech enthusiast experts.

I think people fail to realize the GPU power and Memory Power. If you take the GPU alone AMPERE is actually still plenty efficient. GDD6X is adding lot to power. I get overall power usage is higher. . As an example GDDR6 16GB is only 20w according to IGOR labs charts for RDNA2. Now take GDDR6X for 3090 its 60 watts. So if Nvidia used GDDR6 they could be sitting around 320w for 3090.

Also new report was recently out that said Nvidia inked multibillion dollar contract with samsung again. So I don't think nvidia cares too much about power on the GPU. Looks like its plenty efficient what they likely expect to improve is power usage on the GDDR6X with time. I do think eventually they are working closely with Samsung so they will obviously keep using the newer process for next gen stuff.
 
I think people fail to realize the GPU power and Memory Power. If you take the GPU alone AMPERE is actually still plenty efficient. GDD6X is adding lot to power. I get overall power usage is higher. . As an example GDDR6 16GB is only 20w according to IGOR labs charts for RDNA2. Now take GDDR6X for 3090 its 60 watts. So if Nvidia used GDDR6 they could be sitting around 320w for 3090.

Also new report was recently out that said Nvidia inked multibillion dollar contract with samsung again. So I don't think nvidia cares too much about power on the GPU. Looks like its plenty efficient what they likely expect to improve is power usage on the GDDR6X with time. I do think eventually they are working closely with Samsung so they will obviously keep using the newer process for next gen stuff.
I appreciate your feedback, lets try to get this thread back in line with some DLSS discussions. Anyone taken any screenshots showing the DLSS implementation in Cyberpunk?
 
I appreciate your feedback, lets try to get this thread back in line with some DLSS discussions. Anyone taken any screenshots showing the DLSS implementation in Cyberpunk?

No but over the past two weeks I've seen countless videos going over them in comparison.
 
I appreciate your feedback, lets try to get this thread back in line with some DLSS discussions. Anyone taken any screenshots showing the DLSS implementation in Cyberpunk?
I have not, but I can say that performance is rough, lots of issues with lighting, ultra is worse. Balanced is acceptable and Quality improves things because the game forces TAA which smudges everything anyway.

There is a Hex edit to disable TAA, which results in a massive frame boost but renders DLSS unusable so RT is really impossible. You also have to turn Screen Space Reflections off as without TAA there is a lot of light noise.
 
I have not, but I can say that performance is rough, lots of issues with lighting, ultra is worse. Balanced is acceptable and Quality improves things because the game forces TAA which smudges everything anyway.

There is a Hex edit to disable TAA, which results in a massive frame boost but renders DLSS unusable so RT is really impossible. You also have to turn Screen Space Reflections off as without TAA there is a lot of light noise.
Interesting, thanks for sharing. Just be sure, if you hack TAA off, disregard DLSS and run in 4K, the RT features would not work okay?
 
Interesting, thanks for sharing. Just be sure, if you hack TAA off, disregard DLSS and run in 4K, the RT features would not work okay?

If you turn off TAA you can still use RT. There are a couple of issues disabling TAA, one the screen space reflections create a ton of noise outside of ultra ray-tracing, so you either ultra ray or turn off SSR. Second without DLSS your frames tank, I get 23-30fps at 3440/1440 with a 2080ti OC to 2100mhz and my processor at 5.1Ghz without DLSS and Ultra Ray Tracing and no TAA.

DLSS relies on TAA to function (from what I have gathered) hence the loss of DLSS when TAA is disabled.

You also cannot enable any other kind of aliasing yet, its just hex editing and turning off TAA.

Reddit: TAA Hex Edit to disable
 
I have not, but I can say that performance is rough, lots of issues with lighting, ultra is worse. Balanced is acceptable and Quality improves things because the game forces TAA which smudges everything anyway.

There is a Hex edit to disable TAA, which results in a massive frame boost but renders DLSS unusable so RT is really impossible. You also have to turn Screen Space Reflections off as without TAA there is a lot of light noise.
I've always had a love hate relationship with TAA. It does its job best in motion, which screenshots can't capture/do justice, yet subjectively it's much blurrier than other also slightly blurry aa options like msaa and smaa. The one tried and true showstopper has always been down sampling like SSAA, which will eliminate jaggies, and maintain the same texture quality as the non-aa native picture. Unfortuantely modern titles are so heavy on the GPU, it's really hard to upscale any more than what you're trying to achieve in the first place. On older titles it's a godsend, and can give them life far beyond their debut presentations. I wonder if DLSS will become more flexible in the future or if it really is just an all or nothing scenario due to the way the machine learning is processing from a single type estimation. They would have to train the algorithm on every single setting in a game, to offer blanket coverage of a title. Right now it sounds like you get one true reference scenario to work with.

I also took a screen shot from the GN video where they are showing you a blind test of one native 4k image, and 4 dlss images, asking you which one isn't DLSS. I put the screenshot in PAINT and highlighted some of the areas that DLSS still has to work on with its estimation. I listed in RED which ones were DLSS, with the green one being the native 4k image. I'd like to know more from DLSS owners specifically how it is addressing tessellation and bump mapping. Seems like some of the info is being lost in the process. You have to remember though, there are a lot of gamers that literally don't care as long as it's good enough. Most of us long time forum members probably aren't in that boat so to each their own:


DLSScyberpunk2077.png


Take a look towards the end of the video and you can see exactly which letter corresponds to the actual amount of DLSS being applied. Unfortunately none of them stand up to the native. In my opinion, I'm seeing DLSS as something similar to TAA, where it's probably much better viewed in motion, yet not quite ideal.
 
Last edited:
GN just did a video focusing on CP2077 DLSS specifically, with some added hilarious commentary about nVidia shenanigans:


I wonder if the advantages 4k DLSS Quality had over 4k native would be nulled with some AA implementation.

4k DLSS Performance vs 1080p native is impressive, though. How do they compare perfomance wise?
 
I've always had a love hate relationship with TAA. It does its job best in motion, which screenshots can't capture/do justice, yet subjectively it's much blurrier than other also slightly blurry aa options like msaa and smaa. The one tried and true showstopper has always been down sampling like SSAA, which will eliminate jaggies, and maintain the same texture quality as the non-aa native picture. Unfortuantely modern titles are so heavy on the GPU, it's really hard to upscale any more than what you're trying to achieve in the first place. On older titles it's a godsend, and can give them life far beyond their debut presentations. I wonder if DLSS will become more flexible in the future or if it really is just an all or nothing scenario due to the way the machine learning is processing from a single type estimation. They would have to train the algorithm on every single setting in a game, to offer blanket coverage of a title. Right now it sounds like you get one true reference scenario to work with.

I also took a screen shot from the GN video where they are showing you a blind test of one native 4k image, and 4 dlss images, asking you which one isn't DLSS. I put the screenshot in PAINT and highlighted some of the areas that DLSS still has to work on with its estimation. I listed in RED which ones were DLSS, with the green one being the native 4k image. I'd like to know more from DLSS owners specifically how it is addressing tessellation and bump mapping. Seems like some of the info is being lost in the process. You have to remember though, there are a lot of gamers that literally don't care as long as it's good enough. Most of us long time forum members probably aren't in that boat so to each their own:


View attachment 310444

Take a look towards the end of the video and you can see exactly which letter corresponds to the actual amount of DLSS being applied. Unfortunately none of them stand up to the native. In my opinion, I'm seeing DLSS as something similar to TAA, where it's probably much better viewed in motion, yet not quite ideal.
TAA causes a lot of ghosting in motion, I personally find both TAA and DLSS worse in motion.
 
Take a look towards the end of the video and you can see exactly which letter corresponds to the actual amount of DLSS being applied. Unfortunately none of them stand up to the native. In my opinion, I'm seeing DLSS as something similar to TAA, where it's probably much better viewed in motion, yet not quite ideal.
None of them are as good as native... but some of them are pretty close. And really the thing to look at it how they look vs their render rez just upscaled normally. If you can run native, do it. That is always the best answer. However if that is too low FPS for you, that's where DLSS can be useful. Its new incarnations seem to do a much better job than just a regular upscale. They had some good comparisons of how things looked at 1080 upscaled vs 4k DLSS with 1080 actual rendering.

It's an option if you don't want to lower settings, but also find the FPS too low for your taste, and it is a good one in that regard. Particularly for 4k displays. Even the latest cards aren't blazing fast at 4k, but usually can crush it at 1440. So you might be in a situation where you have a game that runs 4k at a bit over 60fps, or you can drop to 1440 and push 100+. If you have a nice fast 4k monitor, kinda sucks, you'd like the frame rate, but that does mean a lower rez... Well DLSS gives a 3rd option: Run at native but have just a touch less quality, and get fps closer to (though not as high as) the 1440 results. So you might get something that looks very close to native, possibly imperceptible during actual gameplay, but with fps more in the 90s rather than 60s.
 
DLSS is blockbuster in this game. Without it, you can forget about high refresh rate. Unless you are okay with 60 fps, then you'll need it.

Also, after watching the GN video, and testing myself, I would say DLSS actually improves the quality. Gives it a softer look more like console games, which I find tends to be more realistic (i.e. more like a movie).
 
DLSS is blockbuster in this game. Without it, you can forget about high refresh rate. Unless you are okay with 60 fps, then you'll need it.

Also, after watching the GN video, and testing myself, I would say DLSS actually improves the quality. Gives it a softer look more like console games, which I find tends to be more realistic (i.e. more like a movie).
Isn’t dlss becoming a crutch for poor development like dithering 🙁
 
None of them are as good as native... but some of them are pretty close. And really the thing to look at it how they look vs their render rez just upscaled normally. If you can run native, do it. That is always the best answer. However if that is too low FPS for you, that's where DLSS can be useful. Its new incarnations seem to do a much better job than just a regular upscale. They had some good comparisons of how things looked at 1080 upscaled vs 4k DLSS with 1080 actual rendering.

It's an option if you don't want to lower settings, but also find the FPS too low for your taste, and it is a good one in that regard. Particularly for 4k displays. Even the latest cards aren't blazing fast at 4k, but usually can crush it at 1440. So you might be in a situation where you have a game that runs 4k at a bit over 60fps, or you can drop to 1440 and push 100+. If you have a nice fast 4k monitor, kinda sucks, you'd like the frame rate, but that does mean a lower rez... Well DLSS gives a 3rd option: Run at native but have just a touch less quality, and get fps closer to (though not as high as) the 1440 results. So you might get something that looks very close to native, possibly imperceptible during actual gameplay, but with fps more in the 90s rather than 60s.
And remember- static images are not motion. I notice DLSS standing there. Most games, you don’t stand there the entire time. You are doing things.::
 
Back
Top