More DLSS...

It's Nvidias biggest downside, they can come out with some neat tech but then make it proprietary and it never sees widespread adoption because of it.

True to an extent, but they can't exactly open source tensor cores.
 
It's Nvidias biggest downside, they can come out with some neat tech but then make it proprietary and it never sees widespread adoption because of it.
Yeah, it's sad. I think Nvidia does great research and development, but most of what they've done over the years has come and gone (or at least is never widespread) due to the proprietary nature.

But it does make sense that they want to sell more GPUs, that is their business.

And on the other side your have AMD who opened sourced their sharpening filter, and it was quickly ported to ReShade and then Nvidia made their own version.

At least the AMD method results in progress across the board, but I'm not sure if that helps their sales (definitely helps their image/branding, though).
 
Last edited:
Yeah, it's sad. I think Nvidia does great research and development, but most of what they've done over the years has come and gone (or at least is never widespread) due to the proprietary nature.

But it does make sense that they want to sell more GPUs, that is their business.

And on the other side your have AMD who opened sourced their sharpening filter, and it was quickly ported to ReShade and then Nvidia made their own version.

At least the AMD method results in progress across the board, but I'm not sure if that helps their sales (definitely helps their image/branding, though)/

I think it helps AMD in different ways, in this case I think it helped them to land the console deal. Nvidias closed door nature has hurt them more then it's helped with other companies. I always understand wanting to keep things in house like Nvidia does as then you know it's implemented properly, an area AMD suffers at. On the flip side Nvidia has gained in the professional market tho as their reputation is quite good, so there is pro and cons to both methods.
 
I think tensor cores/Tensor Processing Units/Neural processing units are essentially open source already. Nothing mysterious about them. They aren't a NVidia invention. Everyone is adding these to chips now.
https://github.com/UCSBarchlab/OpenTPU

Interesting though I doubt NVIDIAs implementation is like theirs. Just because they execute the same function doesn't neccessarily mean they'd be identical. But if TPUs are common enough, AMD has no excuse not to have a DLSS equivalent for big navi.
 
Re-visiting DUTM in 4K, DLSS in "performance" mode due to the limitations of a RTX2070 non-Super. 45-60fps depending on the environment.

Love the mood and atmosphere this game has and what RT does to enhance that.

Deliver Us The Moon  Fortuna Screenshot 2020.03.30 - 23.02.02.03.png


Deliver Us The Moon  Fortuna Screenshot 2020.03.30 - 23.01.38.89.png
 
DLSS can be the real killer feature if most PC games or at least a good number of them will use it. Ampere potential is becoming way more interesting than RNDA2. I wonder how well DLSS does in VR? Any titles use it yet in VR?
 
DLSS can be the real killer feature if most PC games or at least a good number of them will use it. Ampere potential is becoming way more interesting than RNDA2. I wonder how well DLSS does in VR? Any titles use it yet in VR?
That's the thing of course, and I have no idea how much work/$$$ it takes to implement DLSS 2 at this stage.
As for VR, I haven't seen anything about it with DLSS as of yet. I also don't look for it as I don't VR myself (Situational awareness issues)
 
Adding this little clip, DLSS 2.0 "Performance" mode, 4K.

 
I think tensor cores/Tensor Processing Units/Neural processing units are essentially open source already. Nothing mysterious about them. They aren't a NVidia invention. Everyone is adding these to chips now.
https://github.com/UCSBarchlab/OpenTPU
Nvidia themselves are in the open source game for tensor cores.
https://github.com/nvdla/ <-- the same design as the cores on the Xavier (latest Tegra), afaik.

Though Toshiba semicon (the analog IC side of Toshiba, not related to NAND) has noted some difficulty in using it, and apparently offers services to help.
https://toshiba.semicon-storage.com...for_NVDLA-Based_DNN_SoC_Design_Whitepaper.pdf
 
Nvidia themselves are in the open source game for tensor cores.
https://github.com/nvdla/ <-- the same design as the cores on the Xavier (latest Tegra), afaik.

Though Toshiba semicon (the analog IC side of Toshiba, not related to NAND) has noted some difficulty in using it, and apparently offers services to help.
https://toshiba.semicon-storage.com...for_NVDLA-Based_DNN_SoC_Design_Whitepaper.pdf

Point is, that Tensor/Neural cores are ubiquitous. If AMD want's to implement them, they will have no real issue other than the silicon they occupy. It looks like for RDNA2, they will be doing any Neural Net processing on traditional cores.
 
Another DF DLSS video. Points out the artificacts I have noticed in Control:


Artifacts have been mentioned by some in DUTM with DLSS 2.0 as well.
I have not seen them in my plays, for whatever reason.
 
Another DF DLSS video. Points out the artificacts I have noticed in Control:


Also, this is a really well done side by side comparison. Besides some issues with artifacts 2.0 is working very well. Very cool tech.
 
I saw some artifacts from time to time, but I actually thought they looked interesting (sort of like glitches in the matrix given the nature of the game).

Also, we are talking about around 50% performance boost. I'll accept some loss in fidelity to get that level of performance (especially considering how hard it is to catch the artifacts).
 
  • Like
Reactions: Auer
like this
I just hope nvidia hurries up and releases new drivers with a fix for HDR. I can't run 445.75 or .78 and use HDR in games. It looks super washed out and insanely bright. Changing settings does nothing for it. I rolled back to 442.xx set and everything is perfectly fine again with HDR. :(
 
I just tried it. 540p actually looks good. I didn't even bother testing it before but it really works.

Nvidia's got some black magic or something, not sure how 540p on a 1080p monitor would be viable but it is.
 
I just tried it. 540p actually looks good. I didn't even bother testing it before but it really works.

Nvidia's got some black magic or something, not sure how 540p on a 1080p monitor would be viable but it is.

Carefull...nothing ticks of certain elements as the mention of "NVIDIA black box" ;)
 
I don't have any of the DLSS 2.0 games but how is the resolution handled? I'm using a 3440x1440p UW monitor. Do you change the in game resolution and then let DLSS 2.0 do it's thing or is there a specific resolution setting for DLSS in game?

For example if I wanted to instead render the game @ 2560x1080p and let DLSS upscale or whatever it's doing how do I do that?
 
  • Like
Reactions: Auer
like this
I just purchased Control for $30 on the Epic store. I definitely want to give this a go. I thought Wolfenstein and Bright Memory looked great.
 
  • Like
Reactions: Auer
like this
damstr I only tried it in Control. You just enable DLSS checkbox, then there is a "render resolution" drop down right above with 3 options.

For me on a 2560x1080 monitor, it gave me three choices 720p, 540p and something in the middle (can't remember exactly).

Your display resolution will still be at native resolution, there is an additional property from render resolution that you can change.
 
Edward Liu NVidia GTC DLSS 2.0 talk:
https://developer.nvidia.com/gtc/2020/video/s22698

Finished watching. Near the end it gives more detail on what is required to integrate DLSS 2.0.

I just hope developers take advantage of this. Because it's nVidia exclusive, it will require financial kickbacks to the devs so adoption will be minimal unfortunately. I wish Apex Legends had this technology, rendering at 240 fps constantly would be a game changer.
 
  • Like
Reactions: Auer
like this
damstr I only tried it in Control. You just enable DLSS checkbox, then there is a "render resolution" drop down right above with 3 options.

For me on a 2560x1080 monitor, it gave me three choices 720p, 540p and something in the middle (can't remember exactly).

Your display resolution will still be at native resolution, there is an additional property from render resolution that you can change.
Ah ok that makes sense! I didn't know "render resolution" was an in game setting/option.
 
I just hope developers take advantage of this. Because it's nVidia exclusive, it will require financial kickbacks to the devs so adoption will be minimal unfortunately. I wish Apex Legends had this technology, rendering at 240 fps constantly would be a game changer.
I think they will, Nvidia does own the PC GPU gaming market and it is like a free performance bump for the game in addition Nvidia will also promote your game. Prior to DLSS 2.0 it just did not work well at all, now is a totally different case, a feature that would make 4K resolution gaming more practical -> Would think the Monitor manufacturers would also like this.
 
  • Like
Reactions: Auer
like this
I think they will, Nvidia does own the PC GPU gaming market and it is like a free performance bump for the game in addition Nvidia will also promote your game. Prior to DLSS 2.0 it just did not work well at all, now is a totally different case, a feature that would make 4K resolution gaming more practical -> Would think the Monitor manufacturers would also like this.

Well PhysX had promised to be a revolutionary game changer way back when until NVIDIA closed it off and it died out. I fear the same thing may happen to DLSS unless they open source it so AMD can take advantage of it too as well as Intel. This way everyone benefits from this technology and people with low end/midrange cards can enjoy high IQ games and those with high end cards can push the super high fps they want with high refresh monitors. Unfortunately, that's not how NVIDIA operates so I think this technology will remain limited or worse AMD will create some generic version of DLSS and open source it and it will die out too. I've always maintained that NVIDIA needs to own an engine + studio, Unreal Engine kinda comes close to this criteria with their very close relationship with Epic Games but they need someone like EA on board to build their next Frostbite engine with NVIDIA technology in mind so all future EA games (especially shooters) use DLSS 2.0--that would be a game changer.

If NVIDIA had partnered with Riot games and built DLSS 2.0 into Valorant, that would have enabled people with low end Turing cards to run 240+ fps fps and they would have gotten tons of PR for it, especially since Valorant is a PC only game. It seems like NVIDIA just isn't proactive enough to shop it's technology around and they don't like to spend money because maybe their board are shortsighted and want to see maximum profits and minimum expenditures.
 
If Nvidia can figure out a way to incorporate DLSS into the Nvidia Control Panel 3D settings like how they've done it with Dynamic Super Resolution (DSR) and requires virtually no cost and effort to implement on the part of the developers since's it would just be a switch to be enabled in the driver game profile, that would be literally be quite disruptive to the competitive landscape. That happening would probably force AMD and Intel to come up with their own competing solutions.
 
Last edited:
  • Like
Reactions: Auer
like this
If Nvidia can figure out a way to incorporate DLSS into the Nvidia Control Panel 3D settings like how they've done it with Dynamic Super Resolution (DSR) and requires virtually no cost and effort to implement on the part of the developers since's it would just be a switch to be enabled in the driver game profile, that would be literally be quite disruptive to the competitive landscape. That happening would probably force AMD and Intel to come up with their own competing solutions.

NVIDIA is executing very well right now.
AMD is playing catchup on DXR.
AMD has no feature like DLSS
AMD has no cards that threathen NVIDIA's top of the line SKU's

I don't think AMD has the resources to "attack" all 3 avenues at the same time.
And I suspect it will only get worse over time, people need to understand that NVIDIA is not just a hardware company, their software eco-system is massive and I doubt DXR and DLSS is the "end" of API/Fatures that NVIDIA will bring to the table.

Pure rasterization performance will slide further and further back as the primary "metric".
I think that this is NVIDIA's way of dealing with the death of Moore's law and that they have been planning this for a long time.

So AMD has an uphill battle...and Intel is going to have to spend a LOT of rescources to get involved and becomining on a competitive option vs AMD/NIVIDA.

Fun times ahead.
 
[Stealth edit - leaving this in, but this grayed out text is incorrect - sorry]
Regarding a standard for DLSS, from my understanding, the tech requires both learning (build time / pre-baked) and inference (run time). This is partly why it requires per game support - you actually need to build the learning set for each scene / game, I believe.
Tensor cores are an open concept and help in this case with inference based on the learned model per game at run time on the client, using generated data from the learn set.

A standard at this time would require both halves of the equation, which is probably going to be less open
. DLSS 1 used shader cores so inference can be done on non Turing GPUs I’m sure but the question is how efficient or real time would it be, and there’s also the sense that one company probably believes the other is benefiting from their build build pipeline and investment without necessarily investing themselves in it.

A vulkan or Microsoft standard would definitely help in this department. I’m sure Sony and Microsoft (and Google and Amazon) are looking at this for next gen platforms and streaming. This could reduce streaming bandwidth and potentially latency for cloud and mobile games too, so a standard would be nice, but I expect a lot of proprietary, secret sauce methods in the short term.
 
Last edited:
Regarding a standard for DLSS, from my understanding, the tech requires both learning (build time / pre-baked) and inference (run time). This is partly why it requires per game support - you actually need to build the learning set for each scene / game, I believe.
Tensor cores are an open concept and help in this case with inference based on the learned model per game at run time on the client, using generated data from the learn set.

Someone hasn't been reading this thread. DLSS 2.0 is universal. It doesn't require per game training anymore.

The GTC talk I linked above describes the developer intervention needed which is similar to TAA, but slightly different and a bit more involved.

It won't be much more work than TAA implementation, but you won't be able to force it on, in drivers, it will require developer intervention.
See 40:50 of this video:
https://developer.nvidia.com/gtc/2020/video/s22698
 
Last edited:
Honestly what I'd love NVIDIA to do is bring back allowing us to leverage another card to maybe process the DLSS/AI stuff. Granted my next upgrade (from a 1080TI) will probably be a 3xxx RTX but it would be nice to help push more people to adopt the technology.
 
Someone hasn't been reading this thread. DLSS 2.0 is universal. It doesn't require per game training anymore.

The GTC talk I linked above describes the developer intervention needed which is similar to TAA, but slightly different and a bit more involved.

It won't be much more work than TAA implementation, but you won't be able to force it on, in drivers, it will require developer intervention.
See 40:50 of this video:
https://developer.nvidia.com/gtc/2020/video/s22698
Breakthrough tech and it is now on version 2.0 which definitely proved the viability and usability of this technique. I would gander version 2.1, 2.2 . . . 3.0 . . . Will even improve upon current metrics. I could see data taking with internal higher resolution rendering for data points to be used as inputs being used at lower resolution when the game is being played, maybe not now but maybe useful. I also see this technique may explode into other aspects of the rendering like lighting. Had to take a break from the video because of the over the top time spent on TAA and other rendering techniques vice talking more specifically DLSS (DLSS good all others bad discussion -> extremely boring). This is the best tech I've seen Nvidia come up with and got it to work tremendously good and it is still at an early stage.
 
It had a rough launch, but it works now as advertised. Even better actually than I expected.

Sadly I don't think it will ever catch on a standard unless it can work in a cross-vendor manner.

It made sense for all DXR games to be DLSS as well (since it was somewhat of an Nvidia exclusive thing, even if the API was theoretically agnostic).

But now with DX12 Ultimate and AMD entering the ring, it will be interested to see what happens. If AMD doesn't have a response to DLSS, or some other performance trick, they will be far behind.

Hopefully AMD (and Intel) will find some comparable method to DLSS so it can become a standard in the future.
 
It had a rough launch, but it works now as advertised. Even better actually than I expected.

Sadly I don't think it will ever catch on a standard unless it can work in a cross-vendor manner.

It made sense for all DXR games to be DLSS as well (since it was somewhat of an Nvidia exclusive thing, even if the API was theoretically agnostic).

But now with DX12 Ultimate and AMD entering the ring, it will be interested to see what happens. If AMD doesn't have a response to DLSS, or some other performance trick, they will be far behind.

Hopefully AMD (and Intel) will find some comparable method to DLSS so it can become a standard in the future.
Nvidia is good with the developers and will probably get many more games to use it. The results speak for themselves and I think is a no brainer, the industry should adapt or come up with something as good or better (extremely hard on that part). If Nvidia can get some of the older games or current titles to update that can benefit from this, most if not all RTX titles -> Metro could use this for example. When there are like 20 top notch games +, the pull to get hardware with DLSS will start to get intense and games not using it will start to look second rate as well. Games that use DLSS could push RT effects while having good performance, Other GPUs, AMD and older generations will start showing their age or lack of design. I hope AMD has an effective, usable and used technology as good (I just doubt it at this time). DLSS does exactly what I would want, sharper textures, antialiasing, increase performance, TAA gives soupy blurry textures and worst performance.
 
I think its too early to poo poo AMD or to even give Nvidia a whole lot of credit.

DLSS has been junk for 1.5 years. Was ok for Control. and then about 2 years later, is finally really solid. That's a hell of a lot of time to get a big feature working properly. And I still can't wash them with praise, until I see that several new releases have DLSS. Its all great and nice in 3 games, right now. Like a week ago, it was 2 games. That's not enough to get me to forget about the past two years. Or to spout doom for AMD.
 
Back
Top