More DLSS...

I think its too early to poo poo AMD or to even give Nvidia a whole lot of credit.

DLSS has been junk for 1.5 years. Was ok for Control. and then about 2 years later, is finally really solid. That's a hell of a lot of time to get a big feature working properly. And I still can't wash them with praise, until I see that several new releases have DLSS. Its all great and nice in 3 games, right now. Like a week ago, it was 2 games. That's not enough to get me to forget about the past two years. Or to spout doom for AMD.

This is simply a continuous DLSS conversation thread. Most posters here dont care about "spouting doom for AMD".
Just interested in the tech and having some fun with it.

Nothing to get worked up over.
 
I think its too early to poo poo AMD or to even give Nvidia a whole lot of credit.

DLSS has been junk for 1.5 years. Was ok for Control. and then about 2 years later, is finally really solid. That's a hell of a lot of time to get a big feature working properly. And I still can't wash them with praise, until I see that several new releases have DLSS. Its all great and nice in 3 games, right now. Like a week ago, it was 2 games. That's not enough to get me to forget about the past two years. Or to spout doom for AMD.
Nvidia designed, built hardware and implemented DLSS, they already have virtually all the credit. Agree with criticism on Nvidia promotion and lack of it working good for a good length of time. It is working extremely well now and most likely will continue to improve. It is up to AMD to provide a product that competes well with anything Nvidia has to offer. Not sure why you think AMD is doomed or it was hinted at. DLSS is tech that has matured and today it is very viable, effective and the best reconstruction technique around at this time. If AMD/Microsoft/Sony has other unique tech that can increase performance around 50% while maintaining quality ( I would argue DLSS increases quality compared to like TAA) then that would be great.
 
Nvidia designed, built hardware and implemented DLSS, they already have virtually all the credit. Agree with criticism on Nvidia promotion and lack of it working good for a good length of time. It is working extremely well now and most likely will continue to improve. It is up to AMD to provide a product that competes well with anything Nvidia has to offer. Not sure why you think AMD is doomed or it was hinted at. DLSS is tech that has matured and today it is very viable, effective and the best reconstruction technique around at this time. If AMD/Microsoft/Sony has other unique tech that can increase performance around 50% while maintaining quality ( I would argue DLSS increases quality compared to like TAA) then that would be great.

AMD diving their SKU's into "consumer" and "enterprise" might hamper their effort to reach parity in the consumer space.
NVIDIA played it smart, by leaving the tensor cores in the RTX 20x0 SKU's thus being able to level even more their research into games and not just enterpise workloads.

I would not be surprised if they add "Game A.I. acceleration" soon...perhaps with Ampere.
NVIDIA is a lot more than a hardware manufactor today, their software eco-system is growing at a fast pace.

EDIT:
Example of A.I. being used for new tech:
 
Last edited:
  • Like
Reactions: noko
like this
AMD diving their SKU's into "consumer" and "enterprise" might hamper their effort to reach parity in the consumer space.
NVIDIA played it smart, by leaving the tensor cores in the RTX 20x0 SKU's thus being able to level even more their research into games and not just enterpise workloads.

I would not be surprised if they add "Game A.I. acceleration" soon...perhaps with Ampere.
NVIDIA is a lot more than a hardware manufactor today, their software eco-system is growing at a fast pace.

EDIT:
Example of A.I. being used for new tech:

You got that right, their Software division now is in a league of their own.
 
1586875041198.png
 
Honestly what I'd love NVIDIA to do is bring back allowing us to leverage another card to maybe process the DLSS/AI stuff. Granted my next upgrade (from a 1080TI) will probably be a 3xxx RTX but it would be nice to help push more people to adopt the technology.

If different loads could be balanced out efficiently then I'd be all for that. My favorite example was my GTX-295 though. Two GPUs on one card, and would balance SLI with PhysX nicely. (at the time) I haven't seen anything work quite as well since then though, so I stopped doing multi-GPU shortly after that. I'm not sure how the communications would work between cards doing disparate operations though. Seems like a lot of scheduling overhead would be involved if you were mixing workloads. It might be worth it, but maybe not? I do like the idea of spreading out tasks to various ICs though that are dedicated to the task on a single card. (to avoid system bus limits)

I think it would be cool to have a card that had a GPU, DSP, FPGA on one card. You could process all kinds of things on the one card.
 
If different loads could be balanced out efficiently then I'd be all for that. My favorite example was my GTX-295 though. Two GPUs on one card, and would balance SLI with PhysX nicely. (at the time) I haven't seen anything work quite as well since then though, so I stopped doing multi-GPU shortly after that. I'm not sure how the communications would work between cards doing disparate operations though. Seems like a lot of scheduling overhead would be involved if you were mixing workloads. It might be worth it, but maybe not? I do like the idea of spreading out tasks to various ICs though that are dedicated to the task on a single card. (to avoid system bus limits)

You would still need to synchronize the data...essential spreading (fragmenting) the core out...adding latency.
Physics is a bitch.
 
You would still need to synchronize the data...essential spreading (fragmenting) the core out...adding latency.
Physics is a bitch.

Indeed she is. ;) I do think having the required busses, and maybe a dedicated scheduler etc. to manage all of it would be cool to have on a card. It does still have to hit the system at some point, but it could be minimized. It would obviously be expensive to do this though. I'm just thinking "out loud" with this. I don't really see a practical way to do it, nor a cost effective way. Even just wide memory interfaces (like on top end cards) adds a lot of expense. I just think a piece of hardware like this could open up some interesting possibilities. Kind of a modern take on some old SGI type ideas in a rough sense.
 
Indeed she is. ;) I do think having the required busses, and maybe a dedicated scheduler etc. to manage all of it would be cool to have on a card. It does still have to hit the system at some point, but it could be minimized. It would obviously be expensive to do this though. I'm just thinking "out loud" with this. I don't really see a practical way to do it, nor a cost effective way. I just think a piece of hardware like this could open up some interesting possibilities.

Well, unless you can break the barrier of electrons moving at ~66% of c (the speed of light) through a circut the distance (lantency) will always be there.
There is a reason why going smaller (nm) has yielded more performance over the years...you have more compute at the same physical size:
1587063950109.png


In some things latency really don't matter (research, BOINC, super-computers) as they are all about compute power, but if you want real world input/output...lantency matter.
8.3 ms is fine...33,33 ms is border line...above...not really useful.
We are slaves to the laws of physics, simple as that ;)
 
I think it would be cool to have a card that had a GPU, DSP, FPGA on one card. You could process all kinds of things on the one card.
Stacked silicon and silicon interposers are an available answer to this. The interposers have already seen high-bandwidth use in GPUs for HBM, which does increase performance in that scenario, and is even easier to cool, but it's also more expensive for a number of reasons.
 
Just trying out Minecraft RTX beta. At 1440p with the render chunks maxed out at 24 - DLSS 2.0 almost doubles the frame rate with nearly the same visuals as native. That said, the game is beast maxed out in beta. Some maps perform great, while others like this one are very demanding.

1587086359133.jpeg

1587086377872.jpeg
 
Just trying out Minecraft RTX beta. At 1440p with the render chunks maxed out at 24 - DLSS 2.0 almost doubles the frame rate with nearly the same visuals as native. That said, the game is beast maxed out in beta. Some maps perform great, while others like this one are very demanding.

View attachment 238363
View attachment 238364
I really have a peevee about taking photos of a screen and posting it in 2020...so many options to do it proper :p
You can literally just click Print Screen or Alt+Print Screen and paste that into the editor here. Yep, works on all games I have tried as well.
 
You can literally just click Print Screen or Alt+Print Screen and paste that into the editor here. Yep, works on all games I have tried as well.

lol. Just being lazy, and I shared the pics with multiple people using my phone.
 
  • Like
Reactions: noko
like this
More info about RTX Voice:
https://www.nvidia.com/en-us/geforce/forums/broadcasting/18/361740/rtx-voice-beta/

Perhaps we soon need a A.I./Tensor core sub-forum, as I suspect this is just the "tip of the iceberg"?

(And now I understand why NVIDIA kept the Tensor core in the consumer products, and why AMD's split of SKU's into consumer/enterprise might be so a smart idea after all...NVIDIA will now use not only hardware features, but software features as a "weapon" against AMD...it was not just about Raytraycing/DLSS)
 
No surprise here. Based on some Nvidia's video presentations I watched recently, looks to me like Nvidia's very much invested on their researchers developing consumer applications for their deep learning tech for gamers and content creators.

Apart from their shareholders, I'm guessing much of the proceeds of their revenues from their "overpriced" cards go to getting and keeping the best software and hardware engineering talent and the billions worth of investments in R&D which is why they're hard to beat in the GPU game.
 
Yup...NVIDIA's software eco-system gearing up, unlike Intel they are not "sleeping on their laurels" it seems.
Well I think we are seeing Nvidia having a clear cut edge or advantage, if they can make good on it, longer term, it will be more like Apple starting with the iPod moving to the iPhone. This also leads into robotics and a whole next generation of technologies which would be used broadly. Nvidia though always seem to piss people off in the end with their proprietary shoving, hopefully they will be smarter overall. As for Intel, cough cough IBM like now . . .
 
Well I think we are seeing Nvidia having a clear cut edge or advantage, if they can make good on it, longer term, it will be more like Apple starting with the iPod moving to the iPhone. This also leads into robotics and a whole next generation of technologies which would be used broadly. Nvidia though always seem to piss people off in the end with their proprietary shoving, hopefully they will be smarter overall. As for Intel, cough cough IBM like now . . .

They piss of a small, far to vocal, minority that is not reflect in their financial numbers..that will never change, nut is really unimportant.
But NVIDIA is not just a company that making gaming cards anymore.
They have come a long way since this:

1587232483097.png
 
They piss of a small, far to vocal, minority that is not reflect in their financial numbers..that will never change, nut is really unimportant.
But NVIDIA is not just a company that making gaming cards anymore.
They have come a long way since this:

View attachment 238700
They get criticism or praise when due. I had a NV1 card back in the day, worked fairly decent for that time period. Apple just does the proprietary innovations, new tech stuff in the past much better than what Nvidia has done. Nvidia will have tougher competition coming up in AI, data servers and hopefully gaming devices. World is also changing a lot right now, those who can adapt faster to it will do better.
 
DLSS as well?

Crysis Can’t Run Crysis: Remastered
Crytek, which has presumably resumed paying its employees, woke its Crysis twitter account this week to tease everyone about its Crysis Remastered update. There’s no release date yet (aside from the ominous “soon”), but the game will launch for PC, PlayStation 4, Xbox One, and even the Nintendo Switch, which is a first for Crysis.

The game’s temporarily-online webpage, which looks an awful lot like an accidental leak but was probably just architected marketing, claims that the game will move to modern Cryengine with API-agnostic raytracing. API-agnostic ray tracing was already shown by Crytek previously. It still helps to have RT hardware physically on the GPU, but we’re uncertain at this time of the extent to which Crysis Remastered will use NVIDIA’s RT hardware.

Source: https://www.crysis.com/

Additional Source: https://www.pcgamer.com/crysis-rema...-with-ray-tracing-higher-resolution-textures/
 
  • Like
Reactions: noko
like this
DLSS as well?

Crysis Can’t Run Crysis: Remastered
Crytek, which has presumably resumed paying its employees, woke its Crysis twitter account this week to tease everyone about its Crysis Remastered update. There’s no release date yet (aside from the ominous “soon”), but the game will launch for PC, PlayStation 4, Xbox One, and even the Nintendo Switch, which is a first for Crysis.

The game’s temporarily-online webpage, which looks an awful lot like an accidental leak but was probably just architected marketing, claims that the game will move to modern Cryengine with API-agnostic raytracing. API-agnostic ray tracing was already shown by Crytek previously. It still helps to have RT hardware physically on the GPU, but we’re uncertain at this time of the extent to which Crysis Remastered will use NVIDIA’s RT hardware.

Source: https://www.crysis.com/

Additional Source: https://www.pcgamer.com/crysis-rema...-with-ray-tracing-higher-resolution-textures/
They do not mention DLSS unfortunately but then their engine supports multiple platforms, Crytek:

https://www.crytek.com/news/crytek-announces-crysis-remastered said:
Crytek to release a remaster of Crysis 1 for PC, Xbox One, PlayStation 4, and Nintendo Switch this summer

The classic first person shooter is back with the action-packed gameplay, sandbox world, and thrilling epic battles players loved the first time around – with remastered graphics and optimizations for a new generation of hardware co-developed on CRYENGINE with Saber Interactive. Starting this summer, Crysis Remastered will be available for PC, Xbox One, PlayStation 4, and for Nintendo Switch.

Crysis Remastered will focus on the original game’s single-player campaigns and is slated to contain high-quality textures and improved art assets, an HD texture pack, temporal anti-aliasing, SSDO, SVOGI, state-of-the-art depth fields, new light settings, motion blur, and parallax occlusion mapping, particle effects will also be added where applicable. Further additions such as volumetric fog and shafts of light, software-based ray tracing, and screen space reflections provide the game with a major visual upgrade.

“We are excited to be working on the Crysis franchise again, and to bring all the Crysis fans a remaster worthy of their passion for the game,” said Crytek CEO Avni Yerli. “It’s an exciting opportunity to be able to bring Crysis back to PCs and current consoles – even Nintendo Switch! – so that a whole new generation of players can experience the thrill of a battle in the Nanosuit.”

In Crysis 1, what begins as a simple rescue mission becomes the battleground of a new war as alien invaders swarm over a North Korean island chain. Armed with a powerful Nanosuit, players can become invisible to stalk enemy patrols, or boost strength to lay waste to vehicles. The Nanosuit’s speed, strength, armor, and cloaking allow creative solutions for every kind of fight, while a huge arsenal of modular weaponry provides unprecedented control over play style. In the ever-changing environment, adapt tactics and gear to dominate your enemies, in an enormous sandbox world.

You can watch our official trailer here:

 
Last edited:
They get criticism or praise when due. I had a NV1 card back in the day, worked fairly decent for that time period. Apple just does the proprietary innovations, new tech stuff in the past much better than what Nvidia has done. Nvidia will have tougher competition coming up in AI, data servers and hopefully gaming devices. World is also changing a lot right now, those who can adapt faster to it will do better.

I have seen thousands of servers in datacenters, never seen an AMD card...plenty of Tesla cards though
DLSS as well?

Crysis Can’t Run Crysis: Remastered
Crytek, which has presumably resumed paying its employees, woke its Crysis twitter account this week to tease everyone about its Crysis Remastered update. There’s no release date yet (aside from the ominous “soon”), but the game will launch for PC, PlayStation 4, Xbox One, and even the Nintendo Switch, which is a first for Crysis.

The game’s temporarily-online webpage, which looks an awful lot like an accidental leak but was probably just architected marketing, claims that the game will move to modern Cryengine with API-agnostic raytracing. API-agnostic ray tracing was already shown by Crytek previously. It still helps to have RT hardware physically on the GPU, but we’re uncertain at this time of the extent to which Crysis Remastered will use NVIDIA’s RT hardware.

Source: https://www.crysis.com/

Additional Source: https://www.pcgamer.com/crysis-rema...-with-ray-tracing-higher-resolution-textures/

They showed of some in-house raytracing with less fidelity (and hexagonal reflections) some time ago running via shaders...I will wait and see the I.Q./performance before getting excited...because their demo did not impress me...just saying.
 
I have seen thousands of servers in datacenters, never seen an AMD card...plenty of Tesla cards though
Google Stadia is good example for AMD GPUs. With AI Nvidia has competition from Google custom AI chip, Microsoft starting to use Graphcore AI chips etc. Nvidia Tesla's are everywhere indeed and really supported well by software and Cuda api. Nvidia is not standing alone by a long shot, just too much money for others not to ignore. If Nvidia can maintain their lead and base is another thing. Like Tesla first using Nvidia AI gpu's now making their own custom AI chip for self driving cars.
 
They showed of some in-house raytracing with less fidelity (and hexagonal reflections) some time ago running via shaders...I will wait and see the I.Q./performance before getting excited...because their demo did not impress me...just saying.
Now how exactly would you think a 2080 Ti would perform doing real time ray tracing in a jungle with thousands of trees,plants, water with light scattering around for each leaf, stem, branch, trunk, rock . . .? Considering a 2080 Ti with DLSS does 60fps at 4K with RTX Minecraft (very simple environment) and bogs down under water to less than 50 fps. You could probably use it like BF5, reflections only and then the normal lightmaps, global illumination etc. normal gaming cheats. Crytek uses Voxel ray tracing, very efficient but also limited as well but does very good with complex environments. Not sure their implementation for reflections.
 
Back
Top