NVIDIA Adaptive Shading Tested in Wolfenstein II: The New Colossus

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
Bethesda released a new patch for Wolfenstein II: The New Colossus earlier this week that adds support for an RTX feature called Adaptive Shading, which “adjusts the rate at which portions of the screen are shaded, meaning the GPU has less work to do, boosting performance.” Guru 3D found it provided “a 5% performance differential” without sacrifices to image quality, while The Tech Report, in more extensive tests, determined there was no downside to having the option enabled.

Overall, content-adaptive shading is another intriguing Turing technology that seems to be in its infancy. All three of the Turing cards we have on the bench so far aren't lacking for shader power, and they're plenty capable of running Wolfenstein II at impressive frame rates even at 4K with maximum settings to start with. We're curious what CAS might do for potential lower-end Turing cards as a result of this testing, but for now, the tech is simply making great performance a little bit better.
 
But it's only for RTX cards which only like half the GPU is used for games while 1/4 is for AI and another 1/4 is for Ray-Tracing. This just looks like they're using the AI core to give the game a 5% boost, so that's using 1/4 of the GPU to do this.
 
But it's only for RTX cards which only like half the GPU is used for games while 1/4 is for AI and another 1/4 is for Ray-Tracing. This just looks like they're using the AI core to give the game a 5% boost, so that's using 1/4 of the GPU to do this.

I'd like to see this as a feature across all games. I'm not sure why nvidia can't just allow you to enable a setting in the control panel to use the AI cores as a shader offload for a marginal performance boost across the board. Most games aren't ever going to use those cores for anything else as it is.
 
At this point I think the people who bought into the RTX camp deserve a break......5% ain't much, but it's a start. Enjoy your performance premium, bleeding-edgers...the rest of us look on with envy.
 
As a 2080 ti owner .. I never have nor will I expect anything more than just the raw power out of my video card. Which, I am very happy for.

RT and Adaptive Shading .. I could give zero fukks. If a game comes out that I happen to play and it has one feature or another that boosts performance then I may try it out but as 1st generation tech it nothing you shouldn't count or plan on any of it other than the raw horsepower.

Ray Tracing has another 4 years at least to become viable and even then I'm not sure we will get the 60FPS full ultra ray tracing settings experience you really need to have to make it all worth while.

RT is turning out to be a no-show.
 
Last edited:
I love that iD and now by extension Bethesda add these sorts of features to their game engines. Bleeding edge, squeeze every frame possible.......but at the same time it’s beyond fucking ironic that their games (almost) always have unprecedented performance.

I mean a 290x can run Doom maxed at 1080p.
 
At this point I think the people who bought into the RTX camp deserve a break......5% ain't much, but it's a start. Enjoy your performance premium, bleeding-edgers...the rest of us look on with envy.
Like DrBorg said, as someone who values stability in a system above all else, envy isn't what I'm feeling. It's more the feeling you get if you find out a flight you decided not to go on crashed.
 
I love that iD and now by extension Bethesda add these sorts of features to their game engines. Bleeding edge, squeeze every frame possible.......but at the same time it’s beyond fucking ironic that their games (almost) always have unprecedented performance.

I mean a 290x can run Doom maxed at 1080p.
Bethesda still uses their garbage engine even tho they have access to ID tech.
 
I wonder if someone can "hack' new nvidia drivers to enable this "feature" on us lowly 1070/1080 users ...
 
I wonder why they always show scaled down version of pictures when they need to hide... I mean show there is no difference.
 
So, no news is good news? I mean...no difference...is a zero sum what they were aiming for?
 
It's funny how AFTER the patch i have to run through hoops to get my Vega 64 to not crash dump every time i try to run WF2, when it worked JUST Fine before the patch........hmmm. Yeah i reninstalled drivers, uninstalled drivers etc.. ran driver cleaner etc...Makes me wonder to be honest.....what the fuck NVIDIA did with this "patch"
 
As a 2080 ti owner .. I never have nor will I expect anything more than just the raw power out of my video card. Which, I am very happy for.

RT and Adaptive Shading .. I could give zero fukks. If a game comes out that I happen to play and it has one feature or another that boosts performance then I may try it out but as 1st generation tech it nothing you shouldn't count or plan on any of it other than the raw horsepower.

Ray Tracing has another 4 years at least to become viable and even then I'm not sure we will get the 60FPS full ultra ray tracing settings experience you really need to have to make it all worth while.

RT is turning out to be a no-show.

I just ordered an rtx 2070 and after reading all the reviews I can find now... Wow a lot of them are failing. Makes me happy I avoided the cheap gigabyte boards with their shit tier warranty support.

Still I'm quite looking forward to it. Just bought a few new games and a new PSU to go with it. Finally going to bother to OC my 8600k. Already got proper cooling, and now I have the PSU and graphics card to take that 5ghz OC.

As long as the video card doesn't take a huge dump in my case.
 
At this point I think the people who bought into the RTX camp deserve a break......5% ain't much, but it's a start. Enjoy your performance premium, bleeding-edgers...the rest of us look on with envy.
You maybe but no I. So far I'm hearing nothing but problems for what is essentially an overpriced graphics card. This is Direct X10 all over again for Nvidia when DX10 killed the frame rate but DX10.1 did a much better job on AMD hardware.

source.gif
 
Bethesda still uses their garbage engine even tho they have access to ID tech.

I doubt IDTech would hold up any better under ES or Fallout than Gamebryo/Creation does. It might be more stable, but I'd be surprised if it could handle the demands Bethesda and it's trash-tier coding would be throwing at it.
 
Good, I am still skipping the 2080Ti I was considering and will move on upgrade my monitors instead while drinking my watercooled 1080Ti's blood in the process....I hope it won't take Nvidia 2 years to release a new card though..
 
As a 2080 ti owner .. I never have nor will I expect anything more than just the raw power out of my video card. Which, I am very happy for.

RT and Adaptive Shading .. I could give zero fukks. If a game comes out that I happen to play and it has one feature or another that boosts performance then I may try it out but as 1st generation tech it nothing you shouldn't count or plan on any of it other than the raw horsepower.

Ray Tracing has another 4 years at least to become viable and even then I'm not sure we will get the 60FPS full ultra ray tracing settings experience you really need to have to make it all worth while.

RT is turning out to be a no-show.
You have a good perspective on the new Nvidia cards. I'm going to be putting a new system together - probably in the spring. The GPU decision will be my most expensive component. I have a 970 right now and it works great for what I use it for. However, I realize its old and is limited to 1080 gaming.
Nvidia has not sold me on their RTX cards - I think the price is ridiculous and the RT features are a joke. At this point, I might go with a 2060 or possibly a 2070. If AMD steps up to the plate, I'd consider them as well.
 
Yeah guys I was trying to be a supportive bro here :D Those RTX guys are actually fucked. You know it, I know it. But we don't need to rub it in :) So in this case, 5% framerate improvement................is something.
 
But it's only for RTX cards which only like half the GPU is used for games while 1/4 is for AI and another 1/4 is for Ray-Tracing. This just looks like they're using the AI core to give the game a 5% boost, so that's using 1/4 of the GPU to do this.

I'll be curious as to how AMD does it when they include their Radeon Ray's technology (their open-source version of an RTX for DXR). From what I've read (might just be speculation/rumor based on their workplace cards), they're taking a slightly different approach and not having dedicated pieces of die for it like Nvidia. Instead, they'll allow for offloading this to a secondary GPU.
 
I wonder why they always show scaled down version of pictures when they need to hide... I mean show there is no difference.

Who says anyone is hiding something? More tin foil hat stuff at the hardocp..
 
I'll be curious as to how AMD does it when they include their Radeon Ray's technology (their open-source version of an RTX for DXR). From what I've read (might just be speculation/rumor based on their workplace cards), they're taking a slightly different approach and not having dedicated pieces of die for it like Nvidia. Instead, they'll allow for offloading this to a secondary GPU.

Would that mean that with the simple push of a button you can use your 2nd card for RT or Crossfire ? Seems like a nice approach if it can match the horsepower in this specific workload. Dedicated HW might be more effective though.
 
Who says anyone is hiding something? More tin foil hat stuff at the hardocp..

When it comes to marketings material im not just tinfoil hat. im tin foul t shirt and boxers too.
Any article talking about qualtiy but does not provide an ability for you to analyse said quality yourself is a bad article imho.
 
When it comes to marketings material im not just tinfoil hat. im tin foul t shirt and boxers too.
Any article talking about qualtiy but does not provide an ability for you to analyse said quality yourself is a bad article imho.

You were clearly insinuating that someone (Nvidia, Guru3d, Tech Report, all of them?) was trying to mislead you on this satanically inspired feature that Nvidia added to their RTX cards.. You should invest in a large Faraday Cage or something.
 
You have a good perspective on the new Nvidia cards. I'm going to be putting a new system together - probably in the spring. The GPU decision will be my most expensive component. I have a 970 right now and it works great for what I use it for. However, I realize its old and is limited to 1080 gaming.
Nvidia has not sold me on their RTX cards - I think the price is ridiculous and the RT features are a joke. At this point, I might go with a 2060 or possibly a 2070. If AMD steps up to the plate, I'd consider them as well.


I had a factory OC 970 on my old system, worked(s) great. on my new system i am running the MSI 2070 that was reviewed here on [H]. is working great on my 1440 monitor. was playing path of exile over the weekend with everything at MAX and task manager was saying the GPU was running around 49% utilization. and very cool and quiet.
 
You were clearly insinuating that someone (Nvidia, Guru3d, Tech Report, all of them?) was trying to mislead you on this satanically inspired feature that Nvidia added to their RTX cards.. You should invest in a large Faraday Cage or something.

I did?

im pretty sure my post was not bringing any brands up specifically and did mentioning as a generic issue.
What you make up in your own mind I'm not the cause off.

But if you are in for not showing prof of hat you claim in an articles so be it your way. I'm just from a place where evidence weights higher than some random persons personal subjective metric.
Especially when the evidence is so easy to provide correctly

Also you might look up humor just a tap... tad
 
Back
Top