Deus Ex: Mankind Divided DX12 Performance Review @ [H]

Once a developer has the code that works, or game engine then that no longer becomes an on going time taking event. We have also been looking at averages and not scene or view points where it makes a bigger difference in % and experience. A 1% increase average could also mean in one area of a game a 20% increase in performance where it is now smooth vice jerky and other areas zero increase. Averages can be somewhat misleading if you don't consider what makes up that average.

Now other hardware features that Nvidia has can also be used if need be. DX 12 will allow Nvidia to expose those new capability much easier then with DX 11.

DX12 wont allow you to expose new features easier. DX is a defined standard of features. The features would have to be in a new DX12 version.

And yes, I have seen the DX12 stutter festival. Or how DX12 makes the game slower if heavy on the game logic. Huge success there!
 
DX 12 is a LLAPI where you have access if you want of hardware features. Even new ones, meaning allows Nvidia for example to really push new hardware features and supporting Software a.k.a GameWorks to get that access.
DX12 wont allow you to expose new features easier. DX is a defined standard of features. The features would have to be in a new DX12 version.

And yes, I have seen the DX12 stutter festival. Or how DX12 makes the game slower if heavy on the game logic. Huge success there!
No it is not - feature level 11-0 and 11-1 is the minimum level for DX 12 hardware with 12-0 and 12-1 as optional hardware features. Many of the features in the given feature set can also be optional :cool:. In other words it will allow Nvidia and AMD to experiment with hardware features in the future and not tying it down as hard as previous DX versions.
 
DX 12 is a LLAPI where you have access if you want of hardware features. Even new ones, meaning allows Nvidia for example to really push new hardware features and supporting Software a.k.a GameWorks to get that access.

No it is not - feature level 11-0 and 11-1 is the minimum level for DX 12 hardware with 12-0 and 12-1 as optional hardware features. Many of the features in the given feature set can also be optional :cool:. In other words it will allow Nvidia and AMD to experiment with hardware features in the future and not tying it down as hard as previous DX versions.

You cant add a feature in DX12 if its not supported by MS. Plain simple. DX12 changes nothing in that perspective compared to previous DX versions.
 
You cant add a feature in DX12 if its not supported by MS. Plain simple. DX12 changes nothing in that perspective compared to previous DX versions.
Looks like your right, Vulkan allows extensions but that is not clear with DX 12. Still DX 12 has much flexibility and no hardware present uses all the optional features.
 
It is a bit more nuanced these days with GPUOpen from AMD, ironically all those defending AMD saying they are open standard and used GPUOpen as an example gave me facepalm moments.
Link here explaining the AMD GPU service (AGS), which is part of GPUOpen just like the Vulkan shader extension used by AMD.
Version 4.0 of the library includes support for querying graphics driver version info, GPU performance, Crossfire™ (AMD’s multi-GPU rendering technology) configuration info, and Eyefinity (AMD’s multi-display rendering technology) configuration info. AGS also exposes the explicit Crossfire API extension, GCN shader extensions, and additional extensions supported in the AMD drivers for DirectX 11 and DirectX 12.
Currently disabled for DX12 again but more to do with driver compatibility.
http://gpuopen.com/gaming-product/amd-gpu-services-ags-library/
But that is much more limited compared to the Vulkan extensions, and just like the Vulkan extensions this is now getting back to the old days of both manufacturers upping the ante on proprietary coding-to-hardware development and performance.
Cheers
 
It is a bit more nuanced these days with GPUOpen from AMD, ironically all those defending AMD saying they are open standard and used GPUOpen as an example gave me facepalm moments.
Link here explaining the AMD GPU service (AGS), which is part of GPUOpen just like the Vulkan shader extension used by AMD.

Currently disabled for DX12 again but more to do with driver compatibility.
http://gpuopen.com/gaming-product/amd-gpu-services-ags-library/
But that is much more limited compared to the Vulkan extensions, and just like the Vulkan extensions this is now getting back to the old days of both manufacturers upping the ante on proprietary coding-to-hardware development and performance.
Cheers
You are intentionally being misleading here. What you posted isn't closed source as GameWorks is. No one will contest that GPUopen will benefit AMD GCN far better than any other architecture, but it still doesn't make it closed. And it being open means that any other Manufacturer can in fact make the changes and understand what the code does on their hardware. SO NOT THE SAME. And for the love of all that is holy stop being obtuse with your statements.
 
You are intentionally being misleading here. What you posted isn't closed source as GameWorks is. No one will contest that GPUopen will benefit AMD GCN far better than any other architecture, but it still doesn't make it closed. And it being open means that any other Manufacturer can in fact make the changes and understand what the code does on their hardware. SO NOT THE SAME. And for the love of all that is holy stop being obtuse with your statements.

He didn't say anything about closed source, he is talking about proprietary coding, or extensions that are IHV specific. AMD thinks just the same way nV does when it comes to these things, they are not holier then thou. Please stop trying to say because they are open source they are better, that is BS, the only reason AMD is going open source is because they got their ass handed to them so many times and have no market pull, they had do it otherwise they would be no where.

If they were truly open source, they would not use any IHV specific extensions. That would make their libraries equal on platforms but that isn't the case, with shader intrinsic, we see the same thing, its a way to take advantage of their hardware.

While I feel there is nothing wrong with that, I do feel if you say that they don't do things for the benefit of their hardware, for their business from a programming level, is just wrong, they are doing it and will continue to do it any way its possible.
 
Yep spot on Razor,
unfortunately though context gets lost when it comes to defending AMD vs Nvidia and just how some think AMD's technology and solution is helping the industry, when in reality they are in same position as Nvidia and going for any advantage they can.
Open standard in my context as you rightly read means usable and can be implemented by other companies in a workable defined framework or part of a separate standard/committee, GPUOpen functions-service-workflow cannot ever be implemented on anything but GCN due to the proprietary nature of their hardware and proprietary low level hooks GPUOpen provides for said architecture.
And as you say in the context you describe it is not even open source as developers cannot collaborate (part of the definition for true open source along with capability to modify from its original design) due to the low level nature-architecture hook GPUOpen provides in a rigid structure.
AMD spread a lot of fud when they decided to attack Nvidia with their 'closed source is hurting us'.

Just to add, AMD with GPUOpen will up the ante again with what Nvidia does IMO, it means Nvidia will justify making some of their tools truly low level as well.
The benefit going forward for AMD is that they can 'sync' the tech between console and PCs to a certain extent, but I do think we may see a more aggressive tech approach as well now from Nvidia in response.
Not sure how I feel about this going forward, we have seen how such escalation of low level proprietary tech-solutions impacted gamers in the past; I cannot fault AMD making great use of their console footprint and agree it makes sense, but they have pushed this now to a point where Nvidia is off the leash.
Gameworks was more of a high level 'plug-in' effects/post processing tool and suite, but I can see this heavily changing going forward.

Cheers
 
Last edited:
He didn't say anything about closed source, he is talking about proprietary coding, or extensions that are IHV specific. AMD thinks just the same way nV does when it comes to these things, they are not holier then thou. Please stop trying to say because they are open source they are better, that is BS, the only reason AMD is going open source is because they got their ass handed to them so many times and have no market pull, they had do it otherwise they would be no where.

If they were truly open source, they would not use any IHV specific extensions. That would make their libraries equal on platforms but that isn't the case, with shader intrinsic, we see the same thing, its a way to take advantage of their hardware.

While I feel there is nothing wrong with that, I do feel if you say that they don't do things for the benefit of their hardware, for their business from a programming level, is just wrong, they are doing it and will continue to do it any way its possible.
This hypocrisy always amuses me, too. It's an open standard so long as you make hardware exactly how AMD demands.
 
I've finally gotten round to testing this game, I will post detailed performance data when I can but for now let me just say I was shocked to see the MASSIVE impact on performance the following settings have; Motion Blur, Chromatic Aberration, DoF. AMD CHS is also ridiculously demanding and it looks pretty bad, turning it down a notch from Ultra helped massively.

In the ingame benchmark comparing max settings to max settings + DoF, CA and MB disabled I gained almost 20% performance.

Slightly tempting to play devil's advocate and spin this as evidence of an AMD sponsored game gimping NV hardware, cause seriously, that's what would be happening if the roles were reversed. Overall I'm slightly pleasantly surprised by the graphical fidelity. I still believe the skin shaders to be hjdeous and the facial animations to be unnerving but hey! To each his own.

This is at max settings with forced 16xAF and msaa off
dbBKAc4.jpg
LuewXst.jpg
 
This hypocrisy always amuses me, too. It's an open standard so long as you make hardware exactly how AMD demands.

It's an open source, open standard but only AMD supports async because NVIDIA doesn't have Asynchronous Compute Engines™ ©
 
To tell the truth, I never understood purpose of motion blur if you have high fps. Motion blur was developed for 24fps type frame rates as in film (which has natural motion blur anyways) to smooth out the motion. If your motion is fluid your eye will have natural motion blur. DOF is a camera simulation and not an eye simulation unless you are near sighted. I can see DOF for video type scenes but for actual game play in a FPS it is more like uncorrected eye sight or someone needing glasses. Now if DOF had eye tracking knowing what you are looking at that would be more realistic. To me the game has better IQ and playability with motion blur and DOF off. The shadows I think sort of suck in this game, some objects don't cast shadows at all while being right next to other in the scene. AMD CHS sucks, at least in this game unless they have fixed it.
 
To tell the truth, I never understood purpose of motion blur if you have high fps. Motion blur was developed for 24fps type frame rates as in film (which has natural motion blur anyways) to smooth out the motion. If your motion is fluid your eye will have natural motion blur. DOF is a camera simulation and not an eye simulation unless you are near sighted. I can see DOF for video type scenes but for actual game play in a FPS it is more like uncorrected eye sight or someone needing glasses. Now if DOF had eye tracking knowing what you are looking at that would be more realistic. To me the game has better IQ and playability with motion blur and DOF off. The shadows I think sort of suck in this game, some objects don't cast shadows at all while being right next to other in the scene. AMD CHS sucks, at least in this game unless they have fixed it.
I am also tired of developers trying to simulate cameras instead of eyes. Chromatic aberration is the worst of these effects.
 
This is at max settings with forced 16xAF and msaa off
At 1920x1080 judging by the screens?

I got this back in early September, with everything maxed but Motion Blur (since I hate it) and MSAA off.
ASR1rVV.png
 
At 1920x1080 judging by the screens?

I got this back in early September, with everything maxed but Motion Blur (since I hate it) and MSAA off.
ASR1rVV.png

That was 1440p.

Here is 1080p at your settings (Ultra preset, motion blur off)
mi65hdS.jpg


1440p Max settings (16xAF) no CHS, Motion Blur, Chromatic Aberration, DoF.
kztk2DM.jpg
 
I am also tired of developers trying to simulate cameras instead of eyes. Chromatic aberration is the worst of these effects.

I dont even understand why chromatic aberration exists in game settings. Its essentially a camera flaw that degrades the picture quality. If high fidelity picture is what you are after it has a very much the opposite effect.
 
I dont even understand why chromatic aberration exists in game settings. Its essentially a camera flaw that degrades the picture quality. If high fidelity picture is what you are after it has a very much the opposite effect.
I spend $600 mostly on digital lenses for my glasses to make everything sharp regardless of the distance or which way I am looking without skewing the world - then I pop in a game and add aberrations, blur etc. with a performance loss that is rather large? Not! I also wonder why this is even considered an IQ improvement myself.

Blur can disguise aliasing, poor textures, objects popping in existence and maybe other stuff but the cure to me is worst then the problem. Still having those options which maybe some will find useful is good.
 
What happened to TressFX in this game? Sorry if it was covered previously. Haven't seen it mentioned since launch.

Eidos stated:

We also present Pure Hair, an evolution of the well-known TressFX hair simulation and rendering tech, developed internally by Labs. Compared to the previous version, we have significantly improved rendering, employing PPLL (per-pixel linked list) as a translucency solution. We have also significantly enhanced simulation and utilized async compute for better workload distribution.
 
This game is really not improving that much in performance. In SLI DX 12 on the FX 9590, 2x 1070's, overall scaling sucks, in DX 11 SLI scaling is virtually zero for me. I was hoping to see some improvement in this game before I continue. Will probably complete game with the Nano rig with FreeSync monitor - gives the best game play in the end.
 
Has anyone played this at 4K with SLI enabled? Wondering how much it would help.
 
Hi,

I just wanted to chime in with my finding playing DE MD in SLI with two 980 Tis. As an SLI gamer I like to maximize the utilization of both gpu's when playing a game and max out what ever fps I can get at maxed out ultra settings. At DX11 I noticed both gpu's max out at 95-99% utilization at 2560x1440 resolution and the game runs smooth at like 60 - 90 fps. However in DX12 I noticed about 55 - 75 % gpu utilization and the fps actually went down 40 % and the game played better on just one gpu with SLI off! I made sure I wasn't going over 6gb of vram usage aswell. So I thought the gpus' must be waiting for the cpu to compute the scene before it draws like I have read regarding DX12 and used Afterburner to monitor CPU usage on all cores. You would assume that cpu utilization must be maxed out if thats the case but what I saw was a mixed bag of numbers with nothing really maxed out at all. So it makes me wonder if Deus Exs' engine is muti threaded or not? My feeling is that either on the driver level or the game code still isn't optimized for SLI on DX12. Got to love DX11 though where I'm getting my money's worth.

Anyone else notice the same?

Thanks.
 
It's AMD sponsored port - you can only lose performance for 0 befits or added effecys if you use Nvidia gpus with DX12 in one of those.
 
Well I don't think that's true anymore in regards to AMD advantage considering the games been out for a while and drivers by now should have caught up. Fwiw the new 378.92 say they have added another sli profile for DEMD in the release notes. Got to test them tonight. Thanks.
 
Back
Top