Deus Ex: Mankind Divided DX12 Performance Review @ [H]

Discussion in 'Video Cards' started by Brent_Justice, Nov 18, 2016.

  1. noko

    noko 2[H]4U

    Messages:
    2,515
    Joined:
    Apr 14, 2010
    Dice results in BF1 is disappointing but not done yet. Once I get 1070 SLI I will do some testing, looks like BF1 is on sell for a rather great price now.

    DX 12 will come of age when GPU power is dramatically increased where DX 11 will become the restriction. Start tripling, no even doubling the draw calls will kill DX 11. The more complex your scenes from objects, special shaders etc. the more restrictive DX 11 will become. At this time I would kinda agree with you since there is not a clear example showing that DX 12 can do something gaming wise above DX 11. Does that mean it can't - not at all. Plus you have to consider driver developement incurs a cost as well as more complex the gpu gets and how restrictive a given API can become blocking out hardware ability without a LLAPI. A LLAPI will allow access to new hardware capability much easier for example Async Compute (AMD method) with DX 12, not available with DX 11 - a 4%-7% gain in DX 12 for AMD in Gears Of War.
     
  2. Shintai

    Shintai 2[H]4U

    Messages:
    4,043
    Joined:
    Jul 1, 2016
    Even Microsoft says DX12 will never replace DX11. That's the entire case behind DX11.3. DX12 is for the sub 1% developers to begin with.

    And every time you talk about the async gains. Remember the power increase. Not to mention the work behind needed is most likely not worth the 5% or whatever gain there is and requires fat sponsorships.

    With DX12 you ask developers to do with job Nvidia, AMD or Intel does one time. They need to do it every time, including backwards for games when new GPUs comes out. At least if you exclude the constant reuse of old uarchs.

    DX10 was also the perfect API, praised by developers when asked in public. Hated by developers when asked in private. And we all know the rest of the history.

    You are pretty much saying that the entire success of the API depends on sponsorship money that often screw the result in favour of the sponsor. I remember when some people blindly thought it was as easy as a checkbox in the game engine. Oh have the times changed.

    What happens when Volta comes in terms of older games and DX12? What happens if AMD ever moves on from the same reused GCN since 2012? Even between GCN versions with tiny changes it can go bad fast as we have seen with 1.2 and Mantle.

    Keep the old GPU? Buy the released special version?
     
    Last edited: Nov 26, 2016
    Armenius likes this.
  3. noko

    noko 2[H]4U

    Messages:
    2,515
    Joined:
    Apr 14, 2010
    Once a developer has the code that works, or game engine then that no longer becomes an on going time taking event. We have also been looking at averages and not scene or view points where it makes a bigger difference in % and experience. A 1% increase average could also mean in one area of a game a 20% increase in performance where it is now smooth vice jerky and other areas zero increase. Averages can be somewhat misleading if you don't consider what makes up that average.

    Now other hardware features that Nvidia has can also be used if need be. DX 12 will allow Nvidia to expose those new capability much easier then with DX 11.
     
  4. Shintai

    Shintai 2[H]4U

    Messages:
    4,043
    Joined:
    Jul 1, 2016
    DX12 wont allow you to expose new features easier. DX is a defined standard of features. The features would have to be in a new DX12 version.

    And yes, I have seen the DX12 stutter festival. Or how DX12 makes the game slower if heavy on the game logic. Huge success there!
     
  5. noko

    noko 2[H]4U

    Messages:
    2,515
    Joined:
    Apr 14, 2010
    DX 12 is a LLAPI where you have access if you want of hardware features. Even new ones, meaning allows Nvidia for example to really push new hardware features and supporting Software a.k.a GameWorks to get that access.
    No it is not - feature level 11-0 and 11-1 is the minimum level for DX 12 hardware with 12-0 and 12-1 as optional hardware features. Many of the features in the given feature set can also be optional :cool:. In other words it will allow Nvidia and AMD to experiment with hardware features in the future and not tying it down as hard as previous DX versions.
     
  6. Shintai

    Shintai 2[H]4U

    Messages:
    4,043
    Joined:
    Jul 1, 2016
    You cant add a feature in DX12 if its not supported by MS. Plain simple. DX12 changes nothing in that perspective compared to previous DX versions.
     
    Armenius likes this.
  7. noko

    noko 2[H]4U

    Messages:
    2,515
    Joined:
    Apr 14, 2010
    Looks like your right, Vulkan allows extensions but that is not clear with DX 12. Still DX 12 has much flexibility and no hardware present uses all the optional features.
     
  8. CSI_PC

    CSI_PC 2[H]4U

    Messages:
    2,080
    Joined:
    Apr 3, 2016
    It is a bit more nuanced these days with GPUOpen from AMD, ironically all those defending AMD saying they are open standard and used GPUOpen as an example gave me facepalm moments.
    Link here explaining the AMD GPU service (AGS), which is part of GPUOpen just like the Vulkan shader extension used by AMD.
    Currently disabled for DX12 again but more to do with driver compatibility.
    http://gpuopen.com/gaming-product/amd-gpu-services-ags-library/
    But that is much more limited compared to the Vulkan extensions, and just like the Vulkan extensions this is now getting back to the old days of both manufacturers upping the ante on proprietary coding-to-hardware development and performance.
    Cheers
     
    Armenius likes this.
  9. JustReason

    JustReason [H]ard|Gawd

    Messages:
    1,780
    Joined:
    Oct 31, 2015
    You are intentionally being misleading here. What you posted isn't closed source as GameWorks is. No one will contest that GPUopen will benefit AMD GCN far better than any other architecture, but it still doesn't make it closed. And it being open means that any other Manufacturer can in fact make the changes and understand what the code does on their hardware. SO NOT THE SAME. And for the love of all that is holy stop being obtuse with your statements.
     
  10. razor1

    razor1 [H]ardness Supreme

    Messages:
    8,080
    Joined:
    Jul 14, 2005
    He didn't say anything about closed source, he is talking about proprietary coding, or extensions that are IHV specific. AMD thinks just the same way nV does when it comes to these things, they are not holier then thou. Please stop trying to say because they are open source they are better, that is BS, the only reason AMD is going open source is because they got their ass handed to them so many times and have no market pull, they had do it otherwise they would be no where.

    If they were truly open source, they would not use any IHV specific extensions. That would make their libraries equal on platforms but that isn't the case, with shader intrinsic, we see the same thing, its a way to take advantage of their hardware.

    While I feel there is nothing wrong with that, I do feel if you say that they don't do things for the benefit of their hardware, for their business from a programming level, is just wrong, they are doing it and will continue to do it any way its possible.
     
  11. CSI_PC

    CSI_PC 2[H]4U

    Messages:
    2,080
    Joined:
    Apr 3, 2016
    Yep spot on Razor,
    unfortunately though context gets lost when it comes to defending AMD vs Nvidia and just how some think AMD's technology and solution is helping the industry, when in reality they are in same position as Nvidia and going for any advantage they can.
    Open standard in my context as you rightly read means usable and can be implemented by other companies in a workable defined framework or part of a separate standard/committee, GPUOpen functions-service-workflow cannot ever be implemented on anything but GCN due to the proprietary nature of their hardware and proprietary low level hooks GPUOpen provides for said architecture.
    And as you say in the context you describe it is not even open source as developers cannot collaborate (part of the definition for true open source along with capability to modify from its original design) due to the low level nature-architecture hook GPUOpen provides in a rigid structure.
    AMD spread a lot of fud when they decided to attack Nvidia with their 'closed source is hurting us'.

    Just to add, AMD with GPUOpen will up the ante again with what Nvidia does IMO, it means Nvidia will justify making some of their tools truly low level as well.
    The benefit going forward for AMD is that they can 'sync' the tech between console and PCs to a certain extent, but I do think we may see a more aggressive tech approach as well now from Nvidia in response.
    Not sure how I feel about this going forward, we have seen how such escalation of low level proprietary tech-solutions impacted gamers in the past; I cannot fault AMD making great use of their console footprint and agree it makes sense, but they have pushed this now to a point where Nvidia is off the leash.
    Gameworks was more of a high level 'plug-in' effects/post processing tool and suite, but I can see this heavily changing going forward.

    Cheers
     
    Last edited: Nov 28, 2016
  12. Armenius

    Armenius [H]ardForum Junkie

    Messages:
    10,098
    Joined:
    Jan 28, 2014
    This hypocrisy always amuses me, too. It's an open standard so long as you make hardware exactly how AMD demands.
     
    Damos, Ieldra, razor1 and 2 others like this.
  13. Ieldra

    Ieldra I Promise to RTFM

    Messages:
    3,364
    Joined:
    Mar 28, 2016
    I've finally gotten round to testing this game, I will post detailed performance data when I can but for now let me just say I was shocked to see the MASSIVE impact on performance the following settings have; Motion Blur, Chromatic Aberration, DoF. AMD CHS is also ridiculously demanding and it looks pretty bad, turning it down a notch from Ultra helped massively.

    In the ingame benchmark comparing max settings to max settings + DoF, CA and MB disabled I gained almost 20% performance.

    Slightly tempting to play devil's advocate and spin this as evidence of an AMD sponsored game gimping NV hardware, cause seriously, that's what would be happening if the roles were reversed. Overall I'm slightly pleasantly surprised by the graphical fidelity. I still believe the skin shaders to be hjdeous and the facial animations to be unnerving but hey! To each his own.

    This is at max settings with forced 16xAF and msaa off
    [​IMG] [​IMG]
     
    CSI_PC likes this.
  14. Ieldra

    Ieldra I Promise to RTFM

    Messages:
    3,364
    Joined:
    Mar 28, 2016
    It's an open source, open standard but only AMD supports async because NVIDIA doesn't have Asynchronous Compute Engines™ ©
     
  15. noko

    noko 2[H]4U

    Messages:
    2,515
    Joined:
    Apr 14, 2010
    To tell the truth, I never understood purpose of motion blur if you have high fps. Motion blur was developed for 24fps type frame rates as in film (which has natural motion blur anyways) to smooth out the motion. If your motion is fluid your eye will have natural motion blur. DOF is a camera simulation and not an eye simulation unless you are near sighted. I can see DOF for video type scenes but for actual game play in a FPS it is more like uncorrected eye sight or someone needing glasses. Now if DOF had eye tracking knowing what you are looking at that would be more realistic. To me the game has better IQ and playability with motion blur and DOF off. The shadows I think sort of suck in this game, some objects don't cast shadows at all while being right next to other in the scene. AMD CHS sucks, at least in this game unless they have fixed it.
     
    Armenius and Ieldra like this.
  16. Armenius

    Armenius [H]ardForum Junkie

    Messages:
    10,098
    Joined:
    Jan 28, 2014
    I am also tired of developers trying to simulate cameras instead of eyes. Chromatic aberration is the worst of these effects.
     
    PhaseNoise, noko, MaZa and 3 others like this.
  17. Ieldra

    Ieldra I Promise to RTFM

    Messages:
    3,364
    Joined:
    Mar 28, 2016
    20% extra performance by disabling the three main visual abominations? Yes please.
     
    noko, CSI_PC and Armenius like this.
  18. stashix

    stashix [H]Lite

    Messages:
    126
    Joined:
    May 25, 2016
    At 1920x1080 judging by the screens?

    I got this back in early September, with everything maxed but Motion Blur (since I hate it) and MSAA off.
    [​IMG]
     
  19. Ieldra

    Ieldra I Promise to RTFM

    Messages:
    3,364
    Joined:
    Mar 28, 2016
    That was 1440p.

    Here is 1080p at your settings (Ultra preset, motion blur off)
    [​IMG]

    1440p Max settings (16xAF) no CHS, Motion Blur, Chromatic Aberration, DoF.
    [​IMG]
     
    CSI_PC likes this.
  20. MaZa

    MaZa 2[H]4U

    Messages:
    2,215
    Joined:
    Sep 21, 2008
    I dont even understand why chromatic aberration exists in game settings. Its essentially a camera flaw that degrades the picture quality. If high fidelity picture is what you are after it has a very much the opposite effect.
     
  21. noko

    noko 2[H]4U

    Messages:
    2,515
    Joined:
    Apr 14, 2010
    I spend $600 mostly on digital lenses for my glasses to make everything sharp regardless of the distance or which way I am looking without skewing the world - then I pop in a game and add aberrations, blur etc. with a performance loss that is rather large? Not! I also wonder why this is even considered an IQ improvement myself.

    Blur can disguise aliasing, poor textures, objects popping in existence and maybe other stuff but the cure to me is worst then the problem. Still having those options which maybe some will find useful is good.
     
  22. TaintedSquirrel

    TaintedSquirrel [H]ardness Supreme

    Messages:
    7,826
    Joined:
    Aug 5, 2013
    What happened to TressFX in this game? Sorry if it was covered previously. Haven't seen it mentioned since launch.

     
  23. Ieldra

    Ieldra I Promise to RTFM

    Messages:
    3,364
    Joined:
    Mar 28, 2016
    Wasn't tressfx the proprietary solution? Pure hair is the gpu open evolution of it iirc
     
    CSI_PC likes this.
  24. TaintedSquirrel

    TaintedSquirrel [H]ardness Supreme

    Messages:
    7,826
    Joined:
    Aug 5, 2013
  25. noko

    noko 2[H]4U

    Messages:
    2,515
    Joined:
    Apr 14, 2010
    This game is really not improving that much in performance. In SLI DX 12 on the FX 9590, 2x 1070's, overall scaling sucks, in DX 11 SLI scaling is virtually zero for me. I was hoping to see some improvement in this game before I continue. Will probably complete game with the Nano rig with FreeSync monitor - gives the best game play in the end.
     
  26. Damos

    Damos Limp Gawd

    Messages:
    365
    Joined:
    Oct 9, 2008
    Has anyone played this at 4K with SLI enabled? Wondering how much it would help.
     
  27. davidm71

    davidm71 [H]ard|Gawd

    Messages:
    1,216
    Joined:
    Feb 11, 2004
    Hi,

    I just wanted to chime in with my finding playing DE MD in SLI with two 980 Tis. As an SLI gamer I like to maximize the utilization of both gpu's when playing a game and max out what ever fps I can get at maxed out ultra settings. At DX11 I noticed both gpu's max out at 95-99% utilization at 2560x1440 resolution and the game runs smooth at like 60 - 90 fps. However in DX12 I noticed about 55 - 75 % gpu utilization and the fps actually went down 40 % and the game played better on just one gpu with SLI off! I made sure I wasn't going over 6gb of vram usage aswell. So I thought the gpus' must be waiting for the cpu to compute the scene before it draws like I have read regarding DX12 and used Afterburner to monitor CPU usage on all cores. You would assume that cpu utilization must be maxed out if thats the case but what I saw was a mixed bag of numbers with nothing really maxed out at all. So it makes me wonder if Deus Exs' engine is muti threaded or not? My feeling is that either on the driver level or the game code still isn't optimized for SLI on DX12. Got to love DX11 though where I'm getting my money's worth.

    Anyone else notice the same?

    Thanks.
     
  28. Michaelius

    Michaelius [H]ardness Supreme

    Messages:
    4,670
    Joined:
    Sep 8, 2003
    It's AMD sponsored port - you can only lose performance for 0 befits or added effecys if you use Nvidia gpus with DX12 in one of those.
     
  29. davidm71

    davidm71 [H]ard|Gawd

    Messages:
    1,216
    Joined:
    Feb 11, 2004
    Well I don't think that's true anymore in regards to AMD advantage considering the games been out for a while and drivers by now should have caught up. Fwiw the new 378.92 say they have added another sli profile for DEMD in the release notes. Got to test them tonight. Thanks.
     
  30. bl4C3y3

    bl4C3y3 n00bie

    Messages:
    22
    Joined:
    Nov 18, 2016