Deus Ex: Mankind Divided Performance and IQ Preview @ [H]

The "hypothetical" part was in reference to the game balanced around you buying microtransactions. Which it's not. So again, who cares? Don't buy them. They don't intrude on the experience at all. I don't like microtransactions, but bitching about them isn't going to make them go away. Not buying them will.

And I'm not sure what you mean by "riddled" with Denuvo. Again, it doesn't impact the experience, so I'm not sure why people care.

Actually it will and it is a good for PC Gaming. Next time they won't have that type of crap. PC Gaming is more important than on consoles especially now days when money is more generated selling PC games then all consoles combined.
 
OK, but if the game requires a $600+ GPU to stay above 60 FPS, do you consider that to be a good port? I certainly don't.
I don't think it's particularly good looking at low settings either, but maybe that's clouded by the fact that I have to run it at 720p to get an acceptable framerate. There are certainly better looking games which run much better on mid-range or lower-end cards.

I don't think it has anything to do with the quality of the port and more with the developer deciding not to offer settings so low that all the work they put into the graphics are wasted.

For someone who really wants a smooth framerate, I'd suggest a Freesync/GSync display, which alleviates the need to maintain 60 FPS.
I've been running a 290x until recently (switched to a Fury X, simply for less noise) and, despite what [H] thinks about 4K gaming, I found it adequate for a good number of games in 4K (with reasonable settings) since I only need to maintain 33 FPS minimum with my monitor to remain in the Freesync window.
 
  • Like
Reactions: noko
like this
Actually it will and it is a good for PC Gaming. Next time they won't have that type of crap. PC Gaming is more important than on consoles especially now days when money is more generated selling PC games then all consoles combined.

Hold up, do you have the stats to back that up?
 
For someone who really wants a smooth framerate, I'd suggest a Freesync/GSync display, which alleviates the need to maintain 60 FPS.
It does help a little, but I still find that dropping below 50 FPS or so feels bad on a G-Sync monitor. 33 FPS never feels good.
The larger issue for me is that G-Sync monitors are all 24/27" IPS panels.
I use a 46" FALD VA television as my main display for gaming. I just can't go back from >100,000:1 to 1000:1 contrast or a display 1/4 the size.
Hopefully one day there will be G-Sync OLED TVs.

I hope the DLC is a success and makes them tons of money to finance the next one as well as the one after that. Eventually I like to see the remake of the 2000 release - in 5k.

The DLC is completely optional - I have no plans to use it but sure why not to others.
When that sort of thing starts to influence gameplay design - and you're fooling yourself if you think it isn't - then it becomes a problem.
If there was no need for it in the game, they wouldn't sell it.
 
Hold up, do you have the stats to back that up?

Last quote i saw said income was 50/50 console versus pc. I would have that the percentage would lean higher to the console. That's why they have the stats vs "what i think".
 
Good review though, for my sins I never played a previous deus ex game....maybe time to change that!
 
Last edited by a moderator:
Typo on page 8 - "This is an extreme close-up o fa wall texture." on the wall image comparison (apologies if this has already been picked up)

Good review though, for my sins I never played a previous deus ex game....maybe time to change that!
Play it first! lol Deus Ex: Human Revolution as old as it is is still a very nice looking game. By the time you finish the dx12 patch should be out and you can start Mankind divided!
 
Anyone know why windowed mode gives fps boost? I get about 20 more fps then fullscreen and windowed seems to be the same resolution as fullscreen.
 
Anyone know why windowed mode gives fps boost? I get about 20 more fps then fullscreen and windowed seems to be the same resolution as fullscreen.

I believe they discussed this exact issue in the beginning of the article! read it again lol!
 
What I'm describing is the opposite of what that says. It says that when you use exclusive fullscreen mode you get more fps, I'm saying that when I use windowed mode I get more fps.
 
What I'm describing is the opposite of what that says. It says that when you use exclusive fullscreen mode you get more fps, I'm saying that when I use windowed mode I get more fps.
Ok, well i thought the 2 were related in some way! Maybe Brent_Justice would have a theory!
 
Rendering resolution is based on window size in windowed mode.
Framerate is higher because you're running at a lower resolution.
good call lol...ya think i would have spotted that lol
 
A game got horrible reviews on Steam and people are massively returning the game, not only because it is a horrible ported game but also they added micro transactions so you can buy items with real money...bottom line unacceptable.

My advice, don't buy a game.

So much nope.

The port is excellent because it's scalable, so many options allow PC gamers to tweak as they like it to suit their hardware. Got a weak GPU? Turn down to High, still having great image quality and density in the city. It still manages to look good on medium while running very fast, it's a testament of an optimized game.

A bad port is turning down settings, game still runs like ass while looking like ass.

Ultra in a good graphics game should not be playable 60 FPS on current hardware (single-GPU). If it is, then it hasn't pushed graphics to the next level.

Most of the neg reviews are related to the pointless in-game micro-transaction which you simply do not need to enjoy the game. It's not going to matter anyway, the game is selling great on all platforms and it's the new Crysis. A benchmark for other AAA games to aspire to match or exceed in graphics fidelity.

Oh, it's a great fun game too. Fans of the first DE: HR will love this one.
 
This game is demanding because it's using advance rendering techniques. If you can't run it well, lower the settings and still enjoy a great looking and atmospheric game.



Oh, PureHair is on every NPC. Very noticable on female with longer hair. Doesn't tank performance.

Try HairWorks in Witcher 3 and go fight 4 wolves. Enjoy your slideshow.

Right there is the difference between open source optimizable and GimpWorks DLLs.
 
Still going to be sweat watching the nvidia cards get whooped in this test every time [H] does gpu reviews!
 
Requiring a 980Ti or faster to stay above a 60 FPS minimum at 1080p on the lowest settings is not "scalable".

You must be joking.
either that or you have no idea what your talking about!
 
Requiring a 980Ti or faster to stay above a 60 FPS minimum at 1080p on the lowest settings is not "scalable".
Why? Is there a rule book that says, that to be scalable a previous generation high end card must be able to play the game smoothly at 60 FPS with 1080 resolution?
 
I for one am excited to see the 480 doing so well without DX12 patch yet in this game.
looking forward to an update on DX12.
Great review
 
LOL...either way, this is going to stress my gaming rig. I'm hoping an R9 390 can push it at 1440p in ultra, but I won't hold my breath. This may "force" me into an upgrade. :)
 
Why? Is there a rule book that says, that to be scalable a previous generation high end card must be able to play the game smoothly at 60 FPS with 1080 resolution?

Exactly. And when did 60fps become bog standard? Time was, anything over 30fps was acceptable, 45fps was very good. I'm not saying 30fps provides a superb gaming experience, I'm just saying some people got it in their heads that 60fps is magically going to happen in a brand spanking new game like this on a year+-old GPU. Y'all spoilt by play-it-by-numbers outmoded console ports over the past 6-7 year.

Anyone recall when Oblivion came out and the 7800s/1900s couldn't hardly hit 30fps? And those cards were less than a year old at the time.

This game uses advance rendering techniques to a degree we haven't seen before: not surprisingly, it's a GPU stretcher.
 
Why? Is there a rule book that says, that to be scalable a previous generation high end card must be able to play the game smoothly at 60 FPS with 1080 resolution?
Standard displays are 1080p 60Hz these days.
If you buy a cheap basic monitor, that's what you'll typically get.
When people talk about games being "scalable" they mean that the game looks great on high-end hardware, but also scales to run well on lower-end hardware too.

60Hz displays require that the framerate stays above 60 FPS for the game to be smooth.
Deus Ex requires a 980Ti/1070/1080/TitanXP for that even on the lowest settings.

So unless you bought a new GPU in the last three months, or spent $650+ on a GPU previously, the game won't run smoothly at 1080p with everything turned down/off.
That's insane. How could anyone possibly consider that to be a game which "scales well" !?

LOL...either way, this is going to stress my gaming rig. I'm hoping an R9 390 can push it at 1440p in ultra, but I won't hold my breath. This may "force" me into an upgrade.
36 FPS on Medium, 24 FPS on Ultra
 
This game is demanding because it's using advance rendering techniques. If you can't run it well, lower the settings and still enjoy a great looking and atmospheric game.



Oh, PureHair is on every NPC. Very noticable on female with longer hair. Doesn't tank performance.

Try HairWorks in Witcher 3 and go fight 4 wolves. Enjoy your slideshow.

Right there is the difference between open source optimizable and GimpWorks DLLs.


so with pure hair turned off does NPC hair behave differently? AFAIK with the witcher 3 there is no GPU physx. if i understand it right hairworks in the witcher 3 are not accelerated by gpu at all.
 
Exactly. And when did 60fps become bog standard? Time was, anything over 30fps was acceptable, 45fps was very good. I'm not saying 30fps provides a superb gaming experience, I'm just saying some people got it in their heads that 60fps is magically going to happen in a brand spanking new game like this on a year+-old GPU. Y'all spoilt by play-it-by-numbers outmoded console ports over the past 6-7 year.

Anyone recall when Oblivion came out and the 7800s/1900s couldn't hardly hit 30fps? And those cards were less than a year old at the time.

This game uses advance rendering techniques to a degree we haven't seen before: not surprisingly, it's a GPU stretcher.

I wouldn't mind if the game actually LOOKED that good. It does look decent, don't get me wrong. But these "advanced techniques" seem to require considerably higher GPU horsepower to produce average-looking results. So what's the benefit to the user?
 
Exactly. And when did 60fps become bog standard? Time was, anything over 30fps was acceptable, 45fps was very good. I'm not saying 30fps provides a superb gaming experience, I'm just saying some people got it in their heads that 60fps is magically going to happen in a brand spanking new game like this on a year+-old GPU. Y'all spoilt by play-it-by-numbers outmoded console ports over the past 6-7 year.

Anyone recall when Oblivion came out and the 7800s/1900s couldn't hardly hit 30fps? And those cards were less than a year old at the time.

This game uses advance rendering techniques to a degree we haven't seen before: not surprisingly, it's a GPU stretcher.

The fact that you're comparing this to Oblivion says it all.
 
Ugh why are you on this site.

Also the game is gorgeous in motion. The lighting is insane and the amount of activity per scene is insane. The hate is underserved and honestly very suspect.
Yep the nvidiot trolls hate the game! That just to bad! lol
 
Standard displays are 1080p 60Hz these days.
If you buy a cheap basic monitor, that's what you'll typically get.
When people talk about games being "scalable" they mean that the game looks great on high-end hardware, but also scales to run well on lower-end hardware too.

60Hz displays require that the framerate stays above 60 FPS for the game to be smooth.
Deus Ex requires a 980Ti/1070/1080/TitanXP for that even on the lowest settings.

So unless you bought a new GPU in the last three months, or spent $650+ on a GPU previously, the game won't run smoothly at 1080p with everything turned down/off.
That's insane. How could anyone possibly consider that to be a game which "scales well" !?

36 FPS on Medium, 24 FPS on Ultra

Bit tech benchmarks are shit. They used the first 5 mins of the benchmark inside the game and the benchmark is not even representative of the game.

Edit: Its 30 seconds used out of the 90 sex benchmark.
 
Last edited:
Bit tech benchmarks are shit. They used the first 5 mins of the benchmark inside the game and the benchmark is not even representative of the game.
On that note using HardOCP's numbers a 480 is running the game above 30 fps at ultra. That means at high it'll probably run at around 60 fps. A $200-300 gpu is hardly $600+. A 290 can hit 50 fps at around high medium settings.

Could it be better? Yes but it's not as bad as you make it sound.
 
On that note using HardOCP's numbers a 480 is running the game above 30 fps at ultra. That means at high it'll probably run at around 60 fps. A $200-300 gpu is hardly $600+. A 290 can hit 50 fps at around high medium settings.

Could it be better? Yes but it's not as bad as you make it sound.

Could it be worse ? Yes, but it's not as good as you make it sound.
 
60Hz displays require that the framerate stays above 60 FPS for the game to be smooth.
Deus Ex requires a 980Ti/1070/1080/TitanXP for that even on the lowest settings.
What a complete and utter bullshit. Do you even have this game? Reports of people playing on a 970:
http://www.neogaf.com/forum/showpost...postcount=1070
http://www.neogaf.com/forum/showpost...postcount=1076
http://www.neogaf.com/forum/showpost...postcount=1060
http://www.neogaf.com/forum/showpost...postcount=1810

Digital Foundry:
http://www.eurogamer.net/articles/digitalfoundry-2016-deus-ex-mankind-divided-performance-analysis said:
There is a massive gulf in performance between the lowest and highest settings with ultra averaging around 33 frames per second on a GTX 970 when using the in-game benchmark at 1080p. Thankfully, the game itself does run quicker than this during normal gameplaym and a mix of high and medium settings were enough to reach 60 frames per second. The lower settings also have little issue reaching higher frame-rates so the PC version of Deus Ex should be scalable across a wide range of machines

Performance in ACTUAL game, not the non-representative benchmark:

md_computerbasetcu5r.png

Source: Deus Ex: Mankind Divided im Benchmark

EDIT: I am following NeoGAF performance thread for DE:MD and I have a couple of pointers.

It is recommended to disable Contact Hardening Shadows completely, as it's bugged in it current state (plus it's very demanding, it's AMD's response to Nvidia's PCSS, but it's inferior). The bug is that the distant shadows are not rendered at all when CHS is enabled. Compare:
https://abload.de/img/s3_chs_ultracmub2.png
https://abload.de/img/s3_vhovu9y.png
http://i.pi.gy/dPRD.png
http://i.pi.gy/ORqJ.png

Deus Ex Mankind Divided : Screenshot Comparison


Very high shadows are bugged in some places, and High shadows look better:
Deus Ex Mankind Divided : Screenshot Comparison

Very High AO is bugged and introduces some shimmering artifacts even with TAA enabled, and it also does not apply to the character and it's immediate surrounding, it's better to use "On" option (although the disctance at which it is applied is shorter).

The build in sharpening option is too aggressive, leading to oversharpening, but the blur caused by TAA can be mitigated by using ReShade/SweetFX to apply lumasharpen to the image. Supposedly the results are very good (Durante uses TAA + lumasharpen in Deus Ex and swears by it).

That's all I can remember :)
 
Last edited:
Bit tech benchmarks are shit. They used the first 5 mins of the benchmark inside the game and the benchmark is not even representative of the game.
Get what you're saying, but the actual benchmark is 90 seconds long and Bit-tech only used the first 30 seconds. They were so desperate to be first to press that they couldn't even wait through the entire benchmark.
 
Get what you're saying, but the actual benchmark is 90 seconds long and Bit-tech only used the first 30 seconds. They were so desperate to be first to press that they couldn't even wait through the entire benchmark.

Woops, thanks for the correction.
 
I agree completely, this is true and it's a discussion that has been broached here on these forums several times.

If max settings weren't taxing then we wouldn't move forward, I get that.

At the same time though, look at how this game performs across a range of hardware. Typically integrated benchmarks are less demanding than the real game, in this case it seems to be the opposite, so some of the criticism is undeservee I guess.

I downloaded high quality video of actual gameplay to try and judge IQ and the aesthetic, at ultra settings.

I wasn't impressed, now someone posted reports of bugs at some of the higher settings so there's that.

Badly optimized means it can do those same things more efficiently, what most people are complaining about is that the shadowing, some of the post effects, are not worth the performance hit.

The benchmark is savage and you need to use minimum settings to average above 60, with dips below at 1440p.
 
Yeah it is, if you're interested in the game, gameplay benchmarks are the most important.
They're often performing worse than the built-in benchmarks though.

For comparing hardware the benchmark loops are nice because they're always the same though so results are reproducible.

Are you going to review Obduction in VR? Its the first time I genuinely am interested in VR for a title, it's so damn pretty
 
Back
Top