FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,532
Watch Dogs 2 Video Card Performance Review Part 2 - In Part 2 of our Watch Dogs 2 Video Card Performance Review we go in-depth on WD2 graphics settings and the performance impact of those. We will test the High Resolution Texture Pack, shadow techniques, Ambient Occlusion, the Extra Details slider, and all Anti-Aliasing modes. We test real-world gameplay on the newest AMD and NVIDIA GPUs.
 
Can't wait for part 3 to see what this all means!

Would it be possible to do an animation showing TXAA in motion? Still shots of TXAA don't do it justice due to its temporal algorithm. Very interested in seeing the improved perceived texture quality while using it, though, compared to the old version.
 
Looks to be about on par with my findings personally. Nice review!
 
Extra details really knock some teeth´s out. Very nice review!
 
So even with a 1080 card, you have options that'll turn the game into a slideshow. Not sure whether to be sad or impressed.
 
So is Watch Dogs 2 SLI/CF compatible? Will you be doing testing for this? Thanks for the testing as always!
 
This game eats quad cores for lunch and will even peg all 8 threads of my overclocked 4770k. The CPU usage and demanding graphics just don't make any sense for how the game looks as it's not really all that impressive. And this game has about the worst aliasing I've ever seen and even 4K and maxed out AA can not fully alleviate it but of course it's a slideshow with settings much lower than that anyway.
 
Yeah, I think the graphics are damn good though, especially the shadows and screen space reflections. The aliasing is bad though, and it's unplayable on most forms of AA other than temporal filtering.
 
Nice seeing SLI works out the gate with this title, while scaling looks to be around 40% at least it is something and hopefully will improve over time.

I like how HardOCP does the breakdowns in segments, makes for a great read and gives me ideas to try. The clarifying or description of the experience sticks out clearly and is very unique to this site in general. Please keep that up!

The above review guide is fantastic as well, holy smokes does it give a lot of sample comparisons! Unfortunately it misses out on the experience somewhat as in artifacts with Temporal Filtering. Plus in game motion and game play the still images fall short without some sort of bearing or experience explaining it. Which can be disjointed in the real experience but is useful in any case.

HTFS shadows from the review guide does look good, unfortunately looks to be too performance robbing to be used though. Ultra shadows looks way to hard, now to me High and Very High look better. Same with reflections, sometimes the higher settings become unrealistic and over does an effect - So in other words lower settings may actually look better. Each there own on that one though.
 
Brent Justice said:
The features that are NVIDIA features in this game are equally demanding on AMD and NVIDIA GPUs. There is the same level of performance drain enabling these on both AMD and NVIDIA. Where it is not playable on AMD GPUs it is equally not playable on NVIDIA GPUs. There is no vendor bias present according to our testing at all with this game.

Besides the very nice review itself, i think that Brent's quoted-statement is very important, in order for us to have a general idea of how NV's design-patents work equally on all GPUs despite their brand. (*i was rather surprised to be honest!)
 
The question that popped up in my head is:
At what resolution will a Titan provide playable frame rates with all the IQ maxed out (including 8x TXAA)?
 
The question that popped up in my head is:
At what resolution will a Titan provide playable frame rates with all the IQ maxed out (including 8x TXAA)?

1280x720 with everything maxed and temporal filtering off should get about 45-50 fps overall when outside with an oced Titan X. I say that because I got about 35-40 fps with a 1080 running at 2050/11000.
 
Even though I love my 1070 and gaming at 1440p... I have to admit, giving this game a spin early next year on what's rumored to be the 1080ti coming out.... sounds like it could be a good time :)
 
Downscale to upscale? Sounds painful.

Just run at your native resolution and turn some settings down. It doesn't make you any less of a man.

If I understand correctly it would be upscale to downscale.
 
Overall, we can report that NVIDIA features used in Watch Dogs 2 are equally demanding on AMD hardware. We know many gamers are concerned with "GameWorks" technologies, thinking it harms the competition, or thinking that NVIDIA GPUs have a leg up on performance with these features. That couldn’t be farther from the actual truth in Watch Dogs 2.



The features that are NVIDIA features in this game are equally demanding on AMD and NVIDIA GPUs. There is the same level of performance drain enabling these on both AMD and NVIDIA. Where it is not playable on AMD GPUs it is equally not playable on NVIDIA GPUs. There is no vendor bias present according to our testing at all with this game.
Hopefully this helps put this argument to rest.
 
Hopefully this helps put this argument to rest.

It won't, but I do my best.

There will always be the next game that uses "AMD Something or other" or "NVIDIA Something or other" and someone will cry foul, treachery, unfairness, and sabotage.

Instead, I believe we should embrace new graphics technology that help to improve image quality and the gameplay experience, no matter what form, or whom, they come from.
 
Hopefully this helps put this argument to rest.
HTFS? Not sure why AMD GPU could not use the same technique but that looks like the only GameWorks code that is Nvidia only and Ansel. I remember the uproar when Tomb Raider came out with TressFX from some Nvidia users :ROFLMAO:. Which AMD freely opened up the code and Nvidia was able to use it. Will Nvidia open up HTFS code for AMD?
 
HTFS? Not sure why AMD GPU could not use the same technique but that looks like the only GameWorks code that is Nvidia only and Ansel. I remember the uproar when Tomb Raider came out with TressFX from some Nvidia users :ROFLMAO:. Which AMD freely opened up the code and Nvidia was able to use it. Will Nvidia open up HTFS code for AMD?

IIRC, both HFTS and VXAO are implementations of DX11.3/DX12_1 techs (Conservative Rasterization/Raster Ordered Views), so until AMD starts supporting that, it ain't happening.
 
HTFS? Not sure why AMD GPU could not use the same technique but that looks like the only GameWorks code that is Nvidia only and Ansel. I remember the uproar when Tomb Raider came out with TressFX from some Nvidia users :ROFLMAO:. Which AMD freely opened up the code and Nvidia was able to use it. Will Nvidia open up HTFS code for AMD?
BTW GPUOpen has some low level functions that cannot be used on Nvidia.

TBH I would prefer if both manufacturers man up and make their own tech low level and applicable only to their own hardware, doing so would mean these post processing effects are no longer high level bolt-ons but more low-level and integral to the rendering engine.
Benefit, GameWorks would no longer have massive performance cost, AMD could enhance their own comparable features even more without primary consideration on performance cost being a limiting factor and make PureHair more detailed/greater physics related, along with how it is also used in some scenes such as snow, AMD do have many more post processing functions than that though.
It would also assist volumetric lighting/AO detail and performance.

At some point both companies will have to do this because it is reaching the point you need insane GPUs even without Hairworks-Gameworks on latest modern games when increasing visual quality lighting/shadows/'god-rays'/AA beyond that of designed core to the engine and game at a multi-platform (primarily console) level.

Cheers
 
Last edited:
This game runs better on my Xeon 14/[email protected] than on i7 [email protected] with 1080 SLI. (maxed details, 1440p). Newer titles seem to prefer Xeon multi core setup over CPU with higher frequency but less number of cores.
Call of Duty Infinite Warfare run equally on both CPUs, Deus Ex Mankind Divided prefers Xeon, same for Crysis 3. Far Cry Primal runs equal on both CPUs.

$349 Xeon 14/28 is for sure keeper. I can see DX12 taking advantage of more cores in the future, in other words future gaming will prefer core count over frequency.
 
Well that would be cool if HardOCP checks that out, seems like Xeon processors are way cheaper too.
 
This is the nicest looking game this year imo. Maxed out it brings my setup to around 24 -34 fps @4K
 
We didn't need a really good card to play to Watch dogs 2. The graphics are really low compared to the trailers.
 
Back
Top