Watch Dogs 2 Video Card Performance Review Part 1 @ [H]

No Titan X for comparison? We know at least one of the [H] staff has one :whistle:.
 
Can you guys do a follow-up article where you test the CPU scaling of this game?

I know you love to harp on hyper threading and games not utilizing 8 threads, but this test result makes me curious: is there a noticeable performance improvement going 4c/4t to 4c/8t?

wd2_proz-png.11316


I don't even care if you just simulate the i5 by turning HT off. It would still be useful for people to decide which CPU to buy. You know, put your money where your mouth is. since you have that capability. Prove this chart from GameGPU.com wrong!
 
Last edited:
Thanks for the review. So, with all the advanced graphic settings going past current hardware, I'm thinking of waiting a year or two on this title as I'll most likely pick up some Game of the Year edition for like 5 or 10 bucks and will most likely have a better card to see it in all its detail. Haven't played Skyrim yet ...
 
Can you guys do a follow-up article where you test the CPU scaling of this game?

I know you love to harp on hyper threading and games not utilizing 8 threads, but this test result makes me curious: is there a noticeable performance improvement going 4c/4t to 4c/8t?

wd2_proz-png.11316


I don't even care if you just simulate the i5 by turning HT off. It would still be useful for people to decide which CPU to buy. You know, put your money where your mouth is. since you have that capability. Prove this chart from GameGPU.com wrong!

+1 on this. I'd be very interested in single and multi-gpu performance using different processors. Sill rocking an i7 2700k myself as I did not think there has been much benefit to upgrading anything besides my videocard periodically.
 
Excellent work. I hope you'll be including SLI and XF performance in part 2.

Something else you might consider: I didn't spot which 4K monitor you use. Your visual quality settings are going to be very different between a 24" 4K monitor and a 42" 4K monitor. The larger the monitor, the more post-processingis required. For instance, on my 28" 4K monitor, one of the first things I turn off to gain performance is any AA. Indeed, I'll drop all post-processing before dropping texture quality.
 
Why no GTX1070 @ 1080p testing? The whole reason I went 1070 @ 1080p was to be able to turn all the eye candy on - I know the 'general consensus' is the 1070 is for 1440p, but as Watch Dogs 2 demonstrates here, not ALL games can be played @ Ultra Quality @ 1440p on the 1070.

I'd love to know if Ultra on the 1070 maintained ~45+ (that's my personal minimum playable) fps @ 1080p.
 
Very nice review! Temporal filtering suppose to dramatically improve fps. Nice to see this game supports sli and cfx configurations. Wonder how the scaling is.
 
These latest games makes me wonder if there is any point to 144hz monitors or even 100hz sigh.
Really would not want to buy a 1080Pascal model just to play at 1080 resolution with higher framerates and better latency.

Seems for some AAA studios the performance is becoming much worst for little gains over console design (trend is that additional options beyond the core criteria that works ok for console seriously impacts performance but not necessarily for much visual gain), which is strange when quite a few of the studios mention these games are designed for PCs.
It is getting a bit silly to need a TitanXP to make nice use of a 100hz monitor at 1080 resolution with near max settings

Cheers
 
These latest games makes me wonder if there is any point to 144hz monitors or even 100hz sigh.
Really would not want to buy a 1080Pascal model just to play at 1080 resolution with higher framerates and better latency.

Seems for some AAA studios the performance is becoming much worst for little gains over console design (trend is that additional options beyond the core criteria that works ok for console seriously impacts performance but not necessarily for much visual gain), which is strange when quite a few of the studios mention these games are designed for PCs.
It is getting a bit silly to need a TitanXP to make nice use of a 100hz monitor at 1080 resolution with near max settings

Cheers
Game supports multiple graphics card which should allow fast monitors to do their thing at higher settings. Plus temporal filtering should also do the trick
 
Game supports multiple graphics card which should allow fast monitors to do their thing at higher settings. Plus temporal filtering should also do the trick
Gamers should not need to buy a 1080pascal card to achieve 100fps with settings below max at a resolution of 1080, and even then it does not look like it will do 100fps at 1080 resolution with Ultra settings.
I think most would see it as a problem to buy multiple Nvidia 1070 cards just to hit a consistent rate to make good use of a 100Hz-120Hz monitor with 1080 resolution.

There is something wrong with the industry if one starts to need something that powerful to get even close to 100fps at such a resolution.
The game cannot even average 60fps at 1440p with a 1080pascal card, and that is not even the highest settings....

And I notice the trend also applies to other modern games when one starts to crank visual setting beyond the core ones defined for the console market.
 
Can you guys do a follow-up article where you test the CPU scaling of this game?

I know you love to harp on hyper threading and games not utilizing 8 threads, but this test result makes me curious: is there a noticeable performance improvement going 4c/4t to 4c/8t?

wd2_proz-png.11316


I don't even care if you just simulate the i5 by turning HT off. It would still be useful for people to decide which CPU to buy. You know, put your money where your mouth is. since you have that capability. Prove this chart from GameGPU.com wrong!
I did the maths for you comparing the 2500K + 2600K and 6600 + 6700K.
Subtracting 3% for the clock speed difference, HT adds close to 16% on these 4 core CPUs.
 
I did the maths for you comparing the 2500K + 2600K and 6600 + 6700K.
Subtracting 3% for the clock speed difference, HT adds close to 16% on these 4 core CPUs.

I can do the math just fine, it's people on these forums calling this fake that I can't disprove without more benchmarks.

Just pop your head in here to see the 8 thread hate in full force!

https://hardforum.com/threads/kaby-lake-in-january.1918530/
 
The one thing this game does not yet support however is DX12. It is disappointing that there has been no official word if it ever will. This is the kind of game that is right up DX12’s alley. DX12 can help to alleviate CPU bottlenecks that these types of games suffer from.

This is a very large open-world environment with a large amount of AI, physics and level of detail managing and object loads. If optimized well DX12 could really make a difference. We have not heard one peep from the developers if DX12 is on the drawing board or not for this game, but we can only hope that it is and one day we will all be surprised with a DX12 patch.


I really hope not. So far DX12 is a dud. Specially in nvidia hardware. AFAIK only GoW4 runs well with it. All DX12 patched games look exactly the same as with DX11 but run slower, so why would you want a DX12 patch for WD2?
 
I can do the math just fine, it's people on these forums calling this fake that I can't disprove without more benchmarks.

Just pop your head in here to see the 8 thread hate in full force!

https://hardforum.com/threads/kaby-lake-in-january.1918530/
I see.
I agree that HT is very useful at times.

There are a few considerations.
It used to be that HT caused fluidity or framerate issues in some games but this is greatly improved on Skylake at least, to the point of not noticing.
Multi-threaded GPU drivers and more than 4 core aware games also made a difference.
If there are issues still, setting processor affinity will cure them. There are utils to do this automatically once configured. (I dont need or use them)
I moved from a 6600K to 6700K at slightly slower speed and have only noticed improvements or the same experience.
No loss of quality here otherwise I would be disabling HT.
I'm only seeing the benefits.

ps on Win7-64 with 980ti
 
I really hope not. So far DX12 is a dud. Specially in nvidia hardware. AFAIK only GoW4 runs well with it. All DX12 patched games look exactly the same as with DX11 but run slower, so why would you want a DX12 patch for WD2?

Having the option doesn't mean you are forced to use it. I prefer to have more options at my fingertips. If they spend the time to properly optimize DX12, it has potential to help in a game like this. DX12 is built to take on games such as this and alleviate CPU bottlenecks. The work however, has to come from the developer. As I said, I prefer to at least have the option, and find out for myself if it helps. If it doesn't, no biggie, just keep running in DX11. It hurts no one to have more options to try.
 
That cost too much money and resources to the testing procedures at [H]. that's how always worked here, all with the latest.

So a half useless article then? Seems showing how a new demanding game performs on the last gen would be quite important.
 
So a half useless article then? Seems showing how a new demanding game performs on the last gen would be quite important.

things are like that here at [H] nothing we can make to change that unless you're willing to pay the [H] bills to Kayle and pay the Job to Brent (that's basically what he is gona say anyway). we can always resort to other sites to see what you are asking for.
 
things are like that here at [H] nothing we can make to change that unless you're willing to pay the [H] bills to Kayle and pay the Job to Brent (that's basically what he is gona say anyway). we can always resort to other sites to see what you are asking for.

Lol. Seriously, lol
 
It would be really interesting to see 980ti vs Furyx or 970 vs 390 comparisons in your performance reviews. nowadays It's almost impossible to talk about Nvidia without somebody mentioning how bad Nvidia's GPUs age or how Nvidia gimp their older GPUs by not optimizing for them to force people to upgrade their cards. testing older cards is really helpful for people who are interested in buying cards with better longevity.
 
Can you guys do a follow-up article where you test the CPU scaling of this game?

I know you love to harp on hyper threading and games not utilizing 8 threads, but this test result makes me curious: is there a noticeable performance improvement going 4c/4t to 4c/8t?

wd2_proz-png.11316


I don't even care if you just simulate the i5 by turning HT off. It would still be useful for people to decide which CPU to buy. You know, put your money where your mouth is. since you have that capability. Prove this chart from GameGPU.com wrong!

PCGH got one with FX CPU. None with Intel so far. But I think Pcper or pclabs tends to make one. Looks like 6 threads as such is the max. Fits the direct console port.

Watch-Dogs-2-CPU-Core-Scaling-Performance-Test-AMD-FX-R9-Nano-v2-pcgh.png

wd_2_bay_bridge.png
 
Last edited:
Eh +/- 3-5% for a 980ti compared to a 1070
I guess it depends upon the settings/scene.
PCGamesHardware also has a 7% difference between a custom 980ti and custom 1070 (resolution 1080 and also min framerate 1440p), yeah agree it is not game changing and no-one should think of buying the 1070 as an upgrade to the 980ti.
Not directed at you but most forget even the 1080 is not technically the upgrade from 980ti due to having less GPC/SM (although this structure has changed again with Pascal and for the better per GPC), sales price is what skews the perception though and that the 1070 and 1080 have great performance.

Cheers
 
Last edited:
No 980Ti love?
The problem is if you include the 980ti you then need to include the FuryX.
If you include the Fiji card you should also then include the 390x as this at times is stronger or matching in modern games at launch, context being from the PoV of previous gen that you touch upon.
BTW I agree it does make sense to have previous gen as it is important information to a lot of gamers and generally of interest to show how technical factors are changing, but I guess it is a balancing act between resources available.
Cheers
 
PCGH got one with FX CPU. None with Intel so far. But I think Pcper or pclabs tends to make one. Looks like 6 threads as such is the max. Fits the direct console port.

Watch-Dogs-2-CPU-Core-Scaling-Performance-Test-AMD-FX-R9-Nano-v2-pcgh.png

wd_2_bay_bridge.png
Your graphs suck. The first posted in this thread did it right, SLI. Yours perpetuate GPU limits being single card no matter the resolution.
 
Your graphs suck. The first posted in this thread did it right, SLI. Yours perpetuate GPU limits being single card no matter the resolution.

I can see you dont agree with the results. If it was GPU limited a faster CPU wouldn't change the result. I thought it was common knowledge.
 
Can you guys do a follow-up article where you test the CPU scaling of this game?

I know you love to harp on hyper threading and games not utilizing 8 threads, but this test result makes me curious: is there a noticeable performance improvement going 4c/4t to 4c/8t?

I don't even care if you just simulate the i5 by turning HT off. It would still be useful for people to decide which CPU to buy. You know, put your money where your mouth is. since you have that capability. Prove this chart from GameGPU.com wrong!

I do appreciate the feedback, but my focus right now on this game is on the graphics end.
 
For games as intense as this, it would be good to see how the higher end cards compare at 1080p.
4K screens should be able to display 1080p without too much "interference" so it might be a better option.

Also, many of us are still on 1080p with fast cards.
 
No 980Ti love?

All of our game graphics evaluations are performed on current generation GPUs. While I understand it would be interesting to test older GPUs, we do not have the time and resources to test all of them. Look at it this way, by using current generation GPUs you can evaluate it from an upgrade perspective, you can see if upgrading to one of today's current generation GPUs would benefit you in said game and by how much. It also tells us what current generation GPUs are capable of, and what the value of each one is at a given price point for said game. We can then recommend the best GPU for the game, and find out what is needed to enjoy it.
 
Having the option doesn't mean you are forced to use it. I prefer to have more options at my fingertips. If they spend the time to properly optimize DX12, it has potential to help in a game like this. DX12 is built to take on games such as this and alleviate CPU bottlenecks. The work however, has to come from the developer. As I said, I prefer to at least have the option, and find out for myself if it helps. If it doesn't, no biggie, just keep running in DX11. It hurts no one to have more options to try.

Having an option for option sake? I'd rather have 1 good option than a worse one.

I think its become clear that a DX11 game patched to DX12 has ZERO benefits on the nvidia side, but games designed for DX12 fare better.

So instead of having developers spend time and money on patching DX12, they should just develop newer games with DX12 in mind.
 
Having an option for option sake? I'd rather have 1 good option than a worse one.

I think its become clear that a DX11 game patched to DX12 has ZERO benefits on the nvidia side, but games designed for DX12 fare better.

So instead of having developers spend time and money on patching DX12, they should just develop newer games with DX12 in mind.

You are automatically assuming a DX12 option would result in a negative affect. I wouldn't make assumptions like that. If done right, DX12 can provide advantages.
 
Having an option for option sake? I'd rather have 1 good option than a worse one.

I think its become clear that a DX11 game patched to DX12 has ZERO benefits on the nvidia side, but games designed for DX12 fare better.

So instead of having developers spend time and money on patching DX12, they should just develop newer games with DX12 in mind.

Yet do you understand where the cost/benefit to developers for jumping into DX12 and not having DX11 is?

DX12 = Win10 only. DX11 = Win7/8/10. Which beancounter do you know will do what you wish?
 
Yet do you understand where the cost/benefit to developers for jumping into DX12 and not having DX11 is?

DX12 = Win10 only. DX11 = Win7/8/10. Which beancounter do you know will do what you wish?

Well microsoft released Forza and GoW and AFAIK both are selling well.
 
You are automatically assuming a DX12 option would result in a negative affect. I wouldn't make assumptions like that. If done right, DX12 can provide advantages.

Well, recent history has only thought us so much. I have ZERO indication of a DX12 patch having a possitive effect. Again on nvidia hardware.
 
Well microsoft released Forza and GoW and AFAIK both are selling well.

that's correct but a game dev can't go straight to DX12 only with the knowledge that have an OS restriction, very few people are willing to change OS just to play a game for being DX12 only. I think in that case the same dev would prefer to use Vulkan (as IdSoftware did with Doom) as is available on a wide range of OS, not tied to single OS as DX12..
 
that's correct but a game dev can't go straight to DX12 only with the knowledge that have an OS restriction, very few people are willing to change OS just to play a game for being DX12 only. I think in that case the same dev would prefer to use Vulkan (as IdSoftware did with Doom) as is available on a wide range of OS, not tied to single OS as DX12..

Steam users beg to differ.

Windows 10 64 bit 50.70%
 
Steam users beg to differ.

Windows 10 64 bit 50.70%

Would you as a game dev ignore 40% of the market? (between Win 7, Win 8.1) that's hardly possible for a game dev, it would be like just make games for Nvidia Cards, as AMD only mean less than 30% of the market..
 
Back
Top