Watch Dogs AMD & NVIDIA GPU Performance Preview @ [H]

Please guys can you do an i5 vs i7 bench, pleeeeeeeeeeease!!

Believe or not, this game's recommended system spec is a 3770. This is a lot for a game right? It seems like a 3570k would be fine. Maybe that's why my 660 seems to run it so well, because I am using a 4820k at 4.8, and a i7 3820 @ 4.3ghz

Does the game really benefit from HT?
 
I guess we all need a AMD W9100 16GB to be future proof. This is one HELL of a "R9 290X" workstation video card. Check it out. Now a bad price either, considering what you get!

http://www.newegg.com/Product/Product.aspx?Item=N82E16814195129&cm_re=W9100-_-14-195-129-_-Product

You can run these cards in gaming. And you lose about 5% performance VS. a regular R9 290x only because of the extra built in software. But you can game with it

That card is the dream baby!

Funniest part is nvidia's stratospheric pricing lately is making this card seem accessible at the consumer level! :eek: :p
 
The game runs perfectly fine for me on an FX 6300 @ 4.66GHz. I'm using Win 7 HP x64.

I always apply the first BD patch to my OS, KB2645594, and I disable core parking from the registry.
I disable hibernate from a command prompt. APM is off in my BIOS and HPC is on. NB is @ 2600+
and my 8Gb of ram is @ 2133+ 10-11-10-29 1T.

My power profile is set to high performance in Windows. I've my mouse pointer set to 80% speed with
"enhance pointer precision" disabled; like that it feels fine in games.

I'm running it off of a Samsung SSD 830.

My graphics card is a manually overclocked HD 7970. I'm playing it @ 1920x1200.

More details on my 7970. My old reference 7970 doesn't have a stock BIOS on it. I did an edit on it with VBE7.0.0.7b
and flashed that in with ATiflash.

I raised power from something like 217 up to 300. I opened up the CCC's OC limits to 1300|1700. I increased possible power
limit from + 20% to +50%. My card is capable of 1280|1680 @ 1.287 voltage and I get glitches at those clocks. 1260|1662
with the same voltage is fine though. I may run 1150|1600 straight from the CCC without touching voltage; I just raise power limit.
If I want to run 1260|1662 I use Trixx 4.4.0b-MOD because my card requires extra voltage to do it. I did not want to flash my default
voltage to 1.287. I'm on air and to do 1260|1662 I need to use hair dryer level fan settings.
 
Last edited:
TechSpot's test was done in the hotel area at the start of the game, and walking down the street.
Not the most reliable results. According to them I should be sitting happily at 62fps... lol.
 
As many have mentioned on the internet. This game runs Slow as hell for the perceived graphics realism it provides...

I don't think ubisoft knows how to code very well :(
 
Those texture settings are brutal, even on low my 7970 can't run without choppiness with textures on ultra. Whereas i can run 2xsmaa and the rest on ultra easily with textures on high though as stated in the article it is quite a noticable downgrade visually. This is on 1080p btw.

Ordering a new gfx card for a game i might not even like in the end... well i've done dumber stuff that's for sure :D
 
TechSpot's test was done in the hotel area at the start of the game, and walking down the street.
Not the most reliable results. According to them I should be sitting happily at 62fps... lol.

There is a big difference in performance between interior parts of the game, and exterior city parts. A good run-through would include both, as ours did today.
 
Wish I had 4GB 680's, oh well the 880 can't be too far off at this point.
 
I wouldn't go VRAM crazy just yet...I think this game still has a long way to go to be optimized. There is really no reason we should be seeing 3+ GB VRAM usage with the textures this game has.
 
Thanks for putting the effort into creating such a well done timely review, it's nice to see that the situation isn't as dire as it was made out to be the last few days. That 3gb VRAM requirement seems a little extreme but at least with this new trend I might finally have a reason to upgrade my GPU without moving to a higher res monitor.

It will be interesting to see if the requirements are justified so I'm looking forward to the IQ review. Even the gameplay sounds better than it did in early reports, I still doubt that it will live up to the original hype but it's looking like it might be a solid open world action game.
 
Wish I had 4GB 680's, oh well the 880 can't be too far off at this point.

I'm in the same boat as you, first time my 680's choked on a game at 1600p, and 880's don't come out until at least December :(
 
I am running 3 of the 290x's @ stock and a i7 4930k at 4.4ghz with 32 gigs of ddr3 @ 2400mhz on windows 8.1 and catalyst 14.6....


yeah my results @ 1440p are not even remotely this good or rosy as this article makes it seem, and I am not the only one. We are talking random single digit framerate dips making it impossible to drive, and a framerate all over the place. Settings ultra no AA just like the article, except I am at a lower res. I've even tried it with just twocards adn as a single card still complete and utter crud.


I've uninstalled and re-installed the new cats 3 times even using DDU, and still the same results.

an example of the awesomeness of watchdogs and AMD (7 fps yes you see that correctly it does indeed say 7 fps):
6zx5.png
 
Last edited:
My performance also sucks with 2 290s and the new beta driver. Hell even turning it all the settings to low my framerate ends up being below 60 most of the time, which is ridiculous.
 
Wondering if those having performance issues are running into system memory issues. 16GB needed now?
 
I wouldn't go VRAM crazy just yet...I think this game still has a long way to go to be optimized. There is really no reason we should be seeing 3+ GB VRAM usage with the textures this game has.
Agreed. This is a game-specific issue, not an issue that's necessarily applicable to any other game. Ubi seems to be a little flippant about resource management only because, on the consoles, they can be.
 
Wondering if those having performance issues are running into system memory issues. 16GB needed now?
Watch Dogs is running around 2 GB consistently for me.
I wish I could make it use more ram. And yes I'm using the page file commandline.

No stuttering whatsoever, though. Fullscreen or borderless.
dat vram.
 
"GPU limited"

Must be the "Fuck you AMD" trigger in the game's code.
I swear we went through this EXACT SAME shit in 2008 with GTA 4. And there still doesn't exist a CPU today that can max it out.
 
Running an R9 290X (4GB RAM) with Ultra settings and GPU-Z shows 3.7 GB of VRAM used while I play the game! Yikes!

I am using HBAO+ and Temporal SMAA at 1920x1080 as well, btw.

P.S. 8 core processor recommended...WHATEVER! I have an I5-4670k (no hyper-threading) quad core @ 4.5Ghz and the game is silky smooth. I have only played for about 2 hours thus far so who knows...maybe later on in the game I may wish that I had an 8 core proc?
 
Yeah, I think the real issue for this game is, for that 3+ GB VRAM requirement, what are you actually getting in terms of texture quality? Everything I have seen so far looks nothing like anything that should require that kind of VRAM usage.

Looking forward to the IQ follow-up.

Requiring more than 3GB VRAM for Ultra textures is either laziness or lack of programming skills on the developers part.
I tried ultra and I didn't see any awesome super high resolution textures that would require such MASSIVE amounts of memory. Textures on ultra are simply good - what I would expect from 2014 AAA game. And my 780Ti is too weak for them? Really?

On high the textures look UGLY, like it's 2004:
http://i.imgur.com/T40A2fs.jpg

I am surprised that [H] does not criticize devs for these absurd requirements. There is a difference between "demanding" and "unoptimized" - Crysis 3 is a demanding game, but it rewards your powerful hardware with great graphics. Watch dogs is not demanding, it is simply horribly unoptimized. I just bought 780Ti assuming I will be able to play all games on the highest settings in 1080p with 60fps - and I could, until watch dogs (ironically, watch dogs was one of the main reason that I upgraded my card from 770 to 780Ti).

And we all know why the game requires over 3GB for ultra textures - the consoles. They have unified 8GB memory (around 5GB is available to the developers) so Ubi just optimised their game for the consoles where they don't have to worry about moving data between RAM and video card because it is unified. And what about PC gamers? Well, they can just go buy a 4GB card, so why bother optimising? Great work Ubi!
 
Actually this comparison is not helpful. The biggest difference is when you try to see the details of the texture, i.e. move close to them. From a distance even very low res textures look OK.
But even on this screen you can see that high textures are blurry.

Running an R9 290X (4GB RAM) with Ultra settings and GPU-Z shows 3.7 GB of VRAM used while I play the game! Yikes!
According to nVidia 780Ti with 3GB of VRAM is enough to play on ultra in 1080p. I guess they do not want to admit how badly Ubi dropped the ball with memory requirements - so badly, that their flagship GPU does not have enough memory for highest settings.

Guru3D is aware that something is wrong with this game:
http://www.guru3d.com/articles_pages/watch_dogs_vga_graphics_performance_benchmark_review said:
You can mark my words, Ubisoft is going to release a bunch of zero-day patches as Watch Dogs does not seems to be behaving the way it should with better than HIGH image quality settings.
Concluding then, as much as we like games to stress graphics cards we are a little puzzled. Watch Dogs is not a game that looks mouth-watering good, I mean it is nice and all, but it is just that. To see graphics card struggle this much over graphics memory is weird to see. However if you use our guidelines then your frame-rate should remain very acceptable and the game playable whilst looking good. You will still run into the occasional stutters here and there as it is the nature of the game engine I'm afraid. Let's hope Ubisoft is able to release a patch that at the very least eliminated the heavy stutters.

Also, why are you using MHBAO on ultra?
For the first two graphs we will be using the built-in overall quality option of "Ultra" to keep everything comparable. Look on the previous page to see what those "Ultra" settings are. This means HBAO mode in use is "MHBAO."

According to nvidia, MHBAO is the weakest AO, the same that is used on consoles:
In Watch Dogs, gamers can enable Ubisoft’s half-resolution, console-quality Ambient Occlusion technique, MHBAO
 
Last edited:
I can't play the game without getting a headache. The stuttering is nauseating.

I7 3770k @ 4.5ghz
GTX680 SLI 2GB @ 1250mhz/6400mhz
32gigs of ram

Playing it at 2560x1440


With texture settings on high (for 2GB cards), HBAO low, FXAA, and shadows set at high, the stuttering is still far too distracting.

Going from 70+ FPS to two FPS literally starts to give me a headache after 30 minutes of playing. I should be able to play the game at the settings i am using, without getting a migraine.

I will just wait for a patch to play it, or once i get a pair of GTX8xx cards when they release.
 
Last edited:
Well, the games run amazing for me! I am playing right now on a
RIG ONE ----- RIG TWO
Intel i7 3820 @ 4.3 Ghz ----- Intel i7 4820K @ 4.85Ghz
4x4Gb DDR3 1866mhz ----- 4x4GB DDR3 1866
Samsung 840 evo ssd ---- Samsung 840 evo SSD
GTX 780 Classified 1293/7000------GTX 780 Classified "Not tinkered wid yet"

It is very SIMPLE! Just turn the shadows to "High", instead of Ultra, and run the textures to high instead of Ultra. I don't see a big quality difference. Or any at all! The game runs, lag free! It uses about 2gb of vram, and does not skip or studder one bit.

"Ultra textures is another story. I just don't see a difference in picture quality, or graphics detail. I just see a HUGE performance hit! And unless you have a 6GB GTX 780 or a 6Gb Titan. Then you cannot play it smoothly.

Also, another issue to condsider, when your video card runs out of memory. The game will use your system page file instead of your SYSTEM MEMORY! So, if you disable the PAGE FILE. It will use your VRAM, and DDR system ram. It helped some for me.

But regardless! I have every single option set to "Ultra" accept for, textures = High.. And Shadows is = High, everything else is on "Ultra" With 2X MSAA
And it runs amazing! I dunno why but it just does! Even on my GTX 660 OEM It ran just as good on the same settings, and it was only a 1.5GB card
The 660 OEM was clocked at like 1306mhz/ and 7160mhz memory lol. But it ran the game like a beast! Everything on ultra, 2xaa high shadows, high tectures

Also, I have 16X AF forced in the Driver, and FX AA, as well High quality AF enabled. And Gama correction, and Ambient occlusion forced to.
 
Last edited:
Well, the games run amazing for me! I am playing right now on a
RIG ONE ----- RIG TWO
Intel i7 3820 @ 4.3 Ghz ----- Intel i7 4820K @ 4.85Ghz
4x4Gb DDR3 1866mhz ----- 4x4GB DDR3 1866
Samsung 840 evo ssd ---- Samsung 840 evo SSD
GTX 780 Classified 1293/7000------GTX 780 Classified "Not tinkered wid yet"

It is very SIMPLE! Just turn the shadows to "High", instead of Ultra, and run the textures to high instead of Ultra. I don't see a big quality difference. Or any at all! The game runs, lag free! It uses about 2gb of vram, and does not skip or studder one bit.

"Ultra textures is another story. I just don't see a difference in picture quality, or graphics detail. I just see a HUGE performance hit! And unless you have a 6GB GTX 780 or a 6Gb Titan. Then you cannot play it smoothly.

Also, another issue to condsider, when your video card runs out of memory. The game will use your system page file instead of your SYSTEM MEMORY! So, if you disable the PAGE FILE. It will use your VRAM, and DDR system ram. It helped some for me.

But regardless! I have every single option set to "Ultra" accept for, textures = High.. And Shadows is = High, everything else is on "Ultra" With 2X MSAA
And it runs amazing! I dunno why but it just does! Even on my GTX 660 OEM It ran just as good on the same settings, and it was only a 1.5GB card
The 660 OEM was clocked at like 1306mhz/ and 7160mhz memory lol. But it ran the game like a beast! Everything on ultra, 2xaa high shadows, high tectures

Also, I have 16X AF forced in the Driver, and FX AA, as well High quality AF enabled. And Gama correction, and Ambient occlusion forced to.

thanks for the tips
 
the textures to high instead of Ultra. I don't see a big quality difference. Or any at all!

Well, I do. [H] also does:
is it worth it to run this game at "Ultra" textures? The answer is yes. We have discovered many situations where there are big differences between "High" texture quality and "Ultra" texture quality. We've noticed this on textures on buildings, indoors and outdoors, on posters, billboards in the game, and basically any type of image put over a texture. It is a very texture-rich gameplay world. "

So much for "there's no difference".
 
I've shadows reduced to High. Motion blur and DOF are off. AO is set to HBAO+ High, Water Ultra, Shader High
Level of Detail Ultra, Reflections High, Textures Ultra and Temporal SMAA is enabled.

@1920x1200 the lowest FPS dip I've seen is 35. Typical FPS is between 40-50. In shallow viewing situations FPS
will go over 50. This was measured outdoors during the day running all around the area near the hideout. I ran around
as far as I could without exiting the mission area. I was also watching my FPS when I was escaping from the police
in a car near the beginning.

It felt fine to me once I increased my look and turn sensitivity. I've not seen any hitching or "stutter". I love riding motorcycles :D

Again, I'm using an FX 6300 @ 4.66GHz and a 7970 @ 1150|1600. It will do higher clocks and the fan noise gets on my nerves
at those settings. I'm using Cat 14.6. I'm running the game off of a Samsung SSD 830. More system details are here.
 
Last edited:
I just don't comprehend why even with crossfire disabled and res set to 1080p using no aa and mhbao I am still getting game breaking stuttering. 4 gig 290x card check, 12 logical core i7check, 32 gigs of ram check, page file off check, catalyst 14.6 check....and turning on crossfire makes it worse. I'm hoping someone has some suggestions other than turning down textures.
 
The overwhelming majority of games are optimized for the console; if MGPU performs well in a title it's due
to efforts from the nVidia or AMD driver team. I gave up on MGPU back when I had my 5870s.

I get odd occasional stutter in all my games if I don't install AMD's USB filter driver that is present in the SB
driver package. I use a Logitech illuminated keyboard and a G9x mouse; both are USB. With that driver installed,
everything is fine. I'm using an old Crosshair V mobo.

My reference 7970 doesn't have a stock BIOS on it. I did an edit on it with VBE7.0.0.7b and flashed that in
with ATiflash. Details are here, along with other notes about my system.
 
I'm liking the game so far but not so much the performance aspect of it which I think is my 2.4GHZ quad Xeon holding me back and my 6GB of shit-tastic Corsair XMS ram. I get massive slow-downs when driving. My 2 680's can do better with a better CPU and more RAM (97%).
 
Yesterday while playing I realized that over 8 gigs of system ram was in use (reaching this point took about an hour of gameplay...at first, it was not that high.) maybe the issue a lot of you are having is that? Performance with my overclcoked 290s is good...I was expecting 100fps, but am getting around 70 fps outdoors with all settings in highest possible and msaa. Considering I have two cards, it is pretty low, but, very playable. This is at 1080p.
 
So WD is a not a Title badly coded for AMD GPUs after all. Just a badly coded title.
 
Why did they release this game with this level of performance? Wow. I sure as hell won't be buying it. Not that I'd buy a Ubisoft game, anyway. I'll go out of my way to try and prevent others from buying this one, though.
 
Last edited:
Back
Top