Watch Dogs AMD & NVIDIA GPU Performance Preview @ [H]

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,629
Watch Dogs AMD & NVIDIA GPU Performance Preview - Watch Dogs is out on the PC, sporting a new game engine named "Disrupt." This next generation gaming title is poised to put the kick back in PC gaming technology that has been missing so far this year. We will evaluate performance in this preview evaluation and see if it stands up to high end GPU hardware.
 
Great preview guys. Will be looking forward to image quality updates later. Glad to see the 2 camps not so far off in performance, some had assumed that AMD was screwed on this one.
 
Great preview guys. Will be looking forward to image quality updates later. Glad to see the 2 camps not so far off in performance, some had assumed that AMD was screwed on this one.

Given what we had seen with the 14.6 Beta drivers, we thought we needed to get a quick preview out to put some of the FUD down. IQ this afternoon, and a full/complete performance review soon.
 
Huh. That tidbit about requiring 3GB+ for ultra textures makes me glad I sprung for a 4GB GTX 770. I'll have to get this installed soon to check out the differences.
 
Huh. That tidbit about requiring 3GB+ for ultra textures makes me glad I sprung for a 4GB GTX 770. I'll have to get this installed soon to check out the differences.

I'm in the same boat. I grabbed a 4gig 770 a while back for $320. I'm only running at 19x12 resolution, so hopefully I'll be able to do ultra just fine!
 
Awesome. Love to see this. Funny NV test recommendations too.
 
How AMD managed to optimize their drivers for a gameworks closed library game on release to at least equal the performance (and even faster in some cases) is quite impressive!

BIG kudos to the AMD driver team on this one!

Thanks for the preview [H]!
 
I'm in the same boat. I grabbed a 4gig 770 a while back for $320. I'm only running at 19x12 resolution, so hopefully I'll be able to do ultra just fine!

I'm a 1080p gamer, so hopefully it works out for me.
 
I gave it a quick try on my 290x @ 1080p with 13.12 WHQL and it ran extremely well.
All settings maxed except default HBAO and default AA. (didnt try anything else)
It was easily smooth enough and looked very good, max vram use was 3.25GB.
Looking forward to a serious play later with the new driver and will max HBAO.
 
Fascinating preview. After all the Nvidia marketing and hoopla, to have a flagship-ish card fall flat is rather amusing. Since I'm running a 660ti and a 660 in my main righ/HTPC respectively (Both @ 1080p), I'm very curious to see how this game runs.
That said, I'm going to hold off on buying until I see some real reviews. This is the kind of game I've been hyped up for and been burned on in the past.
 
Great job getting this out day zero, thanks! Looks good for AMD, a lot better than I expected. This is clearly one of those few titles where VRAM is very much a real issue.

One question: you stated that your reference GTX 780 Ti doesn't throttle during testing. Can you 100% confirm this? I know the 780Ti has a temp limit of 83 degrees compared to the 780 that I personally owned but I found the 780 to throttle so much that I fin it hard to believe the Ti can keep under 83 under sustained load. It would be cool to see the clock rate graphs. At the very least, please always report the actual max boost with Boost 2.0 cards.
 
How AMD managed to optimize their drivers for a gameworks closed library game on release to at least equal the performance (and even faster in some cases) is quite impressive!

BIG kudos to the AMD driver team on this one!

Thanks for the preview [H]!

May also have something to do with the Xbox One and PS4 using AMD gpus too.
 
May also have something to do with the Xbox One and PS4 using AMD gpus too.

At first I thought that also, but if you look at the 14.6 notes, performance was increased over 20% so nvidia definitely did some hidden shady programming to possibly reduce AMD performance from the console version which was originally AMD optimized to begin with. DX is inherently inefficient, but this is a lot...
 
thanks for the review. Guess Im starting take hits from my SLI 680s :( 2 GB vram just isnt cutting it anymore.
 
At first I thought that also, but if you look at the 14.6 notes, performance was increased over 20% so nvidia definitely did some hidden shady programming to possibly reduce AMD performance from the console version which was originally AMD optimized to begin with. DX is inherently inefficient, but this is a lot...

I wouldn't infer anything shady from patch notes. Those kinds of numbers are not unusual for a driver release for a specific game. After all, Nvidia didn't actually DO any programming. They just provided the SDK for those options the WD devs decided to include; an SDK available for some time and used on several games in the past.
 
Good review!
@ Old_Way even though you have 4gb on your 770 it wont be able to take advantage of it with its narrow 256bit memory bus. In order to use that extra vram the rez would have to be so high the game would be unplayable.
 
Yeah, I think the real issue for this game is, for that 3+ GB VRAM requirement, what are you actually getting in terms of texture quality? Everything I have seen so far looks nothing like anything that should require that kind of VRAM usage.

Looking forward to the IQ follow-up.
 
If any of you guys have issues running this game, I have been tinkering with the setting for quite some time now. A good boost is to set everything to "Ultra", And run your shadows on "High" And textures on "High"

The game runs very well on a heavily modified OEM GTX 660. Also for some reason this game engine doesn't use Vram really well. It uses your hardrives page file.

If you disable your paging file in windows. IT will run smooth even at "Ultra"! I am using 1.5GB vram and it runs almost perfect on "Ultra" textures, So, a 3GB card should be just fine.

I tried disabling the Page file last night, it improved a great deal with memory lagging and skipping, with "Ultra Textures on"

And then I noticed, my system memory went from 20% utilized with a PAGE FILE, and 41% utilized in this game without a page file! I have 16GB of memory. I shouldn't even need a page file anyways, it will give you a great performance boost with out a page file.
I also, have two GTX 780 classified cards that will be here today sometime. So, I will test this with SLI 3gb cards. And report back how it runs. But, based on how great it runs with this little 660 OEM. Im sure it will not be a problem.
 
Last edited:
How AMD managed to optimize their drivers for a gameworks closed library game on release to at least equal the performance (and even faster in some cases) is quite impressive!
Instrumentation and shader blobs. No big deal.

At first I thought that also, but if you look at the 14.6 notes, performance was increased over 20% so nvidia definitely did some hidden shady programming to possibly reduce AMD performance from the console version which was originally AMD optimized to begin with. DX is inherently inefficient, but this is a lot...
Direct3D is not inherently inefficient (compared to what high-level graphics API?), and no console uses the same Direct3D available in Windows: it uses a derivative of Direct3D. Many of the optimization efforts done for the X1 simply do not translate to D3D11.x.

These are technological and architectural reasons, not conspiratorial.
 
Tried it out since I just bought a second 780 over the weekend for SLI. Game runs pretty well with SLI GTX 780s with in-game settings at their highest at 1080p, although there is some hitching when loading new areas. I haven't noticed any blurring with TXAA 4x enabled, but I will have to try SMAA to see if there is any difference. Despite some of the earlier screenshots, there are plenty of shadows being cast from a lot of objects everywhere. Everything that we have seen up to this point has had to have been running on consoles because I think the game looks pretty damn good. VRAM usage topped out at 3046 MB, with GPU1 seeing 97% utilization and GPU2 seeing 96% utilization.

But I had to stop because I needed to restart the mission, but the opening cutscene is unskippable and I wasn't in the mood to sit through it again...
 
We don't know much about the use of temporal SMAA in this game, we think it might deal with adding a super sample element to SMAA, but we have to test further and look deep at image quality to find out.
It uses temporal reprojection.

In the past we've seen horrible use of TXAA where it creates massive blurring [here] and [here].
By design. TXAA is specifically designed to yield a softer, film-like image, not a razor sharp image.
 
Great preview guys. Will be looking forward to image quality updates later. Glad to see the 2 camps not so far off in performance, some had assumed that AMD was screwed on this one.

What I picked up from the article was that AMD's larger memory helped keep performance stable, I'm curious to see what happens when memory is not holding the nVidia cards back, I could theorize that AMD worries for future benchmarking once they don't have that edge.

Who knows.
 
I have two PCs with 2 GB cards (GTX 670 and HD 7850) @ 1920x1200. I know the review says there's a difference in image quality between high and ultra textures, but I've seem enough screenshots and video comparison to know its not the "OMG THIS IS SOOO MUCH BETTER!!!" type of difference. If the graphics were really over-the-top great on the game it might drive me to upgrade, but from everything I've seen they're not. The open world, no-zones set up of the game is what's driving this VRAM usage - not great graphics. I'm sure in the relatively near future there'll be enough critical mass in new games demanding more than 2GB of VRAM to look their best at my resolution, but I'll sit tight for now with my current hardware.
 
Way to go with this preview. It is depressing learning what I will be missing out as I will probably be playing this on the console.
 
I have two PCs with 2 GB cards (GTX 670 and HD 7850) @ 1920x1200. I know the review says there's a difference in image quality between high and ultra textures, but I've seem enough screenshots and video comparison to know its not the "OMG THIS IS SOOO MUCH BETTER!!!" type of difference. If the graphics were really over-the-top great on the game it might drive me to upgrade, but from everything I've seen they're not. The open world, no-zones set up of the game is what's driving this VRAM usage - not great graphics. I'm sure in the relatively near future there'll be enough critical mass in new games demanding more than 2GB of VRAM to look their best at my resolution, but I'll sit tight for now with my current hardware.

3gb for high, i doubt you will be able to use high with 2gb.
 
I'm curious. What would happen with my 660's in Watch Dogs, with their weird "2GB on a 192-bit bus" configuration?

With MSI-afterburner, in Skyrim, I never ever see vram usage go over 1500 MB. Is this because afterburner doesn't understand the card's memory configuration, or is the card simply incapable of using more than 1500 MB?

If it's the latter, I guess I can forget about playing Watch Dogs with textures on high right?
 
What I picked up from the article was that AMD's larger memory helped keep performance stable, I'm curious to see what happens when memory is not holding the nVidia cards back, I could theorize that AMD worries for future benchmarking once they don't have that edge.

Who knows.

That would be the graph right under the first one, where we are running in High textures to not bottleneck VRAM capacity. You can see they are very close in performance un bottlenecked.
 
The Titan should not be marketed to gamers, and the 780ti should be a 6gb card at the same price point. When a company resorts to bs tactics and starts ripping people off, it is never a good thing.
 
The game has 360/PS3 versions (and an upcoming weeks delayed WiiU one I hear), how bad could that have impacted the development and the engine?

I would have asked this for Wolfenstein but those guys just modified and improved the already old aged Rage engine which did fine on 360 and PS3. Just wondering what it does to ensure an engine can be ported down.
 
That would be the graph right under the first one, where we are running in High textures to not bottleneck VRAM capacity. You can see they are very close in performance un bottlenecked.

I meant in Ultra mode, wouldn't the workload be different there? I'm just curious, but I wouldn't be surprised if it's nothing to talk about at the moment.

If nVidia is the one holding the black box, one wonders how things change down the road when they are able to optimized for the game freely and the competition cannot.

All in all a concern for the consumers imho.
 
Yeah, I think the real issue for this game is, for that 3+ GB VRAM requirement, what are you actually getting in terms of texture quality? Everything I have seen so far looks nothing like anything that should require that kind of VRAM usage.

Looking forward to the IQ follow-up.

This. Going by screen shots only, it does not look as good as Crysis 3 to me texture wise. Which runs very well on my GTX 670 2GB. I think there is an optimization issue here and hopefully a patch or Nvidia drivers can increase performance.
 
Tried it out since I just bought a second 780 over the weekend for SLI. Game runs pretty well with SLI GTX 780s with in-game settings at their highest at 1080p, although there is some hitching when loading new areas. I haven't noticed any blurring with TXAA 4x enabled, but I will have to try SMAA to see if there is any difference. Despite some of the earlier screenshots, there are plenty of shadows being cast from a lot of objects everywhere. Everything that we have seen up to this point has had to have been running on consoles because I think the game looks pretty damn good. VRAM usage topped out at 3046 MB, with GPU1 seeing 97% utilization and GPU2 seeing 96% utilization.

But I had to stop because I needed to restart the mission, but the opening cutscene is unskippable and I wasn't in the mood to sit through it again...

Sweet! I new this game used a lot of memory! And the shadows do look great! I am still waiting on my two 780 3GB Classifieds to arrive " this afternoon!". They are refurbished, and I got them both for $815 right before they were out of stock again! lol. If something is wrong with one of them, then I will try and opt for the 6GB 780 SC cards. Now, I am worried about having only 3Gb of vram. This game does look amazing at times though! And beautiful scenery.. Any tips for 780 SLI setup? I have never run SLI BEFORE!! lol
 
This. Going by screen shots only, it does not look as good as Crysis 3 to me texture wise. Which runs very well on my GTX 670 2GB. I think there is an optimization issue here and hopefully a patch or Nvidia drivers can increase performance.

Yea, nothing looks like crisis 3. The textures are so clean and deep everywhere. You cannot see them get blurry. But, I think it is because Crysis 3 is a tubular game now. It use to not be like this in 2008 when it first released. You could walk just about any where you wanted to!! But, now we have consoles to thank us for all the tunnel vision, "Tube Games"

I guess that is why watch dogs uses so much Vram.
 
Back
Top