Watch Dogs Performance and Image Quality Review @ [H]

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,634
Watch Dogs Performance and Image Quality Review - We published a preview of Watch Dogs performance when it was released back in May this year. We have given this game time to mature. Now that a couple of game patches have been released, along with newer drivers from NVIDIA and AMD, it is time for our full performance and image quality comparison review.
 
Thanks, fixed. - Kyle

Very thorough review. Thank you. I've put it on my backlog until some patch work so maybe it's time to go back to it.
 
Last edited by a moderator:
So much information in this article! This is ridiculous! It's going to take me a long time to analyze all of it even though it's presented so well. I mean it's really easy to understand; it's just so much of it. I'm too damn happy right now. F it. I ordered a GenMay sub for $5 after reading 1/3 of the article to say thanks.

Thank you Kyle, Brent, and the crew! Best damn website, forum, and community around!
 
Hell of an article! Too bad it was dedicated to a.... wait for it... dog of a game. HAH!
Kidding aside, it's an interesting look at a game that really pushes modern gpus. I look forward to more articles re 4k, but on games that are better coded, and use better assets and tech. It just seems like Disrupt is holding this thing back.
 
Nice review, I will try to force AF for this game.

Thanks, fixed. - Kyle
 
Last edited by a moderator:
page 6 1600p High end gpu instead of 1080p


Never was a fan of hbao , diminishing returns and what not.
 
Everyone should look at the Ultra textures without and with forced AF. The improvement is like night and day. I haven't even installed the game but I'll make sure to make a profile forcing AF when I do. Great stuff as always [H]
 
Should've included the e312 mod since that's what most people use.when i get the game im not playing one level until i load the E3 12 mod.
 
Should've included the e312 mod since that's what most people use.when i get the game im not playing one level until i load the E3 12 mod.

Based on the reviews I've seen, it's a give and take thing.
Besides, [H] never reviews mods. They review what the publisher/devs put out there, not what some third party enables or adds.
 
This is another great article and showing why I don't like how this sold so well on purely hype. Ubi have once again for like the third or fourth time this year promised to shape up with their PC versions but can we believe them?


This is the opposite of what we expected from Watch Dogs considering NVIDIA had such a heavy hand in its development and GameWorks was a big part of this game.
And that's because it's nothing but a TWIMPTB sham, these partner development programs that demand vendor-exclusive functionality need to DIE.
 
And that's because it's nothing but a TWIMPTB sham, these partner development programs that demand vendor-exclusive functionality need to DIE.
Why?
Because of these programs we have cool technologies we wouldn't otherwise have. None groundshattering, to be sure, but PhysX, TressFX, Mantle and some of the newer AA techniques are all valuable.
I, for one, really enjoy PhysX and what it does in some games (the electric gloves from Batman: Arkham Origins is a good example), and Mantle has proven competent at reducing CPU load in BF4.
 
Based on the reviews I've seen, it's a give and take thing.
Besides, [H] never reviews mods. They review what the publisher/devs put out there, not what some third party enables or adds.

Yeah i know and that's great.but this is different. a quick look by someone like [H} at the hidden settings to make it look like it's supposed to would be nice.:cool:
 
A quote from this review.

The point is this, running this game at anything lower than "Ultra" textures is just not that fun,

Man I just don't know if hardocp has become this much out of touch or really thinks a majority of it's readers really are running triple monitors/triple SLI. I'm running this game on a GTX 680 at 1080p at high textures. It looks great and I had a blast playing it and don't feeling I'm missing a thing. I sure don't fit into hardocp's catch all category of lowly gamers who don't have triple monitors/SLI and are missing all this great shit (when in reality they are not). Just my single card cent and a half.
 
Just a question, but how does TXAA look in motion? I heard some claims that it's better in actual gameplay and gives a cinematic look to games with it enabled.
 
Just a question, but how does TXAA look in motion? I heard some claims that it's better in actual gameplay and gives a cinematic look to games with it enabled.

Whether in motion, or not, it is a blur fest. When I enable AA I expect it to improve image quality, not worsen it.
 
A quote from this review.



Man I just don't know if hardocp has become this much out of touch or really thinks a majority of it's readers really are running triple monitors/triple SLI. I'm running this game on a GTX 680 at 1080p at high textures. It looks great and I had a blast playing it and don't feeling I'm missing a thing. I sure don't fit into hardocp's catch all category of lowly gamers who don't have triple monitors/SLI and are missing all this great shit (when in reality they are not). Just my single card cent and a half.

All I can say is I've played the game extensively in High Textures and Ultra, and we provided screenshot comparisons in this review. Ultra textures make an impact on image quality in a big way in this game. As a gamer I do not enjoy this game at High textures. I enjoy it at Ultra textures forcing 16X AF and Temporal SMAA.
 
Thank you very, very much for this. The timing was right and your conclusion on Ubisoft's merits or lack thereof in this game is spot on.

But I can't find it in me to agree with your conclusions on the stuttering being a direct consequence of VRAM bottleneck. I have played and tested the game myself thoroughly with a heavily OC'd GTX 780 3GB and a factory OC'd GTX 670 4GB and it stutters on ultra textures regardless of other settings, even when the VRAM usage doesn't even hit 3GB. I'm open for any theory on how could the stuttering even theoretically be because of VRAM limit if even the VRAM allocation doesn't top out.

So what I would very much liked to have seen is the Titan vs. 780Ti results all the way down to 1080p with both ultra and high textures. That would have drawn a much clearer picture on the actual impact of VRAM on Nvidia cards.

The article mentions the stuttering on all the Nvidia cards and that seems to be true for user experiences on various forums as well. Based on my own testing and what I've read from others, the common ground for the most severe stuttering is Nvidia, not VRAM. No matter if it's a 760 (2GB or 4GB), a 780 (3GB or 6GB) or a Titan, all of those have been reported by numerous people to demonstrate the pause-like stuttering while driving. If it's there for people with Titan Blacks at 1080p, surely it can't be simply an VRAM limitation, right?

My working theory is that a) Ubisoft messed up badly in porting the game from unified memory consoles to PC and b) Nvidia messed up badly by missing something crucial to their performance with the way the game streams the textures (ultra textures in particular). Watch Dogs will go down in history as perhaps the most messed up partnership title ever.
 
What you guys have stated is why it is hard to stomach paying full price at or near launch when games are increasingly shipping with major flaws. Unless a game has Forza or Halo in the title, I find myself waiting a few months for it to get fixed at which point the game has dropped $20 in price.
 
Whether in motion, or not, it is a blur fest. When I enable AA I expect it to improve image quality, not worsen it.
Yeah, TXAA is blurry as hell in all games I've tried it in, I'm not sure what the hell Nvidia was thinking this was suitable for mainstream usage. The good news is you can use SweetFX on top of TXAA to sharpen it and it can fix it for the most part.
 
Brent,

Maybe I overlooked it but can you tell me what the performance differences were for the different forced AF settings? I am curious if 16x AF is minimal to 8x in performance with greatly improved visuals. Thanks!

Lee
 
All I can say is I've played the game extensively in High Textures and Ultra, and we provided screenshot comparisons in this review. Ultra textures make an impact on image quality in a big way in this game. As a gamer I do not enjoy this game at High textures. I enjoy it at Ultra textures forcing 16X AF and Temporal SMAA.

I want to thank you for your expertise and experience reviewing and critiquing image quality and performance. We are readers because of the good quality and expert reviews in hardware.

But I think making an opinion on game play quality based on the texture quality is dubious. Knowing that you judge quality in this way reduces the effect your opinion has on me at least.

If taken semi-seriously, I actually could think you believe all games made before 5 years ago weren't fun. How can I relate to a reviewer like that?

I know it was only 1 sentence in your final comments but things like that can put a tint on how your make your opinions.

What the person you quoted was saying is, since you are a graphics snob; I now can't trust your opinion and reviews because they are effected by your graphics snobbery.

I understand you are the video card expert, you should be a graphic snob. You should be making graphic comparisons and opinions.

I don't think you should be making game play opinions, because clearly graphics are a larger part of that equation due to your field of expertise.
 
Awesome review, thank you. You've highlighted the reasons I've stayed away from this game and will continue to do so. Games that are released essentially broken don't deserve my cash.
 
I want to thank you for your expertise and experience reviewing and critiquing image quality and performance. We are readers because of the good quality and expert reviews in hardware.

But I think making an opinion on game play quality based on the texture quality is dubious. Knowing that you judge quality in this way reduces the effect your opinion has on me at least.

If taken semi-seriously, I actually could think you believe all games made before 5 years ago weren't fun. How can I relate to a reviewer like that?

I know it was only 1 sentence in your final comments but things like that can put a tint on how your make your opinions.

What the person you quoted was saying is, since you are a graphics snob; I now can't trust your opinion and reviews because they are effected by your graphics snobbery.

I understand you are the video card expert, you should be a graphic snob. You should be making graphic comparisons and opinions.

I don't think you should be making game play opinions, because clearly graphics are a larger part of that equation due to your field of expertise.

I think you are reading too much into it. We don't review games. We review how video cards accelerate and make games look, what kind of experience is delivered by the video cards.

I posted my opinion on gameplay. My opinion is that when I play a game, I want to run it at the highest image quality settings it supports. To do otherwise doesn't deliver the experience the game developer intended. I am all about getting the most from the game and getting the full experience the game developer intended, and that means playing the game with the highest image quality settings the game developer put in there.

I believe that reasoning holds true for a lot of people here. It is after all why we spend thousands of dollars on high-end gaming machines with video cards that cost hundreds and hundreds of dollars. If you are spending all that money on a gaming system, and large displays, or multiple displays for gaming, and you aren't maximizing the game's potential, you are missing the whole point IMO.
 
I think you are reading too much into it. We don't review games. We review how video cards accelerate and make games look, what kind of experience is delivered by the video cards.

I posted my opinion on gameplay. My opinion is that when I play a game, I want to run it at the highest image quality settings it supports. To do otherwise doesn't deliver the experience the game developer intended. I am all about getting the most from the game and getting the full experience the game developer intended, and that means playing the game with the highest image quality settings the game developer put in there.

I believe that reasoning holds true for a lot of people here. It is after all why we spend thousands of dollars on high-end gaming machines with video cards that cost hundreds and hundreds of dollars. If you are spending all that money on a gaming system, and large displays, or multiple displays for gaming, and you aren't maximizing the game's potential, you are missing the whole point IMO.

Yep, exactly. I've spent what I have on my system because I want the best gaming experience. This is one of the reasons I continue to read this site when others have dropped by the wayside, and why I'll continue to read it.

But you know what? Even if I didn't have a moderately high-end system, I'd still be here. I want to know how something at the highest of high end functions and looks and works. It gives me something to aspire to when my budget finally allows it (*grumble*) and in the mean time gives me some awesome stuff to look at and be envious of... lol

Edit: And I want to know when it's worth upgrading what I have. A 2% increase? Probably not... A 40% increase? Hell yes...
 
Okay, I've taken a better look at the article regarding the whole VRAM vs. stuttering issue that I pointed out earlier. Since that post didn't even get a reply, I'm going to try once more.

Your conclusion is that the 780Ti has problems with ultra textures because of lack of VRAM. I have a suspicion most readers here know what effect does running out of VRAM have on the gaming experience: stalling, crawling, missing textures and/or crashes. Now let's take your texture performance graphs for the 780Ti and the 290X:

14083370007cx2pU3ZI8_9_1_l.gif

vs.
14083370007cx2pU3ZI8_9_2_l.gif


As you said, the performance is there since the 780Ti actually suffers less of a performance penalty from using the ultra textures than the 290X does. Looking at the graphs, both cards show similar behaviour with ultra textures: the graphs are spikier, but the 780Ti has a few more of those sharp low ones. So this is what the ultra textures cause for the 780Ti: not a dramatic performance hit but spiky FPS and the feeling of stuttering.

Now take the AA performance graphs:
14083370007cx2pU3ZI8_11_4_l.gif

vs.
14083370007cx2pU3ZI8_11_2_l.gif


Going from 4xMSAA to 8xMSAA is a sure way to put the hurt on VRAM. So what do we see? We see the 780Ti taking a bigger hit in performance than the 290X and then, there is that massive valley in the middle of the benchmark. That is the kind of dramatic change in performance that I consider to be a classic telltale of running out of VRAM. Here, there is a clear symptom that the 780Ti is showing and the 290X is not. There is also a probable cause: 3GB of VRAM against 4GB.

So yes, more VRAM is very much a good thing for Watch Dogs. Still, even reading and analyzing your otherwise very thorough article, I must come to a different conclusion than you about the ultra textures. The stuttering they cause do not fall in the category where I'm used to place VRAM bottlenecks, whereas 8xMSAA absolutely does. There is something else in play with the ultra textures and Nvidia cards.

As I said, I've played the game a lot with both a 3GB Nvidia card and a 4GB Nvidia card. What I've found out is that they both show very similar behavior when running ultra textures. So going from 3GB to 4GB does not alleviate the stuttering if both are Nvidia. Going from, say, 780 to 290 seems to do so. Now how is that a VRAM thing and not a Nvidia thing?
 
I think it is just poorly optimized, at least for nvidia. Metro Last Light and Crysis 3 have far better textures. Even at ultra, Watch Dogs looks average. And some textures look well below average for 2014 (people, certain props, cars ect.). No reason for it to be running above 2GB VRAM at 1080 with FXAA and 2x AF.

Stuttering on my end also seems random. Some times it runs very smooth (at ultra), and sometimes it stutters heavily (at high).
 
I find the game almost unplayable when driving. The stuttering happens no matter what settings I use. Developers should understand by now that without a steady frame rate a game is garbage. This is on my new PC with a gtx 780. I would be really pissed if I had paid any money for Watch Dogs. It came for free with the video card, so it is just a waste of time.
 
I was really interested to see how much the 290X dominated the 780 Ti and Titan at 4K. And the graphs indicated that the problem is not VRAM.

Speaking of VRAM, Sapphire do a 290X with 8 GB VRAM.

I would have liked to have seen single-card performance at 4K. Tell us what settings we need too use to get smooth gameplay on a single 780 Ti or 290X at 4K.
 
Well, I suppose this means I will buy this game when it's real cheap, or when I have a 4GB card. Although, if what specops is saying is true, that won't really matter much. Hopefully Nvidia can optimize their drivers. But I doubt they will... not 3 months after release.

I must say, those 'Ultra' textures look very... 'average' for a 2014 game. The barely readable sign is just awful. awful. Calling that 'high texture quality' is ridiculous. Somebody either miscalculated their texture-memory budget, or they f*cked up the texture streaming.

I don't understand how this could have happened after the 6 month (!!!) delay they got. The only explanation I can think of is that the engine is just too optimized for the consoles' large unified memory, and they couldn't figure out how to make it work for the PC's split memory architecture and just shipped the game.
 
I just now looked at the Anti-alising screenshots I have to say this is kind of frustrating. You notice how NONE of the AA techniques listed here do jackshit on the fence and other alpha texutres, at most they'll blur it. I would have liked to have seen some more tests to try and actually take care of it. So many damn games just call it a day at half-assed AA solutions like this.

On AMD cards, they have Adaptive AA and on Nvidia you have TrSSAA to smooth out textures, although thanks to current practices, those don't always work. Both cards also have the option for SGSSAA or RGSSAA, which again, sometimes works, sometimes doesn't. It would be useful to know what Watch Dogs is capable of.
 
I would have liked to have seen single-card performance at 4K. Tell us what settings we need too use to get smooth gameplay on a single 780 Ti or 290X at 4K.

What's the point of buying a 4k monitor, and having a system that can't power it? Anyone sinking that kind of cash into that kind of display has a multigpu setup, guaranteed.
 
Very thorough article, though I may suggest updating it with the 4 GB editions of the 760 and 770 and the 6 GB version of the 780, all of which [H] hard reviewed in the past for a true comparison.

My OC GTX 770 4GB runs just fine with the latest patch using everything ultra at 1440p.
 
My experience with this particular game, on my system, running 1680x1050, I had massive stutters makiing it unpalyable while driving.
Turning OFF the page file almost completely eliminated that, but then the system would run out of RAM after palying for a while and the game would crash.
 
My experience with this particular game, on my system, running 1680x1050, I had massive stutters makiing it unpalyable while driving.
Turning OFF the page file almost completely eliminated that, but then the system would run out of RAM after palying for a while and the game would crash.

Is your swap file on hard drive or SSD?
 
I was really interested to see how much the 290X dominated the 780 Ti and Titan at 4K. And the graphs indicated that the problem is not VRAM.

Speaking of VRAM, Sapphire do a 290X with 8 GB VRAM.

I would have liked to have seen single-card performance at 4K. Tell us what settings we need too use to get smooth gameplay on a single 780 Ti or 290X at 4K.

This is as close as you'll get:
http://hardocp.com/article/2014/07/23/gtx_780_oc_6gb_vs_r9_290x_4gb_overclocked_at_4k/4#.U_OnNxPn9eQ

Both the overclocked 290x and 780 OC 6GB couldn't run ultra textures at 4K. They were impressive results for single cards at that resolution.
 
Last edited:
Very thorough article, though I may suggest updating it with the 4 GB editions of the 760 and 770 and the 6 GB version of the 780, all of which [H] hard reviewed in the past for a true comparison.

My OC GTX 770 4GB runs just fine with the latest patch using everything ultra at 1440p.

What am I missing here? Looking at the review this doesn't seem possible.
 
What's the point of buying a 4k monitor, and having a system that can't power it?

Because you can run many games (e.g. Starcraft 2, Borderlands 2) at full whack at 4K on a single 780 Ti and get good graphics with others (e.g. Tomb Raider).

Here are my settings on Tomb Raider:

TombRaiderSettings-Basic_zps3d90db80.jpg


TombRaiderSettings-Advanced_zps477a0414.jpg


And I'm getting ~50 fps and smooth gameplay.

Anyone sinking that kind of cash into that kind of display has a multigpu setup, guaranteed.

You've obviously missed that 4K monitors have tumbled in price.
 
Back
Top