Watch Dogs Image Quality Preview [H]

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,664
Watch Dogs Image Quality Preview - We have previewed the performance experienced in Watch Dogs, now it is time to preview image quality and look at some specific image quality differences. We will look at texture quality and the great differences between modes, anti-aliasing, and Horizon-Based Ambient Occlusion effects, AKA HBAO.
 
Last edited:
Excellent info on the 3GB limitation on Ultra textures at 1440p and above.


GeForce dot com has an excellent Watch Dogs Graphics, Performance & Tweaking Guide with Interactive Comparisons: http://www.geforce.com/whats-new/guides/watch-dogs-graphics-performance-and-tweaking-guide#textures

I was amazed at how crucial the Shader option is for nighttime illumination.
SDShaderHiLow1000x.jpg
 
Last edited:
"Ultra" textures require 3GB of VRAM" - nope, they require more than 3GB (so effectively a card with 4GB), even on 1080p, which is simply ABSURD and Ubi should be criticised for not optimizing memory usage on PC. Consoles have unified memory, so Ubi felt no need to cater to PC crowd. Just go and buy 4GB card, or lower your textures to High, which actually looks ugly http://i.imgur.com/T40A2fs.jpg
My friend has 2GB 770 card and he can't even play on high - not enough VRAM.

At that resolution, 3GB of VRAM wasn't enough for smooth performance with "Ultra" textures. We didn't talk about lower resolutions 1080p. From further testing it seems like performance and consistency is just fine on 3GB video cards to run at "Ultra" textures at 1080p.
Did you actually test ultra textures in 1080p? I did and my memory usage was constantly at 3GB and the game was hitching and stuttering every few seconds - textures were swapping from RAM (or page file) to VRAM. I am not able to play in 1080p at stable 60fps on an overclocked 780Ti - it's ridiculous just how badly this game is (not)optimised.
 
Thanks for the detailed screenshots, especially for the different forms of AA..It is amazing how poorly MSAA does in this engine, even at the 8X setting. I am going to sit this game out in the hopes that UbiSoft replaces some major patches to address some of the gripes. I am currently enjoying Wolfenstein at the moment anyway.
 
"Ultra" textures require 3GB of VRAM" - nope, they require more than 3GB (so effectively a card with 4GB), even on 1080p, which is simply ABSURD and Ubi should be criticised for not optimizing memory usage on PC. Consoles have unified memory, so Ubi felt no need to cater to PC crowd. Just go and buy 4GB card, or lower your textures to High, which actually looks ugly http://i.imgur.com/T40A2fs.jpg
My friend has 2GB 770 card and he can't even play on high - not enough VRAM.


Did you actually test ultra textures in 1080p? I did and my memory usage was constantly at 3GB and the game was hitching and stuttering every few seconds - textures were swapping from RAM (or page file) to VRAM. I am not able to play in 1080p at stable 60fps on an overclocked 780Ti - it's ridiculous just how badly this game is (not)optimised.

Someone in the other thread (about [H]'s first Watch Dogs preview today) said that disabling the page file on their PC helped eliminate the stuttering. Might give that a try. :)
 
Someone in the other thread (about [H]'s first Watch Dogs preview today) said that disabling the page file on their PC helped eliminate the stuttering. Might give that a try. :)

Well, as far as I remember disabling the page file is not recommended as it may lead to unexpected problem, when RAM fills up. I may try it, but is not a solution, just a workaround for developer's laziness.
 
My memory usage was constantly at 3GB and the game was hitching and stuttering every few seconds - textures were swapping from RAM (or page file) to VRAM. I am not able to play in 1080p at stable 60fps.

I am getting the same thing as well, with a GTX 780.
 
This is a disgrace. What is the point of having a highend nvidia video card if 3GB is not enough for normal quality textures?
 
I am getting the same thing as well, with a GTX 780.

I play 1200p, ultra, FXAA and dont have almost no stuttering problems. Latest beta driver fixed a lot of those issues. VRAM is not even full, 2.6-2.9gb most of the time. Game eats tons of RAM. I ve seen more than 10gb been used with nothing special in the background.
 
This is a disgrace. What is the point of having a highend nvidia video card if 3GB is not enough for normal quality textures?

'Normal quality' is just an arbitrary label. If the game is actually using that much VRAM on the texture pool and framebuffer then that's simply how much the game needs for a given level of detail.

For example, if the PS4 is running the game on a 'High quality' equivalent and has access to 5gb of unified memory then three of those gigabytes could be in use by the texture pool and framebuffer. You're not going to get that onto a 2gb card.
 
...
Did you actually test ultra textures in 1080p? I did and my memory usage was constantly at 3GB and the game was hitching and stuttering every few seconds - textures were swapping from RAM (or page file) to VRAM. I am not able to play in 1080p at stable 60fps on an overclocked 780Ti - it's ridiculous just how badly this game is (not)optimised.

So I'm running this with three Titans, at at 3840x2160 I'm sitting at about 4500mb of vram used. That is with the maximum settings on everything except: Motion Blur (don't like it), Depth of Field (Don't like it some of the time), and no AA (while I can do some AA, I just left it off because I can't see a "big" difference either way for my playing). This is with ultra textures.

Turn on that AA though.... and to the top the vram usage goes!

I was using 2x TXAA at 1080p with the same system and worked very well but I can't remember my vram usage now....
 
I like the fact that this game is ultimately going to push AMD and Nvidia to add more VRAM and speed to their next generation of graphics cards. As quiet and cool that the GTX 780 and 780Ti are, I never understood why Nvidia would make them with only 3GB of VRAM when Hitman Absolution exceeded that years ago. Heck Skyrim with mods can do it also. Then take into consideration that 4K gaming isn't something new and everyone can see why 3GB of VRAM is too small.

In the end I think that Nvidia will enable better texture swapping in a future driver. At least I hope so.
 
I would not mind this crazy hunger for memory if these textures that require 4GB VRAM looked superbly detailed and ultra-crisp (like posters in Bioshock Infinite, or walls in BF4). But they don't.They look simply good for a 2014 AAA game.

'Normal quality' is just an arbitrary label. If the game is actually using that much VRAM on the texture pool and framebuffer then that's simply how much the game needs for a given level of detail.
The game needs more than VRAM any other game in known human history and has nothing to show for it (i.e. it does not look spectacularly better in any way to explain the need for so much memory). This obviously means very poor optimization on Ubi's part and it should be brought up, not excused by saying "well, if it needs that much, then it needs that much, end of story." Otherwise next time they will launch a game that requires 6GB of VRAM for no reason (other than laziness), and then what?

For example, if the PS4 is running the game on a 'High quality' equivalent and has access to 5gb of unified memory then three of those gigabytes could be in use by the texture pool and framebuffer. You're not going to get that onto a 2gb card.
If you do a lazy port, then you are not. But we already have games with way better textures than in Watch Dogs (Bioshock Infinite, Crysis 3, Metro Last Light, Battlefield 4, Max Payne 3, Batman Arkham City) which do fine with 2GB VRAM, and Watch Dogs has worse textures and needs 4GB. Additionaly, the textures that did not fit into the video memory are not kept in a fast RAM, but in a slow page file (so you need to manually disable the page file to force the textures into ram - that's good coding right there). Such high requirement is indefensible considering how the game looks.

Here is a texture from 2004 game Far Cry. That game is 10 years old and some of the textures were so detailed, that you could read headlines on the cover:
http://i.imgur.com/5kqpp.jpg

A more recent example - Bioshock Infinite: http://cloud-3.steampowered.com/ugc/723121226387225528/225F0BCCD7DCFB858E9055F4797C4E433CAE1EA2/
The texture is so detailed, you can read the lettering on the stamp and learn that thee is Faculty of Music at Finkton University of Columbia. I relally like such small touches, tiny background detail that contribute to painting a picture of the in-game world (and that is why I like nice texures).
 
Last edited:
psQrDOk.png


you have to play on the lowest texture setting at 1080p with a 2 GB card to not have stuttering. that's with the game on an ssd and paging file disabled. meanwhile, the textures look like complete shit on "medium" (why is medium the lowest texture setting?) while using more vram than any game ever made. high textures look like what you would expect from a game released in the last 5 years, not the supposed technological marvel that watch dogs was supposed to be. ultra textures are marginally better than high, and yet you need 4 GB of VRAM to be able to play at 1080p without stuttering (that's right, the $700 780 Ti doesn't have enough VRAM to max this game at 1080p.) you can forget trying to play this game with anything less than 2 GB of VRAM at 1080p, unless you don't care that the game drops to sub 10 fps frequently while hitching every few seconds in less intensive areas. what a fucking joke.
 
Last edited:
Have any of you adjusted the shadow quality to see how that effects your performance? I only ask because I am playing at 3840x2160, ultra textures, ultra level of detail, medium shadows, no AA with a 2GB 680 and I don’t experience any stuttering at all. FPS isn't the best, sits around 30, but I deal with it because IQ is so much better at 4k. VRAM usage is typically around 1500-1700MB. System ram usage is around 8-9GB for Watch Dogs alone (12gb total system ram). I also have a 6 core processor (Xeon X5650 @ 3.4) which might be making the difference, I see Watch Dogs using putting 60-70% load on 6 threads continuously.
 
I would not mind this crazy hunger for memory if these textures that require 4GB VRAM looked superbly detailed and ultra-crisp (like posters in Bioshock Infinite, or walls in BF4). But they don't.They look simply good for a 2014 AAA game.


The game needs more than VRAM any other game in known human history and has nothing to show for it (i.e. it does not look spectacularly better in any way to explain the need for so much memory). This obviously means very poor optimization on Ubi's part and it should be brought up, not excused by saying "well, if it needs that much, then it needs that much, end of story." Otherwise next time they will launch a game that requires 6GB of VRAM for no reason (other than laziness), and then what?


If you do a lazy port, then you are not. But we already have games with way better textures than in Watch Dogs (Bioshock Infinite, Crysis 3, Metro Last Light, Battlefield 4, Max Payne 3, Batman Arkham City) which do fine with 2GB VRAM, and Watch Dogs has worse textures and needs 4GB. Additionaly, the textures that did not fit into the video memory are not kept in a fast RAM, but in a slow page file (so you need to manually disable the page file to force the textures into ram - that's good coding right there). Such high requirement is indefensible considering how the game looks.

Here is a texture from 2004 game Far Cry. That game is 10 years old and some of the textures were so detailed, that you could read headlines on the cover:
http://i.imgur.com/5kqpp.jpg

A more recent example - Bioshock Infinite: http://cloud-3.steampowered.com/ugc/723121226387225528/225F0BCCD7DCFB858E9055F4797C4E433CAE1EA2/
The texture is so detailed, you can read the lettering on the stamp and learn that thee is Faculty of Music at Finkton University of Columbia. I relally like such small touches, tiny background detail that contribute to painting a picture of the in-game world (and that is why I like nice texures).

I can't speak for texture swapping to a page file or whether that's even correct (which should be easy to test) but you aren't quite comparing like-for-like games. FPSes have very different memory needs to open-world games. GTA4, for example, scales VRAM usage linearly with rendering distance where its settings page tells you exactly how much memory is consumed at such and such rendering distances; increase rendering distance twofold and memory consumption goes up twofold and the reason for that is because it tries to have as few repeating textures as possible. Conversely, a game like Bioshock or Skyrim may only be rendering a few dozen square meters at a time and is full of repeating textures which allows those textures to be more intricate.

Don't get me wrong because I absolutely don't think the game is anything to write home about visually. But you can't say that graphics card X is supposed to be able to have Y quality diffuse textures without actually knowing how the rest of the memory budget is being spent. GTA4 was made for 512mb consoles but can consume up to 1.5gb of VRAM if you increase rendering distance to its maximum. It's still using the same low resolution textures but now there are 3x more of them. And that's not counting higher resolution shadow maps, specular maps, normal maps, NPCs, cars, and whatever rendering buffers are in use in Watch Dogs. Just looking at a flat surface billboard and saying that the memory budget is being wasted ignores every other aspect where it may actually be better used.

It's not dissimilar to the recent Wolfenstein fallout where its textures are made of enormous database files. Some people think that those files should be smaller and that their 2gb cards are supposed to handle using uncompressed textures and therefore the game is unoptimized. There are huge parts of the equation that those people are missing.
 
Last edited:
Have any of you adjusted the shadow quality to see how that effects your performance? I only ask because I am playing at 3840x2160, ultra textures, ultra level of detail, medium shadows, no AA with a 2GB 680 and I don’t experience any stuttering at all. FPS isn't the best, sits around 30, but I deal with it because IQ is so much better at 4k. VRAM usage is typically around 1500-1700MB. System ram usage is around 8-9GB for Watch Dogs alone (12gb total system ram). I also have a 6 core processor (Xeon X5650 @ 3.4) which might be making the difference, I see Watch Dogs using putting 60-70% load on 6 threads continuously.

i don't understand how everyone is getting such ridiculous ram usage. the game never goes over ~2080 MB for me. i did what you said and the stuttering is slightly less but it's still there, and i'm only playing at 1080p with my vram maxed out around 2010 MB. i don't understand this game.
 
So, I think it's time for me to step up for 6GB Titan Black Edition cards. Lolololol.
 
i don't understand how everyone is getting such ridiculous ram usage. the game never goes over ~2080 MB for me. i did what you said and the stuttering is slightly less but it's still there, and i'm only playing at 1080p with my vram maxed out around 2010 MB. i don't understand this game.
I have 780Ti and at 1080p with ultra settings and Temporal SMAA my VRAM usage is maxed and game stutters all the time.
 
i don't understand how everyone is getting such ridiculous ram usage. the game never goes over ~2080 MB for me. i did what you said and the stuttering is slightly less but it's still there, and i'm only playing at 1080p with my vram maxed out around 2010 MB. i don't understand this game.

Your VRAM usage maxes out at 2010mb because you only have a 2gb card. The game could be trying to allocate 3gb of VRAM but because your card only has 2gb then you won't see it because the rest spills over into system memory and causing stutter.

It also depends on what tools you're using because Task Manager only tells half the picture. Wolfenstein is 'only' using 2.5gb of RAM according to Task Manager where Process Explorer shows it allocating up to 5.5gb of RAM on my own system. Turning off compressed textures makes that number grow by 1.6gb despite only showing a few hundred megabytes of increased VRAM usage.

There are many, many different metrics for showing memory in use and they often seem to contradict each other. I find that Process Explorer is your best bet for showing actual overall memory usage; run the program, press ctrl+i, and look at System Commit in the memory tab before running the game and during to see its real use. Physical Memory corresponds to the number you see in Task Manager while System Commit includes VRAM and various other stuff not represented there.
 
Holy crap, Never have I seen such a big discrepancy between High quality textures and the next setting up. Based only on the screenshots, it looks like "ultra" uses 2-3 times the texture resolution.

Even so, that "ultra" quality setting looks equivalent to Medium settings on most modern FPS games. This will be very disappointing for people like with last-generation 2gb cards.
 
So, I think it's time for me to step up for 6GB Titan Black Edition cards. Lolololol.
Let's hold off on emptying your wallet just a little longer. Guys the game has been out to the public for less then a day. Ubi has already released a patch. the game is not as well optimized as it should be for the last gen graphics that it has. I would definitely work on those settings medium texture is just horrendous to my eyes.
 
i don't understand how everyone is getting such ridiculous ram usage. the game never goes over ~2080 MB for me. i did what you said and the stuttering is slightly less but it's still there, and i'm only playing at 1080p with my vram maxed out around 2010 MB. i don't understand this game.
Same.
I'd love to see this game chew through some of my 8 GB RAM but as of right now it uses a little less than 2.
 
"Ultra" textures require 3GB of VRAM" - nope, they require more than 3GB (so effectively a card with 4GB), even on 1080p, which is simply ABSURD and Ubi should be criticised for not optimizing memory usage on PC. Consoles have unified memory, so Ubi felt no need to cater to PC crowd. Just go and buy 4GB card, or lower your textures to High, which actually looks ugly http://i.imgur.com/T40A2fs.jpg
My friend has 2GB 770 card and he can't even play on high - not enough VRAM.

Yeah...the optimization (or lack thereof) in this game is seriously a joke.
 
Your VRAM usage maxes out at 2010mb because you only have a 2gb card. The game could be trying to allocate 3gb of VRAM but because your card only has 2gb then you won't see it because the rest spills over into system memory and causing stutter.

yes that's why i'm confused, "gee why isn't my VRAM usage more than 2 GB on my 2 GB graphics card?", not because the guy running the game at 4k has less VRAM usage than i do at 1080p. lol.

but yes, i always forget about committed memory. looks like the game was using ~5 GB of RAM.
 
Last edited:
I have 780Ti and at 1080p with ultra settings and Temporal SMAA my VRAM usage is maxed and game stutters all the time.

I don't have the game to test for myself but if you only have 8gb of RAM then that could be a reason for stutters unto itself. VRAM spilling into system memory and system memory spilling into the page file would be pretty much unheard of on a modern system assuming that the earlier number of 8-9gb of memory use is correct.

Try running Process Explorer as detailed above to see how much memory is actually in use by the game. I'd very much appreciate seeing that number.
 
Have any of you guys that are playing the game in medium to low settings have the newest driver update? that might help as well. I'm on a GTX 670 2GB and I can play on High with MSAA 2x and get well over 60+ FPS at 1080p.

it does chew through my ram I will say that.
 
I don't have the game to test for myself but if you only have 8gb of RAM then that could be a reason for stutters unto itself. VRAM spilling into system memory and system memory spilling into the page file would be pretty much unheard of on a modern system assuming that the earlier number of 8-9gb of memory use is correct.

Try running Process Explorer as detailed above to see how much memory is actually in use by the game. I'd very much appreciate seeing that number.


I will try, but that would be the first time in history when 8GB RAM and 3GB VRAM is not enough, which would be funny if it wasn't tragic.

Are there people with 16 gigs that do not experience stutter?
 
10 minutes of driving around the starter zone.
1920x1200, High textures, Temporal AA, everything else max.

http://i.imgur.com/EylqVud.png

I'm using the disable page file shortcut.
I DON'T have any stuttering problems, though. The only problem I have with this game is low fps on average.
 

Yes. I just double checked to make sure I wasn't speaking with my foot in my mouth. Task Manager and Resource Monitor both report 'Memory in use' at 2-2.5gb of memory use in Wolfenstein. 'Modified' memory changes from ~500mb to ~1600mb depending on whether I'm using compressed or uncompressed textures and signifies textures spilling from VRAM into system RAM and causing texture pop-ins.

These numbers are otherwise invisible to both Task Manager and Resource Monitor.
 
I don't have the game to test for myself but if you only have 8gb of RAM then that could be a reason for stutters unto itself. VRAM spilling into system memory and system memory spilling into the page file would be pretty much unheard of on a modern system assuming that the earlier number of 8-9gb of memory use is correct.

Try running Process Explorer as detailed above to see how much memory is actually in use by the game. I'd very much appreciate seeing that number.

you don't need to use process explorer to see the amount of committed memory. i just did it, it was 2.2 GB before and 7.2 after.

excel500-20140528-164239.png
excel501-20140528-164421.png


10 minutes of driving around the starter zone.
1920x1200, High textures, Temporal AA, everything else max.

http://i.imgur.com/EylqVud.png

I'm using the disable page file shortcut.
I DON'T have any stuttering problems, though. The only problem I have with this game is low fps on average.

of course you don't have any stuttering problems, your card has 3 GB of VRAM and you're playing with high textures.
 
of course you don't have any stuttering problems, your card has 3 GB of VRAM and you're playing with high textures.
I don't want people to falsely assume I'm associating ~2 GB RAM usage with the stuttering issue, since I don't suffer from it (just for clarity).
I would like to see this game utilize more RAM, since Ubisoft claimed it would. Although I don't think I would see any benefit from it.

It doesn't explain why other people are seeing 6+ GB.
Memory leak?

What is that? You can disable page file without restarting?
Create a shortcut to Watch Dogs.exe, add this cvar to the target:
-disablepagefilecheck
 
I don't want people to falsely assume I'm associating ~2 GB RAM usage with the stuttering issue, since I don't suffer from it (just for clarity).
I would like to see this game utilize more RAM, since Ubisoft claimed it would. Although I don't think I would see any benefit from it.

It doesn't explain why other people are seeing 6+ GB.
Memory leak?


Create a shortcut to Watch Dogs.exe, add this cvar to the target:
-disablepagefilecheck

the more ram you have the more windows will allocate. it's normal.
 
I will try, but that would be the first time in history when 8GB RAM and 3GB VRAM is not enough, which would be funny if it wasn't tragic.

Are there people with 16 gigs that do not experience stutter?

Heh, I wouldn't have laughed if you hadn't pointed out how tragic it was. But, for the sake of it, I want to show how much memory Wolfenstein actually uses.

Here is my system before running the game:

By1oyM9.png


Here it is while using compressed textures:

igle5D5.png


Here it is while using uncompressed textures:

zjLcvsn.png



Considering that most of this memory is consumed by textures, it kind of puts it into perspective. My GPU, conversely, only shows a difference of about 1.5gb from 'no game running' to 'running the game uncompressed'.
 
Wow, just wow. And the best part is that these textures are very poor. We are entering a new era of lazy ports from consoles with 8GB of unified memory.
 
Back
Top