Watchdogs 2 to be AMD Optimized

Yakk

Supreme [H]ardness
Joined
Nov 5, 2010
Messages
5,810
Well... This one came out of left field IMO. :eek:

Watchdogs 2 to be AMD Optimized

Ubi$oft are very, very good at getting government funding to sponsor their studios and getting nvidia to sponsor their individual game titles (if not directly sponsored, then through their Gameworks program) so I was surprised to see them mentioned with an association with AMD (again). Maybe their string of release disasters with Watchdogs 1, AC..etc... had something to do with it, maybe not, who knows.

Hopefully with DX12 and working with AMD they might finally be able to deliver on the graphics fidelity for Watchdogs on consoles and PC?
 
It seems that the damage caused by using Sabotageworks in the last game cost Ubisoft more in the long term due to the bad press than what they got upfront from the kickback/co-marketing deal with Nvidia.
The Division is an example of how a well optimized game for ALL should play out. The dev worked closely with AMD to optimize for console and those optimizations carried over to PC. They then bolted on Sabotageworks BUT it didn't actually hinder performance on AMD as they had already optimized the game for AMD's architecture. Everyone gets a good gaming experience and the game sold extremely well

Kudos to Red Storm for showing the other Ubisoft studios how things should be done.
 
Last edited:
Nvidia probably passed on this after the graphics controversy of the first game...AMD usually only gets the titles Nvidia doesn't want :D
 
So now Watch_Dogs 2 will run great on all platforms. Good News!

Problems with watch dogs were always related more toward CPU, RAM and VRAM than GPU or bad coding itself, Watch Dogs always ran good in my old R9 280X at that time, it performed equal to the Nvidia GTX 770 which was the directly competence.. but not only that, it performed better than the 2GB GTX 770 because of vRAM, that was all the serious issues with fallout heavy vRAM usage that required 3GB+ GPU with 8 threads intel CPU and 16GB RAM to be fully smooth, problem? any of those things were in the market ages ago before Watch Dogs. but Nvidia people grumbled because their 2GB GPUs were unable to run Ultra Textures in that game due heavy stuttering.. the problem again never was GameWorks, it was ubisoft that never cared to launch a game polished it was even un-finished and visually downgraded..
 
AMD just has more experience with DX12, despite nVidia claiming they were working with msft 6 years ago. :D

That's more than likely the reason they'd go with AMD support. They might even have been one of the many devs who were involved in the Mantle program once AMD opened it up.
 
Problems with watch dogs were always related more toward CPU, RAM and VRAM than GPU or bad coding itself, Watch Dogs always ran good in my old R9 280X at that time, it performed equal to the Nvidia GTX 770 which was the directly competence.. but not only that, it performed better than the 2GB GTX 770 because of vRAM, that was all the serious issues with fallout heavy vRAM usage that required 3GB+ GPU with 8 threads intel CPU and 16GB RAM to be fully smooth, problem? any of those things were in the market ages ago before Watch Dogs. but Nvidia people grumbled because their 2GB GPUs were unable to run Ultra Textures in that game due heavy stuttering.. the problem again never was GameWorks, it was ubisoft that never cared to launch a game polished it was even un-finished and visually downgraded..

It stuttered badly on my HD 7950 Crossfire setup. I bought this R9 290 because of this and even had to OC the memory for the last scenarios when all hell breaks loose and you have to drive for extended periods. The driving was a stuttering mess most of the time. Watch Dogs was one of the main reasons I loathe Crossfire to this day. SLi hate came from experiences I encountered a few years prior.
 
It seems that the damage caused by using Sabotageworks in the last game cost Ubisoft more in the long term due to the bad press than what they got upfront from the kickback/co-marketing deal with Nvidia.
The Division is an example of how a well optimized game for ALL should play out. The dev worked closely with AMD to optimize for console and those optimizations carried over to PC. They then bolted on Sabotageworks BUT it didn't actually hinder performance on AMD as they had already optimized the game for AMD's architecture. Everyone gets a good gaming experience and the game sold extremely well

Kudos to Red Storm for showing the other Ubisoft studios how things should be done.

For sure it is a step in the right direction, but don't count Gameworks interference out yet.

For example The Division, and War Thunder. In the case of War Thunder the game was functioning pretty darn well upto their infamous Gameworks update which sabotaged Crossfire, which was working extremely well before, and just generally doesn't run very well on AMD hardware anymore. The Division is also suffering from broken Crossfire support, and so far AMD seems to be having a very hard time working through whatever nvidia did in their recent GW updates which broke stuff in a very oddly similar way for both these games.
 
Explains a lot of the GW hate still ongoing, even if with single cards AMD has a solid grasp on driver fixes.

How does GW break CF? Even if you disable all GW features? (A well laid trap for anyone with a good answer).
 
It stuttered badly on my HD 7950 Crossfire setup. I bought this R9 290 because of this and even had to OC the memory for the last scenarios when all hell breaks loose and you have to drive for extended periods. The driving was a stuttering mess most of the time. Watch Dogs was one of the main reasons I loathe Crossfire to this day. SLi hate came from experiences I encountered a few years prior.

It ran smooth on my 290.... With these settings it was fine.
 
Explains a lot of the GW hate still ongoing, even if with single cards AMD has a solid grasp on driver fixes.

How does GW break CF? Even if you disable all GW features? (A well laid trap for anyone with a good answer).

All I can say to you is my 290x crossfire ran perfectly on The Division in beta, on release, horrible graphical flickering. This is hardly the first time it goes from working, to broken. Many a year ago (I've run dual GPU for ages now, AMD and Nvidia) before all this started happening crossfire worked 95% of the time without any driver required or anything. Now it mostly doesn't work, and when it doesn't work it usually runs even worse then normal until i completely disable it in control panel. On the other hand, Sli barely worked with anything way back, better now for sure but i'd say Nvidia are still shy of being what AMD already once was. I've had 15 dual GPU rigs over the last 10 years, so...bit of experience with them.
 
Explains a lot of the GW hate still ongoing, even if with single cards AMD has a solid grasp on driver fixes.

How does GW break CF? Even if you disable all GW features? (A well laid trap for anyone with a good answer).
It doesn't, or rather, it shouldn't.

Probably something else was updated in the game, there's no reason this should happen. As you say, turn GW off, is CF still broken?
 
All I can say to you is my 290x crossfire ran perfectly on The Division in beta, on release, horrible graphical flickering. This is hardly the first time it goes from working, to broken. Many a year ago (I've run dual GPU for ages now, AMD and Nvidia) before all this started happening crossfire worked 95% of the time without any driver required or anything. Now it mostly doesn't work, and when it doesn't work it usually runs even worse then normal until i completely disable it in control panel. On the other hand, Sli barely worked with anything way back, better now for sure but i'd say Nvidia are still shy of being what AMD already once was. I've had 15 dual GPU rigs over the last 10 years, so...bit of experience with them.

SLi was so broken back in the day that I contemplated dropping my HD5850 back into my PC to replace my GTX 460 SLi setup. I spent so much money on an inflated pricing day one purchase for those 460's that I mentally couldn't swap setups. AMD runt frames fiasco was pretty bad also as the stutter would give me immense headaches. So glad when Nvidia and reviewers called them out on it as the fix made my HD7950 Crossfire setup magnificent. Watch Dogs sent me back to a single card setup eventually though.
 
It doesn't, or rather, it shouldn't.

Probably something else was updated in the game, there's no reason this should happen. As you say, turn GW off, is CF still broken?

As stated above, we are at 2 games which had Crossfire CONFIRMED working before gameworks updates, and became a mess after the gameworks update in a fashion so similar it's very hard to say a coincidence.

It's not a case of turning off gameworks features either... In the case of War Thunder gameworks libraries unfortunately replaced the codepath for DX11 in their games engine, so GW can't be turned off on PC. The only alternative is to use the OpenGL codepath which doesn't support multi-card setups anyways. I need to look into The Division more to see how exactly gameworks broke it to be so similar to War Thunder, I suspect a very similar situation where game code is concerned.
 
As stated above, we are at 2 games which had Crossfire CONFIRMED working before gameworks updates, and became a mess after the gameworks update in a fashion so similar it's very hard to say a coincidence.

It's not a case of turning off gameworks features either... In the case of War Thunder gameworks libraries unfortunately replaced the codepath for DX11 in their games engine, so GW can't be turned off on PC. The only alternative is to use the OpenGL codepath which doesn't support multi-card setups anyways. I need to look into The Division more to see how exactly gameworks broke it to be so similar to War Thunder, I suspect a very similar situation where game code is concerned.

I have no idea about Crossfire or sli support in those games, but there's no reason specifically why GameWorks features would break multigpu, im sure those weren't the only changes when those were patched in, many others titles with GameWorks work just fine, are you saying this is yet another nvidia conspiracy?
 
I have no idea about Crossfire or sli support in those games, but there's no reason specifically why GameWorks features would break multigpu, im sure those weren't the only changes when those were patched in, many others titles with GameWorks work just fine, are you saying this is yet another nvidia conspiracy?
Based on the examples there is viable reason for that outcome. No one is saying it is the definite end but rather a reasonable one till more evidence is found. And to be totally honest GW was a tool to enhance its card performance not the competitors. If parity was their goal and only to show higher quality effects then their libraries would have been open.
 
Based on the examples there is viable reason for that outcome. No one is saying it is the definite end but rather a reasonable one till more evidence is found. And to be totally honest GW was a tool to enhance its card performance not the competitors. If parity was their goal and only to show higher quality effects then their libraries would have been open.

Big difference between 'GW is a library to enhance experience on nV hw' and claiming it breaks Crossfire (and sli?) intentionally, twice.

Now the source code is available will you stop complaining about it?
 
Of course SLI works in WT. I'm just looking at the parallels between the two games. It's uncanny the similarities before and after gameworks was added.
 
  • Like
Reactions: Zuul
like this
Of course SLI works in WT. I'm just looking at the parallels between the two games. It's uncanny the similarities before and after gameworks was added.

Yeah I'm just saying though, since there are many examples of games with GW with crossfire working, and games with GW in which AMD performs well, maybe, just maybe, it's a question of implementation and developer effort in maintaining support for multigpu

Seeing this in your sig:
"gameworks is a terrible destructive negative force in PC gaming. Quote me. Its evil." Roy Talor - #gameswithoutgameworks

Makes it hard for me to consider you impartial in this discussion
 
Yeah I'm just saying though, since there are many examples of games with GW with crossfire working, and games with GW in which AMD performs well, maybe, just maybe, it's a question of implementation and developer effort in maintaining support for multigpu

Seeing this in your sig:
"gameworks is a terrible destructive negative force in PC gaming. Quote me. Its evil." Roy Talor - #gameswithoutgameworks

Makes it hard for me to consider you impartial in this discussion


Given your posting history, I'd say its hard for anyone to consider you impartial either, so I'd argue that it all evens out. :p
 
Well... This one came out of left field IMO. :eek:

Watchdogs 2 to be AMD Optimized

Ubi$oft are very, very good at getting government funding to sponsor their studios and getting nvidia to sponsor their individual game titles (if not directly sponsored, then through their Gameworks program) so I was surprised to see them mentioned with an association with AMD (again). Maybe their string of release disasters with Watchdogs 1, AC..etc... had something to do with it, maybe not, who knows.

Hopefully with DX12 and working with AMD they might finally be able to deliver on the graphics fidelity for Watchdogs on consoles and PC?

Developers seem to switch very quickly these days.
From a business-development strategy this makes sense IMO as Ubi$soft would see the best profit from working with AMD and bringing the "best game" they can make on console and direct port it as much to PC to reduce costs, people involved, and timeframe-milestones.
Key is getting the sales on PC before the gripes and lamentations gain speed :)
Although this would suggest one card manufacturer may have a headache if they cannot engage successfully.
Cheers
 
Given your posting history, I'd say its hard for anyone to consider you impartial either, so I'd argue that it all evens out. :p
I thought I replied to this, my post history consists of well founded claims backed by evidence, claims you denied the verity of on the basis that I'm some kind of nvidia marketing guy.

When I don't know something I just admit it, learn from my example.

Your posts in a nutshell

Binyamin-Netanyahu-009.jpg


Never seen someone less contrite than you are
 
Last edited:
Yeah I'm just saying though, since there are many examples of games with GW with crossfire working, and games with GW in which AMD performs well, maybe, just maybe, it's a question of implementation and developer effort in maintaining support for multigpu

Seeing this in your sig:
"gameworks is a terrible destructive negative force in PC gaming. Quote me. Its evil." Roy Talor - #gameswithoutgameworks

Makes it hard for me to consider you impartial in this discussion

I stand by my sig line, else I wouldn't write it. Some of the stuff... Anyways, I'm not looking to argue, it is what it is. Just noticed something interesting from 2 separate developers as the thread progressed. Could it be something else? Possible.
 
It seems that the damage caused by using Sabotageworks in the last game cost Ubisoft more in the long term due to the bad press than what they got upfront from the kickback/co-marketing deal with Nvidia.
The Division is an example of how a well optimized game for ALL should play out. The dev worked closely with AMD to optimize for console and those optimizations carried over to PC. They then bolted on Sabotageworks BUT it didn't actually hinder performance on AMD as they had already optimized the game for AMD's architecture. Everyone gets a good gaming experience and the game sold extremely well

Kudos to Red Storm for showing the other Ubisoft studios how things should be done.

It really had nothing to do with Nvidia but Ubisoft had been on the decline for years. They sold out to the mainstream on their Tom Clancy titles in the early 2000s and dropped the ball on support for all their niche titles to. Remember IL-2 Cliffs of Dover? Rushed and horribly broken at release. Ubisoft has been on a quality decline for years and it peaked with Assassins Creed Unity. I suppose it was only then that the mainstream caught on to how bad their quality has fallen.

Then we have titles like Far Cry 3/4/Primal which used Gameworks features but ran wonderfully and were nicely optimized to. And I recall Blacklist using a number of those features with no performance problems, although it did have the typical broken Ubisoft controls (not Nvidias fault). I imagine after the mainstream finally brought Ubisoft's screw ups to the news sites they finally had to take a step back and up the quality. Hence why their recent titles might be better, and why there is no Assassins Creed coming in 2016.

Ubisoft seems to have realized they were releasing crap. They'll continue to release a lot of overly mainstream games and rape the Tom Clancy name (the joke that is Rainbow Six Siege, The Division, "Ghost Recon" Wildlands) but maybe they'll make their games work better. And Nvidia or AMD will have little influence on that.

As for AMD, they have their faults to. Their support sucks for flight sims so they really aren't an option for me. I suppose this extends to other niche PC only games. I like to play all kinds of games, from mainstream AAA releases to niche PC only games and I choose whichever brand seems to work the best. Although the GTX 260 Vista era Nvidia drivers almost made me go back to ATI/AMD, their recent drivers for the past ~4-5 years have been excellent.
 
It really had nothing to do with Nvidia but Ubisoft had been on the decline for years. They sold out to the mainstream on their Tom Clancy titles in the early 2000s and dropped the ball on support for all their niche titles to. Remember IL-2 Cliffs of Dover? Rushed and horribly broken at release. Ubisoft has been on a quality decline for years and it peaked with Assassins Creed Unity. I suppose it was only then that the mainstream caught on to how bad their quality has fallen.

Then we have titles like Far Cry 3/4/Primal which used Gameworks features but ran wonderfully and were nicely optimized to. And I recall Blacklist using a number of those features with no performance problems, although it did have the typical broken Ubisoft controls (not Nvidias fault). I imagine after the mainstream finally brought Ubisoft's screw ups to the news sites they finally had to take a step back and up the quality. Hence why their recent titles might be better, and why there is no Assassins Creed coming in 2016.

Ubisoft seems to have realized they were releasing crap. They'll continue to release a lot of overly mainstream games and rape the Tom Clancy name (the joke that is Rainbow Six Siege, The Division, "Ghost Recon" Wildlands) but maybe they'll make their games work better. And Nvidia or AMD will have little influence on that.

As for AMD, they have their faults to. Their support sucks for flight sims so they really aren't an option for me. I suppose this extends to other niche PC only games. I like to play all kinds of games, from mainstream AAA releases to niche PC only games and I choose whichever brand seems to work the best. Although the GTX 260 Vista era Nvidia drivers almost made me go back to ATI/AMD, their recent drivers for the past ~4-5 years have been excellent.

Look how much you had to write to sum up your view, look how much easier it is say this :

'nvidia sabotaging amd with gameworks'
 
Look how much you had to write to sum up your view, look how much easier it is say this :

'nvidia sabotaging amd with gameworks'

I'll never understand why/how people can become fans of one particular hardware brand. Its a GPU. How they play games matters. Although I am not a fan of Nvidia/AMD only features in games. Every hardware brand should be able to use any settings, IMO.
 
As stated above, we are at 2 games which had Crossfire CONFIRMED working before gameworks updates, and became a mess after the gameworks update in a fashion so similar it's very hard to say a coincidence.

It's not a case of turning off gameworks features either... In the case of War Thunder gameworks libraries unfortunately replaced the codepath for DX11 in their games engine, so GW can't be turned off on PC. The only alternative is to use the OpenGL codepath which doesn't support multi-card setups anyways. I need to look into The Division more to see how exactly gameworks broke it to be so similar to War Thunder, I suspect a very similar situation where game code is concerned.
Yeah. They don't just drop the DLL's in the game's folder and everything magically works. The game is rewritten to support them
 
I stand by my sig line, else I wouldn't write it. Some of the stuff... Anyways, I'm not looking to argue, it is what it is. Just noticed something interesting from 2 separate developers as the thread progressed. Could it be something else? Possible.

I guess it depends what is actually implemented regarding Gameworks and how good the developers are.
Has anyone complained about Rise of the Tomb Raider since end of march when they added 11.3 feature pertaining to VXAO-gameworks, or GTA V?
I thought The Division was ok these days but I have not been following it for awhile, SLI and X/Fire have issues even without Gameworks for some games so I am not sure it can be blamed for all woes in games.
Would be like blaming AMD for all recent AAA games crossed over from console....
Not suggesting Gameworks is all good for the industry as it is bloated and performance deficient most times (only a very few developers seem to be able to implement this well) but it can be very useful.
Who should we blame for Hitman 2016?
Cheers
 
Yeah. They don't just drop the DLL's in the game's folder and everything magically works. The game is rewritten to support them

Don't be silly dude, they don't just drop the DLLs in the folder, they also have to set UseGameWorks 1 and BreakCrossfire 1 in the .ini file
 
I'll never understand why/how people can become fans of one particular hardware brand. Its a GPU. How they play games matters. Although I am not a fan of Nvidia/AMD only features in games. Every hardware brand should be able to use any settings, IMO.

Agreed, and then be so damn defensive when called on it, like you just insulted their mother.
 
Agreed, and then be so damn defensive when called on it, like you just insulted their mother.

Yeah it's insane how they react, people are willing to even deny the results of elementary mathematics in lieu of conceding their argument is flawed
 
Back
Top