AMD is not allowed to see the DX11 code for Watch Dogs

Meh...par for the course in the nVidia vs AMD pissing contest. Whomever makes the backroom deals and writes the checks first gets the advantage. Nothing new, sadly.
 
Nvidia using thier proprietary Gameworks closed libraries is nothing new, ESPECIALLY on bundled games (...Batman...etc...).

This is what nvidia does.

They still need to be baseline compatible with DX, but they hide "their" library code, which they supplied to the developer, so they can dictate EXACTLY (and I mean EXACTLY) what the competition performance will be.

Glad to see an article about it, but it has been this way for years...
 
AMD’s Mantle, a low-level API, doesn’t require the company’s GCN architecture to function properly. AMD says it will work equally well on Nvidia cards. The company clearly waves a banner of open-source development and ideals.

AMD isn't going as far as nvidia.
 
Zero performance gain between Radeon 280x and 290x is just unreal, though.
 
AMD’s Mantle, a low-level API, doesn’t require the company’s GCN architecture to function properly.

If you could show me Mantle running on anything other than GCN that would be cool.

Not gonna happen.

8XozYvd.gif
 
If you could show me Mantle running on anything other than GCN that would be cool.

Not gonna happen.

8XozYvd.gif


Not AMD's fault. AMD could offer the source code, specs, library and even PAY for Nvidia to integrate it, Nvidia would still say 'no'

See the 'freeSynch' fiasco.
 
So just like what AMD did with Mantle and BF4. Payback is a beyotch! :D

I didn't know Mantle somehow hindered DX11 performance for Nvidia. I didn't know Nvidia couldn't code for DX11 anymore. I didn't know that an alternative method hinders the original method for the competition. I really don't know how you are still not banned for blatant flame baiting.


Not AMD's fault. AMD could offer the source code, specs, library and even PAY for Nvidia to integrate it, Nvidia would still say 'no'

See the 'freeSynch' fiasco.

I think you would have more success telling a wall that it's a door than getting anywhere with Prime1.
 
What does Mantle have to do with any of this?
Unless DICE specifically denied optimization access to Nvidia during BF4's development, Mantle is a totally separate issue.

GameWorks isn't an API.
 
So just like what AMD did with Mantle and BF4. Payback is a beyotch! :D

I think you might be slightly, I won't say it because I'll get an infraction.

Mantle is an API. DirectX is an API. OpenGL is an API. Gameworks in not an API. What nvidia is doing is a taking an API (DirectX) and trying to turn it into their own "gameworks" API.

I know why they're doing it, but I really see a problem with it. All we need is amd to do the same thing and we will have to change out gpus to play different game.

You want to play game x, you need an nvidia card.

You want to play game y, you need an amd card.

Gameworks is bad for gamers, as a gamer if you don't see that, I hope you like changing out gpus all the time.
 
I am fairly sure he is not testing on a WD optimized driver, so it makes a lot of this very likely moot. Not the first time we have seen coding held back from team Green or team Red. We might just be seeing some of this very very soon.

Watch Dogs performance improvements

AMD Radeon R9 290X - 1920x1080 4x MSAA – improves up to 25%

AMD Radeon R9290X - 2560x1600 4x MSAA – improves up to 28%

AMD Radeon R9290X CrossFire configuration (3840x2160 Ultra settings, MSAA = 4X) - 92% scaling
 
Do people really make their GPU purchase decisions based on one game? Sounds ridiculous.
 
edit: In light of new information my rant seems a bit puerile. End of post. :/
 
Last edited:
Mantle is an API. DirectX is an API.

Exactly my point. This is why Mantle is worse. Sleeping Dogs is DX11, so it will work on different systems. If it was Mantle it would be limited to only AMD.

Either way it looks like this was much ado about nothing. At least I got to use my Breaking Bad gif.
 
According to the article he is using a Watch Dogs optimized driver that AMD hasn't released to the public yet, however I'm not in the loop like you are so maybe you're referring to another set of drivers.
Show me a GTX 770 getting 60fps in Watch Dogs and I'll show you A LIAR!
 
Exactly my point. This is why Mantle is worse. Sleeping Dogs is DX11, so it will work on different systems. If it was Mantle it would be limited to only AMD.

Either way it looks like this was much ado about nothing. At least I got to use my Breaking Bad gif.

You do realize that every game that runs mantle, also runs DirectX right? So if you can't run mantle, you can still run DirectX, right?
 
How is this going to affect the console versions? It's odd that the Watch Dogs game is heavily advertised as a console game but aren't the latest gen consoles AMD powered? That just seems odd to further handicap consoles anymore than they already are - it's not like they can just upgrade / sidegrade their GPU like PC gamers can.
 
Console performance is horrible, high preset at 792p for Xbox one and 900p for ps4 with 30fps and lower fps during more intense scenes, personally I call that a fail technically wise
 
Console performance is horrible, high preset at 792p for Xbox one and 900p for ps4 with 30fps and lower fps during more intense scenes, personally I call that a fail technically wise
The fact that either console can handle "High" preset is impressive.
I say this after seeing some high-end gaming rigs struggle with it.
 
You do realize that every game that runs mantle, also runs DirectX right? So if you can't run mantle, you can still run DirectX, right?

Then what's the point of Mantle? Maybe it's so that it runs better on AMD hardware? Wait! But that's evil!

Are you trying to prove my point for me?
 
How is this going to affect the console versions? It's odd that the Watch Dogs game is heavily advertised as a console game but aren't the latest gen consoles AMD powered? That just seems odd to further handicap consoles anymore than they already are - it's not like they can just upgrade / sidegrade their GPU like PC gamers can.

Hmm... Well that might be a good reason why it looks like AMD is able to release their 14.6 driver with WD nvidia gameworks workarounds so quickly this time! :)
 
Then what's the point of Mantle? Maybe it's so that it runs better on AMD hardware? Wait! But that's evil!

Are you trying to prove my point for me?

Blatant flame baiting.... You should be banned.....
 
Then what's the point of Mantle? Maybe it's so that it runs better on AMD hardware? Wait! But that's evil!

Are you trying to prove my point for me?

You are 100% correct! the point of mantle to make amd hardware run faster. What you are missing is mantle is amd own API. They made their own API to showcase their hardware. In no way does mantle affect DirectX performance. DirectX is Microsoft's open API.

So pleas explain to me what gameworks is?
 
How is this going to affect the console versions? It's odd that the Watch Dogs game is heavily advertised as a console game but aren't the latest gen consoles AMD powered? That just seems odd to further handicap consoles anymore than they already are - it's not like they can just upgrade / sidegrade their GPU like PC gamers can.

I wonder if the console version even use gameworks? I bet they don't.
 
Nvidia wasn't sent the Tomb Raider files until after the game came out...AMD was developing TressFX right from the get-go...now TR happened to run better on Nvidia hardware anyway but the point being that AMD plays this game as well...Bioshock Infinite as well I believe
 
Last edited:
Nvidia wasn't sent the Tomb Raider files until after the game came out...now TR happened to run better on Nvidia hardware anyway but the point being that AMD plays this game as well...Bioshock Infinite as well I believe

I'm not sure how that's the same thing considering that AMD will NEVER see the code used in gameworks games unless Nvidia reverses their policy.
 
is this that GTA + goofy hacking game? It doesn't look particularly good or innovative anyway, probably console bait
 
Nvidia wasn't sent the Tomb Raider files until after the game came out...AMD was developing TressFX right from the get-go...now TR happened to run better on Nvidia hardware anyway but the point being that AMD plays this game as well...Bioshock Infinite as well I believe

That isn't true at all. Why do you feel the need to make up this BS?

TR, Nvidia didn't receive the launch version code because they were already working with Nvidia on a patch.
Bioshock Infinite, Nvidia had a driver update ready a few days before release, ie they received the finalized version of the game code.
 
Nvidia's always been shady, batman ran terrible on amd cards on purpose, this is just like intel crippling amd processors in benchmarks back in the day
 
Back
Top