NVIDIA GameWorks + code optimization

So Prime1 how do you explain most of Batman's AA code coming from some MS documentation of which there was some AMD tweaks in it but for some reason it can't run on AMD cards?
 
So Prime1 how do you explain most of Batman's AA code coming from some MS documentation of which there was some AMD tweaks in it but for some reason it can't run on AMD cards?

You simply dont have the facts.

Implementing MSAA support in a deferred renderer is not trivial to do. If NVIDIA hadn't done the work, the game would have *no AA*.

It was reasonable not to want their free work to benefit the competition who invested nothing in improving the game.

The same is true of these Gameworks features. They would not exist if NVIDIA didn't invest money to research and develop them. So it's not a question of whether you'd prefer open or closed.... It's a question of if you'd prefer closed or nothing.

Devs can still spend the time and resources to develop them themselves from scratch... No one is stopping them if it's so easy to do and evil of NVIDIA.
 
Last edited:
You simply dont have the facts.

Implementing MSAA support in a deferred renderer is not trivial to do. If NVIDIA hadn't done the work, the game would have *no AA*.

It was reasonable not to want their free work to benefit the competition who invested nothing in improving the game.

The same is true of these Gameworks features. They would not exist if NVIDIA didn't invest money to research and develop them. So it's not a question of whether you'd prefer open or closed.... It's a question of if you'd prefer closed or nothing.

Devs can still spend the time and resources to develop them themselves from scratch... No one is stopping them if it's so easy to do and evil of NVIDIA.

Really? Well the fact of the matter is UE3 is not a deferred renderer. No matter how many times people say it, it simply isn't true.

If they had to do so much work why was a majority of the code a standard DX9 method? Oh I forgot, Nvidia had to work on that vendor-id check. Gloss over the fact that UE3 on a certain console with an AMD GPU in it could magically use 2x AA.

So I guess AMD should be locking out Nvidia from all features that they have helped implement in games? That isn't how you move the industry forward, nor is that how you satisfy your customer base.
 
The way I see it everyone with complaints about low performance in a GameWorks game should make a thread on the Nvidia forums and demand that they write better CUDA translators. I could care less if they want to hide their trade secrets. Just fix your damn software! Since AMD GPUs don't have issues in other games keeping up or surpassing Nvidia GPUs, obviously the weak link is the Nvidia software. Ray Charles can see that!
 
If games don't run well on AMD cards. Stop buying AMD cards. It's just that simple.
 
If invidia cant play nice, stop buying nvidia cards. Its just that simple.
 
I like NVIDIA's products, but that doesn't stop me from recognizing when they do something that is bad for the industry and the market and rightfully criticizing them for it.
 
It's only time till they get in trouble with the EU and fined :p

uncompetitive practices? do they still enforce that?(with fines)
 
im not sure they are anti-competitive, just not consumer friendly.

Although the EU has put its foot down for a lot less. Money grab to bail out all the failed members mostly.
 
Who am I going to buy from? They all do this.

NVIDIA just has better support for games, so they get my money.

When was the last time you had an AMD card? Their drivers/cards have gotten much better and they're doing just fine.
 
My opinion on this kind of stuff is that if you're going to use vendor-specific libraries, it's going to be optimized for that vendor's hardware. It may even be designed specifically to degrade performance or functionality on hardware from competing vendors (but I've not seen any evidence of GameWorks being designed this way). If all you have access to are compiled libraries to which you link and shader blobs, you have to assume that there may be some weird shit going on that's going to negatively impact some of your customers. Your abilities as a developer to get insight into what may be going on there are limited.

Any developer considering using any vendor-specific libraries, compilers or other tooling — regardless of vendor — should understand that there are potential consequences to that. Anyone using GameWorks is accepting a certain amount of risk because they're able to make a business case for it.

"According to Nvidia, developers can, under certain licensing circumstances, gain access to (and optimize) the GameWorks code, but cannot share that code with AMD for optimization purposes."
That seems like a fairly bog-standard proprietary license restriction. In many cases, working with proprietary libraries and middleware is a truly "black box" setup, so NVIDIA at least allowing a licensee to peer into the code makes it a somewhat better situation. I'd be curious to know what those "certain licensing circumstances" are, though.

Mantle and dx are in not the same and cannot be compared. A closer comparison would be mantle and physx.
No, I'd argue that Mantle and Direct3D are quite comparable. They're both proprietary graphics APIs. Mantle and PhysX are comparable for different sets of reasons, but Mantle and D3D are actually pretty similar things, conceptually speaking.

Nvidia is trying to take a standard, dx11, and turn it into a closed standard.
They have no ability to do so. This is just shader code and thin sets of middleware built atop DX.
 
Back
Top