Nvidia publicly bashing Stardock developer over an ALPHA level game

2: I'm pretty sure PC gaming has been growing both in gross numbers and market share metrics since the end of 2011/beginning of 2012. (interestingly enough, that coincides with the release of Skyrim, which was noted for introducing a ton of people to the idea of PC modding and flexibility). So I don't think anyone is 'leaving' PC gaming at the moment. If you had said that in 2010, I'd agree.

I remember one of the old, old criticisms was, "I just bought a $2,000 top-of-the-line computer and it can't even play a newly released game. Forget this I'm just going to get a [PS, Xbox, whatever]."

That's just what I thought of when I saw those charts.

Not everyone is going to upgrade to Windows 10, etc., especially some of your less computer literate folks. Some of these game devs just make me queasy.
 
I remember one of the old, old criticisms was, "I just bought a $2,000 top-of-the-line computer and it can't even play a newly released game. Forget this I'm just going to get a [PS, Xbox, whatever]."

That's just what I thought of when I saw those charts.

Not everyone is going to upgrade to Windows 10, etc., especially some of your less computer literate folks. Some of these game devs just make me queasy.

Yeah, that is a danger: and you have a point. But developers need to communicate that the "Highest settings" aren't the "Only Settings", and (in the case of Skyrim back in 2011) most games look better on low than any of the consoles, and run on mom's Compaq at those settings. Yeah, many people want to run the top end visual quality, but a lot of developers look at the "very high" and "Ultra" settings as future-proofing, not meant to be run on Video Cards available at launch-day. That was surely the case with Crysis and The Witcher.

that's a discussion for a different thread, though.
 
I remember one of the old, old criticisms was, "I just bought a $2,000 top-of-the-line computer and it can't even play a newly released game. Forget this I'm just going to get a [PS, Xbox, whatever]."

That's just what I thought of when I saw those charts.

Not everyone is going to upgrade to Windows 10, etc., especially some of your less computer literate folks. Some of these game devs just make me queasy.

the flip side is people like me, would rather purchase games that are designed to work on only the newest dx.....especially if it really pushes my pc. Makes me feel like im getting something console kiddies cant
 
Oh you want me to spoonfeed you? Sure ok:
So yes, Richard Huddy = full of shit as usual. GW adds a layer of effects on top of a game, it isn't required and certainly isn't detrimental as Witcher 3 has clearly shown. AMD fans and the company itself are butt hurt because they want a free lunch courtesy of NVIDIA. Of course this has been covered a million times in several threads but AMD shills and fanboys seem to ignore it and keep beating that anti-GW drum. But who can blame them? Richard Huddy is spouting nonsense so of course the fanboys will take his lead. AMD is a company with 20% market share and slipping, it will keep crying foul until it inevitably sinks into the abyss.

Did you bother to even READ what you quoted?

With the GameWorks program developers can gain access to source code through a licensing deal with Nvidia. However this means that developers are not allowed to share this code with anyone else without a license, this obviously includes Nvidia’s competitors like AMD and Intel.

So they have to PAY to see the source code and can't share that source with either Intel or AMD to help them optimize the code or drivers... pretty much exactly what I said:

GameWorks has been criticized for its proprietary and closed nature. Competitors such as AMD, which do not offer a similar middleware, are unable to properly optimize Nvidia's libraries for their hardware. Users of AMD cards report drastically lower performance than their Nvidia equivalents for games that use the GameWorks API. AMD Chief Gaming Scientist, Richard Huddy, has claimed that developers who use GameWorks are contractually forbidden to work with AMD

Let me write it again for you since you can't understand:

Developers have to pay Nvidia to see source code, and those license terms forbid them from sharing it with AMD.

However we’re told that game developers are still allowed to optimize GameWorks features for competitors’ hardware without showing it to them and as long as it does not negatively impact the performance of Nvidia hardware.

So as long as the developers have to do all the hardwork and can't contact AMD/Intel for help, and they have to pay to do so. What a win for the developers!

Does partaking in the GameWorks program preclude a game developer from working with AMD to implement an alternative to a specific GameWorks visual effect like HairWorks for example?
No. We don’t prohibit them from adding technologies from other IHVs to their games.

Ohhh so as long as the developers do twice the work its OK. Nothing like having to implement 2 different hair solutions to make a developers life easy. Doing it twice without being allowed support from anyone except Nvidia, thats the GAMEWORKS way!

I'm not sure if you are an nvidia employee, or just drank so much of their koolaid you are completely blind.
 
Did you bother to even READ what you quoted?



So they have to PAY to see the source code and can't share that source with either Intel or AMD to help them optimize the code or drivers... pretty much exactly what I said:



Let me write it again for you since you can't understand:

Developers have to pay Nvidia to see source code, and those license terms forbid them from sharing it with AMD.

Are you now trying to change what you originally quoted? Because this is what you had in your post:

AMD Chief Gaming Scientist, Richard Huddy, has claimed that developers who use GameWorks are contractually forbidden to work with AMD

Seems you suffer from a reading and comprehension deficiency. Either that or just doing the usual AMD fanboy tactic by shifting goalposts.

So as long as the developers have to do all the hardwork and can't contact AMD/Intel for help, and they have to pay to do so. What a win for the developers!



Ohhh so as long as the developers do twice the work its OK. Nothing like having to implement 2 different hair solutions to make a developers life easy. Doing it twice without being allowed support from anyone except Nvidia, thats the GAMEWORKS way!

I'm not sure if you are an nvidia employee, or just drank so much of their koolaid you are completely blind.

I don't know where you got any of the above from, you need to lay off the drugs.
 
To everyone here read this http://www.oxidegames.com/2015/08/16/the-birth-of-a-new-api/

nv/amd/ms/intel has had access to the source code for over a year there is seriously no excuse for nv to be caught with its pants down dick flapping in the wind like this.

Just shows how much effort they are tossing into dx11 optimizations that dx12 essentially renders useless thus in this 1st instance of a dx12 benchmark the playing field has been leveled.

Now we see nv in total damage control mode.

On to gameworks my hope is dx12 murders gameworks and graphical middle-ware like it. Why remember the little dust-up about tomb raider last year how it was an amd gaming evolved title and nv's cards ran like dogshit for the 1st week till nv fixed its driver. nv was able to do that cause amd doesn't lock their optimizations behind a layer of proprietary middleware. If amd wants to fix a driver for a gameworks title however tough cookies go pound sand, now i ask you is that fair to us consumers?
 
Last edited:
This thread has really gone off topic...

Nvidia got caught with their pants down, THEY EVEN ADMITTED IT.

Lessons to take away: don't trust anybody, Nvidia will blame developers before looking at themselves.
 
Are you now trying to change what you originally quoted? Because this is what you had in your post: .

So you really are blind, that quote is in both of my posts and let me re-write it here since you can't seem to understand it:

AMD Chief Gaming Scientist, Richard Huddy, has claimed that developers who use GameWorks are contractually forbidden to work with AMD

And the supporting proof from your quote that I don't think you understood:

However this means that developers are not allowed to share this code with anyone else without a license, this obviously includes Nvidia’s competitors like AMD and Intel.

Do we want to repeat that a few more times so it can sink in?

Developers have to pay extra, and can not work with Intel/AMD to optimize it any code due to contract. Intel/AMD can not create optimized drivers due to being restricted from source code.

  • Developers must pay extra for source
  • Developers can't contact Intel or AMD for optimization help due to contract
  • AMD/Intel can not create optimized drivers due to lack of source

Do you understand now or do you still need help?
 
Its clear to everyone here that you can't read and can only throw insults and spout lies.

Let me relink all of my posts since you can't seem to read them:

http://hardforum.com/showpost.php?p=1041801900&postcount=142

http://hardforum.com/showpost.php?p=1041802298&postcount=164

And here is the quote you can't seem to find in them:



And the proof that you quoted but didn't understand:



Licensing deal = contract. Developers who sign a contract (licensing deal) are not allowed to work with AMD to optimize the code because they are not allowed to share the code with AMD.

Do you understand yet or do we need to try again?


Wait, that means they can't share Gameworks source code. That does not mean they can't work with AMD and share development builds of the game. Hell, They could even give the Gameworks DLLs to AMD to play with, because by those quotes, they are only obligated to not share the source, not the compiled DLLs.

am I reading this right?
 
18% market share and falling... You'd have to be a fool to buy AMD now anyways.

Unless AMD Pulls a rabbit out of their hat, or sells their GPU business to a big company, we are looking at a one-horse-race.

Essentially, Nvidia will own the market.
 
Wait, that means they can't share Gameworks source code. That does not mean they can't work with AMD and share development builds of the game. Hell, They could even give the Gameworks DLLs to AMD to play with, because by those quotes, they are only obligated to not share the source, not the compiled DLLs.

am I reading this right?

This aspect seems to commonly be ignored since this issue came up. There seems to be lack of separation between the Gameworks libraries/features and the rest of the rest of the game.

For example if we go back to Assassin's Creed Unity. Issues in the game not specifically related to Gameworks libraries (basically the Gameworks features) would not be directly attributable to them. But in discussions everything tends to be lumped together.

Or with the Witcher 3. The statement form the developer was specifically related to a Gameworks feature in which their could be limited optimization done with AMD. It was not a blanket statement in regards to the entire game.

GTA V even used specific technology from each vendor.
 
Wait, that means they can't share Gameworks source code. That does not mean they can't work with AMD and share development builds of the game. Hell, They could even give the Gameworks DLLs to AMD to play with, because by those quotes, they are only obligated to not share the source, not the compiled DLLs.

am I reading this right?

They can't share any of the Gameworks code correct. Gameworks optimization not the rest of the game. What is still unknown is if there are any other restrictions that nvidia places on developers when they provide them with gameworks and support personnel.
 
They can't share any of the Gameworks code correct. Gameworks optimization not the rest of the game. What is still unknown is if there are any other restrictions that nvidia places on developers when they provide them with gameworks and support personnel.

exactly gameworks locks up code that amd can not see. yes they can optimize the drivers for the game the best they can but they cant see what nv's implementations will do to their code and from the looks of it nv is seriously limiting performance on non nv hardware. willing to bet nv's middleware is going land nv in an antitrust suit soon enough vs amd/intel/other players in the industry.
 
exactly gameworks locks up code that amd can not see. yes they can optimize the drivers for the game the best they can but they cant see what nv's implementations will do to their code and from the looks of it nv is seriously limiting performance on non nv hardware. willing to bet nv's middleware is going land nv in an antitrust suit soon enough vs amd/intel/other players in the industry.

The vast majority of current gameworks implementations are basically used as an "add on." These are effects which can be completely disabled. It is therefore not a valid technical argument that it would affect the performance of the rest of the game directly in those cases since Gameworks code would not even be used.
 
Unless AMD Pulls a rabbit out of their hat, or sells their GPU business to a big company, we are looking at a one-horse-race.

Essentially, Nvidia will own the market.

And when that happens I guess I can finally pick up console gaming again. :D
 
I won't post a name, but that person was a nvidia Senior PR Manager. Not an intern or somebody inexperienced! They knew perfectly well what they were doing.

Oh, he knew. They got away with claiming the misrepresented 970 specs was an oversight. They just say it, the sites go along, and when it comes out, they ride out the shit storm. In the meantime you have reviews and charts all over the internet that show their product in the best possible light. Mission accomplished.
 
I3 4330 > FX 8370 ... Tricky Nvidia must have done something with the shader code that secretly crippled AMD's hardware.

Is there anyone here who actually cares about the performance of AMD's current FX line? I don't think so. If you do maybe post it over in the CPU forum where it's relevant. Posting it here amounts to nothing more than throwing dirt.

This game is literally the only currently accepted benchmark for DX12...Oxide has literally been working on the Nitrous Engine with all the key players in the space for years...

The publisher Stardock has games out that have sold millions, of which, many are extensive in scale. Brad Wardell (of Stardock) has been designing and producing games for over a decade including getting his hands dirty with programming.

Yea, it's not a first person shooter, but they are an order of magnitude more complex than Starcraft in terms of units present on screen, real-time AI simulation, etc. Don't disregard Oxide or Stardock's work.

In case you haven't noticed, anytime there's a game AMD does better than nVidia in then the game is crap and irrelevant.
 
Is there anyone here who actually cares about the performance of AMD's current FX line? I don't think so. If you do maybe post it over in the CPU forum where it's relevant. Posting it here amounts to nothing more than throwing dirt.

Yeah, I don't think ANYONE cares about AMD's CPU division... not even AMD. The 8350 is a 3 year old chip that had a lack of IPC against the i3s of its time. given IPC advancements in intel's chips compounded with new use of instruction sets, memory bandwidth, chipset improvements, its quite easy to see why the 8350 would be left behind by a newer 2c/4t part. Core2Quads get left behind, and they have an IPC advantage over the 8350 too!
 
Wow I forgot about those games running in DX 12

Thanks for the completely off topic comment and once again failing to understand what we are discussing while bad mouthing AMD. It's no wonder half the forum has you on ignore, I think I'll be joining them :)

I would appreciate it if you would because people quoting him screws it up for those of us who do have him on ignore. ;)



Yeah, I don't think ANYONE cares about AMD's CPU division... not even AMD. The 8350 is a 3 year old chip that had a lack of IPC against the i3s of its time. given IPC advancements in intel's chips compounded with new use of instruction sets, memory bandwidth, chipset improvements, its quite easy to see why the 8350 would be left behind by a newer 2c/4t part. Core2Quads get left behind, and they have an IPC advantage over the 8350 too!

Again, the CPU forum. This will just help derail this thread, which is the main reason people post about AMD CPU's in the GPU forum.

Cheers. :)
 
Oh, he knew. They got away with claiming the misrepresented 970 specs was an oversight. They just say it, the sites go along, and when it comes out, they ride out the shit storm. In the meantime you have reviews and charts all over the internet that show their product in the best possible light. Mission accomplished.

Yes, of course. It's a tried and true delay PR tactic to "wait out the storm". Sometimes throw in a completely off-topic bit of news to help shift interest to something else if the storm isn't going out quick enough, or is at an inopportune time.

In this case though using a relatively smaller dev studio as an escape goat and publicly try to discredit them is bloody scary. Imagine you are the owner of a SMB and a multi-national billion dollar company you need for your business to survive decides it has a bone to pick with you and goes public about it instead of just calling a meeting? Especially when you know you right?!

Holy crap! Imagine what the message other studios are getting from this? Yes they can be intimidated.
 
Unless AMD Pulls a rabbit out of their hat, or sells their GPU business to a big company, we are looking at a one-horse-race.

Essentially, Nvidia will own the market.

Which is bad for everybody including NV fanboys but they are to dense to see this.
 
Wait, that means they can't share Gameworks source code. That does not mean they can't work with AMD and share development builds of the game. Hell, They could even give the Gameworks DLLs to AMD to play with, because by those quotes, they are only obligated to not share the source, not the compiled DLLs.

am I reading this right?
It's entirely right, AMD just uses it as a scapegoat for their drivers not performing. You can bench the cards with gameworks turned off and it doesn't change the results in relative performance much in many games. AMD should also be smart enough to be able to optimize without having source, unless no one on AMD's driver team has any reverse engineering experience. It makes AMDs job harder not impossible...
The vast majority of current gameworks implementations are basically used as an "add on." These are effects which can be completely disabled. It is therefore not a valid technical argument that it would affect the performance of the rest of the game directly in those cases since Gameworks code would not even be used.
Which is the other aspect, people complain about performance but still want to turn on and leave on the features. So the features are worth while?
Which is bad for everybody including NV fanboys but they are to dense to see this.
Gameworks isn't killing AMD, AMD is killing AMD. AMD is been incompetent for quite some time it's GPU division used to prop up it's CPU division but that weight is just bringing them both down.
 
I need to install one of those autoreplace extensions.

PRIME1 would be Jen-Hsun Huang, that way the posts would be more accurate.

That is just an example of how i would use the extension.
 
It's entirely right, AMD just uses it as a scapegoat for their drivers not performing. You can bench the cards with gameworks turned off and it doesn't change the results in relative performance much in many games. AMD should also be smart enough to be able to optimize without having source, unless no one on AMD's driver team has any reverse engineering experience. It makes AMDs job harder not impossible...

Which is the other aspect, people complain about performance but still want to turn on and leave on the features. So the features are worth while?

Even Nvidia doesn't agree with you. When they had tress fx issues on tomb raider, it required work on the part of crystal dynamic to fix their performance issue, and Nvidia had access to the tressfx source code with no restrictions.

So, please tell me about how AMD should fix performance issues with no access to source and with developers limited in their ability to work with them on those issues without violating NDA.

Please be advised that these issues cannot be completely resolved by an NVIDIA driver. The developer will need to make code changes on their end to fix the issues on GeForce GPUs as well. As a result, we recommend you do not test Tomb Raider until all of the above issues have been resolved.

http://techreport.com/news/24463/nvidia-acknowledges-tomb-raider-performance-issues
 
Last edited:
The vast majority of current gameworks implementations are basically used as an "add on." These are effects which can be completely disabled. It is therefore not a valid technical argument that it would affect the performance of the rest of the game directly in those cases since Gameworks code would not even be used.

its a black box that the dev/amd/intel cant see into unless the dev pays nv money and not to mention before any of this cash exchanges hands the dev is prohibited from helping amd/intel optimism performance by contract...
 
its a black box that the dev/amd/intel cant see into unless the dev pays nv money and not to mention before any of this cash exchanges hands the dev is prohibited from helping amd/intel optimism performance by contract...

And how does this affect the rest of the game?

Again that only applies to the specific Gameworks features in which the vast majority of applications are "add on" effects (which means they are not integral to the game and can easily be disabled).

The Witcher 3's performance with Hairwork's turned off as an example cannot be directly linked to Hairwork's implementation. So whether or not optimization is done in that area would be irrelevant. CDPR's comment regarding optimization was only directed at that specific feature. Performance optimization with that off would be solely dependent on AMD and CDPR. Whether or not Hairwork's is optimized or not for AMD hardware is unrelated to whether or the game is with that feature disabled.

Or take Assassin's Creed Unity. Why would the game's performance with Gameworks feature disabled (which can be done so) be attributable to it's usage of Gamework's (all of which can be disabled)?
 
its a black box that the dev/amd/intel cant see into unless the dev pays nv money and not to mention before any of this cash exchanges hands the dev is prohibited from helping amd/intel optimism performance by contract...

Actually its even worse.

AMD/Intel (or anyone non-ndivia GPU maker) can never see the gameworks source code.

If the game developers want to optimize for non-nvidia hardware, they have to pay licensing fees and are still prohibited from working with the GPU vendors.

Lets let that sink in: Game developers have to pay to Nvidia to optimize their games for non-Nvidia GPUs and aren't allowed to work with AMD/Intel (or others) to do so.

And you want to tell me that isn't a black box for Intel/AMD as well as harmful to game developers?
 
Even Nvidia doesn't agree with you. When they had tress fx issues on tomb raider, it required work on the part of crystal dynamic to fix their performance issue, and Nvidia had access to the tressfx source code with no restrictions.

So, please tell me about how AMD should fix performance issues with no access to source and with developers limited in their ability to work with them on those issues without violating NDA.



http://techreport.com/news/24463/nvidia-acknowledges-tomb-raider-performance-issues

Notice the commentary was not specifically directed at TressFX. If TressFX is the sole cause of performance issues than it can be isolated by disabling that feature. Lack of TressFX specific optimizations cannot cause performance issues with it disabled.

Performance issues with Gamework's features disabled (which for the vast majority of implementations can be done so as it is not integral to the game) cannot be directed attributed to the inability to optimize for Gamework's, as it would not even be used in that case.
 
And how does this affect the rest of the game?

Again that only applies to the specific Gameworks features in which the vast majority of applications are "add on" effects (which means they are not integral to the game and can easily be disabled).

The Witcher 3's performance with Hairwork's turned off as an example cannot be directly linked to Hairwork's implementation. So whether or not optimization is done in that area would be irrelevant. CDPR's comment regarding optimization was only directed at that specific feature. Performance optimization with that off would be solely dependent on AMD and CDPR. Whether or not Hairwork's is optimized or not for AMD hardware is unrelated to whether or the game is with that feature disabled.

Or take Assassin's Creed Unity. Why would the game's performance with Gameworks feature disabled (which can be done so) be attributable to it's usage of Gamework's (all of which can be disabled)?

The rest of the game usually works fine on both sets of hardware, its the gameworks stuff that is horribly optimized on AMD hardware, so thus looks A) worse on reviews and B) means that AMD(or intel/whomever non-Nvidia) customers can't use those features, or have a much lower performing game because of those unoptimized features.
 
The rest of the game usually works fine on both sets of hardware, its the gameworks stuff that is horribly optimized on AMD hardware, so thus looks A) worse on reviews and B) means that AMD(or intel/whomever non-Nvidia) customers can't use those features, or have a much lower performing game because of those unoptimized features.

The original point was to address the notion that Gamework's affects the entirety of a games performance irrespective whether or not those features enabled.

However to look at your two points -

1) Reviews
GPUs will have high performance variance depending on work load, this has always been the case, and will be the case irrespective of Gameworks. There is certainly not a lack of test results for game which do implement Gamework's as they are relatively high profile. If users don't care for Gamework's features then they should look for reviews that don't use them (there are plenty). Users that do can look for them. I don't see a need to make a blanket judgement whether or not Gamework's test results have relevance or not, individuals can make that decision for themselves.

2) Features
This seems to be an entitlement based argument. To some extent the issue is also related to perception and the fixation on "max settings" that perpetuates in the hardware focused PC gaming crown. At the end though these are optional add-on features for most part and are not critical to the games using them and can be disabled.

If we look at major cross-platform games that do not receive specific IHV support you will actually notice they tend to have a bare minimum (if at all) and differential graphics improvements from their other platform variants.
 
The original point was to address the notion that Gamework's affects the entirety of a games performance irrespective whether or not those features enabled.

I don't think that was ever under question though? It was always an issue with gamework specific features causing games to go much slower on non-Nvidia (or even nvidia latest gen) hardware.

However to look at your two points -

1) Reviews
GPUs will have high performance variance depending on work load, this has always been the case, and will be the case irrespective of Gameworks. There is certainly not a lack of test results for game which do implement Gamework's as they are relatively high profile. If users don't care for Gamework's features then they should look for reviews that don't use them (there are plenty). Users that do can look for them. I don't see a need to make a blanket judgement whether or not Gamework's test results have relevance or not, individuals can make that decision for themselves.

I'm not sure what you are trying to say here. Yes reviewers can turn off gameworks features and do so in reviews, especially since they are the major causes of slowdowns.

2) Features
This seems to be an entitlement based argument. To some extent the issue is also related to perception and the fixation on "max settings" that perpetuates in the hardware focused PC gaming crown. At the end though these are optional add-on features for most part and are not critical to the games using them and can be disabled.

Yes the features are optional, and they are often a huge selling point for the game. Hell look at Batman, they plaster all of the special effects on every trailer or demo shown. So why wouldn't gamers expect to run those settings if they have high end systems?


-----


The main issue here isn't the ability to disable gameworks, we already know that is an option. The issue is Non-Nvidia hardware can not be optimized for gameworks features because they can not access the source.

Here is how Nvidia felt gamers should act when they weren't able to get TressFx working at launch:

The developer will need to make code changes on their end to fix the issues on GeForce GPUs as well. As a result, we recommend you do not test Tomb Raider until all of the above issues have been resolved.

Yet now they charge developers to be able to optimize Gameworks for other vendors, and prevent those developers from working with the vendors to optimize.

If that isn't Hypocrisy and if you can't see how that is bad for all gamers I don't know how else to explain it.

Also to try and get off the issues with gameworks and get back on topic: Look at how this developer is willing to work with all companies (Intel, Microsoft, Nvidia and AMD) and openly share their source code to make sure it runs as best as possible on all hardware and platforms. This is how gaming will succeed
 
Even Nvidia doesn't agree with you. When they had tress fx issues on tomb raider, it required work on the part of crystal dynamic to fix their performance issue, and Nvidia had access to the tressfx source code with no restrictions.

So, please tell me about how AMD should fix performance issues with no access to source and with developers limited in their ability to work with them on those issues without violating NDA.
That's not really a counter argument. Ofc nvidia use their access to the source code to do their optimization. That's a ton easier why would they do things the harder way? It's not like tressfx source was ever restricted from them to begin with. I said it makes their job harder not impossible. They didn't need the source but it helps. The point is you can optimize the drivers but it will just take more work than usual. AMD as a company isn't about the spend that money when it already spends less in R&D than Nvidia as a company.
 
That's not really a counter argument. Ofc nvidia use their access to the source code to do their optimization. That's a ton easier why would they do things the harder way? It's not like tressfx source was ever restricted from them to begin with. I said it makes their job harder not impossible. They didn't need the source but it helps. The point is you can optimize the drivers but it will just take more work than usual. AMD as a company isn't about the spend that money when it already spends less in R&D than Nvidia as a company.

It's too funny that you left out that part that completely counters what you're saying.

Let me put it here again in case you "missed" it.

The developer will need to make code changes on their end to fix the issues on GeForce GPUs as well.
 
Even Nvidia doesn't agree with you. When they had tress fx issues on tomb raider, it required work on the part of crystal dynamic to fix their performance issue, and Nvidia had access to the tressfx source code with no restrictions.

So, please tell me about how AMD should fix performance issues with no access to source and with developers limited in their ability to work with them on those issues without violating NDA.



http://techreport.com/news/24463/nvidia-acknowledges-tomb-raider-performance-issues

Simply not true. TressFX code was not present in any advance copies of the game that NVIDIA received. The developer was contractually obligated not to share it with NVIDIA. Only the final advance copy had it present, right when the game shipped, and it caused crashing issues on NVIDIA hardware
 
Back
Top