Update about AMD Catalyst Driver Support for Project CARS and The Witcher 3

Warsam71

AMD Community Manager
Joined
Aug 6, 2013
Messages
140
Hello Everyone,

I wanted to let you know that we have just posted an important update about our driver’s support for Project CARS and The Witcher 3: Wild Hunt.

AMD is committed to improving performance for the recently-released Project CARS and The Witcher 3: Wild Hunt. To that end, we are creating AMD Catalyst™ 15.5 Beta to optimize performance for these titles, and we will continue to work closely with their developers to improve quality and performance. We will release AMD Catalyst™ 15.5 Beta on our website as soon as it is available.

You’ll also find a workaround if you are experiencing issues with the “HairWorks” feature in The Witcher 3: Wild Hunt.

Please click here to read the full article
 
thanks for the update, make sure it has good triple xfire performance for witcher 3 pls :)
 
Translation : once you'll be done with those game we might finally have acceptable performance with them.

Just like when I had my AMD 6950 (unlocked) and 7970m in crossfire in my alienware m18x r2.
 
Translation : once you'll be done with those game we might finally have acceptable performance with them.

Just like when I had my AMD 6950 (unlocked) and 7970m in crossfire in my alienware m18x r2.

Nvidia paid the developer to not allow AMD to optimize their drivers beforehand so it's hardly AMD's fault.
 
Nvidia paid the developer to not allow AMD to optimize their drivers beforehand so it's hardly AMD's fault.
And, of course, you have irrefutable evidence to backup such a bold statement. Otherwise you wouldn't be saying it in the first place, right?
 
That is my biggest gripe with AMD. By the time the drivers are working and crossfire is working I am often done with the game or the game has come down in price and I could have spent less on it.
 
That is my biggest gripe with AMD. By the time the drivers are working and crossfire is working I am often done with the game or the game has come down in price and I could have spent less on it.

nvidias own 7xx series is suffering massively with W3 as well, Titan is just terrible, 960 beats 780 etc.
 
Wish they would have worked with CDPR all along to develop a driver at time of launch. Not like this game is some indi-level no-nothing game.

So, no estimate on release of this "beta"? Last beta was released in early april 2015. Long time between updates.
 
That is my biggest gripe with AMD. By the time the drivers are working and crossfire is working I am often done with the game or the game has come down in price and I could have spent less on it.

The Witcher came out a few days ago and Project Cars was earlier this month, we have an updated driver today to address performance...sure it would be great if we didn't need to wait at all, but hell things happen. I think the time to get updated drivers was pretty short.
 
Will this new driver improve performance in GTA V as well? or is that a one time shot too.
 
Wish they would have worked with CDPR all along to develop a driver at time of launch. Not like this game is some indi-level no-nothing game.

So, no estimate on release of this "beta"? Last beta was released in early april 2015. Long time between updates.

There was an actual post by a dev where they had worked with CDPR since the beginning and then prior to W3 release and had everything running great. Then just before release a new Gameworks came out that included HairWorks and then performance was massively killed...it came out close enough to release that "TressFX" wouldn't have had enough time to make it in game.
 
This is why the Fiji top of the line chip can't be priced over $500. When I upgrade my GPU if I have to pick between a GPU like the 390x for $850 or a Titan-X for $1,000, I'll pick the Titan-X everytime, because nVidia seems to have no moral issue with paying off developers to hinder Radeon performance (I'm against this, but AMD doesn't seem to be doing anything about this and most people think it's not happening so it gets ignored). BUT priced at $500, we have a Radeon 4870 situation again, where AMD's top of the line card cost half as much as nVidia's and can match it in performance in most games(4870 @ $300, GTX280 @ $650).

Because AMD is apparently reading this thread, if you have the audacity to release the 390x at more than $500, then use that extra money to start helping developers.

Also, I'm smelling some price-fixing between AMD and nVidia again. 2009 AMD's top of the line single GPU card, the Radeon 5870 retailed for $330. Six years later, AMD's top of the line single chip card, the R9-390x, is rumored to be $850. What else on Earth went up almost THREE times in cost? Even gas didn't jump this high. A $500 R9-390x would punch nVidia in the gut the same way the 4870 did to the GTX280.

http://www.tomshardware.com/news/nvidia-amd-ati-graphics,6311.html


Of course my thoughts are thrown out the window if the Top of the line Fiji is called "The AMD Olympian" and slays the Titan-X by offering the performance of two Titan-X's in SLI from a single Fiji-chip for $850. Then they have an updated Hawaii-Chiped(think Tahitti -> Tonga) R9-390x that matches Titan-X performance for $500.
 
Last edited:
Of course they did, because this has NEVER happened before :rolleyes:

Your eye roll does not change the fact that Nvidia has been caught with it's hand in the cookie jar. :D Whether it happened before or not does not change the fact that Nvidia is doing it now. ;)
 
Your eye roll does not change the fact that Nvidia has been caught with it's hand in the cookie jar. :D Whether it happened before or not does not change the fact that Nvidia is doing it now. ;)
Proof? Evidence? Details?

All of these are lacking.
 
Proof? Evidence? Details?

All of these are lacking.

I believe a developer for CDPR stated before the game released that they weren't able to optimize it for AMD cards. That's not definitive by any means, but it is suspicious.
 
I believe a developer for CDPR stated before the game released that they weren't able to optimize it for AMD cards. That's not definitive by any means, but it is suspicious.
"It" refers to virtual hair, not the game itself.

CDPR said they cannot optimize HairWorks for AMD cards. The professional angry gamer community that exists on reddit willfully misinterpreted this as CDPR saying Nvidia prevented them from optimizing the game for AMD cards. Shitstorm of idiotic proportions ensued.
 
Your eye roll does not change the fact that Nvidia has been caught with it's hand in the cookie jar. :D Whether it happened before or not does not change the fact that Nvidia is doing it now. ;)

Proof? Evidence? Details?

All of these are lacking.

Here is a Solid Fact Hairworks relies on DX11 Tessellation, YES THE STANDARD in DirectX11. Well AMD cards suck Bad in that. So Nvidia used a standard for the api So who should be fault now for that? In tessellation benchmarks, a 290x gets BEAT by a gtx760. AMD did boost it in r9 285 by 50% but still well behind.

Too put in perspective gap in tessellation, its bigger then gap was between AMD and Nvidia was in OpenCL which is WHY AMD used it for tressFX cause they had a huge performance advantage, well Nvidia used what they had advantage in. So both sides did it so can stop whining about it.
 
Your eye roll does not change the fact that Nvidia has been caught with it's hand in the cookie jar. :D Whether it happened before or not does not change the fact that Nvidia is doing it now. ;)

How is this the hand in the cookie jar for any game other than the Hairworks works off Tessellation which apparently AMD cards can't handle?

AMD has basically admitted defeat here by releasing, sorry announcing, working drivers too little too late. I guess it would be silly to prep drivers before release because the faithful will blame someone else, and it is much cheaper to release working drivers a month or two after release... which seems to be a common trend. (See my original sarcastic post for more information).

PS - I've owned many ATi/AMD cards over the last decade and half and this is the norm.
 
Here is a Solid Fact Hairworks relies on DX11 Tessellation, YES THE STANDARD in DirectX11. Well AMD cards suck Bad in that. So Nvidia used a standard for the api So who should be fault now for that? In tessellation benchmarks, a 290x gets BEAT by a gtx760. AMD did boost it in r9 285 by 50% but still well behind.

Too put in perspective gap in tessellation, its bigger then gap was between AMD and Nvidia was in OpenCL which is WHY AMD used it for tressFX cause they had a huge performance advantage, well Nvidia used what they had advantage in. So both sides did it so can stop whining about it.

Hey
I own a 780.
AMD aint fucking me.
Nvidia is fucking me.
Where is my big advantage in tessellation.
I'm waiting.
Who do I call?

Is Tress FX going to help me?
Maybe AMD will write a driver for 780 owners since they are the ones fucking everybody up.
 
Here is a Solid Fact Hairworks relies on DX11 Tessellation, YES THE STANDARD in DirectX11. Well AMD cards suck Bad in that. So Nvidia used a standard for the api So who should be fault now for that? In tessellation benchmarks, a 290x gets BEAT by a gtx760. AMD did boost it in r9 285 by 50% but still well behind.

Too put in perspective gap in tessellation, its bigger then gap was between AMD and Nvidia was in OpenCL which is WHY AMD used it for tressFX cause they had a huge performance advantage, well Nvidia used what they had advantage in. So both sides did it so can stop whining about it.

Thanks for clearing that up. So basically AMD fanboys are crying because a game used standard DX11 tessellation, as opposed to TressFX which is not a DX standard at all. Obviously nVidia cheated by making Hairworks use a universal standard! (rolls eyes).

To AMD fanboys who say they should have used TressFX: Why the fuck would nVidia use an AMD standard in their API?

I wondered why AMD was making Mantle when everything else uses DX9/10/11 - Now it's becoming obvious that their cards simply don't cut it. Another reason added to my "I won't go back to AMD because..." list.
 
Thanks for clearing that up. So basically AMD fanboys are crying because a game used standard DX11 tessellation, as opposed to TressFX which is not a DX standard at all. Obviously nVidia cheated by making Hairworks use a universal standard! (rolls eyes).

To AMD fanboys who say they should have used TressFX: Why the fuck would nVidia use an AMD standard in their API?

I wondered why AMD was making Mantle when everything else uses DX9/10/11 - Now it's becoming obvious that their cards simply don't cut it. Another reason added to my "I won't go back to AMD because..." list.

TressFX uses DirectCompute. Which is the standard for all video cards in DX11. There is nothing "AMD Only" about it. They just happened to help facilitate Square Enix to develop TressFX faster. Ever seen that game called Alien Isolation where there are all these cool effects that blow PhysX out of the water? It so well on AMD and Nvidia hardware because it uses Microsoft DirectCompute physics effects. We're talking 100 fps performance across Nvidia and AMD cards according to the [H]ardocp review. AMD Evolved game.

So you're arguing that Tessellation is a better use of resources because it kills performance for Nvidia cards across the board as long as it hits AMD cards a little harder? It is wrong for Nvidia sponsored titles to use AMD development assisted libraries because they will give close to 120 fps when all of the effects are turned on across all video cards?

Come on man. :)
 
TressFX uses DirectCompute. Which is the standard for all video cards in DX11. There is nothing "AMD Only" about it. They just happened to help facilitate Square Enix to develop TressFX faster. Ever seen that game called Alien Isolation where there are all these cool effects that blow PhysX out of the water? It so well on AMD and Nvidia hardware because it uses Microsoft DirectCompute physics effects. We're talking 100 fps performance across Nvidia and AMD cards according to the [H]ardocp review. AMD Evolved game.

So you're arguing that Tessellation is a better use of resources because it kills performance for Nvidia cards across the board as long as it hits AMD cards a little harder? It is wrong for Nvidia sponsored titles to use AMD development assisted libraries because they will give close to 120 fps when all of the effects are turned on across all video cards?

Come on man. :)
Fair points - I have to admit that I thought that DirectCompute was developed by AMD. My mistake.

I'm not saying that DX11 tessellation is better, I'm saying it's a standard that exists across all cards. That being the case I fail to see how nVidia is somehow cheating by using it. If they are somehow cheating then AMD were no better for using TressFX in Tomb Raider.

There's an interesting article on Forbes here which refutes a lot of what AMD have claimed in regards to The Witcher 3.
 
Fair points - I have to admit that I thought that DirectCompute was developed by AMD. My mistake.

I'm not saying that DX11 tessellation is better, I'm saying it's a standard that exists across all cards. That being the case I fail to see how nVidia is somehow cheating by using it. If they are somehow cheating then AMD were no better for using TressFX in Tomb Raider.

There's an interesting article on Forbes here which refutes a lot of what AMD have claimed in regards to The Witcher 3.

Forbes > [H]. Clearly
 
Nvidia paid the developer to not allow AMD to optimize their drivers beforehand so it's hardly AMD's fault.

*facepalm*

Read this:

A PC enthusiast on Reddit did more to solve the HairWorks performance problem than AMD has apparently done. AMD’s last Catalyst WQHL driver was 161 days ago, and the company hasn’t announced one on the horizon. Next to Nvidia’s monthly update cycle and game-ready driver program, this looks lazy.

If you want a huge AAA game release to look great on your hardware, you take the initiative to ensure that it does. What you don’t do is expect your competitor to make it easier for you by opening up the technology they’ve invested millions of dollars into. You innovate using your own technologies. Or you increase your resources. Or you bolster your relationships and face time with developers.

In short, you just find a way to get it done.

Maybe Nvidia is secretly paying off AMD programmers not to bother updating and optimizing AMD drivers too? :rolleyes: You guys seriously need to get a grip.

All this crying about GameWorks is a red herring, because remove that from the equation and AMD still wouldn't have an optimized driver update. There seem to be some deep fundamental problems over there which the company seems to be staying very quiet about.
 
Last edited:
*facepalm*

Read this:



Maybe Nvidia is secretly paying off AMD programmers not to bother updating and optimizing AMD drivers too? :rolleyes: You guys seriously need to get a grip.

All this crying about GameWorks is a red herring, because remove that from the equation and AMD still wouldn't have an optimized driver update. There seem to be some deep fundamental problems over there which the company seems to be staying very quiet about.

They've been releasing betas since then. I really don't get why people get so bitchy about there not being WHQL drivers. MS cert doesn't magically make a driver better nor does it mean the driver will even be good, WHQL drivers have been just as ass as beta drivers before from both companies. As long as there isn't another 2-3 month gap between releases of any kind I'm not going to complain too loudly about drivers not being out ASAP, especially in games they clearly didn't get time to fully mess with prior to release. If AMD can get a driver out that's worth a damn within a couple weeks I'm alright with that. I'm still willing to give AMD some leniency in that department though I've completely abandoned any thought of going Crossfire anytime in anything remotely resembling the near future.
 
Back
Top