Witcher 3 Announcement From CD Projekt Red

Warsam71

AMD Community Manager
Joined
Aug 6, 2013
Messages
140
Hello everyone,

I just wanted to let you know of the recent announcement made by CD Projekt Red team regarding the AMD Radeon performance on the up and coming title, The Witcher 3.


“Many of you have asked us if AMD Radeon GPUs would be able to run NVIDIA’s HairWorks technology – the answer is yes! However, unsatisfactory performance may be experienced as the code of this feature cannot be optimized for AMD products. Radeon users are encouraged to disable NVIDIA HairWorks if the performance is below expectations."


Source

Hope this helps
 
I feel bad for AMD, you guys put all this effort into making an awesome hair physics simulation only to have Nvidia come out and pull the rug from under your feet and then make everyone switch to them.
 
Kind of like TressFX in Tomb Raider for Nvidia? Although patches did improve performance significantly. Is the performance impact for TressFX the same on Nvidia/AMD now? If so, that would seem like a more ideal solution unless Hairworks offers some considerably better technology. I am not fan of proprietary graphical features.
 
I look forward to the day we'll have such level of physics simulation being applied to the entire world and not just one character's hair.
 
I feel bad for AMD, you guys put all this effort into making an awesome hair physics simulation only to have Nvidia come out and pull the rug from under your feet and then make everyone switch to them.

I don't follow your logic here. The only noteworthy HairWorks game that comes to mind is TW3, and the only TressFX game that comes to mind is Tomb Raider. Tomb Raider had terrible performance on Nvidia with it enabled... seems like the exact same situation here.
 
I look forward to the day we'll have such level of physics simulation being applied to the entire world and not just one character's hair.

Can't wait for The Poison game. Breaking cards left and right with trying to keep up with those members hair.
 
Friggin' hair physics... not that I can run it nicely, but cant they make the damn thing work for both? FFS.
 
I don't follow your logic here. The only noteworthy HairWorks game that comes to mind is TW3, and the only TressFX game that comes to mind is Tomb Raider. Tomb Raider had terrible performance on Nvidia with it enabled... seems like the exact same situation here.

Exactly, difference is AMD tends to be much more open with their technology while nvidia keeps their shit closed source. Nvidia has a lot more money to throw at developers to make their game run their gameworks and TWIMTBP crap. AMD is essentially the underdog in the GPU business and I really don't want to see them fail as it creates less competition. Everything AMD does Nvidia will find a way to do better and make money doing it by locking everyone else out.
 
^there we go

I don't follow your logic here. The only noteworthy HairWorks game that comes to mind is TW3, and the only TressFX game that comes to mind is Tomb Raider. Tomb Raider had terrible performance on Nvidia with it enabled... seems like the exact same situation here.

But but but AMD is good Nvidia evil don't you know? :( /sarc

My problem with TressFX wasn't the perforce impact, as I knew it would work faster on AMD cards (doh), but the fact that most of the time Lara's hair was like Medusa's - all over the place and rarely pretty. Not to mention the through the neck/ face glitches. We'll see how Nvidia's implementation works.
 
Last edited:
Everyone knows that Male hair physics is much more demanding than Female hair physics.. the beautiful flowing lock Geralt can't be contained by TressFX.. it would cause all AMD cards to explode by the sheer exquisite beauty of each strand as he batltes the evils of the land..

AMD is only good enough for the pitiful strands of Lara Croft.. :)



lol but in seriousness - CD Projekt and Nvidia really chose Geralt to pull the Hair Physics card over AMD?
 
but the fact that most of the time Lara's hair was like Medusa's - all over the place and rarely pretty...

It's Tomb Raider, of course her hair looks terrible. I dare you to find a woman with perfect hair that is stranded on an island, running around for her life, and sweating like crazy. Not gonna happen.
 
Well glad I didn't preorder this game. Thank you for letting us AMD users know in advance so that we can skip this title.
 
Fancy hair or otherwise, it'll be the same game.

Not if there are other GameWorks features that disrupt the game on AMD hardware. PhysX on the CPU runs like shit in Project Cars. That's another Nvidia game that is broken on the PC. AMD can't see the code to GameWorks features so you have to hope that the developer can turn them off for AMD users. In the case of Project Cars you can't turn off the PhysX because it is the entirety of the car physics modeling for the driving.

Nvidia has unleashed the Apple ecosystem onto the PC.
 
Well glad I didn't preorder this game. Thank you for letting us AMD users know in advance so that we can skip this title.

+1

Wasn't planning on getting it for a while, but I'll just skip it altogether now.
 
Not if there are other GameWorks features that disrupt the game on AMD hardware. PhysX on the CPU runs like shit in Project Cars. That's another Nvidia game that is broken on the PC. AMD can't see the code to GameWorks features so you have to hope that the developer can turn them off for AMD users. In the case of Project Cars you can't turn off the PhysX because it is the entirety of the car physics modeling for the driving.

Nvidia has unleashed the Apple ecosystem onto the PC.

Still don't think a boycott is in order?! :p
 
Someone posted there showing HairWorks didn't cripple AMD cards on Far Cry 4.

http://www.hardocp.com/article/2015...cs_features_performance_review/5#.VVY_aUZRF9o

It could just be CD Projekt saying this since it's a GameWorks game, and they probably can't say anything positive about AMD that might affect Nvidia's sales. Just speculation of course.
Yeah that's what I was thinking also.
Far Cry 4 ran fine with Fur enabled across both Nvidia and AMD. Why they're singling out Witcher 3 is worrisome...
 
Yeah that's what I was thinking also.
Far Cry 4 ran fine with Fur enabled across both Nvidia and AMD. Why they're singling out Witcher 3 is worrisome...

It could be because the fur is more complex in Witcher 3

1wvOSR9.jpg
 
It could be because the fur is more complex in Witcher 3


1wvOSR9.jpg

Also it could be that the fur is using the new PhysX that runs on the CPU for consumers that don't have an Nvidia GPU. So if you have an Nvidia GPU it can be offloaded to the GPU, and if not then you're SOL. Project Cars uses that form of PhysX and a i7-5960X + R9 290X can't cope with it when you enable the rain and storm weather effects. The game becomes CPU bound. Luckily there is supposed to be a DX12 release for this game. Maybe it can rectify the situation.
 
This is a prime reason to petition developers to stop using proprietary extensions/tech that are GPU specific. Nvidia is by far the worst of the lot, from PhysX on up, as they push this sort of lock in all the bloody time. Not that AMD hasn't done it before,but they've been much more amenable to open technologies and actually follow standards when they exist. The vast majority of the time, if I see a game plastered with "The Way Its Meant To Be Played" and other Nvidia garbage loading screens, I'm going to find a handful of options that are essentially useless without an Nvidia card (ie PhysX being the most notorious.). However, on the relatively rare occasion that a game prefers AMD, even users of NV cards can usually get something close to the full experience. That is to say, where I can play Civilization Beyond Earth with Mantle (which, coincidentally is evolving into the cross platform close-to-the-metal Vulkan soon), someone with a reasonably powerful NV card isn't going to miss anything. Likewise, if someone chooses OpenCL or Havok physics, Nvidia users aren't left out, compared to how things go with PhysX. FreeSync monitors won't preclude Nvidia from supporting them etc...

Its just getting old and I try to stay brand agnostic, but when one side seems to be pressing towards inclusive, advancing tech (see: Vulkan, the new upcoming AMD drivers that are mostly open source, with the Catalyst binary blob something of a plug-in , etc) and the other rests on its laurels and looks to implement more proprietary "optimizations", its hard to support the latter.
 
Well glad I didn't preorder this game. Thank you for letting us AMD users know in advance so that we can skip this title.

+1

Wasn't planning on getting it for a while, but I'll just skip it altogether now.

I've heard of plenty of reasons to pass on a game before, but because of not being able to check the box for pretty hair? Really? :confused:
 
I've heard of plenty of reasons to pass on a game before, but because of not being able to check the box for pretty hair? Really? :confused:

They think the game isn't even going to run on their AMD cards or something
 
I've heard of plenty of reasons to pass on a game before, but because of not being able to check the box for pretty hair? Really? :confused:

They think the game isn't even going to run on their AMD cards or something

Game will run for sure, I'm not worried. If I was all hyped about it I could go dig out an old 670 out of my closet, but that's not the point. I'd still be encouraging this type of behavior by giving money to developers for this closed proprietary crap.

I'd rather not encourage it.
 
Game will run for sure, I'm not worried. If I was all hyped about it I could go dig out an old 670 out of my closet, but that's not the point. I'd still be encouraging this type of behavior by giving money to developers for this closed proprietary crap.

I'd rather not encourage it.

Quick question. Did you feel the same way about Tomb Raider, or any game that implemented Mantle? ;)
 
Quick question. Did you feel the same way about Tomb Raider, or any game that implemented Mantle? ;)
Well mantle has zero performance impact on any game using it for nvidia unlike gameworks implementation for one. From what i heard tressfx was optimized to work with nvidia cards to a certain extent unlike physx where it would bring your system to it's knees (10-15fps max) with amd.
 
Last edited:
Quick question. Did you feel the same way about Tomb Raider, or any game that implemented Mantle? ;)

Well mantle has zero performance impact on any game using it for nvidia unlike gameworks implementation for one. From what i heard tressfx was optimized to work with nvidia cards to a certain extent unlike physx where it would bring your system to it's knees (10-15fps max) with amd.

Mantle has no detrimental effects if you don't use it, and now is now open source Vulkan.

As for TressFX, it uses Directcompute, again which is OPEN for everybody. AMD even shared sample TressFX Code! Go ahead and download it yourself, everyone can optimize their code and drivers. It's not about hampering performance, but rather encouraging development participation.
 
Last edited:
I don't care if it's an AMD feature or an Nvidia feature, proprietary features do not move PC gaming forward.
 
Mantle has no detrimental effects if you don't use it, and now is now open source Vulkan.

As for TressFX, it uses Directcompute, again which is OPEN for everybody. AMD even shared sample TressFX Code! Go ahead and download it yourself, everyone can optimize their code and drivers. It's not about hampering performance, but rather encouraging development participation.

HairWorks uses Directcompute not CUDA for physics simulation just like TressFX (lol... I just realized how slightly ridiculous all these names are).

The difference is for rendering HairWorks heavily leverages Tesselation. As we know (or some might know) tessellation performance is not quite the same across all architectures (even within GCN).

Because of the latter don't be surprised if the R9 285 (Tonga) shows a lower performance delta than other AMD video cards (although in this case it might be a bit tricky due to the compute component).
 
i have 3x 290's so lets hope they have good scaling when a crossfire profile comes out, so that 3rd gpu can pick up the tesselation slack lol
 
1 day, 14 hours. I will try to sleep a much as possible in this time, so i can pull an all nighter once it comes out...!
 
I think it would be a loss to boycott a good game just because of some silly hair physics, but that's just me.
 
In tomb raider tressfx ran like crap on my previous nvidia cards so I just played the game without it. I'm more bothered that they downgraded the graphics to meet the needs of consoles.
 
Back
Top