Nvidia Responds To Witcher 3 GameWorks Controversy, PC Gamers On The Offensive

Well you still have to make sure it works for that 20% don't you? Why would a dev want to possibly loose 20% of their sales?

Time costs money. Putting in a lot of time to optimize code for an extra 20% may not be worth it. Don't believe me? Read a dev

"Many of you have asked us if AMD Radeon GPUs would be able to run NVIDIA’s HairWorks technology – the answer is yes! However, unsatisfactory performance may be experienced as the code of this feature cannot be optimized for AMD products. Radeon users are encouraged to disable NVIDIA HairWorks if the performance is below expectations."
 
the code of this feature cannot be optimized for AMD products
AMD TressFX runs like shit on Nvidia: Crystal Dynamics & AMD give Nvidia the source to fix it.
Nvidia HairWorks runs like shit on AMD: Sorry guys, turn it off, nothing we can do here, pack it up.

:rolleyes: <-- I wish I could make this bigger. And more eye-rollier.

hairworks heavily relies on tesselation.
Works fine in Far Cry 4, according to HardOCP anyway.
Haven't seen anyone else dig deep on it.
 
Time costs money. Putting in a lot of time to optimize code for an extra 20% may not be worth it. Don't believe me? Read a dev

"Many of you have asked us if AMD Radeon GPUs would be able to run NVIDIA&#8217;s HairWorks technology &#8211; the answer is yes! However, unsatisfactory performance may be experienced as the code of this feature cannot be optimized for AMD products. Radeon users are encouraged to disable NVIDIA HairWorks if the performance is below expectations."


Well I think its a pretty simple fix, I don't play the witcher games love their graphics but just felt their games were a bit too slow for me, any case the simple fix is to drop the amount of tessellation, which I think AMD can do via drivers.... correct me if I'm wrong.

Dropping the tessellation levels on hair though, if you start looking close up like in cinematics its easy to see the differences, so they might not have wanted to do it. To get natural hair to flow and react to the world you need very dense meshes and lots of bones. Even in offline rendering to render hair with correct lighting and animation would increase render times 4 to 6 fold.
 
AMD TressFX runs like shit on Nvidia: Crystal Dynamics & AMD give Nvidia the source to fix it.
Nvidia HairWorks runs like shit on AMD: Sorry guys, turn it off, nothing we can do here, pack it up.

:rolleyes: <-- I wish I could make this bigger. And more eye-rollier.


Works fine in Far Cry 4, according to HardOCP anyway.
Haven't seen anyone else dig deep on it.

The question people have not been answering here in any realistic fashion is why would devs themselves choose Nvidia over AMD? AMD is "free" and "open-source." Why isn't everyone going with TressFX? Why would they pay Nvidia to license their code when they can get AMD code for free? Many think it a conspiracy, but the obvious answer is AMD isn't doing their job on a number of levels.
 
AMD also worked with a number of Developers as well as NVidia to fix the TressFX performance issues. It runs with some level of parity nowadays.

Not saying NVidia needs to go balls to the wall here, but some professional courtesy goes a long way...
 
Well I think its a pretty simple fix, I don't play the witcher games love their graphics but just felt their games were a bit too slow for me, any case the simple fix is to drop the amount of tessellation, which I think AMD can do via drivers.... correct me if I'm wrong.

You are wrong.

"Many of you have asked us if AMD Radeon GPUs would be able to run NVIDIA’s HairWorks technology – the answer is yes! However, unsatisfactory performance may be experienced as the code of this feature cannot be optimized for AMD products. Radeon users are encouraged to disable NVIDIA HairWorks if the performance is below expectations."
 
Well you still have to make sure it works for that 20% don't you? Why would a dev want to possibly loose 20% of their sales?

Dev's have to do performance testing anyways after integration into their project even for nV's products, this isn't as simple as put the library here, or source code and compile and its good to go, it would be prudent to do it for AMD cards at the same time it saves money and potentially increases sales.
That's the thing though, it does work, just does not work at a reasonably good speed. Then the question is how much do you need to spend to make it work at optimum speed for the 20% of buyers? The truth is losing 20% sales is a big deal, but spending more or equal to what you'd make from those 20% sales to optimize it would not be worth doing it.
 
You are wrong.

"Many of you have asked us if AMD Radeon GPUs would be able to run NVIDIA&#8217;s HairWorks technology &#8211; the answer is yes! However, unsatisfactory performance may be experienced as the code of this feature cannot be optimized for AMD products. Radeon users are encouraged to disable NVIDIA HairWorks if the performance is below expectations."
A long time ago, AMD recommended capping Tessellation in the driver controls in order to alleviate some HairWorks performance problems. I don't know if that actually worked, of if it's still true today. But that's what they recommended. When he says they can't optimize the code, he means Nvidia uses a lot of tessellation in HairWorks and they can't do anything about it from their end.

I think the CDPR rep who wrote that quote is going to regret his wording. Claiming that certain Nvidia features physically can't be optimized for AMD hardware is going to cause a bit of a shitstorm... Well, it already has. That's the smoking gun, imho. They will easily flip it and blame it on AMD's tessellation performance, which is a valid excuse from their perspective.
 
A long time ago, AMD recommended capping Tessellation in the driver controls in order to alleviate some HairWorks performance problems. I don't know if that actually worked, of if it's still true today. But that's what they recommended.
Well, even if true you are still not getting the full experience on AMD. That would be like adding a new grfx option to your game.
HairWorks - ON / OFF / HALFASS
 
You are wrong.

"Many of you have asked us if AMD Radeon GPUs would be able to run NVIDIA&#8217;s HairWorks technology &#8211; the answer is yes! However, unsatisfactory performance may be experienced as the code of this feature cannot be optimized for AMD products. Radeon users are encouraged to disable NVIDIA HairWorks if the performance is below expectations."


The problem is nv is loosing 20% also, AMD is loosing 37% on the 290x which goes in line with my tessellation theory, that's about the same % drop with uniengine tess on and off.

As I said before the Dev's might not want to do it because of visual reasons, Witcher's hair is pretty awesome.
 
Well, even if true you are still not getting the full experience on AMD. That would be like adding a new grfx option to your game.
HairWorks - ON / OFF / HALFASS

You would need to see a comparison with AMD's low tessellation setting vs full tessellation, including an Nvidia GPU.
There's a video around here somewhere, one of the AMD graphics guys claims the tessellation in HairWorks makes no noticable difference in visual quality. They compared it to the water tessellation in Crysis 2. To defend AMD on their tessellation problems, he compared it to Nvidia's crippled compute performance... Doesn't do them any favors in games, though.
 
I'm sorry whats the problem? Nvidia works with the devs adds proprietary tech that runs better on their hardware but is also compatible with competitors tech, just not optimized.


Somehow this is bad?
 
Are people mostly complaining that the Witcher 3 *might* run poorly on AMD hardware? Or are their actual benchmarks showing the game runs like shit on AMD cards?

If getting the game to run correctly on AMD just means turn off hair animations, then people are being whiny about nothing.
 
just not optimized
Well, can't be optimized.
If you put on your tinfoil hat, Nvidia is actively preventing AMD from optimizing it. Therein lies the problem.

Are people mostly complaining that the Witcher 3 *might* run poorly on AMD hardware? Or are their actual benchmarks showing the game runs like shit on AMD cards?
PCGH has a benchmark up.
HairWorks hits the GTX 970 for about 10fps, the 290X for 15fps. They both run about 44fps average with it disabled.
 
A long time ago, AMD recommended capping Tessellation in the driver controls in order to alleviate some HairWorks performance problems. I don't know if that actually worked, of if it's still true today. But that's what they recommended.

That wasn't for HairWorks. GameWorks didn't even exist back then.

That was for the time Nvidia-sponsored titles jacked the tessellation levels so high on random geometry it would cause an insane performance loss on AMD GPUs.

HairWorks won't really be a problem with GCN 1.2 though. The R9 285 (Tonga) is about equal to Maxwell when it comes to Tessellation.
 
Well, can't be optimized.
If you put on your tinfoil hat, Nvidia is actively preventing AMD from optimizing it. Therein lies the problem.


PCGH has a benchmark up.
HairWorks hits the GTX 970 for about 10fps, the 290X for 15fps. They both run about 44fps average with it disabled.

So it's a non-issue then, got it.
 
Well, can't be optimized.
If you put on your tinfoil hat, Nvidia is actively preventing AMD from optimizing it. Therein lies the problem.


PCGH has a benchmark up.
HairWorks hits the GTX 970 for about 10fps, the 290X for 15fps. They both run about 44fps average with it disabled.

All I am getting out of that is that the game runs like shit when using HairWorks. Dropping from 44 to 34 fps just for enabling fancy hair on a Nvidia card? No thanks.
 
The problem is nv is loosing 20% also, AMD is loosing 37% on the 290x which goes in line with my tessellation theory, that's about the same % drop with uniengine tess on and off.

As I said before the Dev's might not want to do it because of visual reasons, Witcher's hair is pretty awesome.
Well, they are not doing it for some reason.

As for me, I do not really care if a wolf looks better the 30 secs or so I see it before I kill it.
 
Here's the relevant benchmark info.

Hairworks: OFF // ON

970: 43, down to 36.
290X: 42, down to 27.
Differnece: 7fps vs 14.3fps
 
Are people mostly complaining that the Witcher 3 *might* run poorly on AMD hardware? Or are their actual benchmarks showing the game runs like shit on AMD cards?

If getting the game to run correctly on AMD just means turn off hair animations, then people are being whiny about nothing.


As for me, I do not really care if a wolf looks better the 30 secs or so I see it before I kill it.
But if this becomes a trend for SERIOUS SHIT like BoobieJiggling(tm) I will be pissed.
 
That wasn't for HairWorks. GameWorks didn't even exist back then.

That was for the time Nvidia-sponsored titles jacked the tessellation levels so high on random geometry it would cause an insane performance loss on AMD GPUs.

HairWorks won't really be a problem with GCN 1.2 though. The R9 285 (Tonga) is about equal to Maxwell when it comes to Tessellation.


Tonga still has a greater performance hit with tessellation it takes around 30% while nV's hardware takes 20%.
 
Here's the relevant benchmark info.

Hairworks: OFF // ON

970: 43, down to 36.
290X: 42, down to 27.
Differnece: 7fps vs 14.3fps
I think it said in the review that was just with Geralt and his horse nothing demanding. The difference maybe greater with more on screen.
 
Nvidia tested something similar:

drtLWZ8.png
 
so the consensus is that kepler-based card owners got screwed, right? man, if that is the case I will never buy a nvidia product again. just went sli 780ti (bought one a year after the first.. the thing sli is supposed to incentivize) and now my rig can get easily beaten by a shitty maxwell-based gpu?

wow. this is just something.
 
most likely it will be fixed, that benchmark is a pre release, wait till the full game comes out., Kepler based cards are getting a performance hit that shouldn't be there.
 
NVIDIA goes out of their way to make sure games run better for their customers.

Meanwhile AMD has not released a driver since last year.
 
NVIDIA goes out of their way to make sure games run better for their customers.

Meanwhile AMD has not released a driver since last year.

Apparently betas don't count as releasing a driver then?,....
 
NVIDIA goes out of their way to make sure games run better for their customers.

Meanwhile AMD has not released a driver since last year.

Maybe in your alternate reality but in the realworld, you are just flat out wrong.
 
Dear Erek,

We are pleased to inform you that your application to NVIDIA GameWorks&#8482; Registered Developer Program is approved.

As a member, you can also have access to file bugs, submit feature requests, engage with NVIDIA engineering, and access to exclusive events. From time to time, you may receive emails to , asking you to validate your email and Registered Developer Program status. It is important that you promptly respond to those emails in order to maintain your status.

Thank you for joining the NVIDIA GameWorks Registered Developer Program. We look forward to your participation!

NVIDIA Developer Relations Team

https://developer.nvidia.com/gameworks-visualfx-overview


Lets see how this works with Unity ... I am curious if the GameWorks is really all that big of a deal involving AMD ..

Here is the current status of my own work without using any GameWorks features :

https://youtu.be/aNGzm4895UQ
 
NVIDIA goes out of their way to make sure games run better for their NEW customers.

Meanwhile AMD has not released a driver since last year.

FTFY ;)



(yes it's a "semi-final" review build with 2 of 3 day 1 patches, so hopefully the Kepler cards regain their rightfully deserved performance in the retail build)
 
It seems that Nvidia is in a perpetual state of "shitstorm" these days. (970 Memorygate, Batmangate, Gameworks, Bumpgate, etc).

Marketshare seems to be holding in though.

Maybe these issues aren't as big as some people are making them out to be.

I said at the time that the 970 thing was making a mountain out of a molehill. Just checked newegg and I see they're still selling for over $300. Yup. Nobody cares.
 
FTFY ;)



(yes it's a "semi-final" review build with 2 of 3 day 1 patches, so hopefully the Kepler cards regain their rightfully deserved performance in the retail build)

So, new tech is faster than old tech...GTFO. Never expected that.
 
So, new tech is faster than old tech...GTFO. Never expected that.

For you it doesnt raise an eyebrow that a premium card that has been practically twice as fast in other games suddenly takes a nose dive in a new game for no clear cut (like AMD's Tesselation deficiency) reason and suddenly trades blows with a budget card?
 
Back
Top