Nvidia Responds To Witcher 3 GameWorks Controversy, PC Gamers On The Offensive

Bullshit claim, proper reviews done show Kepler doing fine. This is just AMD fanboy nonsense that dupes some gullible Kepler owners into believing it because they don't do their own research.

A real claim, I am a Kepler owner and I am pissed I did not buy an R9 290 after seeing benches for Witcher 3, and trust me I have been doing research ever since the first benchmarks have come out. Initially Kelper (both the 780 and titan) were slower than a 960. Now via guru3d ,on the latest drivers with the latest patch, 780 and titan are 3-5 FPS above the 960, what a relief. The R9 290 is beating the 780ti what a joke. :mad:

Oh and in before someone tells me to turn down the settings... I bought a Nvidia card not an AMD card!!!!!!!!!!!!!!!!!!!!!!
 
1440P - Medium settings and all post processing enabled except for AA and Vignette. Using a GTX 780 overclocked to 1215/1675 - running about 40-60 FPS at the start of the game (White Orchard). Water is set to High and Textures to Ultra. Game still looks great IMO and runs decent.
 
A real claim, I am a Kepler owner and I am pissed I did not buy an R9 290 after seeing benches for Witcher 3, and trust me I have been doing research ever since the first benchmarks have come out. Initially Kelper (both the 780 and titan) were slower than a 960. Now via guru3d ,on the latest drivers with the latest patch, 780 and titan are 3-5 FPS above the 960, what a relief. The R9 290 is beating the 780ti what a joke. :mad:

Oh and in before someone tells me to turn down the settings... I bought a Nvidia card not an AMD card!!!!!!!!!!!!!!!!!!!!!!

I don't see the 960 anywhere near the 780 Ti or Titan, do you? What I do see is the inherent weakness of Kepler being exposed in some situations vs Maxwell and here's a shocker for you: technology evolves! Your 2 year old card may perform worse in SOME situations vs Maxwell or GCN,get over it. This is one game of many that are out there and the difference between a 290X and 780 Ti in this game aren't much at all.

http--www.gamegpu.ru-images-stories-Test_GPU-RPG-The_Witcher_3_Wild_Hunt-game-new-1920_u_off.jpg


http--www.gamegpu.ru-images-stories-Test_GPU-RPG-The_Witcher_3_Wild_Hunt-game-new-2560_h_off.jpg


http--www.gamegpu.ru-images-stories-Test_GPU-Retro-The_Witcher_2_Assassins_of_Kings-test-witcher3_1920.jpg


w3m_ultra_1920.png


w3m_ultra_1920h.png
 
I don't see the 960 anywhere near the 780 Ti or Titan, do you? What I do see is the inherent weakness of Kepler being exposed in some situations vs Maxwell and here's a shocker for you: technology evolves! Your 2 year old card may perform worse in SOME situations vs Maxwell or GCN,get over it. This is one game of many that are out there and the difference between a 290X and 780 Ti in this game aren't much at all.

Here are some sites you can check out...
http://techbuyersguru.com/hotdealsblog/?p=4119

http://www.guru3d.com/articles_pages/the_witcher_3_graphics_performance_review,5.html

http://www.pcgameshardware.de/The-Witcher-3-PC-237266/Specials/Grafikkarten-Benchmarks-1159196/

Updated with the newest patch and latest drivers.

I really hope H does a review, then I can add one more site to my list.

Edit: oh and in the first graphs you posted the difference between a 960 and titan and 780 is about 4-6 fps and the r9 290 is better than a 780ti!!!! I mean that is unbelievable.
"here's a shocker for you: technology evolves"- and Kepler technology must have devolved, I remember when Kepler was beating the r9 290 ( very old card must be close to the same age as Kepler) so now I am to believe that either AMD's hardware is better than Nvidia, or that AMD has better drivers than Nvidia, or that Nvidia threw Kepler under a bus. I wish I knew which one it was, but in the end all are still pretty bad.
 
Last edited:
Wait wot... the 295x2 has a lower framerate than the 290x? LMAO! Ya, so not buying this game.
 
joker is just being a mean-spirited know-it-all. no matter how many links you put up, he will insist there "there's nothing to see here -- you're all idiots"...

there is mounting evidence that he is wrong, so let it play out.
 
one thing that seems interesting is the big performance boost people are reporting when reverting to old drivers. Obviously that's not a great fix overall and it's even worse if you have an sli rig, but I'd definitely say it's a good sign. Gotta love a "Game Ready" driver totally crippling performance though, if it is true.
 
Wait wot... the 295x2 has a lower framerate than the 290x? LMAO! Ya, so not buying this game.
CrossFire isn't working. There should be a new driver sometime next week per AMD_Roy to fix this.

It's not the game's fault, really. Not sure why you'd blame CDPR for that.
 
Witcher 3 and Project Cars aside for the moment, there are 2 other games in which the 960 comes within 10% of a 780 -- COD:AW and Far Cry 4.

cod_aw_1920_1080.gif
farcry4_1920_1080.gif


Perhaps it's worth finding out what (if any) these 4 games have in common that result in Kepler not performing up to par.
 
http://www.reddit.com/r/witcher/comments/36jpe9/how_to_run_hairworks_on_amd_cards_without/

Interesting, basically you can use Catalyst to limit tesselation to 8x or lower, negating the immense Hairworks drop for AMD cards.

It's such a moving target...

You can also edit the .INI file to scale it down manually. AND it can be overwritten in drivers. Between the 64x and 8x there appears to be very little visual difference, but huge FPS difference. Then add in 8x AA to something that gets mostly hidden anyways on top of that. It's getting ridiculous.

So.... Who knows where gameworks defaults AMD. Over tesselation is getting old.

... and even nvidia's last gen cards for that matter can be victims, with the FPS difference with older drivers.

With just hairworks as an example, benchmarking this game is more or less a guessing game with what you think your settings are and what you are seeing onscreen IMO.
 
there is a ini setting somewhere in the game folder...

That ini setting is for antialiasing of hair based on what is being reported. Currently, there is no way to adjust the tessellation amount regarding hair for Nvidia cards, like you can through the CCC for AMD.

...unless that development came at some point today and I missed it. :)
 
Has there been a console port in the last ten years that didn't initially run like sheet on the PC? This sounds like Dying Light and Ass Creed all over again.
 
Far from it. For AC:U, that game was botched for both sides since day 1, the textures used in low were 'runescapey', the high details ran like dog, basically the game was barely playable at any level. As far as I could tell TW3 only has real issues when HairWorks is turned on, and it does not do a whole lot either. On medium or high details the game runs fine for the most part (at least much better than AC:U, which is ironic given how much of a bigger franchise and company AC and Ubisoft are compared to CDPR).
 
Has there been a console port in the last ten years that didn't initially run like sheet on the PC? This sounds like Dying Light and Ass Creed all over again.

Ugh... GTA V runs pretty good on my lowly 680GTX. Maybe I'm too docile and don't know if I should be complaining.
 
That reddit post, yes, though I can't comment as I completely skipped Kelper (went from a 570 straight to 970's).
 

The contortions are amazing. Wouldn't it be far easier to just compare the 780 (before) to the 780 (after) rather than comparing the 780 to the 290 and the 780 to the 970? Like let's introduce as many variables as possible to fog up the logic.

The implication is that the drivers for Kepler got worse -- easily provable or disprovable -- rather than the possibility that AMD and Maxwell's drivers just got better, which would certainly be quite ordinary and hardly in need of a 3,000 word reddit post to believe.
 
Man this sucks, I remember when the consoles came out as AMD products. I was pretty happy that games would be optimized for AMD hardware but they are still optimized for NVIDA. That means that the consoles will have a hard time with the game, and we can't have a PC game that looks better than consoles, time to downgrade the graphics until it plays on AMD hardware... so everyone is losing.

The vast majority of games are developed for consoles in mind first, so console optimization isn't going to be an issue
 
The contortions are amazing. Wouldn't it be far easier to just compare the 780 (before) to the 780 (after) rather than comparing the 780 to the 290 and the 780 to the 970? Like let's introduce as many variables as possible to fog up the logic.

The implication is that the drivers for Kepler got worse -- easily provable or disprovable -- rather than the possibility that AMD and Maxwell's drivers just got better, which would certainly be quite ordinary and hardly in need of a 3,000 word reddit post to believe.

Yeah I have to agree that there are many variables, but from the looks of it what that guy posted is pretty crazy.
Looks to me like AMD has been producing better hardware or drivers or something. Nvidia is not optimizing drivers for Kepler and at the same time, in those games, not optimizing drivers for maxwell? Up until now that is as Maxwell is hitting AMD in the Witcher 3.

In any case I am pretty mad I bought a Kepler 780, would have been much better off with a 290x or 290 to hold me off until 14nm.
 
The vast majority of games are developed for consoles in mind first, so console optimization isn't going to be an issue

See, for me though, it is hard to imagine that games are made with gameworks features, by devs, for the PC and at the same time those games are ported to AMD consoles.

Edit: Tinfoil hat and all but maybe the consoles are why AMD cards have been getting better and better performance as devs start to get more efficient at optimizing for the consoles.
 
See, for me though, it is hard to imagine that games are made with gameworks features, by devs, for the PC and at the same time those games are ported to AMD consoles.

Edit: Tinfoil hat and all but maybe the consoles are why AMD cards have been getting better and better performance as devs start to get more efficient at optimizing for the consoles.

They aren't ported to AMD consoles. They are ported from console to PC and GW features tacked onto them.
 
They aren't ported to AMD consoles. They are ported from console to PC and GW features tacked onto them.

Come on man... they make the game on PC (in engine) and then port/optimized to consoles. That is why all those games looked so good in the demos before the new consoles were announced. Devs must have expected super consoles but got the PS4 and XBone (time to downgrade those games and get them ready for consoles)
 
The implication is that the drivers for Kepler got worse -- easily provable or disprovable -- rather than the possibility that AMD and Maxwell's drivers just got better, which would certainly be quite ordinary and hardly in need of a 3,000 word reddit post to believe.

Indicating that Nvidia doesn't support its older cards (and by older I mean 1 year older) as well as their current ones
 
Come on man... they make the game on PC (in engine) and then port/optimized to consoles. That is why all those games looked so good in the demos before the new consoles were announced. Devs must have expected super consoles but got the PS4 and XBone (time to downgrade those games and get them ready for consoles)

I don't think you understand how this works. Just because games are made ON a computer doesn't mean they are ported to consoles. Every smart phone app is also made on a computer, doesn't mean it was made FOR a computer then ported to a phone. Developers knew console hardware well before we did.
 
Has there been a console port in the last ten years that didn't initially run like sheet on the PC? This sounds like Dying Light and Ass Creed all over again.
Ugh.

1) TW3 isn't a console port.

2) The game doesn't run badly at all. The internet is just butthurt because some visual effects don't look as good as a trailer from 2 years ago when the game was nowhere near done and they had to change rendering engines since then. The only setting that tanks framerates is HairWorks and the game still looks gorgeous with it off.

Way too much stupid drama over such an amazing game.
 
It's the grass.
Edit the INI and disable grass entirely and then watch how your GPU crushes this game.

I mean, obviously, don't play the game like that. Just proving a point.
 
Now, this is bullshit from AMD.

http://siliconangle.com/blog/2015/0...s-nvidia-of-sabotaging-witcher-3-performance/

It took 36 hours for random internet guys to figure out how to make hairworks not perform terrible on Witcher 3 - and hairworks has NOTHING to do with crossfire not working. In that two months, did they not even bother to try two cards and figure out it was not scaling at all?

Unless the 390x is tremendous and cheap - I am done with AMD. I cannot stand the excuses on the driver end anymore.
 
Indicating that Nvidia doesn't support its older cards (and by older I mean 1 year older) as well as their current ones


Its a driver related issue, using older drivers for keplar cards they run just fine.
 
Now, this is bullshit from AMD.

http://siliconangle.com/blog/2015/0...s-nvidia-of-sabotaging-witcher-3-performance/

It took 36 hours for random internet guys to figure out how to make hairworks not perform terrible on Witcher 3 - and hairworks has NOTHING to do with crossfire not working. In that two months, did they not even bother to try two cards and figure out it was not scaling at all?

Unless the 390x is tremendous and cheap - I am done with AMD. I cannot stand the excuses on the driver end anymore.
I try not to read too much into GameWorks controveries (vs AMD) given what we know about the program. AMD had their GTA 5 driver ready the day before launch, including CrossFire profiles. GTA 5 has a tessellation option in the video settings. It would be nice if TW3 had the same setting (for HairWorks and otherwise).

If you really care about getting the most out of GameWorks games, then you have to buy Nvidia... Congratulations, that was their goal all along. I still fully believe GameWorks was created for the sole purpose of disenfranchising AMD. The same might be said for Gaming Evolved, I guess? The fact that you're sitting there blowing shit at AMD over their problems in a fully Nvidia-sponsored title is the exactly what Nvidia wants you to do.

I don't agree with Huddy's response this time, though. HairWorks' tessellation problems are AMD's fault... Yes, maybe Nvidia is a scumbag for exploiting it. Perhaps the next Battlefield game should use FP64 so it runs like shit on Nvidia cards, then we can blame Nvidia for not optimizing their drivers. I don't even know if that makes sense conceptually, just making a point.

It's funny because you can see the negative effects tessellation is having on Kepler compared to Maxwell. That's what started the outrage over Nvidia crippling Kepler cards in their new drivers.
People are dumb... They would rather be angry for no reason than be told what the real issue is.
 
Last edited:
I try not to read too much into GameWorks controveries (vs AMD) given what we know about the program. AMD had their GTA 5 driver ready the day before launch, including CrossFire profiles. GTA 5 has a tessellation option in the video settings. It would be nice if TW3 had the same setting (for HairWorks and otherwise).

If you really care about getting the most out of GameWorks games, then you have to buy Nvidia... Congratulations, that was their goal all along. I still fully believe GameWorks was created for the sole purpose of disenfranchising AMD. The same might be said for Gaming Evolved, I guess?

The fact that you're sitting there blowing shit at AMD over their problems in a fully Nvidia-sponsored title is the exactly what Nvidia wants you to do.
I don't agree with Huddy's response this time, though. HairWorks' tessellation problems are AMD's fault... Yes, maybe Nvidia is a scumbag for exploiting it. Perhaps the next Battlefield game should use FP64 so it runs like shit on Nvidia cards, then we can blame Nvidia for not optimizing their drivers. I don't even know if that makes sense conceptually, just making a point.

The issue with HairWorks is tessellation, and it's funny because you can see the negative effects it's having on Kepler compared to Maxwell. That's what started the outrage over Nvidia crippling Kepler cards in their new drivers. People are dumb... They would rather be angry for no reason than be told what the real issue is.

The Hairworks tesselation was crippling all cards. The issue is each strand of hair had whatever absurdly high MSAA applied to it. By using the AMD control center and forcing tesselation to 8x, and changing the .ini to 8x - my performance with Hairworks is actually good.

The issue is not really about Hairworks to anyone, though. The issue is about basic crossfire compatibility being an issue time and time again - Witcher 3 included. There is no reason at all that Crossfire didn't work with a specific driver release on day 1 when AMD had the game for a couple months to try and figure it out.
 
When has CrossFire EVER been properly supported?
We're talking about nearly 10 years of bad history here.

I don't see AMD blaming GameWorks or CDPR for that. The article doesn't even mention CrossFire.
 
When has CrossFire EVER been properly supported?
We're talking about nearly 10 years of bad history here.

I don't see AMD blaming GameWorks or CDPR for that. The article doesn't even mention CrossFire.

And that's my problem. The mantra from AMD is instead of having single, extremely expensive GPU solutions is to try and compete with more budget oriented dual GPU solutions. The fact that crossfire doesn't work, and AMD knew it didn't work and is not even acknowledging that as the main issue versus an optional nvidia HAIR feature is my main problem right now.

My single GPU 290 performance with Witcher is not an issue, especially with the hairworks workaround AMD is actually outperforming some of the Nvidia counterparts. With ultra settings, 1440p, most post processing off and hairworks on - I'm getting 40-50fps on a single card.

Crossfire was properly supported day one by GTA5, and has had some successes prior. But it simply comes down to the effort put forth by the AMD driver team, which has historically disappointed.
 
The Hairworks tesselation was crippling all cards. The issue is each strand of hair had whatever absurdly high MSAA applied to it. By using the AMD control center and forcing tesselation to 8x, and changing the .ini to 8x - my performance with Hairworks is actually good.

The issue is not really about Hairworks to anyone, though. The issue is about basic crossfire compatibility being an issue time and time again - Witcher 3 included. There is no reason at all that Crossfire didn't work with a specific driver release on day 1 when AMD had the game for a couple months to try and figure it out.

But when you look at the images of Hairworks with 16x, 8x, 4x, and 2x especially doesn't it change the artistic quality of the hair? One of the images that I looked at with 2x on it looked like Geralt had dreadlocks. If so that would be against the designer's intentions and NOT a fix.
 
Back
Top