Nvidia Responds To Witcher 3 GameWorks Controversy, PC Gamers On The Offensive

I am saying it would be a small niche if the major reason for PC games to be popular was graphics.

Still sounds like two separate arguments, but we can go ahead and talk about this now if you want... Define "small"? It's big enough for AMD and nVidia to release expensive graphics cards every year and even hold media events to talk about them. It's big enough for MS to get off it's ass and develop DX12 and it's big enough that games that cost millions to make are being released for PC with PC specific features added in.

I just don't see the point you're making.
 
Still sounds like two separate arguments, but we can go ahead and talk about this now if you want... Define "small"? It's big enough for AMD and nVidia to release expensive graphics cards every year and even hold media events to talk about them. It's big enough for MS to get off it's ass and develop DX12 and it's big enough that games that cost millions to make are being released for PC with PC specific features added in.

I just don't see the point you're making.

The AAA games are still build specifically for consoles. Very few people play those games on PC with the exception of a few really good titles like BFs and GTAs.
They most likely make most of their profit from consoles too.


Look at any statistic like this one - http://www.statista.com/statistics/251222/most-played-pc-games/

So the major reason PC gaming is a thing is because of these games which are not there on any other platform and are awesome. When someone buys a PC to play these games, they would probably invest $200 - $400 on GFX card too, coz why not? it will add the benefit of AAA games too not the other way around.

If you take out these games, I don't see people spending triple the amount over a console for a very similar experience though less gfx goodness. (except enthusiasts ofcourse)
 
The AAA games are still build specifically for consoles. Very few people play those games on PC with the exception of a few really good titles like BFs and GTAs.
They most likely make most of their profit from consoles too.


Look at any statistic like this one - http://www.statista.com/statistics/251222/most-played-pc-games/

So the major reason PC gaming is a thing is because of these games which are not there on any other platform and are awesome. When someone buys a PC to play these games, they would probably invest $200 - $400 on GFX card too, coz why not? it will add the benefit of AAA games too not the other way around.

If you take out these games, I don't see people spending triple the amount over a console for a very similar experience though less gfx goodness. (except enthusiasts ofcourse)

The fact they are ported over to PC is not relevant though to your point. Which i'm not really sure what is tbh.

First you said PC gamers complain too much
Then it was PC gaming is small niche
Now it's PC games are ports and console is more profitable

What is your point exactly?

You're wrong about the whole complaining part, you just need to visit console forums to see that for yourself..

We all know console market for AAA games is bigger than PC, you aren't enlightening anyone there.

We know that many games are ported over to PC

And obviously if there are no demanding games, people wouldn't buy high end hardware.

I just don't get the purpose of your arguments here. They range from completely untrue to completely obvious and you seem to be all over the place with what it is you're actually trying to say, which I'm still unsure of.

Candy Crush is more popular than just about any game on all the platforms combined, that doesn't make the other platforms irrelevant. PC games don't exist out of the kindness of developers hearts. They exist because they're still profitable. With or without the complaining, which happens across all platforms.
 
The fact they are ported over to PC is not relevant though to your point. Which i'm not really sure what is tbh.

First you said PC gamers complain too much
Then it was PC gaming is small niche
Now it's PC games are ports and console is more profitable

What is your point exactly?

You're wrong about the whole complaining part, you just need to visit console forums to see that for yourself..

We all know console market for AAA games is bigger than PC, you aren't enlightening anyone there.

We know that many games are ported over to PC

And obviously if there are no demanding games, people wouldn't buy high end hardware.

I just don't get the purpose of your arguments here. They range from completely untrue to completely obvious and you seem to be all over the place with what it is you're actually trying to say, which I'm still unsure of.

Candy Crush is more popular than just about any game on all the platforms combined, that doesn't make the other platforms irrelevant. PC games don't exist out of the kindness of developers hearts. They exist because they're still profitable. With or without the complaining, which happens across all platforms.

Well this all started with me just saying is it even such a big deal if hairworks is doing that shit. Graphics arent the most important thing in a game by far.
 
I think he's taking shots at pc by basically saying the only thing it has over a console is higher resolution. And you would have to be a pretty big moron to beleive that.

He should read this

https://www.reddit.com/r/pcmasterrace/wiki/guide

Nah, I'm saying that PC gamers bitching over small shit makes the whole community look bad. And I follow that reddit. Maybe I said it too strongly / wrongly, so that'd be my bad.
 
Nah, I'm saying that PC gamers bitching over small shit makes the whole community look bad. And I follow that reddit. Maybe I said it too strongly / wrongly, so that'd be my bad.

Well how would you suggest that PC Gamers express to developers that 11 fps in a racing game isn't fun, and running tessellation at 64x is making the game unplayable? Send them care packages with cake, milk, and cookies?

I think that most developers have come to the realization that the more passionate PC Gamers are about their creation, then the more they love their work. Criticism can be a good thing if it gives you the proper feedback to create a better game and experience for your audience. As long as the developer engages with the fanbase in a positive manner, then the money will roll in from increased sales. Watch some DICE videos on critical feedback and BF4 changes. They print money over there.
 
Bioware as well.

Engaging your fan base is THE way to get more fans...
 
Well how would you suggest that PC Gamers express to developers that 11 fps in a racing game isn't fun, and running tessellation at 64x is making the game unplayable? Send them care packages with cake, milk, and cookies?

I think that most developers have come to the realization that the more passionate PC Gamers are about their creation, then the more they love their work. Criticism can be a good thing if it gives you the proper feedback to create a better game and experience for your audience. As long as the developer engages with the fanbase in a positive manner, then the money will roll in from increased sales. Watch some DICE videos on critical feedback and BF4 changes. They print money over there.

we are talking about witcher 3 here, amd is working on a fix as per them, and in the meanwhile you can play with hairwerks off. Maybe other hardware except maxwell isn't capable of delivering the same visuals / performance of hairwerks in this game.

Do you agree if there was no setting for hairworks at all and it was off by default, people would say it is an amazing launch?


It would be crappy if they did shit like ubisoft.
 
i expect CDPR to release a patch to add a quality slider eventually.

In the mean time, you dont have to turn it off.

Change Tesseleation to 8X in CCC, and you are good to go.
 
i expect CDPR to release a patch to add a quality slider eventually.

In the mean time, you dont have to turn it off.

Change Tesseleation to 8X in CCC, and you are good to go.

is this tesselation CPU Bound in some way? I changed it to 4x and my frame rate still dropped massively into occasional stutters and a 290 shouldnt be the problem.

drops from 50s to 25 - 35 with the CCC setting. Am I doing something wrong?
 
you are on an i3, you are going to be CPU bound in general.

Matching that with a 290, is a ocmplete waste of GPU power.

Get an i5 in there already.
 
you are on an i3, you are going to be CPU bound in general.

Matching that with a 290, is a ocmplete waste of GPU power.

Get an i5 in there already.

my 4770k is on its way, and its not as bad as you're making it sound, i3s are okay until you go into heavy cpu bound games territory.

but is this particular setting in general cpu bound though?

like there was physix in the batman game and anyone with an amd gpu with physix turned on had massive lag during one of the missions where you meet harleen quinzel ( if i remember ).
 
anything with advanced physics (physX, bullet, or Havoc) is going to choke on that i-3

It does have PhysX, but there is no GPU offload for nvidia, so its a pretty level playing field, unlike BM.
 
is Hairworks phsyx?
It's shaders (GPU), heavily dependent on tessellation performance. It's more of a problem on AMD GPUs, but the performance hit will probably also force older nvidia card owners to turn it off too.

They fixed that awhile ago...
http://images.anandtech.com/graphs/graph5261/43145.png[/I MG][/QUOTE][url]http://www.anandtech.com/show/8526/nvidia-geforce-gtx-980-review/19[/url]

Or did they!
 
Anyone who's followed The Witcher 3's development from the first teasers and trailers to the finished product has likely noticed not-quite-subtle changes to the graphics—and definitely not for the better. A NeoGaf poster whipped up a quick comparison, and the differences are rather stark, I'd say. And this guy agrees.


http://techreport.com/news/28314/the-witcher-3-developer-explains-controversial-graphics-downgrade

probably gotta thank Obama for that.

Btw, has anyone put sweetfx on this game yet? might have some good improvements with that
 
hmmm...patch 1.03 must have fixed something in hairyworks.

i get mid 50s with all ultra / all post processing - HBAO+ / 8x tess at 1080p on a single 290pro.
 
Ah there we go, tesselation was over the top at 64x...
Not over the top, but it looks much better at that setting. You can't see it in a still photo unless it's been reduced down to stick segments like the 8x picture.

It's less of a problem on nvidia cards since those have about 2-4x the 64x tessellation performance vs GCN 1.1 cards: http://images.anandtech.com/graphs/graph8526/67750.png

The problem could have been avoided if there was an easy way to adjust tessellation in HairWorks as the AMD control panel allows, like the game offering higher and lower quality settings for Hairworks. But it looks like that's an easy fix AMD card owners to use if they want HairWorks enabled.
 
Last edited:

AMD have been working on their tessellation performance in recent years, with their new Tonga, GCN 1.2 architecture doubling its performance under tessellated loads, though right now AMD have only one GPU using this architecture. AMD will need to refresh their high end lineup soon with new silicon to decrease the affect of these kinds of issues in the future

Wow and they wrote this down and didn't think of it twice . Nvidia introduces useless feature which adds nothing to gameplay then AMD have to update their hardware because of what Nvidia does to 1 game.

What a weird conclusion for a tech website...
 
It's not Nvidia's fault that AMD sucks at tessellation.

You mean as long as you can put _useless_ code into a game which does not benefit gameplay (no where in whichever review you will read that tessellation at 64 is mandatory otherwise the game looks like crap).

And this is justified how exactly ? You can just turn it down with the AMD drivers anyway. Rather simple solution to Nvidia stupidity.
 
Not over the top, but it looks much better at that setting. You can't see it in a still photo unless it's been reduced down to stick segments like the 8x picture.

It's less of a problem on nvidia cards since those have about 2-4x the 64x tessellation performance vs GCN 1.1 cards: http://images.anandtech.com/graphs/graph8526/67750.png

The problem could have been avoided if there was an easy way to adjust tessellation in HairWorks as the AMD control panel allows, like the game offering higher and lower quality settings for Hairworks. But it looks like that's an easy fix AMD card owners to use if they want HairWorks enabled.

People keep saying "NVidia cards" but that's not really the case.
780 is a dog in this game.
AMD didn't make this card.
um...
 
People keep saying "NVidia cards" but that's not really the case.
780 is a dog in this game.
Where did I say any particular model will make every gamer happy? The game is graphically intensive, especially in Ultra quality settings with AA. Driver updates will probably fix performance issues with AMD and nvidia cards if something can be tweaked to improve it. Give it a couple of weeks.

TIP: A way to play games optimally with one weird trick: scale the game settings to the GPU it's running on until you're happy with the performance. That varies based on individual tastes. Some like higher resolution, some like fiddling with quality settings and some go ballistic when they see a jaggy so higher AA is important.
 
Where did I say any particular model will make every gamer happy? The game is graphically intensive, especially in Ultra quality settings with AA. Driver updates will probably fix performance issues with AMD and nvidia cards if something can be tweaked to improve it. Give it a couple of weeks.

TIP: A way to play games optimally with one weird trick: scale the game settings to the GPU it's running on until you're happy with the performance. That varies based on individual tastes. Some like higher resolution, some like fiddling with quality settings and some go ballistic when they see a jaggy so higher AA is important.

Be obtuse if you must.
Kepler owners are getting fucked just as much as AMD users.
 
Be obtuse if you must.
Kepler owners are getting fucked just as much as AMD users.
I'm rubber and you're glue... Give it a couple of weeks and chill out:

https://forums.geforce.com/default/...he-witcher-3-wild-hunt-/post/4533205/#4533205

ManuelG Customer Care said:
That is untrue. Our software team is investigating the reported issue. I will update everyone as soon as I have more information.

Notice that was posted within a day of "Nvidia won't do anything to improve 700 series cards performance/sabotage!!!!111" :ALLTHECRYING: forum babble started. :rolleyes:
 
I'm rubber and you're glue... Give it a couple of weeks and chill out:

https://forums.geforce.com/default/...he-witcher-3-wild-hunt-/post/4533205/#4533205



Notice that was posted within a day of "Nvidia won't do anything to improve 700 series cards performance/sabotage!!!!111" :ALLTHECRYING: forum babble started. :rolleyes:

Come on it's Nvidia we are talking about here. I remember when the 970 first was found to have issues. Nvidia support was talking about refunds. Then we fast forward a day or two and they are calling the 3.5gb a feature, not a chance of a refund, unless you live in Europe where they have laws for the customer.
Oh I also remember that time they disabled overclocking for mobile GPU's, a day or two after they said they would fix it and allow overclocking again. Fast forward a bit and the next drivers blocked overclocking again.

And just an FYI Kepler has been getting worse and worse, hell I remember when an overclocked 780 could beat a 290x, now a 780ti is slower than a 290.

You can't tell me that they are going to fix my problems because recent history shows they wont.
 
I'm rubber and you're glue... Give it a couple of weeks and chill out:

https://forums.geforce.com/default/...he-witcher-3-wild-hunt-/post/4533205/#4533205
Nvidia: "See guys? They're going to fix the drivers, just give them time."
AMD: "They need time to fix their drivers?! It should work fine on DAY ONE, they are the worst company ever. Their drivers are so awful."
:rolleyes:

Let's rewind to last week when AMD said they needed time to fix their Project Cars driver.
At least this confirms our suspicions: GameWorks sucks for everyone, including Nvidia owners (Kepler, anyway).
 
What the hell is the baseline here? Everyone assumes their performance is being gimped when they have no knowledge of what's actually going on beyond what's in their minds. Then they say that their concern is legitimate because they're doing it for "the gaming cornmunity"! Everything that the PC gaming community seems to do just reinforces the notion that we're a bunch of overgrown children who refuse to pay for content while spending magnitudes more on hardware and then whine about how they're not getting their money's worth out of that.
 
Come on it's Nvidia we are talking about here. I remember when the 970 first was found to have issues. Nvidia support was talking about refunds. Then we fast forward a day or two and they are calling the 3.5gb a feature, not a chance of a refund, unless you live in Europe where they have laws for the customer.
Oh I also remember that time they disabled overclocking for mobile GPU's, a day or two after they said they would fix it and allow overclocking again. Fast forward a bit and the next drivers blocked overclocking again.

And just an FYI Kepler has been getting worse and worse, hell I remember when an overclocked 780 could beat a 290x, now a 780ti is slower than a 290.

You can't tell me that they are going to fix my problems because recent history shows they wont.

So you're saying:
a) Nvidia didn't issue a mass refund on a product line,
b) Changed their stance on mobile GPU overclocking,
c) AMD has closed some of the driver performance gap that used to exist,

Therefore the correct assumption is that Nvidia won't be releasing driver updates that improve Kepler performance in Witcher 3?
 
Why would they? They already have your money from Kepler, they want you to buy something new. If they improve their drivers so their older cards have a smaller performance gap to their newer ones, what's the incentive to upgrade? Imagine Apple bringing out a new iOS that used less memory and ran faster on older iPhones. It's never happened for a reason.
 
Nvidia: "See guys? They're going to fix the drivers, just give them time."
AMD: "It's all Nvidia's fault bawwwwwwwww"

ftfy :D

The kepler problem is unusual gpu utilization, not gw. Little things like accuracy probably don't matter.
 
Why would they? They already have your money from Kepler, they want you to buy something new. If they improve their drivers so their older cards have a smaller performance gap to their newer ones, what's the incentive to upgrade? Imagine Apple bringing out a new iOS that used less memory and ran faster on older iPhones. It's never happened for a reason.

QFT
 
So you're saying:
a) Nvidia didn't issue a mass refund on a product line,
b) Changed their stance on mobile GPU overclocking,
c) AMD has closed some of the driver performance gap that used to exist,

Therefore the correct assumption is that Nvidia won't be releasing driver updates that improve Kepler performance in Witcher 3?

No, what I am saying is that anything that you hear from a Nvidia rep on their forums should be taken with a sea of salt. The two things I mentioned are examples of why things said by reps should be taken with a sea of salt.

As far as AMD closing a gap in divers I find it hard to believe unless the consoles are finally helping AMD hardware shine. I do however strongly believe that a company will drop support for older products, in this case I have a really bad taste in my mouth because I bought a 780 instead of a 290 or 290x.
So to sum that up, I could have paid less and had more performance for a longer time. Seeing as the 290x/290 will be seen in the 300 series It would have been really smart to get one of those cards as the support will be with them until 14nm (what I am waiting for). Instead I got to pay more and have support dropped for Kepler by the leading company in drivers etc... etc... all that good stuff. I will not make the same mistake again.

Edit: heck I could still be running a 7970 or have bought 280x/285 seeing as they will most likely be better than a 780 in the near future.
 
Last edited:
No, what I am saying is that anything that you hear from a Nvidia rep on their forums should be taken with a sea of salt. The two things I mentioned are examples of why things said by reps should be taken with a sea of salt.

As far as AMD closing a gap in divers I find it hard to believe unless the consoles are finally helping AMD hardware shine. I do however strongly believe that a company will drop support for older products, in this case I have a really bad taste in my mouth because I bought a 780 instead of a 290 or 290x.
So to sum that up, I could have paid less and had more performance for a longer time. Seeing as the 290x/290 will be seen in the 300 series It would have been really smart to get one of those cards as the support will be with them until 14nm (what I am waiting for). Instead I got to pay more and have support dropped for Kepler by the leading company in drivers etc... etc... all that good stuff. I will not make the same mistake again.

Edit: heck I could still be running a 7970 or have bought 280x/285 seeing as they will most likely be better than a 780 in the near future.

Yep.
People that bought a launch 290x got their money's worth. And will continue to get their moneys worth.
Somewhere in the mining craze it got hazy but after... there were some wicked deals on these cards. Those people got their moneys worth.
780 owners... Fucked. Totally fucked. Hell... can you imagine being an original Titan owner? Talk about ass fucked with no Vaseline.
 
No, what I am saying is that anything that you hear from a Nvidia rep on their forums should be taken with a sea of salt. The two things I mentioned are examples of why things said by reps should be taken with a sea of salt.

As far as AMD closing a gap in divers I find it hard to believe unless the consoles are finally helping AMD hardware shine. I do however strongly believe that a company will drop support for older products, in this case I have a really bad taste in my mouth because I bought a 780 instead of a 290 or 290x.
So to sum that up, I could have paid less and had more performance for a longer time. Seeing as the 290x/290 will be seen in the 300 series It would have been really smart to get one of those cards as the support will be with them until 14nm (what I am waiting for). Instead I got to pay more and have support dropped for Kepler by the leading company in drivers etc... etc... all that good stuff. I will not make the same mistake again.

Edit: heck I could still be running a 7970 or have bought 280x/285 seeing as they will most likely be better than a 780 in the near future.

Just want to say you are right to feel something is off. How long has it been now that the Kepler concern has been discussed. Witcher3 with the tess issue helped give it a bit more light.

Fact is that the 780Ti was definitely ahead of the 290X, granted not by a mile but still ahead. Now we see the 780Ti barely keeping pace with the 290 non(x). Something is off. Maybe Kepler has hit a wall and no longer can drivers give the performance increases as with Maxwell or GCN. But GCN is not new and enjoying increases equal to Maxwell. Something is definitely off.
 
Back
Top