Nvidia Responds To Witcher 3 GameWorks Controversy, PC Gamers On The Offensive

I need to see some actual evidence that nvidia is making performance worse over time for Kepler.

Like driver #1 is faster than driver #2 is faster than driver #3, where #3 is the latest in the series, etc.

And I think we need a bigger sample size than four games, two of which just came out.

If you revert to 347.88 drivers you get better performance in the Witcher 3 than with the release day Witcher 3 drivers.

Why do you need evidence by the way? Are you planing on buying a 7xx series or do you own one? I own a 7xx series card and this is a problem for me, the evidence is in how the new games I play are playing like shit on my card, not only in overall FPS but in the hard drops I get while playing games too.
 
If you revert to 347.88 drivers you get better performance in the Witcher 3 than with the release day Witcher 3 drivers.

Why do you need evidence by the way? Are you planing on buying a 7xx series or do you own one? I own a 7xx series card and this is a problem for me, the evidence is in how the new games I play are playing like shit on my card, not only in overall FPS but in the hard drops I get while playing games too.

Ok. So there was a regression between two different branches. Big deal. You're making it sound like they disabled a quarter of the SMs or something. Just use 347.* until they fix it.
 
Ok. So there was a regression between two different branches. Big deal. You're making it sound like they disabled a quarter of the SMs or something. Just use 347.* until they fix it.

The fps bump from the 347 drivers is minimal nothing close to the loss of performance that has taken over the 7xx series in general. If people choose not to see that is their right. All the evidence is everywhere.
Like I said before, as you can probably notice, this is a big deal to me as I own a 7xx series card. If it is not a big deal for you that is ok with me but please don't tell me it is not a big deal to me.

Edit: here is that reddit article about how the 7xx series has got worse link either Nvidia stopped supporting Kepler or they made it slower on purpose, from the looks of it they did both of those things.
 
Last edited:
The fps bump from the 347 drivers is minimal nothing close to the loss of performance that has taken over the 7xx series in general. If people choose not to see that is their right. All the evidence is everywhere.
Like I said before, as you can probably notice, this is a big deal to me as I own a 7xx series card. If it is not a big deal for you that is ok with me but please don't tell me it is not a big deal to me.

I can see it's a big deal to you. You seem to think your $450 video card is "literal shit". I'm just trying to get some kind of understanding what the hell you're talking about. What is the "loss of performance that has taken over the 7xx series in general"?

Are you saying that when the cards were benched in 2013 they were performing better than when benched in 2014 and 2015 (for the identically same games)? By how much?
 
Edit: here is that reddit article about how the 7xx series has got worse link either Nvidia stopped supporting Kepler or they made it slower on purpose, from the looks of it they did both of those things.

That link has already been discussed. It's an idiotic post designed to fog the issue and any time someone deals with percentages to make relative comparisons you know you're dealing with a person trying to obscure some obvious fact.

AMD drivers got better. No kidding. They were starting from the bottom of the barrel.
 
I can see it's a big deal to you. You seem to think your $450 video card is "literal shit". I'm just trying to get some kind of understanding what the hell you're talking about. What is the "loss of performance that has taken over the 7xx series in general"?

Are you saying that when the cards were benched in 2013 they were performing better than when benched in 2014 and 2015 (for the identically same games)? By how much?

Dude a 280x is head to head and even beating the 780 in games. That means a 7970 is beating a 780. Do you remember when the 780 was the best card available? just to get the point through, now a 7970 is going head to head with it. A 7970.
How much more proof could someone need?
 
Dude a 280x is head to head and even beating the 780 in games. That means a 7970 is beating a 780. Do you remember when the 780 was the best card available? just to get the point through, now a 7970 is going head to head with it. A 7970.
How much more proof could someone need?

Show me the game where:

- One year the 780 was beating the 7970.
- Another year the 7970 was beating the 780.

(FOR THE SAME GAME.)

All I'm asking for here is a little bit of DETAIL rather than a bunch of hand-waving.
 
Show me the game where:

- One year the 780 was beating the 7970.
- Another year the 7970 was beating the 780.

(FOR THE SAME GAME.)

All I'm asking for here is a little bit of DETAIL rather than a bunch of hand-waving.

That is not how it works, if you got lower performance in all past games we would need a 980 to run Crysis. How you tell that the drivers are getting worse is by looking a new games that come out and by comparing them to hardware that is the same age etc... and also by comparing them to performance gaps from older games with the same hardware.

Detail is everywhere. If you don't want to see it that's fine.

Edit: And just to be clear the 7970 is older hardware than the 780. Please don't tell me that the 7970 some how got better, and that AMD is the new driver king of the world etc...
What happened is dropped support expedited by no optimization for Kepler and that is downgrading a cards potential and performance through drivers.
 
I can't believe you made me do this, but...

I checked the performance of the 780 vs. 7970 GHz in a new game (GTA V) and the performance difference between the two is about the same as it was for at least half of the games benched in AT's 780 launch review. There were some games in that review where the 7970 was better than the 780.
 
That is not how it works, if you got lower performance in all past games we would need a 980 to run Crysis. How you tell that the drivers are getting worse is by looking a new games that come out and by comparing them to hardware that is the same age etc... and also by comparing them to performance gaps from older games with the same hardware.

Detail is everywhere. If you don't want to see it that's fine.

Edit: And just to be clear the 7970 is older hardware than the 780. Please don't tell me that the 7970 some how got better, and that AMD is the new driver king of the world etc...
What happened is dropped support expedited by no optimization for Kepler and that is downgrading a cards potential and performance through drivers.

Filip you are acting like a child who doesn't have the toy but is complaining about it any ways sorry had to say it trying to make fun of the kid that has the toy and you have to justify that your toy is better, my 9 year old niece doesn't even do that. nV supports their cards for around 3 generations, similar to AMD. So what happens when AMD stops supporting their cards, dont' worry about it, by then you should have a new card.

The 780 were EOL more then 9 months ago. So no one will be buying them any more other then second hand. Driver bugs happen, it will be fixed.
 
Last edited:
Scali responds to Richard Huddy accusations regarding The Witcher 3 in 2010.

In other words, it has been known since 2010 that AMD's rendering pipeline chokes on its own design, and AMD has done nothing to rectify the situation except add a hack which for all intents and purposes should have invalidated the driver ever being WHQL'd again.

Scali is as biased as they come, Forced Driver Vsync is also a hack, you dont see him jumping on NV for that.
Scali started his blog after being banned from several hardware enthusiast forums..
 
Scali is as biased as they come, Forced Driver Vsync is also a hack, you dont see him jumping on NV for that.
Scali started his blog after being banned from several hardware enthusiast forums..


I don't particularly care where he was banned from, the point is AMD has a flaw in its rendering pipeline and has not fixed it in the 5 years since it was first identified.

They were quick enough to fix the AF flaw in the 5xxx series cards though after it was identified.

As for driver vsync, both ati and nvidia have had it for years going back to the early 2000's, and i cannot even recall who added it first. the factor hack was added to work around a weakness in amd's designs which they should have reimplemented in more recent gpu's.
 
Scali is as biased as they come, Forced Driver Vsync is also a hack, you dont see him jumping on NV for that.
Scali started his blog after being banned from several hardware enthusiast forums..

Still the fact remains he is a very good programmer and found the flaw without source code You think AMD engineers aren't as good as Scali then? Shows the flaws in AMD's side too.

AMD's tonga did address this flaw for the most part btw.
 
Last edited:
Filip you are acting like a child who doesn't have the toy but is complaining about it any ways sorry had to say it trying to make fun of the kid that has the toy and you have to justify that your toy is better, my 9 year old niece doesn't even do that. nV supports their cards for around 3 generations, similar to AMD. So what happens when AMD stops supporting their cards, dont' worry about it, by then you should have a new card.

The 780 were EOL more then 9 months ago. So no one will be buying them any more other then second hand. Driver bugs happen, it will be fixed.

I have the toy (GTX 780) I can send you pictures of me with it so you can put them as your desktop background. I don't understand what your point is. Do I not have a right to have my card preform as it should.
Later today I am going to make a chart or something to show all the naysayers the performance difference of the 780 over time. What I can't comprehend is that people are telling me this is ok for a company to do to the product I purchased in two years after it was released and only 9 months after it was taken off the shelves.
 
I don't particularly care where he was banned from, the point is AMD has a flaw in its rendering pipeline and has not fixed it in the 5 years since it was first identified.

They were quick enough to fix the AF flaw in the 5xxx series cards though after it was identified.

As for driver vsync, both ati and nvidia have had it for years going back to the early 2000's, and i cannot even recall who added it first. the factor hack was added to work around a weakness in amd's designs which they should have reimplemented in more recent gpu's.

Since Vista came out MS changed the rules, Forced driver Vsync was disabled for both AMD and NV, NV than later hacked a work around.

Anything more than x32 tessellation is a waste of resources including more than 4xaa on the hair, dedicate the hardware and resources saved to other areas.
 
I can't believe you made me do this, but...

I checked the performance of the 780 vs. 7970 GHz in a new game (GTA V) and the performance difference between the two is about the same as it was for at least half of the games benched in AT's 780 launch review. There were some games in that review where the 7970 was better than the 780.

Thanks for checking one game where the 7970 does not come up next to a 780. I will make a more conclusive chart to share with everyone. Then I wonder what people will tell me. :rolleyes:
 
https://www.youtube.com/watch?v=IYL07c74Jr4

As you can see some objects are so heavily tessellated but the amount of detail you NOTICE IN GAME is pretty much ZERO. Especially those alien-shields and the big chunks of stone and rubble. You barely see them up so close at ALL... So what is the reason for the heavy tessellation? The alien tentacles are tessellated but still they are EDGY on some points, like that big hose/tube in the middle of them - it has EDGES, so why the thousands of polygons for gods sake?

Crytek should have chosen a better distance and a MUCH MUCH LOWER tessellation factor. Looking at HEAVEN Benchmark a proper road with bricks can be achieved easily with 8x Tessellation. On many OTHER objects some POM would have done MUCH better than ******* tessellation, like the white wall inside the one room with the light. POM would have done better performance-wise. But nooo, we gotta kill the performance with DX11... ***** sake crytek....

MUCH MORE IMPORTANT STUFF like characters and plants are not tessellated at all. Tessellation in Crysis 2 is a good idea but it is being used not really to show how pretty a game can look with more detail but to **** up the performance for ATI users. I cant find another explanation for some objects being tessellated as heavy as that.


Screw ******* NVIDIA, screw ******* Crytek's lies about developing for the people. The SDKs are a good way to recruit people and maybe find a better way to do something. If people end up selling their stuff Crytek gets paid for the license.

God damn ******* stone-cold-hearted moneymakers... frown.gif
 
Maxwell handles tessellation better than Kepler, in games with higher tessellation Maxwell will perform better. Since Nvidia is not an 'evil' company, they're doing that because it looks better. Technology improves over time, it's not their fault new cards are more efficient at certain things... Would you prefer we stop advancing graphics altogether just so Kepler can stay fast?

It's also not Nvidia's responsibility to worry about AMD's hardware. If you really want to complain about performance you should be whining to CDPR and Slightly Mad Studios, taking the most recent examples.
 
I have the toy (GTX 780) I can send you pictures of me with it so you can put them as your desktop background. I don't understand what your point is. Do I not have a right to have my card preform as it should.
Later today I am going to make a chart or something to show all the naysayers the performance difference of the 780 over time. What I can't comprehend is that people are telling me this is ok for a company to do to the product I purchased in two years after it was released and only 9 months after it was taken off the shelves.


It one game and one driver that is screwed up if you think that is a norm i think you should sell your card and be done with it, and don't buy nV products. Pretty short sighted, when you have all other games performing the same as before.

Please make the chart I want to see the last 4 drivers vs a suite of games at least 4 games. Best way for people to learn is to do their own work. And please take your time, it will take more then a day if you do it right.
 
This it not how its meant to be done.


z5aTQk.jpg

vrr3IR.jpg

w5Ax3z.jpg
 
Maxwell handles tessellation better than Kepler, in games with higher tessellation Maxwell will perform better. Since Nvidia is not an 'evil' company, they're doing that because it looks better. Technology improves over time, it's not their fault new cards are more efficient at certain things... Would you prefer we stop advancing graphics altogether just so Kepler can stay fast?

It's also not Nvidia's responsibility to worry about AMD's hardware. If you really want to complain about performance you should be whining to CDPR and Slightly Mad Studios, taking the most recent examples.

But it isn't just AMD, it is also Kepler. Then we also have to consider the level of Tess, whether there is an actual visual rationale to 64x over 8x or even 16x. This isn't about whining so much as it is about EXCESSIVE TO WHAT END? Sorry but there seems to be good reason for some questions and concerns.
 
This it not how its meant to be done.


z5aTQk.jpg

vrr3IR.jpg

w5Ax3z.jpg


Unfortunately tessellation levels are harder to control on the same object its automatic and camera distance dependent, like Cry Engines default water, its all one object, as with Batman's cap, the reason why you want small triangles is so that you can get better flow of animated polys, but the cost is you get more geometry aliasing.
 
I dont care anymore for your BS there are plenty of example of tessellation levels done right on the same object with camera distance dependent, you dont tessellate flat surfaces period, there is no excuse and the fact that i can use CCC to control tessellation levels means it cant be that hard.

Consider yourself ignored because you are just not worth the time.
 
I dont care anymore for your BS there are plenty of example of tessellation levels done right on the same object with camera distance dependent, you dont tessellate flat surfaces period, there is no excuse and the fact that i can use CCC to control tessellation levels means it cant be that hard.

Consider yourself ignored because you are just not worth the time.


There is no way around the tessellation on some of those because the object covers the entire map, sure ignore me, being ignorant of the fact of how things work, works well for many people who make mistakes in life.

Developers set up a default tessellation level they feel looks good, but its all one level, its not controlled on a per object basis (you can either have tesselation on an object or off, no control of individual tessellation levels its a global setting). So if you turn down tessellation levels on one object it will turn down on another object that the developer might not want to turn down on). We are talking about game engines and full games not demos that show a few objects, a game level is much more complex.

newer games that are about to come out will have better control of this, because engines were made with tessellation in mind.
 
Last edited:
There is no way around the tessellation on some of those because the object covers the entire map, sure ignore me, being ignorant of the fact of how things work, works well for many people who make mistakes in life.

Developers set up a default tessellation level they feel looks good, but its all one level, its not controlled on a per object basis (you can either have tesselation on an object or off, no control of individual tessellation levels its a global setting). So if you turn down tessellation levels on one object it will turn down on another object that the developer might not want to turn down on). We are talking about game engines and full games not demos that show a few objects, a game level is much more complex.

newer games that are about to come out will have better control of this, because engines were made with tessellation in mind.

Yet ignorance doesn't seem to stop posting. You keep sidestepping any proof or fact of what level of Tess is necessary for a given look. Granted there are a lot of variables like resolution of displayed image. But there are enough users posting how well 8x even 16x not affecting performance a grew deal and showing little to no visual loss compared to 64x. Think of it like this, it would be like having a choice between 8x MSAA or off and nothing between. Would make no sense across the range of hardware to have those only choices. We see here in this instance a case where we have only 2 choices and one, by designers choice, is severely limited to a precious few cards. Had there been a slider or selections between the two more cards could make use of the selection. Really if you are pushing a tech as a feature why limit it so. Wouldn't it make far more monetary sense to appeal to the masses. Now don't start with whole holding back to allow more users. Rather this allows for a far wider purchasing audience.
 
Yet ignorance doesn't seem to stop posting. You keep sidestepping any proof or fact of what level of Tess is necessary for a given look. Granted there are a lot of variables like resolution of displayed image. But there are enough users posting how well 8x even 16x not affecting performance a grew deal and showing little to no visual loss compared to 64x. Think of it like this, it would be like having a choice between 8x MSAA or off and nothing between. Would make no sense across the range of hardware to have those only choices. We see here in this instance a case where we have only 2 choices and one, by designers choice, is severely limited to a precious few cards. Had there been a slider or selections between the two more cards could make use of the selection. Really if you are pushing a tech as a feature why limit it so. Wouldn't it make far more monetary sense to appeal to the masses. Now don't start with whole holding back to allow more users. Rather this allows for a far wider purchasing audience.


I didn't think we were taking about witcher I was responding to batman screenshots so please read the posts that I was quoting. What is a grew deal btw? great deal yes it doesn't make much of a difference performance wise going from 8x to 16 times but there is a visual fidelity going to 64 times, it was show in witcher's hair quite a few times. Again go through this thread and the other one that is also on this forum it was shown many times.

Also I did say that it was under the developers control to include something like a tessellation level selector, which they didn't do so again read through the thread, it was probably an art direction they didn't do it because of the visual differences.

If you would like I can post everything I have stated thus far in one post, which actually would make AMD's MARKETING RESPONSE look like they don't know what they are talking about, but in fact they fully know what they were talking about and were trying to spin it as if they can't solve the issue on a driver level. I did mention even before Witcher 3 came out and we had the pre release benchmarks that it was tessellation that they were having problems with and same response people don't know what they are talking about and argue moot points.

Good point to learn from this when you come into the middle of a conversation find out what the conversation is about by read back or ask a question.......
 
Maxwell handles tessellation better than Kepler, in games with higher tessellation Maxwell will perform better. Since Nvidia is not an 'evil' company, they're doing that because it looks better. Technology improves over time, it's not their fault new cards are more efficient at certain things... Would you prefer we stop advancing graphics altogether just so Kepler can stay fast?

It's also not Nvidia's responsibility to worry about AMD's hardware. If you really want to complain about performance you should be whining to CDPR and Slightly Mad Studios, taking the most recent examples.
This is sarcasm, right?
 
Maxwell handles tessellation better than Kepler, in games with higher tessellation Maxwell will perform better. Since Nvidia is not an 'evil' company, they're doing that because it looks better. Technology improves over time, it's not their fault new cards are more efficient at certain things... Would you prefer we stop advancing graphics altogether just so Kepler can stay fast?

It's also not Nvidia's responsibility to worry about AMD's hardware. If you really want to complain about performance you should be whining to CDPR and Slightly Mad Studios, taking the most recent examples.

This is an awesome post on many levels. This is why gameworks sucks and should be boycotted. The best part is some people don't even realize what is happening.

So nvidia knows that hairwork kills performance on EVERY gpu (including their own) except maxwell. So what are they trying to get people to do? Get people to ditch your amd and Kepler cards and buy a maxwell card. Is it smart on their part, yes, it's making them a ton of cash. Is this what enthusiast want? I know it isn't what I want.
 
This is an awesome post on many levels. This is why gameworks sucks and should be boycotted. The best part is some people don't even realize what is happening.

So nvidia knows that hairwork kills performance on EVERY gpu (including their own) except maxwell. So what are they trying to get people to do? Get people to ditch your amd and Kepler cards and buy a maxwell card. Is it smart on their part, yes, it's making them a ton of cash. Is this what enthusiast want? I know it isn't what I want.

I agree as there seem to be no real good explanations of why they choose the levels or lack of levels of tess in this game. And sorry a dev desire means little in terms of market sales.
 
So nvidia knows that hairwork kills performance on EVERY gpu (including their own) except maxwell.

The Kepler issues are with Hairworks both on and off. They don't seem to be strictly Hairworks-related problems.

My understanding is that the devs provided the ability to just turn HW off altogether so I don't get what everyone is all spun up about. I know there is this concept of #FirstWorldProblems but things are really starting to deviate from reality. I think the hair stuff is just ridiculous anyway. It's almost asinine that the PC master race is having an Internet meltdown over HAIR.
 
Maxwell handles tessellation better than Kepler, in games with higher tessellation Maxwell will perform better. Since Nvidia is not an 'evil' company, they're doing that because it looks better. Technology improves over time, it's not their fault new cards are more efficient at certain things... Would you prefer we stop advancing graphics altogether just so Kepler can stay fast?

It's also not Nvidia's responsibility to worry about AMD's hardware. If you really want to complain about performance you should be whining to CDPR and Slightly Mad Studios, taking the most recent examples.

Your argument might be OK if nVidia hadn't already admitted there are driver issues. It's not simple hardware superiority. It also might hold some water if Kepler wasn't slipping compared to GCN. An arch which in no way is superior in tessellation.
 
Back
Top