Nvidia GameWorks - Game Over for You

You watched a 20+ minute video in less than 7 minutes? Damn you're good...
No need to waste 20 minutes of my life listening to some Scottish dude rehash the same thoroughly debunked conspiracy theories about Nvidia sabotaging The Witcher 3 and Crysis 2.

It's literally 20 minutes of some guy mousing over graphs and complaining.
 
How many times to we really need to rehash this shit?
 
No need to waste 20 minutes of my life listening to some Scottish dude rehash the same thoroughly debunked conspiracy theories about Nvidia sabotaging The Witcher 3 and Crysis 2.

It's literally 20 minutes of some guy mousing over graphs and complaining.

I watched the whole thing, at the end he complains about Fallout 4 as well :p
 
Seemed to be ok to watch...shouldn't be anything people here dont already know and been argued about. I did not know the details regarding that beta fallout patch, would be a huge gain for amd cards:)
 
No need to waste 20 minutes of my life listening to some Scottish dude rehash the same thoroughly debunked conspiracy theories about Nvidia sabotaging The Witcher 3 and Crysis 2.

It's literally 20 minutes of some guy mousing over graphs and complaining.

Dude, I am using a 980Ti now but even I know that Crysis 2 was sabotaged. The evidence was right in our face for all too see. Also, as I recall, there is a racing game that had been sabotaged as well, I believe it was called Cars? Now, I have no issue with Gameworks as long as it can be turned off without effecting the ability for AMD cards to work well and look good in a game. However, I do have an issue with it when the game is built on top of it and cannot be disabled.
 
How many times to we really need to rehash this shit?
Until people either people stop sticking their heads in the sand....

crusty_juggler said:
No need to waste 20 minutes of my life listening to some Scottish dude rehash the same thoroughly debunked conspiracy theories about Nvidia sabotaging The Witcher 3 and Crysis 2.

or Nvidia stops pulling this crap.
 
Until people either people stop sticking their heads in the sand....



or Nvidia stops pulling this crap.


If they are pulling this crap, its because AMD has no answers for it. Gameworks has very little benefit for marketshare if the graphics techology isn't good enough. Just 2 gens ago, AMD had 40% of the Discreet market, Game works and TWIMTBP program was there. But AMD had good hardware, and this is why they were able to keep their marketshare,

But since then they have had too many late releases and at times, recently aren't that competitive. This is what hurts them, not gameworks, gameworks, is only viable if AMD's tech can't compete for what ever reason, architectural design or late to the market.
 
If they are pulling this crap, its because AMD has no answers for it. Gameworks has very little benefit for marketshare if the graphics techology isn't good enough. Just 2 gens ago, AMD had 40% of the Discreet market, Game works and TWIMTBP program was there. But AMD had good hardware, and this is why they were able to keep their marketshare,

But since then they have had too many late releases and at times, recently aren't that competitive. This is what hurts them, not gameworks, gameworks, is only viable if AMD's tech can't compete for what ever reason, architectural design or late to the market.

Usually, I respect what you have to say but, in this case, you are simply deflecting. No one here said AMD is not suffering for their own missteps but, it does not change the crap that was pulled by Nvidia. One does not justify the other.
 
That was one of the most well-thought out and substantiated arguments I've seen on the topic. This guy put a bit more time and effort into this than "mousing over graphs and complaining". He also clearly points out that there could be other explanations from the beginning e.g. slopping coding, but when there are so many examples out there it points to something more formalized and directed. Additionally, he rails on the "rabid fanboys on both sides". He, or his argument at least, seems pretty damn well objective.

I'm not one for tin-foil stories, but when so much evidence points to shenanigans here - come on... Follow the money, guys. Of course nV is amping up Gamesworks features with insignificant-zero real improvements when they know it hits the competition.
 
Last edited:
Usually, I respect what you have to say but, in this case, you are simply deflecting. No one here said AMD is not suffering for their own missteps but, it does not change the crap that was pulled by Nvidia. One does not justify the other.


Game works can be turned off, if a user doesn't like it, pretty simple, to say, its a user choice to use game works or not. I'm not deflecting it, if someone doesn't like game works, just don't use it......

A developer wants to use a feature in game works because its easy for them to integrate it, saves them money, but its only optimized for nV cards, which is true, nV is under no obligation to optimize for other's hardware, Users of the other's hardware can turn off those features, Well, that's the market for ya. Market presence dictates gameworks, market presence dicates what developers do, mareket presence is dictated by what people want. Market presence isn't the end result of Gameworks, Gameworks is the end result of market presence, which Gameworks prolongs market presence.
 
Last edited:
Game works can be turned off, if a user doesn't like it, pretty simple, to say, its a user choice to use game works or not. I'm not deflecting it, if someone doesn't like game works, just don't use it......

Not all games where Gameworks is included is able to be turned off. No, I am not going to get into which ones since they have been proven and rehashed to death in these forums.
 
Not all games where Gameworks is included is able to be turned off. No, I am not going to get into which ones since they have been proven and rehashed to death in these forums.


All tessellation gameworks libraries, can be turned off, and that video is 90% of it was based on tessellation.

And Crysis 2 wasn't evan gameworks title, gamesworks wasn't even around. Tessellation came out as a patch for Crysis 2.

Witcher 3, had hairworks in it 3 years before it was released. Hairworks was first showed off on witcher's wolves at GDC 2011 I think it was. If I wanted to, I could probably listen to the rest of that video, and rip it apart too.
 
All tessellation gameworks libraries, can be turned off, and that video is 90% of it was based on tessellation.

And Crysis 2 wasn't evan gameworks title, gamesworks wasn't even around. Tessellation came out as a patch for Crysis 2.

Witcher 3, had hairworks in it 3 years before it was released. Hairworks was first showed off on witcher's wolves at GDC 2011 I think it was. If I wanted to, I could probably listen to the rest of that video, and rip it apart too.

In your dreams. You have no clue about any of this. That you even dare to say that you would rip apart this video is beyond laughable, it is sad that you don't even see how stupid tessellation is in some square or rectangle objects.

The demeanor of the video is that only 1 hardware feature is exploited to abuse benchmarks. Once you know how to bypass this benchmarks are more reflective.
 
In your dreams. You have no clue about any of this. That you even dare to say that you would rip apart this video is beyond laughable, it is sad that you don't even see how stupid tessellation is in some square or rectangle objects.

The demeanor of the video is that only 1 hardware feature is exploited to abuse benchmarks. Once you know how to bypass this benchmarks are more reflective.


Which can be turned off on all those examples, so you are bitching about something that doesn't even need to be on, if AMD had enough market presence this would be a none issue. Developers would have had to cater to them as they have to nV's hardware.

And now if you want to talk about the technical merits of high tessellation amounts. We can do that, but that won't get very far since ya know what....... (tessellation isn't just about the face of the object, its also about shadowing too, lighting too, etc.)

Are you complaining about "we can't use those features", well yeah nV's better at tessellation because of its geometry throughput, well, yeah AMD needs to fix that. Just like nV had to fix its shader throughput with the 7800 line vs the competition, which they did with the G80.
 
Which can be turned off on all those examples, so you are bitching about something that doesn't even need to be on, if AMD had enough market presence this would be a none issue. Developers would have had to cater to them as they have to nV's hardware.

And now if you want to talk about the technical merits of high tessellation amounts. We can do that, but that won't get very far since ya know what....... (tessellation isn't just about the face of the object, its also about shadowing too, lighting too, etc.)

Are you complaining about "we can't use those features", well yeah nV's better at tessellation because of its geometry throughput, well, yeah AMD needs to fix that. Just like nV had to fix its shader throughput with the 7800 line vs the competition, which they did with the G80.

It is abuse it is nothing more then that. AMD drivers allow you to tone down the tessellation so there is nothing to worry about...

Good luck on discussing square or rectangle objects for tessellation ...
 
Last edited:
It is abuse it is nothing more then that. AMD drivers allow you to tone down the tessellation so there is nothing to worry about...

Good luck on discussion square or rectangle objects for tessellation ...


Its not about just one object, that's the problem, the tessellation pipeline doesn't have the flexibility to have variable tessellation factors for different objects in a scene, the water in Crysis 2 wasn't over tessellated, it needs that factor amount to get the best fidelity, that same factor amount is used across all objects. If you knew anything about how the GPU works, you will see that. Witcher 3 at x8 to x64 there is a huge difference in the hair, x8 looks like hay and x64 the hair moves much more realistically. Now a way to get around this is if the hair was modeled with more polys to begin with, which can drop the tessellation factors for the hair, but then you have to bone the hair at a much tighter length per bone, which has its affect on the CPU, and this will also affect geometry throughput just like tessellation so now at x8 factor it might be having frame rates of x16 factor or maybe enough less.

To take things out and point it out like this doesn't get anywhere, because the tessellation factors don't affect singular objects but everything that is tessellated in the entire game.
 
Last edited:
Nvidia have sliders for tessellation now as well?
Nvidia does not really give anything about performance just as long as they use a stick where they can beat their competition with. Since GameWorks is all black box and no AMD developer can access it the idea that you can abuse the details might also mean that the black box also controls which algorithm it uses and when it uses a lesser version of it (rather then it just being a feature of the driver).
 
Nvidia have sliders for tessellation now as well?
Nvidia does not really give anything about performance just as long as they use a stick where they can beat their competition with. Since GameWorks is all black box and no AMD developer can access it the idea that you can abuse the details might also mean that the black box also controls which algorithm it uses and when it uses a lesser version of it (rather then it just being a feature of the driver).

Sliders would be nice, but again, its not up to nV to make such a feature, the dev's should do it if they want to. They have to do it without the support of nV, so they have to purchase the license for the said library and do it. Its not a black box when they can get the source if they want to. Maybe AMD should buy it for the dev's if the dev's don't want to purchase it in the interm while they get their game sdk's ready?

Something else I have forgot to mention, when the guy talked about Trufrom, Truform was horrible at tessellation, it had some serious problems when creating artwork for it. It was tedious and painful for artists to make assets.
 
Witcher 3 at x8 to x64 there is a huge difference in the hair, x8 looks like hay and x64 the hair moves much more realistically.

I can barely tell any difference between x8 and x16. Having the default setting at x64 is simply ridiculous.

w3-amd-gpu2.jpg
 
you have to see it motion and x8 tessellation factor is enough to start seeing performance advantages for nV hardware. Factor amounts will only get worse as games start using more base polygon counts to begin with.

Edit this is easily seen when you start changing options for the vegetation in witcher 3's menu. The grass has a similar affect to performance as with tessellation factor changes through the driver.
 
Last edited:
I can barely tell any difference between x8 and x16. Having the default setting at x64 is simply ridiculous. *snip*

Have you tried comparing it with the animal fur? Geralt's hair looks like shit with GameWorks on, but animal fur looks great with it.
 
The only issue that really needs attention is Kepler's performance degradation. You could argue all day about the merits of GameWorks/tessellation and it's mostly just opinions about what Nvidia might be or might not be doing to hurt AMD or even their own older cards.

But the suffering performance on the last-gen GPUs is real. It could be Nvidia intentionally crippling old hardware to make Maxwell look good. It could be Nvidia neglecting their Kepler drivers. It could be that Kepler has reached its full potential already. Doesn't really matter why -- Point is, Nvidia is losing ground and AMD GPUs are still continuing to improve above and beyond their Nvidia counterparts. That's the most important takeaway and it should concern everyone when buying Nvidia GPUs in the future.
 
The only issue that really needs attention is Kepler's performance degradation. You could argue all day about the merits of GameWorks/tessellation and it's mostly just opinions about what Nvidia might be or might not be doing to hurt AMD or even their own older cards.

But the suffering performance on the last-gen GPUs is real. It could be Nvidia intentionally crippling old hardware to make Maxwell look good. It could be Nvidia neglecting their Kepler drivers. It could be that Kepler has reached its full potential already. Doesn't really matter why -- Point is, Nvidia is losing ground and AMD GPUs are still continuing to improve above and beyond their Nvidia counterparts. That's the most important takeaway and it should concern everyone when buying Nvidia GPUs in the future.

https://www.youtube.com/watch?v=iZUshOSWQRo

It's a GTX 580, but you get the gist. It's not gonna be an increase every time.
 
Not all games where Gameworks is included is able to be turned off. No, I am not going to get into which ones since they have been proven and rehashed to death in these forums.

Also, these dll's aren't just drop and play. The render path has to be modified. When you turn them off the game doesn't go back to the unmodified render path.
 
http://www.guru3d.com/articles_page..._graphics_performance_benchmark_review,7.html

where do you see Keplar having issues here? Brand new game and the 780ti is where its supposed to be.

I wouldn't bet on it. It is being discussed most places that the benches are all over the place... I knew someone here would post the very graphs you did when these here are present as well.
http://www.techpowerup.com/reviews/Performance_Analysis/Rise_of_the_Tomb_Raider/4.html

Now Kepler is missing from the graphs but looking at the performance of the rest of the cards I wouldn't give much stock to any particular graph just yet as undeniable proof.
 
Back
Top