More NVIDIA Gameworks Controversy

Yes, competition breeds innovation. Closing access to competition breeds monopolies.

Your open source fantasy is just that and cannot/will not work in the real world.

According to your theory, NV should just give the code they invested R&D time/money in to AMD to allow further innovation??? That seems logical to you? Then why would any company invest in R&D if they just have to end up giving it to their competitors?

Gameworks is a marketing tool to sell NV cards...not AMD cards.

A single open source tech does not create innovation,it creates stagnation. Only two open source techs vying for the same position can create competition. So AMD will need their own library.

Competition can only be create by two technologies battling against one another. AMD's poor performance with Gameworks or Gameworks being the sole set of visual/physics libraries lies solely with AMD.

AMD is 1000% at fault for Gameworks, not NV. AMD needs to stop crying, sack up and get their own libraries. You know, the libraries "Gaming Scientist" Huddy promised a while ago.
 
Your open source fantasy is just that and cannot/will not work in the real world.

According to your theory, NV should just give the code they invested R&D time/money in to AMD to allow further innovation??? That seems logical to you? Then why would any company invest in R&D if they just have to end up giving it to their competitors?

Gameworks is a marketing tool to sell NV cards...not AMD cards.

A single open source tech does not create innovation,it creates stagnation. Only two open source techs vying for the same position can create competition. So AMD will need their own library.

Competition can only be create by two technologies battling against one another. AMD's poor performance with Gameworks or Gameworks being the sole set of visual/physics libraries lies solely with AMD.

AMD is 1000% at fault for Gameworks, not NV. AMD needs to stop crying, sack up and get their own libraries. You know, the libraries "Gaming Scientist" Huddy promised a while ago.

This is not good for the consumer, you wanna buy one video card to play a set of games and another to play a different set? That's the dumbest thing I ever heard, this isn't the console market. It's gonna hurt the PC gaming market if that happens. I kinda doubt this is gonna take off in the long run, but for nVidia's purposes - PR and exclusives - it's gonna do allot of damage with the uneducated consumer crowd sadly.
 
This is not good for the consumer, you wanna buy one video card to play a set of games and another to play a different set? That's the dumbest thing I ever heard, this isn't the console market. It's gonna hurt the PC gaming market if that happens. I kinda doubt this is gonna take off in the long run, but for nVidia's purposes - PR and exclusives - it's gonna do allot of damage with the uneducated consumer crowd sadly.

It's always been that way. Companies are not out for the good of the consumer. Companies are out to make a profit. NVidia or Intel would love AMD to die. That way, they could make their technology last vastly longer than if there was competition.
 
Your open source fantasy is just that and cannot/will not work in the real world.

According to your theory, NV should just give the code they invested R&D time/money in to AMD to allow further innovation??? That seems logical to you? Then why would any company invest in R&D if they just have to end up giving it to their competitors?

Gameworks is a marketing tool to sell NV cards...not AMD cards.

A single open source tech does not create innovation,it creates stagnation. Only two open source techs vying for the same position can create competition. So AMD will need their own library.

Competition can only be create by two technologies battling against one another. AMD's poor performance with Gameworks or Gameworks being the sole set of visual/physics libraries lies solely with AMD.

AMD is 1000% at fault for Gameworks, not NV. AMD needs to stop crying, sack up and get their own libraries. You know, the libraries "Gaming Scientist" Huddy promised a while ago.

Going around in circles... Ok, you want closed gaming systems. PC gaming will just become a console-like gaming experience you can put the parts together yourself. Depending on those parts; certain games will be accessible to you for nothing more than arbitrarily imposed limits.

It's a rather bleak evolution to PC gaming though.

This is not good for the consumer, you wanna buy one video card to play a set of games and another to play a different set? That's the dumbest thing I ever heard, this isn't the console market. It's gonna hurt the PC gaming market if that happens. I kinda doubt this is gonna take off in the long run, but for nVidia's purposes - PR and exclusives - it's gonna do allot of damage with the uneducated consumer crowd sadly.

Unfortunately it has, and is currently doing, a good deal of damage. Education is the key IMO.
 
I just think it's very interesting that AMD is promoting an open eco-system and is actually trying to spur the industry in a positive direction (with open-source libraries, Vulkan/Mantle, etc.) and Nvidia is just trying to create a walled garden, and they're using their marketshare and financial advantage to do so. And yet people are defending it? Sure, AMD doesn't have money to compete, but why defend Nvidia at all? It's damaging in the long-run, whether or not AMD is even still around in the long run.

Performance differences aside and other biases aside, I would immediately side with the company that is trying to promote an open eco-system over a walled garden.
Weird how AMD claimed Mantel will be open source yet none of that came out until AMD already claimed Mantel was dead to them because it was in "beta" for the considerable amount of time even though it was featured in games for well over a year.
 
AMD is 1000% at fault for Gameworks, not NV. AMD needs to stop crying, sack up and get their own libraries. You know, the libraries "Gaming Scientist" Huddy promised a while ago.


Really? That isn't a solution it is creating more of a problem. Depending if a game is using amd or nvidia libraries, you will need a amd or nvidia gpu. If not you will suffer poor performance if it's a amd game with a nvidia gpu or vice versa.
 
Don't bother, arguing with an Nvidia fan is like trying to yell at a dog to stop shitting in the house.

To be fair, AMD nutters are just as bad when they're 'on top' so to speak. Have we forgotten the Fermi days gloating so quickly?
 
To be fair, AMD nutters are just as bad when they're 'on top' so to speak. Have we forgotten the Fermi days gloating so quickly?

They were certainly LOL'ing it up with a lot of "sucks to be you" directed at NV bros when Battlefield 4 had those performance gains with Mantle, and they believed it was the first of many more games of a future of Mantle dominance. Didn't work out ofcourse but you saw the behavior and the attitude.
 
I figured when Intel and AMD consider so much as licensing their x86 and x86-64 ISAs, then nvidia can think about open-sourcing their IP.
 
Weird how AMD claimed Mantel will be open source yet none of that came out until AMD already claimed Mantel was dead to them because it was in "beta" for the considerable amount of time even though it was featured in games for well over a year.

It's very simple, child-like simple for anyone that knows technology: Mantle was a DX/OpenGL alternative, it does not break the DirectX path, its only benefit is to reduce driver overhead. Gameworks is middleware, it sits between DirectX and the drivers, it's DRM against optimizations, the PhysX feature uses the GPU when it's an nVidia card and reverts to unoptimized X86 code otherwise.

No gaming company in its right mind would drop the Windows platform standard, DirectX, and go for such a narrow market, all Mantle did was force MS's hand to the benefit of the market, Gameworks on the other hand is nothing but bad to both nVidia and AMD consumers in the long run. As an nVidia card user I cannot stand by such shady practices.
 
I figured when Intel and AMD consider so much as licensing their x86 and x86-64 ISAs, then nvidia can think about open-sourcing their IP.

What could happen is Intel may decide to get serious and block nVidia from selling cards on its motherboards, the same way it kicked them out of the chipset market, they are fully capable of doing so, it's their platform. Sure, it would be a bad thing but if we accept that companies can do as they please then this shouldn't be much different from current developments.
 
What could happen is Intel may decide to get serious and block nVidia from selling cards on its motherboards, the same way it kicked them out of the chipset market, they are fully capable of doing so, it's their platform. Sure, it would be a bad thing but if we accept that companies can do as they please then this shouldn't be much different from current developments.

Does Intel even make motherboards anymore? I thought they got out of that business years ago.
 
It's very simple, child-like simple for anyone that knows technology: Mantle was a DX/OpenGL alternative, it does not break the DirectX path, its only benefit is to reduce driver overhead. Gameworks is middleware, it sits between DirectX and the drivers, it's DRM against optimizations, the PhysX feature uses the GPU when it's an nVidia card and reverts to unoptimized X86 code otherwise.

No gaming company in its right mind would drop the Windows platform standard, DirectX, and go for such a narrow market, all Mantle did was force MS's hand to the benefit of the market, Gameworks on the other hand is nothing but bad to both nVidia and AMD consumers in the long run. As an nVidia card user I cannot stand by such shady practices.
PhysX used x87 until SDK 3.0 which uses SSE
 
Now people are noticing that The Witcher 3 has somewhat better performance on a 960 vs a 780Ti. Is that a problem? Yes. Is it something that can be fixed? Yes. Are people still going to bitch because it wasn't fixed on day one? You better believe your ass.


The game was under development for three years, there's no excuse for a 960 to be blowing the doors off of a 780ti in this game unless it was intended as such.
 
So you agree DirectX needed competition to spur advancement.

Amazing how quickly after Mantle was announced that MS suddenly had DX12 in the pipeline isnt it?

Competition breeds innovation.
Yes, simply amazing. It's almost as if MS had dx12 unset development :rolleyes:
 
Witcher 3 blows on any 7 series and under card. Nvidia didn't optimize their drivers for older cards. A 960 will be equivalent in performance to 780Ti in Witcher 3. This is Nvidia's way of getting people to upgrade.

The irony is that for all the talk about crappy amd drivers and weaker support, the 7xxx cards and 290 cards will probably last the longest at the peak of their range of performance, even if the initial performance over the first couple months is lower due to nvidia cock blocking amd from optimizing the software with certain features. Part of that is due to the 7xxx series lasting so long, but at least when the newer 290 came along they did not abandon the old cards performance.
 
To be clear, it's not about being a "fanboy" it's about supporting the product that gives me the best all round performance AND experience. I couldn't give less of a shit about the brand of the card as long as it gives me the best reliable performance. When AMD manages this I'll buy AMD.

you bought a 960/970

I'll enjoy hearing you go silent when nvidia releases pascal next year and stops bothering to tune the drivers for that generation of cards and earlier to the same extent. But never mind, nvidia is a deity to be worshiped among some people.
 
Wouldn't it be awesome if companies could adhere to 3D graphics standards instead of paying off individual companies to break them?
 
Man, this got ugly quick.
jfxwcoo7wyiefaxmmreh.gif

AMD vs NVIDIA
 
The irony is that for all the talk about crappy amd drivers and weaker support, the 7xxx cards and 290 cards will probably last the longest at the peak of their range of performance, even if the initial performance over the first couple months is lower due to nvidia cock blocking amd from optimizing the software with certain features. Part of that is due to the 7xxx series lasting so long, but at least when the newer 290 came along they did not abandon the old cards performance.

Because they will keep rebadging the 7xxx series for another 10 years. :D
 
As long as gameworks features can be turned off completely, I see no problem. However, when a game is completely built on gameworks, do not lie about it or blame others when you cannot even turn the "features" off. Crippling a competitors product is down right illegal.
 
PhysX used x87 until SDK 3.0 which uses SSE

That's great news, it's progress. Wont hold a candle to a GPU but much better than the original legacy x86 at least.

The point remains, it's a major performance crippling issue and I can't see how AMD can do anything about it, it's closed source and the developers chose to chain themselves to an unworkable situation, I assume they figured it was a worth tradeoff in some fashion because that's anywhere from 20 to 40 percent of the market segment affected, but using it as a PR stunt is shady imho.
 
That's great news, it's progress. Wont hold a candle to a GPU but much better than the original legacy x86 at least.

The point remains, it's a major performance crippling issue and I can't see how AMD can do anything about it, it's closed source and the developers chose to chain themselves to an unworkable situation, I assume they figured it was a worth tradeoff in some fashion because that's anywhere from 20 to 40 percent of the market segment affected, but using it as a PR stunt is shady imho.

The ironic thing is that PhysX supports hardware acceleration on both the XBone and the Playstation 4.
 
The ironic thing is that PhysX supports hardware acceleration on both the XBone and the Playstation 4.
The sad thing is that most people don't realize how limited hardware acceleration is for physx only a few functions actually get accelerated many titles sport physx without hardware acceleration entirely and without fanfare, ofc depending on how it's implemented that can be quite a bit fps.

PhysX source is actually quite available to developers it's not available to AMD though, developers are freely able to tweak it as they see fit then again that's money they wont bother spending just so it runs slightly better for AMD cards.
 
I love how NVIDIA apologists keep pretending that its mere coincidence that on non-NVIDIA sponsored shillworks games, that AMD performance typically considerably exceeds NVIDIA counterparts dollar-for-dollar, yet mysteriously when NVIDIA effectively pays part of the development cost for developers all of a sudden non-NVIDIA graphics solutions are crippled. Funny how that keeps happening, and they dismiss all the mounting evidence of intentional sabotage by NVIDIA in order to promote their own cards.

Yup.
 
The sad thing is that most people don't realize how limited hardware acceleration is for physx only a few functions actually get accelerated many titles sport physx without hardware acceleration entirely and without fanfare, ofc depending on how it's implemented that can be quite a bit fps.

PhysX source is actually quite available to developers it's not available to AMD though, developers are freely able to tweak it as they see fit then again that's money they wont bother spending just so it runs slightly better for AMD cards.

To my knowledge The Witcher 3 doesn't use GPU Physx at all but people keep acting like it does. It's the tessellation killing AMD / nVidia 7xx
 
I love how NVIDIA apologists keep pretending that its mere coincidence that on non-NVIDIA sponsored shillworks games, that AMD performance typically considerably exceeds NVIDIA counterparts dollar-for-dollar, yet mysteriously when NVIDIA effectively pays part of the development cost for developers all of a sudden non-NVIDIA graphics solutions are crippled. Funny how that keeps happening, and they dismiss all the mounting evidence of intentional sabotage by NVIDIA in order to promote their own cards.

Must be aliens.
The only person dismissing anything appears to be, and a few others, who dismiss how business works.

When AMD sponsors a video game's development, it outperforms nVidia. When nVidia sponsors a video game, it outperforms AMD. There are people in this thread pointing out that's business as usual responding to the incessant whining from AMD customers that they get locked out of features they say they don't want or want for free. We hear it often because nVidia sponsors more games than AMD and has more money to throw around. I don't often hear nVidia customers complaining about AMD as much but that's either because they either don't sponsor many games that nVidia customers care about or maybe they are complaining but I haven't seen it. Either way, it's *not* due to AMD being the Defenders of Freedom (tm) you incessantly claim them to be. And it's also not due to aliens. In fact, there's no mystery to most people from what I've seen.
 
To my knowledge The Witcher 3 doesn't use GPU Physx at all but people keep acting like it does. It's the tessellation killing AMD / nVidia 7xx

This is true.

AMD has had a tessellation slider for a while now, while Nvidia only has an on/off setting. Essentially Nvidia is screwing over their own customers, at this point they probably consider their last gen Kepler cards a bigger competitive threat than AMD current lineup, it's the only explanation I can come up with.

People know Pascal is right around the corner, possibly around this time next year and the performance gains are expected to be huge considering it will be a double die shrink. If they can convince 780/780ti owners to upgrade to Maxwell this year, the majority will almost surely jump on Pascal next year making this a potentially colossal cash grab for them.
 
Back
Top