AMD Accuses NVIDIA's Gameworks Of Being 'Tragic' And 'Damaging'

Just remember nvidia spends more in R&D than AMD does as a company that's how bad AMD is. Intel spends around 10 times as much into R&D than AMD. AMD is woefully under prepared to compete with either company in cpus or gpus.
 
I recently switched to Nvidia cards for the first time since 2006, thanks to the BB deal on the gtx 970s, little faster than my 7990 was. All Nvidia has to do is drop their prices and they would finally tighten AMDs self made noose.
 
You won't know until it happens. I remember people thinking Microsoft could never be dethroned. Yet here we are today, with them the underdog. Companies come in and disrupt markets all the time. Like Google. Shit, even if you want me to make a guess, Intel. So, please, don't act like it never happens and never could in the GPU world.

Markets change no doubt there. The fact that the masses use there phone for basic internet stuff, hardly means they are not running windows on there laptops. All the people using those new devices are not running out and buying macs. Sure apples computer sales are up compared to where they where, that is hardly a feat though. They haven't really challenged MS, they simply moved to another market.

My point was GPUs are far to complicated for a startup company to jump in and get it compete anymore. There was a time when the tech was new that yes a small company in Qubec like Matrox could design and build a card. Even then the Matrox powervr ati nvidias only got started due to tons of investment capital. At this point the amount of $ needed to design a working 9000 million transistor from scratch... and the amount of $ needed to licence / buy patents that would get stepped on would be insane. The chances of a startup going from zero to top of the food chain is pretty much zero. The existing other smaller GPU guys like PowerVR are very unlikely to even bother trying to get back into high end GPU when the low power mobile markets they are in have 10x more profit.
 
:rolleyes: Crysis 2 isn't a GameWorks game (it predates it), and CryEngine always tries to push hardware as much as possible in the maximum quality settings. And yeah, it looks totally like Nvidia sabotaged AMD performance, so much in fact that AMD beat equivalent level Nvidia cards when the game launched by respectable margins: http://www.guru3d.com/articles_pages/crysis_2_dx11_vga_and_cpu_performance_benchmarks,8.html

Worthless, crazy, tin foil hat conspiracy/speculation was removed from the quote and there's wasn't much left to respond to. :rolleyes:

Bawwwwww.....
Alright, let's go down the list:

1. Yes, it wasn't a Gameworks title, but it's the same concept in play, paying a publisher to get your code in the game.

2. Are we looking at the same benchmarks? On your link at 1600x1200 I see Nvidia's 590 dual-GPU beating every dual GPU AMD card. I also see the 580, 570, 480, beating out the 6970, which itself is only barely above Nvidia's midrange 560 Ti. The story looks the same at higher resolutions too. What the hell are you talking about? Your evidence is undermining your own argument. Please enlighten a crazy tinfoil hat person such as myself, I'm seeing Nvidia's cards beat out AMDs on all the higher resolutions. If you want drop names fine, but back it up with evidence huh?

3. Speaking of which, if you don't like the Crysis example, what's the explanation for 1 million polygons on the cap in Arkham Origins? That's probably close to the same budget as the entire rest of the scenes. (or the dog in COD Ghosts) Yes, you want a lot of polygons for items like that, but if it's going past what you can even see and is crippling performance.
 
hardly means they are not running windows on there laptops.

For those that even have a laptop. The PC industry is shrinking. It will never go away. But it is withering. And, in countries such as India and Africa, with huge populations, the smartphone is the only computer they own, and will ever own. Like it or not, mobile computing on smart phones and tablets is the future, even with PC sticking around forever because you need them for the heavy lifiting/data input markets. We all love the PC here, but it 's relevance has already been obsoleted. We just have yet to see how low it will go down to. It won't go away totally though, I'm not claiming that.
 
My point was GPUs are far to complicated for a startup company to jump in and get it compete anymore. There was a time when the tech was new that yes a small company in Qubec like Matrox could design and build a card. Even then the Matrox powervr ati nvidias only got started due to tons of investment capital. At this point the amount of $ needed to design a working 9000 million transistor from scratch... and the amount of $ needed to licence / buy patents that would get stepped on would be insane. The chances of a startup going from zero to top of the food chain is pretty much zero. The existing other smaller GPU guys like PowerVR are very unlikely to even bother trying to get back into high end GPU when the low power mobile markets they are in have 10x more profit.

And, so couldn't Intel work if they wanted to? There you go, someone to take AMD's place and be an actual competitor.
 
Jesus.

I hope when AMD eventually goes under cause they are incapable of producing competitive products anymore, all their fanbois move onto consoles and leave PC gaming.

*Shrug* AMD is probably not going anywhere for a long time to come. :) Do you feel superior now that you spent quite a bit more money than I have on your computer for next to no real life difference?
 
*Shrug* AMD is probably not going anywhere for a long time to come. :) Do you feel superior now that you spent quite a bit more money than I have on your computer for next to no real life difference?

No, I feel superior I didn't go with the company who it and it's fans suffer from small-dick-syndrome and both spend their time slinging shit instead of actually correcting their faults to be better.
 
No, I feel superior I didn't go with the company who it and it's fans suffer from small-dick-syndrome and both spend their time slinging shit instead of actually correcting their faults to be better.

Hmmmm..... If I go anywhere with that, I will be accused of trolling and probably get an infraction. Therefore, I will leave it with this: *Shrug*
 
have you paid attention to the performance of the gtx 780ti in game work games? It can't hang with the 290x.

Wait I thought the argument was that GameWorks cripples games running on AMD cards?

Now the idea is that GW makes AMD cards run better than nvidia's?

Fascinating.
 
Wait I thought the argument was that GameWorks cripples games running on AMD cards?

Now the idea is that GW makes AMD cards run better than nvidia's?

Fascinating.

Yea this head is full of......well laughs.

Like it being ok to lower performance in games while using geforce experience.....And that is ok to some.

I mean coming from [H] it makes you really wonder who you can even trust anymore when doing reviews.

Everyone has an agenda.....
 
Yea this head is full of......well laughs.

Like it being ok to lower performance in games while using geforce experience.....And that is ok to some.

I mean coming from [H] it makes you really wonder who you can even trust anymore when doing reviews.

Everyone has an agenda.....

I ment thread, not heads....why cant I edit my post? Thats fucking odd
 
30FPS isn't a slideshow although I understand the psychological need to justify spending a thousand dollars on video cards for only a marginal improvement over consoles.

See, to me it feels like a slideshow. That's why games have options. 30fps to me is jarring, stuttery, and non fluid. Maybe it is because I grew up playing games on 90hz CRT's. I don't know.

I like my PS4, there are a few games that I really enjoy on it, but if I have a choice between 900p@30fps, or 1440p@60hz, I will always pick what is a better experience for me.

To say that my SLI 980s is a marginal improvement over consoles is just... I don't even know where to start. :confused: The fuckin console can't even do 1080p60fps on its games. How the fuck is it going to hold up with 4k becoming mainstream in the next 3-4 years.
 
The problem with anything under 60FPS (maybe push it to 45FPS at the least) is the perceived disconnect between input and movement/motion. Especially when looking around in FPS games or moving the camera around in TPS games. You hit a button and everything just feels so sluggish like something is wrong.
 
^ Of course, this is only after experiencing 60 FPS. If you only ever knew 30FPS, I suppose ignorance is bliss and you wouldn't perceive the 'sluggishness'.
 
They never brought it to market.

As they realized it was woefully uncompetitive. Really there is only one option for a third high end GPU company. That would be Imagination Technologies, they would hardly be new though... videologic was one of the first to produce 3d hardware.

They would be the only company with the tech to really go toe to toe. They already obliterate the standard 2 in the mobile space. They have all the licences to make it happen, they have the bank with all the apple funds to do it if they choose to.

I could completely see it happening in a few years. When the first mac ships with a ARM based Apple chip, which is very likely. At that point apple may well want a secondary GPU for their high end (amd or nvidia may not be to willing to play along at that point knowing their tech would be the next to be swapped out for in house)... apple may throw the R&D money at IT to make their own in house GPU happen at the same time. It wouldn't a new from scratch though... at any time I am sure IT could take one 6 month cycle and scale up their current go to mobile chip. Heck that might be the biggest apple launch since Jobs was involved... the day apple starts shipping laptops and full size work machines with highly optimized ARM CPU / PowerVR GPUs. I'm sure its Intels biggest fear.
 
And why do you think that is? They spent years designing, developing and testing it.

OK. And? How does that preclude them from trying to make a dedicated GPU again? Did it hurt them in any way financially? No. Does it mean they would try it again? Probably not.

So what's your point?
 
OK. And? How does that preclude them from trying to make a dedicated GPU again? Did it hurt them in any way financially? No. Does it mean they would try it again? Probably not.

So what's your point?

My point is that Intel obviously didn't intend for their attempt to fail, and thus it stands to reason that even when you have a giant like Intel trying to get into the game it's exceptionally difficult. Your posts come across like it's easy, as if one day Intel could just decide to be NVidia's toughest competitor.
 
My point is that Intel obviously didn't intend for their attempt to fail, and thus it stands to reason that even when you have a giant like Intel trying to get into the game it's exceptionally difficult. Your posts come across like it's easy, as if one day Intel could just decide to be NVidia's toughest competitor.

No, my point is, if Intel wants to do it - they have the chops and bankroll to - even if they almost came out with and unconventional one before but then decided not to. You make it sound like Intel isn't one of the biggest players in technology. If they wanted to, they could crush Nvidia. They dwarf them.
 
It's sad so many including the owner of the site celebrate crippleware. It adds little or nothing to the game and has rediclous demands on both brands of cards and yet the game barely looks any different. Nvidia is pushing it so you keep getting your wallet out to pay for more power and then they will release Gameworks 2.0 and cripple your stuff again so you upgrade again. Nvidia is using it to make you upgrade and it just happens to be painful to AMD so to them that's just a bonus. If you own a AMD card then simply lower the tessellation setting and you can run Hairworks in Witcher 3 just fine, well at least that worked for me on crossfired 290x cards. This kind of stuff will cause a game of dirty pool from both AMD and Nvidia and the only ones that will lose will be us on the PC game market. Game makers will pick Red or Green for a side and you cant play the game for a damn unless you have the correct card.

If someone from Nvidia reads this then know this, I buy video cards from both Nvidia and AMD depending on who I thought had the better card when I felt a need to upgrade, I will no longer support Nvidia until they stop producing crippleware and fracturing the PC market even more.

Wasn't going to bother posting in this thread, but after reading this post, and a few other ones it's good to see consumers are starting to see the bigger picture and what the gameworks initiative is all about.
 
Yea the raptr shit needs to die in a fire. I remove it but each Catalyst update reinstall.

Out of curiosity, what make Raptr so shitty? I have not used it, but it seems to do the same the thing Geforce experience does...at least that seems to be the case from what I've read.

I have no loyalty to either brand. I just go with what I feel is the best bang for the buck. I chose Nvidia this round because I like the extra eye candy Gameworks brings and SLI seems to be supported more frequently than Crossfire.

LMAO for some reason I feel I need to explain my video card choice in the thread.
 
My fear is that in a few years we will be limited in our game choices unless we buy two GPUs and care to swap them out every time we want to play another game. Want max graphics in this group of titles? Install this hardware brand. Want to play these titles with max graphics? Get the other hardware ready.

Obviously not practical for most, so it may just result in having to turn down the graphics. Which partially kills a positive thing about PC gaming (high end graphocs).
 
My fear is that in a few years we will be limited in our game choices unless we buy two GPUs and care to swap them out every time we want to play another game. Want max graphics in this group of titles? Install this hardware brand. Want to play these titles with max graphics? Get the other hardware ready.

Obviously not practical for most, so it may just result in having to turn down the graphics. Which partially kills a positive thing about PC gaming (high end graphocs).

Couldn't agree more. This is pretty much a reality right now.

I love the eye candy Gameworks brings, but even with my GTX 980 SLI, turning all Gameworks feature to max in particular titles has my system struggling to keep FPS at the 50+ mark that I would expect for such an expensive setup. This is a problem IMHO.
 
At the end of the day I believe AMD's Richard Huddy comments are rooted in envy, but I agree with some aspects. Gameworks features generally come at a cost that exceeds the hardware it should require IMHO. Ultimatley this diminishes PC gaming.

With the steadily decreasing gap between console and PC graphics I'm starting to wonder if the cost of high end PC gaming is worth it.
 
it's good to see consumers are starting to see the bigger picture and what the gameworks initiative is all about.
It's about virtual hair and volumetric smoke effects. That's it.

It's not a conspiracy to cripple AMD cards. It's not a scheme to force you to upgrade. It's not the death of PC gaming.

Honestly, the stupid shit I read on the Internet from tinfoil hat adorned gamers is beyond ridiculous. Every thing, every damn thing is a conspiracy to these people.
 
Couldn't agree more. This is pretty much a reality right now.

I love the eye candy Gameworks brings, but even with my GTX 980 SLI, turning all Gameworks feature to max in particular titles has my system struggling to keep FPS at the 50+ mark that I would expect for such an expensive setup. This is a problem IMHO.

Always been a problem with Physx. It looked over done, unrealistic and silly in many instances. But it was often that or not having the effects at all, so I tend to leave it on. TressFX looked better than the default hair in TR, but did look odd. I hated it until the patches came out and then it ran great on my Nvidia card. I went from a 7600GS, 260, 560ti, 670 to 970 as my last 5 GPUs so I am not an ATI/AMD fan by any means. I just don't like it when graphic options are locked or impractical for one brand.

Likewise, it will hinder proper implementation of these features. Physics won't be used as the basis of a game because the performance hit will be great on a large part of the market. So we're left with optional/unessential graphic effects, like over done smoke, ect. opposed to boat/wave mechanics, or real time destruction of large structures.
 
See, to me it feels like a slideshow. That's why games have options. 30fps to me is jarring, stuttery, and non fluid. Maybe it is because I grew up playing games on 90hz CRT's. I don't know.

I like my PS4, there are a few games that I really enjoy on it, but if I have a choice between 900p@30fps, or 1440p@60hz, I will always pick what is a better experience for me.

To say that my SLI 980s is a marginal improvement over consoles is just... I don't even know where to start. :confused: The fuckin console can't even do 1080p60fps on its games. How the fuck is it going to hold up with 4k becoming mainstream in the next 3-4 years.

Unless you have robot eyes or your face is pressed up against the television, you do not need 4K because you can't tell the difference from the distance you sit from the TV.

4K TVs are nothing but magic beans being sold by snake oil salesman.
 
I'v felt Geforce Experience/Gameworks has been a ploy to segment game performance into tiers, this is as a Nvidia card owner.
 
Gameworks "benefits" are simply glorified proprietary in game settings that were sponsored by Nvidia so people would buy it thinking their experience would be night and day.

It isnt night and day and adds little to the actual experience.

Where are all the trolls that tell everyone that its all about the story and and not about the graphics. What happened to you? Is that no longer the case since this is an Nvidia related thread?

Things like Hairworks are a joke. Yes it makes the hair look better but a metric shit ton of resources is spent to render such a minute change. Seriously, hair? I remember watching a playthrough of Witcher 3 on max settings with hairworks, it ran like shit. As soon as he turned it off it was butter smooth. Yea...so much for the so called immersion...

Its all proprietary bullshit which everyone should be against. Next thing you know there will be games out that will ONLY run on Nvidia. I am sure plenty of people on this forums will support that too because they are so blinded by their fanboy tactics.
 
Unless you have robot eyes or your face is pressed up against the television, you do not need 4K because you can't tell the difference from the distance you sit from the TV.

4K TVs are nothing but magic beans being sold by snake oil salesman.

Oh, I do not know, I think 4k TV's have their place just like 4k monitors do. What I am getting a kick out of is seeing folks claim, "60fps the stuff!, you blind if you not notice diff" type posts. :D I have played games at 60fps, 45fps, 25fps (Flight Simulator only in that case) and I have not noticed any input lag or hesitation.

30fps is fine at 4k resolutions such as in Crysis 3. Now, if someone disagrees and does not like it, fine. However, not you but those who would do so, do not bother coming in here and dictating that 60fps is the only way to go and I do not know any better type of crap. :rolleyes:
 
I have played games at 60fps, 45fps, 25fps (Flight Simulator only in that case) and I have not noticed any input lag or hesitation.

Maybe it's more a matter of frame times or latency than average fps but I notice that when I dip down to 30-40fps in L4D2 I'm toast. I'm ready to throw the monitor off the desk. I can't aim, I can't move, it looks ridiculous, and I get pissed. Maybe it's somewhat game-dependent.
 
See, to me it feels like a slideshow. That's why games have options. 30fps to me is jarring, stuttery, and non fluid. Maybe it is because I grew up playing games on 90hz CRT's. I don't know.

I like my PS4, there are a few games that I really enjoy on it, but if I have a choice between 900p@30fps, or 1440p@60hz, I will always pick what is a better experience for me.

To say that my SLI 980s is a marginal improvement over consoles is just... I don't even know where to start. :confused: The fuckin console can't even do 1080p60fps on its games. How the fuck is it going to hold up with 4k becoming mainstream in the next 3-4 years.

I truely don't think you will see 4k even become mainstream in 10 years. If you look at Steams hardware stats, you will see that 720p still holds 26.89% of the resolution played at market. The resolution that is considered now days as mainstream, 1080p, only has 34.33% of the market.
The 2560x1600 monitor setups have been out there for awhile now and they only make less than 1 percent of the market. If you look at multimonitor set ups you see that the largest share goes to 2 1080p monitors and they make up 28.58% of the market.
 
Back
Top