AMD Accuses NVIDIA's Gameworks Of Being 'Tragic' And 'Damaging'

Has Nvidia been caught crippling game performance with Gameworks for literally NO visual benefits? YES.
How so? Forcing a lower maximum tessellation level does have an IQ impact, but it's an acceptable trade-off vs performance. What the issue primarily is, IMO, is that GW does not have a built in way to do the same thing. Considering the wide range of tessellation performance on even its own cards, there should have been an easier way to adjust it.

The HairWorks drama is funny in a way because this seems to be a double eagle directed at AMD for a similar problem with TressFX1.0 (in Tomb Raider): Nvidia accused AMD of crippling performance on its hardware by doing more work than necessary. (The solution was to tighten bounds to make the hair simulation run faster... fixed thanks to the game maker, not AMD, and furthered optimized on Nvidia's driver side.)

Is Nvidia exercising practices that are bad for the entire ecosystem of game development standards and hurts gamers? YES.
How is it bad for the entire ecosystem, and what standards are you talking about? GW is an optional middleware library, and GW uses DX (on Windows) to render. There are no "game development standards", but there are graphics API standards.

Boo hoo for (butt) "hurt[ing] gamers". Those people are the ones wanting to take the ball and go home when they're not the ones on the beneficial end of GPU specific optimizations, but can't stop crowing when they're the ones who do get GPU vendor specific optimizations.
 
Ah, the "goldeneyes", "I can't play at less than 120fps because I'm an armchair pro gamer and you might not be good enough to notice the difference but I am" attitude.

Maybe those commentators remember when we would play things like FEAR or the original Far Cry and were grateful for a fraction of that, but we appreciated games which pushed the envelope. Since all the Crysis "waahhh, poorly optimized!" nonsense spewed by idiots with shit PCs, gamers have become such a bunch of elitist pussies when it comes to frame rates. They're almost as bad as audiophiles these days, and that's saying something.

Tell me about it. PC Gaming these days has far too many ricers. People who brag about spending $2000 on video cards are like the people who drive Honda Civics and think that putting a spoiler on it makes it a damned race car. I have render farms here at work that would put the most powerful build in these forums to absolute shame in terms of graphics performance.
 
https://en.wikipedia.org/wiki/Mantle_(API)#Video_games

Just quite a bit more.

The "custom" API does something no other API did for the PC market. It works well and got MS and Khronos of their ass.

Windows driver has nothing to do with it. It is a closed system where you can not optimize for, regardless who does it.

bravo mantle, for all the PC games available, you have 10 games supported.

10.

not a pathetic 4, but a mind blowing 10. steam has 12,880 games for sale, so mantle supports 7.76e^-4 games. thats a success if ive ever seen it.

and my point about drivers was regarding company resources. when your market share is crap compared to nvidia, and a common complaint is driver quality and release frequency... why the hell would you make an API before fixing the driver problem?
 
Ah, the "goldeneyes", "I can't play at less than 120fps because I'm an armchair pro gamer and you might not be good enough to notice the difference but I am" attitude.

Maybe those commentators remember when we would play things like FEAR or the original Far Cry and were grateful for a fraction of that, but we appreciated games which pushed the envelope. Since all the Crysis "waahhh, poorly optimized!" nonsense spewed by idiots with shit PCs, gamers have become such a bunch of elitist pussies when it comes to frame rates. They're almost as bad as audiophiles these days, and that's saying something.

Tell me about it. PC Gaming these days has far too many ricers. People who brag about spending $2000 on video cards are like the people who drive Honda Civics and think that putting a spoiler on it makes it a damned race car. I have render farms here at work that would put the most powerful build in these forums to absolute shame in terms of graphics performance.

What the actual fuck are you guys talking about? It's wrong to complain about poorly optimized bullshit running like crap on machines people spend thousands on with the promise of high-end graphics and awesome performance?

Who gives a fuck about your render farm? People pay $2000 so they can play games at a high resolution, possibly on multiple monitors, and get decent framerate. When that is not possible due to shittily-optimized console ports (which, frankly, more than half of which do NOT look good enough to justify the poor performance in the first place), then you bet your ass I'd be pissed. You don't pay $2000 to sit around and jerk off to YouPorn, you pay it to play games at 120 FPS.
 
My take is, a game shouldn't be lock at 30FPS if your's or anyone elses hardware is capable of producing much higher FPS. It is a trend that really started during the last console generation of locking 30FPS (for good reasons) that seems to be migrating to PC gaming where we aren't use to seeing frame lock.
I agree that games should never be limited to 30 FPS. Ideally there would be no limits on how high the framerate can go, as long as your hardware is fast enough.

However, the vast majority of displays have a fixed refresh rate of 60Hz.
At 60Hz, games need to be locked at either 60 FPS or 30 FPS.
If you don't synchronize the framerate to the refresh rate, the game is going to judder badly. (e.g. uncapped and running at 40 FPS instead of 30 FPS)
But 30 FPS is, in my opinion, unacceptably low. Low enough that I get motion sick or end up with headaches after playing for more than a short while.
And you cannot eliminate judder with 30 FPS on a 60Hz display. That's not an opinion, but the nature of how our displays work, and it gets considerably worse on low-persistence displays.
The only option for smooth motion on a 60Hz display, is 60 FPS. Anything more or less than that is not smooth.

Tomb Raider is a Gaming Evolved title (think the opposite of Gameworks) and the 960 is going to struggle enabling all the eye candy in any game. I know, I have one in my HTPC/ITX build.
Yes, I know that it's not a GameWorks title. My post was in response to someone claiming that this was just AMD complaining because their tessellation performance is not on-par with NVIDIA's.
I was demonstrating that my own experience with a 960 and a 970 says that those cards don't have sufficient tessellation performance to enable it in most games either - whether it's a GameWorks title or otherwise.

So this excessive tessellation that GameWorks or "The way it's meant to be played." titles use is negatively affecting the performance on GPUs from both manufacturers.
NVIDIA just see it as an acceptable trade-off because it hurt AMD cards more than it hurt NVIDIA cards.
But AMD now have options in their driver to limit tessellation, while NVIDIA do not, so they're really just hurting performance on their own cards for no reason now.

And I don't expect to max out the settings in most games with a 960 (they were definitely not maxed out there) but in most games I can't enable tessellation at all without the framerate dropping below 60 - even with most other settings turned down.
 
Its about AMD whining because NVidia make something they dont and now they are suffering for it.

NVidia make a gaming toolkit that makes game development faster with more features that push the limits of graphics cards.
AMD need to stop complaining and do the same.
They did and it's called Mantle. Then Mantle became Vulkan. But everyone is circle jerking over DX12 like it's something never seen before. :rolleyes:

Unless I am mistaken, no one is forcing developers to use GameWorks.
You think developers make this decision?

via Imgflip Meme Maker
I don't see how this is "locking out the competition". I do agree about open standards, but neither side is truly into that...if AMD were more competitive/popular I guarantee they'd be doing the same thing. It makes business sense to do so and AMD/nVidia are businesses. Not saying I like it, but that's kind of just how it is.

You do realize that AMD isn't the only competitor to Nvidia right? Intel Iris Pro graphics? As a consumer this isn't a good direction. They pulled Arkham Knight for crying out loud cause the game was so badly broken.
 
However, the vast majority of displays have a fixed refresh rate of 60Hz.
At 60Hz, games need to be locked at either 60 FPS or 30 FPS.
If you don't synchronize the framerate to the refresh rate, the game is going to judder badly. (e.g. uncapped and running at 40 FPS instead of 30 FPS)
But 30 FPS is, in my opinion, unacceptably low. Low enough that I get motion sick or end up with headaches after playing for more than a short while.
And you cannot eliminate judder with 30 FPS on a 60Hz display. That's not an opinion, but the nature of how our displays work, and it gets considerably worse on low-persistence displays.
The only option for smooth motion on a 60Hz display, is 60 FPS. Anything more or less than that is not smooth.

Thank you, zone74. I never knew there was a technical term called Judder, I have experienced it just never knew what the condition is called, learned something new today.
 
Imagine that, the newest, most sparkly special effects are taxing on GPUs...inconceivable!

A couple of other questions:
How much exactly would an enterprise license for GameWorks cost AMD? If its that important buy a damn license then you can jerk off to all the source code you like.

Game assets can be made to have or not have tessellation applied. It is not Nvidia's fault if the game artists throw displacement maps all over walls and flat slabs of concrete where a simple bump or normal map would have sufficed.

There has to be a reason why AMD's "solutions" aren't being used. Anyone have any idea why?
 
They did and it's called Mantle. Then Mantle became Vulkan. But everyone is circle jerking over DX12 like it's something never seen before. :rolleyes:


You think developers make this decision?

via Imgflip Meme Maker


You do realize that AMD isn't the only competitor to Nvidia right? Intel Iris Pro graphics? As a consumer this isn't a good direction. They pulled Arkham Knight for crying out loud cause the game was so badly broken.
Good old EA, wait our game is super broken and glitchy? Shit... well put it on the shelves anyways who'd notice, it's PC master race everyone will just blame everything else. You know the same publisher who embraced mantle because you know "it's just great" after taking a shit ton of money from AMD to do so. Developers don't get money from nvidia as much as they get a shortcut which saves them spending money, it's essentially free work which is what they are after. You see this with crysis 2 and tessellation just lazy water implementation that hangs under the whole map even when it doesn't have to be. Developers want to do things quick, dirty and cheap not right.
 
So AMD is doing more bitching and whining... All they seem to do lately is put out sub-par cards and moan and complain.

Give me a good reason to switch from nVidia and I'll look at it seriously. Sit there and whine like a 2 year old and I'm not even interested in reading what you have to say. Keep on moaning and my interest in AMD declines more and more - much like AMD themselves, coincidentally. Sure, surely, by now you're starting to realise, AMD, that having a giant moan party about your competition does nothing but harm yourself?
 
Do you think AMD gets the source code for every game released in order to optimize their drivers?

Either way the issue was Hairworks using a lot of tessellation. AMD has poor tesselation performance and Maxwell has more advanced performance. It was fixed on older NVIDIA cards and AMD cards with later patches/drivers whatever.

AMD is trying to distract from the fact that their cards are under performing across the board. They called Fury "an overclockers dream" and the card for 4K. You can read Hardocp's review to see how much pure BS that was.

AMD is in full panic mode right now. If their stock drops another dollar they could be delisted from the stock exchange and that would be the last nail in a nearly sealed coffin.

I am going to laugh my butt off and most likely game a lot less if AMD goes out of business. I am not going to spend $650 from Nvidia for a Mid range card that barely runs faster than previous cards. :D

Also, my R9 290 4GB ran great at 4k resolutions. The only game I have that I cannot run maxed out is Crysis 3 and even then, it plays smooth and looks great at 4K res on my 10bit 4k panel. :)

So, if AMD goes out of business, where is Microsoft and Sony going to get their chips to keep producing their consoles? They license the chips after all and do not own the tech.
 
I see, that is certainly true that having a dedicated gpu will decrease battery life and increaseing noise, especially if you value power saving and a silent computer. While I do not know what you game, I do realize there are plenty of modern games out there that aren't FPS dependent.

My take is, a game shouldn't be lock at 30FPS if your's or anyone elses hardware is capable of producing much higher FPS. It is a trend that really started during the last console generation of locking 30FPS (for good reasons) that seems to be migrating to PC gaming where we aren't use to seeing frame lock.

Yes, I think you're right about the locking happening in the last console generation. In fact, its reasonable to blame consoles for current FPS locking as well since most of those games that are coded first for consoles (because, let's face it, console games are typically a lot more profitable than PC games so porting is an afterthought and it's been that way for like 15 years or more) and some of their physics and other junk ends up getting tied to framerate so unlocking it for a PC would not only introduce that judder thing that you're already talking about, but it'd also require more developer effort. That effort translates into additional costs and, as much as gamer types wanna deny it, that's unacceptable when it comes to profit margins for a business so the locking is cheaper and easier in the PC game market where its harder to make money.

Fake Edit: I know, people are going to claim that the PC game market is alive, healthy, and arguably bigger than the console market in terms of total sales. That's correct, but the sales are distributed across a much larger number of current and older games so an individual new title doesn't make as much moolah which is where corners have to be cut since PC gamers are basically second class citizens in terms of sales dollars.

The bottom line...if you wanna play a game and get treated like you're worth the effort of making it, offload your gaming box at Goodwill or an electronics waste recycler and go buy a current generation console. Then the whole FPS lock question goes away anyhow.
 
Apple does the same thing with iOS updates. The latest one made my iPad 2 useless because of how slow it runs. So long as fanboys do not hold their messiahs accountable, companies will continue to do this.

Claiming iOS made your ipad slow is like saying Crysis made your PC slow. It's not the game making your shit slow, it's your slow ass PC unable to keep up. Gameworks adds all sorts of graphical enhancements that are simply very demanding on hardware. Of course rendering every single strand of hair on the human head is going to have an impact on performance. The only reason they're doing this is because it is at least somewhat possible on today's platforms. They could have done this 10 years ago if they wanted but you'd only get 5 fps. Now you can reasonably play a game in real-time at around 30fps.

AMD's argument is that "games look too nice now, we need to dumb it back down for performance because users are stupid and they're going to enable all the bells and whistles and play in a substandard manner".
 
A couple of other questions:
How much exactly would an enterprise license for GameWorks cost AMD? If its that important buy a damn license then you can jerk off to all the source code you like.
That's unlikely. A source code license to parts of GameWorks would almost certainly exclude such a thing since it's intended for game developers integrating the code into a particular title.

AMD's problem is that while it no doubt has all the analysis tools it needs to grab shaders, tweak code in real time and analyze those effects on GPU utilization, it doesn't have the resources necessary to do that anymore. When a new game comes out and AMD updates drivers to improve performance for it, it's very unlikely AMD has ever seen the source code for the game. AMD finds it easier to whine about GW than admit it's falling apart and can't function as a top tier GPU maker anymore.
 
bravo mantle, for all the PC games available, you have 10 games supported.

10.

not a pathetic 4, but a mind blowing 10. steam has 12,880 games for sale, so mantle supports 7.76e^-4 games. thats a success if ive ever seen it.

and my point about drivers was regarding company resources. when your market share is crap compared to nvidia, and a common complaint is driver quality and release frequency... why the hell would you make an API before fixing the driver problem?

I'm sure that all of those games are really good because you can buy them on steam. But how long has Mantle been out in public form? It has still games supporting it this year.

Because the API makes it possible for people with slower cpu to enjoy gaming at highest graphics settings. That is not a "problem" you can fix in DX11 drivers.
If AMD made the same DLL onto games then Nvidia can't fix it either so there is no driver problem.. It resides outside of the normal game code and is called from the game rather then having it as source code.
 
That's unlikely. A source code license to parts of GameWorks would almost certainly exclude such a thing since it's intended for game developers integrating the code into a particular title.

AMD's problem is that while it no doubt has all the analysis tools it needs to grab shaders, tweak code in real time and analyze those effects on GPU utilization, it doesn't have the resources necessary to do that anymore. When a new game comes out and AMD updates drivers to improve performance for it, it's very unlikely AMD has ever seen the source code for the game. AMD finds it easier to whine about GW than admit it's falling apart and can't function as a top tier GPU maker anymore.

LOL. It is the Nvidia provided DLL that declares which AMD DX11 routines to use for which effect(s). So "if" AMD would fix this nothing is stopping Nvidia of introducing other code to the DLL and the thing starts all over again.
 
d0x360 said:
It's simple. nvidia doesn't pay devs to use game works BUTbthey do pay for ads.
Do you have a source on that? Huddy has flat out stated that Nvidia pays development companies 6-7 figures not only for advertising, but stipulating that Gameworks gets used in their title. Now people can doubt Huddy's credibility, fine, but a statement that bold would be slander if it's not true, and best of my knowledge Nvidia hasn't taken any action against him or AMD.

MavericK96 said:
Unless I am mistaken, no one is forcing developers to use GameWorks.
You're either mistaken or Huddy is lying and Nvidia's not doing anything to disprove him aside from saying "nu-uh". I wish I could remember the exact source, but it was the interview between Huddy and PC Perspective or Huddy and Maximum PC (I forgot which one, sorry) where he said they're paying management of the companies to accept Gameworks, then the company is contractually obligated to use it, whether the actual developers want to or not. So the COMPANY isn't being forced, but the developers are.

pxc said:
How so? Forcing a lower maximum tessellation level does have an IQ impact, but it's an acceptable trade-off vs performance. What the issue primarily is, IMO, is that GW does not have a built in way to do the same thing. Considering the wide range of tessellation performance on even its own cards, there should have been an easier way to adjust it.
Well one example AMD uses is that that the tessellation is being performed on the water in Crysis 2, even on levels where the water is literally never seen, like inner city ones. Just turning on tessellation will cause a framerate drop (even if no visible objects are being tessellated). For COD Ghosts and Arkham Origins, they were claiming that the tessellation degree is way beyond what any can even SEE, claiming the cape in Arkham Origins uses over 1 million tessellated polygons. At that point, they're just driving down performance for everyone, but doing it for AMD more.


pxc said:
How is it bad for the entire ecosystem, and what standards are you talking about
Two things:
1. The practice of Nvidia paying companies with contract clauses that make Gameworks mandatory. This gives Nvidia an opening to derail AMD at the source rather than just making the better product. This isn't a good thing for gamers. Another person on here referred to this as crippleware and that's at least partially how it's being executed, it just happens to cripple AMD a lot more.

2. Let's get away from the bullshit where you MUST have Nvidia or MUST have AMD in order to run features huh? I mean we've already been down this road with 3DFX and Glide gaming, do we really want more games like that where you need both Nvidia and AMD cards in order to have all the features from a game? I remember a time period where if you wanted to play some games 3D-accelerated, you needed a voodoo card, but if you wanted higher textures, you needed a Riva TNT. It was bullshit then and it's not doing us any favors drifting back towards it. That's what I mean by harming the ecosystem, having games tied to specific chipsets. It's much better for gamers to have the games be hardware agnostic, and some cards will simply run them better than others.
 
They did and it's called Mantle. Then Mantle became Vulkan. But everyone is circle jerking over DX12 like it's something never seen before. :rolleyes:


Rolleyes when you dont know what you are talking about :)
How is Mantle equivalent to Gameworks?
 
Well one example AMD uses is that that the tessellation is being performed on the water in Crysis 2...
:rolleyes: Crysis 2 isn't a GameWorks game (it predates it), and CryEngine always tries to push hardware as much as possible in the maximum quality settings. And yeah, it looks totally like Nvidia sabotaged AMD performance, so much in fact that AMD beat equivalent level Nvidia cards when the game launched by respectable margins: http://www.guru3d.com/articles_pages/crysis_2_dx11_vga_and_cpu_performance_benchmarks,8.html

Worthless, crazy, tin foil hat conspiracy/speculation was removed from the quote and there's wasn't much left to respond to. :rolleyes:

Bawwwwww.....
 
So AMD is doing more bitching and whining... All they seem to do lately is put out sub-par cards and moan and complain.

Give me a good reason to switch from nVidia and I'll look at it seriously. Sit there and whine like a 2 year old and I'm not even interested in reading what you have to say. Keep on moaning and my interest in AMD declines more and more - much like AMD themselves, coincidentally. Sure, surely, by now you're starting to realise, AMD, that having a giant moan party about your competition does nothing but harm yourself?

AMD is losing more and more relevance every quarter. The past three years have shown completely pathetic efforts from the company. It's obvious that they're just phoning it in now, riding it down, and spending money on carnival-barking marketeers like Huddy to play on people's pity emotions to keep the whole thing propped up for as long as possible. They keep whining about GW because it diverts attention from their own poor efforts.

The AMD Radeon R9 Fury Is Currently A Disaster On Linux
Shadow of Mordor Released For Linux, But Only For NVIDIA Gamers

Phoning it in. But remember, it's all nvidia's fault.
 
“It certainly feels like it’s about reducing the performance, even on high-end graphics cards, so that people have to buy something new."
Who exactly is upgrading their GPU solely for virtual hair and pointless smoke?

"Man, this GTX 680 is a pretty sweet card, but I needs my sinewy Geralt hair and billowing Batmobile smoke!" "I'd better buy a 980ti!"

What a hilariously idiotic thought process. We don't upgrade for Gameworks features, we just turn that useless crap off. AMD has got nothing at this point except trying to fan the flames of tinfoil hat gamer rage. We're witnessing the last days of AMD - what a sad and pathetic sight to behold.
 
Yup +1 to this stuff. High FPS is about as beneficial as a 1,200 watt Platinum PSU, a water cooler, or a 4K screen. It's just pointless pursuit of some comparably high number with like at best weak justification behind it to support spending money on an immediate want instead of saving for a long-term need that can't yet be foreseen. Besides that, locking a game at 30 FPS is a great idea to reduce power consumption and heat output that would otherwise get thrown away in rendering frames that have little to no additional benefit. The other nice thing about 30 FPS is that its exactly half of most monitor refresh rates so the experience is super smooth too rather than being jittery and stuff at something that varies between 45 and 80 constantly (though sync stuff is kinda starting to address that, but not in a widespread or cost effective manner yet).

You know, I was about to jump all over this post, and was wondering who could have made such a heinous post... and then I realized who posted it. :D
10/10 - Would read again!
 
You know, I was about to jump all over this post, and was wondering who could have made such a heinous post... and then I realized who posted it. :D
10/10 - Would read again!

Aw gee thanks! ;) It's nice to see you again Mister Red Birdie. It's been really quiet since heatlesssun is no longer in super-deluxe-happy-mode over a tablet-based UI.
 
When ever AMD stops pushing that raptr malware they can talk until then can it.
 
Just as bad as Geforce experience. I mean hell Geforce experience has been proven to lower performance.

Both of them are complete trash.

I use GeForce Experience on the kids computers and it works OK. Good place to start some tweaking with if you don't want to waste a lot of time. Overall good for them though. Both have SLI rigs on 1440p screens, not like a bit of perf loss is going to be noticed. :p
 
The last card I bought was nvidia after a long line of AMD/ATI. I used to buy into the "Ati junk software" line. I found the catalyst suite to be sluggish and over complicated.

After switching to nvidia though I really really miss ATI. My next card won't be another nvidia even if they are 20% slower then the same priced ATI card. Nvidias junk software is far worse then anything ATI ever pushed on me.

In any event... lets all be honest the differences between most of the cards from both is in the low single digit % wise. The quality of their renders is almost identical these days. The feature sets if they are different are almost always on paper.

It is likely time to support AMD more with purchases, if we loose them completely the next 10-20 years in gaming won't advance anywhere... the lack of competition for the "high end" will drive intel/nvidia to do nothing but focus on mobile... down the road we might be able to play todays games on a phone, but there would be little much in way of progress in the way of game tech. Next machine I build will be AMD/ATI just cause.
 
How do you explain a tessellation performance diference of 13% turning into over 100% with Hairworks?

This is not even x64 and it's much more than a 13% difference.

xLuzRQF.gif


ditto with a different test at x64:

h6Bs1qL.png
 
Jesus.

I hope when AMD eventually goes under cause they are incapable of producing competitive products anymore, all their fanbois move onto consoles and leave PC gaming.
 
Jesus.

I hope when AMD eventually goes under cause they are incapable of producing competitive products anymore, all their fanbois move onto consoles and leave PC gaming.

How can you even think no competition is good? And you're not a fanboi right?
 
It's obvious they are hindering performance on all gous, have you paid attention to the performance of the gtx 780ti in game work games? It can't hang with the 290x. I hate gameworks.. Nvidia ' marketing is king, that is what they are good at. They can sell you a shit product and make it look good and they have been caught cheating and lying.. people still buy their shit. Gtx 970, IQ optimizations in the past.
 
How can you even think no competition is good? And you're not a fanboi right?

There already is no competition; AMD doesn't compete with Intel or Nvidia anymore. It's time they exit the market and let so a competent company can step in and take their place. Instead, they think slinging shit around will somehow make their products better, all while they lie to consumers about their own lackluster product's abilities.

I have no brand loyalty to Nvidia. If freaking ISIS came in and created an actual better GPU than Nvidia's, I'd buy it in a heartbeat with no regrets. I simply care about getting the best product I can, not the name on the sticker of that product.

And as for the AMD fanbois, they swarm in and play armchair lawyer and armchair FCC, and make laughable claims about illegalities and ethical business practices whenever someone just mentions the color green in any context.
 
<snip>It is likely time to support AMD more with purchases, if we loose them completely the next 10-20 years in gaming won't advance anywhere... the lack of competition for the "high end" will drive intel/nvidia to do nothing but focus on mobile... down the road we might be able to play todays games on a phone, but there would be little much in way of progress in the way of game tech.</snip>
I think you're wrong there. It's most often games themselves that push the tech, and vice versa. I think that the biggest thing that will happen if AMD disappear is a large increase in prices from nVidia do to no financial competition.
 
There already is no competition; AMD doesn't compete with Intel or Nvidia anymore. It's time they exit the market and let so a competent company can step in and take their place.
Gotta be the dumbest thing I've heard in my life.

Spidermanthatpostgavemecancer.jpg

Instead, they think slinging shit around will somehow make their products better, all while they lie to consumers about their own lackluster product's abilities.
I heard Nvidia graphic cards do great at 4k compared to AMD. :rolleyes: Hey lets do another test but with games that use more than 3.5GB of memory?
 
Fixed: It's time they exit the market and let a competent company step in and take their place.

What company would that be ?

Seriously who has the money to develop these something like a GPU from scratch. The only other companies that could even hope to do something like that spend their time creating mobile tech. Unless you think there is a second coming of PowerVR on the horizon. lol If AMD stopped trying to innovate and refocused on mobile or something progress would slow a lot... as it is right now if Nvidia stopped trying to improve performance and just focused on shrinking size and power, AMD would take the performance lead in one cycle. (pushing progress doesn't need to mean a photo finish, just being in the race can allow you to win if the other guy stumbles)

AMD still has a lot of infrastructure from ATI they haven't destroyed yet. They are quite capable of creating great hardware. The Fury stuff might not have been the home run people hoped for. Really though its far from a bad product tech wise. Again lets all be honest with ourselves... none of us can really tell the difference between 59 frames and 53 frames per second. Its not quite as bad as the audiophile that swears they can hear the difference between a $20 and $2000 digital cable connection... but its not that far off. :)
 
I agree that games should never be limited to 30 FPS. Ideally there would be no limits on how high the framerate can go, as long as your hardware is fast enough.

However, the vast majority of displays have a fixed refresh rate of 60Hz.
At 60Hz, games need to be locked at either 60 FPS or 30 FPS.
If you don't synchronize the framerate to the refresh rate, the game is going to judder badly. (e.g. uncapped and running at 40 FPS instead of 30 FPS)
But 30 FPS is, in my opinion, unacceptably low. Low enough that I get motion sick or end up with headaches after playing for more than a short while.
And you cannot eliminate judder with 30 FPS on a 60Hz display. That's not an opinion, but the nature of how our displays work, and it gets considerably worse on low-persistence displays.
The only option for smooth motion on a 60Hz display, is 60 FPS. Anything more or less than that is not smooth.

Yes, I know that it's not a GameWorks title. My post was in response to someone claiming that this was just AMD complaining because their tessellation performance is not on-par with NVIDIA's.
I was demonstrating that my own experience with a 960 and a 970 says that those cards don't have sufficient tessellation performance to enable it in most games either - whether it's a GameWorks title or otherwise.

So this excessive tessellation that GameWorks or "The way it's meant to be played." titles use is negatively affecting the performance on GPUs from both manufacturers.
NVIDIA just see it as an acceptable trade-off because it hurt AMD cards more than it hurt NVIDIA cards.
But AMD now have options in their driver to limit tessellation, while NVIDIA do not, so they're really just hurting performance on their own cards for no reason now.

And I don't expect to max out the settings in most games with a 960 (they were definitely not maxed out there) but in most games I can't enable tessellation at all without the framerate dropping below 60 - even with most other settings turned down.

All I was trying to point out is that AMD cards inherently perform much better in Tomb Raider than Nvidia, and as a result it may not be the best piece of software to use for your comparison.

With your 960 you should be able to enable tessellation on many games and handle 60fps. What is your rig (sorry if it's listed in your sig)? You must have a huge bottleneck. I just ran Tomb Raider bench on a 2500k @4.2ghz with a Strixx 960 and had 59.6fps avg with a 48 min on ultimate and had a 72 minimum fps on ultra (just lacking TressFX which is shitty in the first place). You should be able to max Tomb Raider even though it was designed for AMD...
 
I heard Nvidia graphic cards do great at 4k compared to AMD. :rolleyes: Hey lets do another test but with games that use more than 3.5GB of memory?

> Falsified Fury benchmarks before release

> Coil whine is only in review units

> An overclocker's dream

> The best 4k card there is

:)


What company would that be ?

You won't know until it happens. I remember people thinking Microsoft could never be dethroned. Yet here we are today, with them the underdog. Companies come in and disrupt markets all the time. Like Google. Shit, even if you want me to make a guess, Intel. So, please, don't act like it never happens and never could in the GPU world.
 
Back
Top