The fps race has ruined driver quality

Joined
Mar 15, 2002
Messages
2,482
I've been gaming and have had an interest in having nice hardware since the quake 2 days, (1997ish) and have had tons of pc's and various configurations. I've owned many, many different video cards. I have been troubled by the lack of stability for awhile now but never to this degree.

I don't think any current video card by Ati or Nvidia is in anywhere close to an acceptable state. I could list all the various problems all these cards have but instead I'll just point you to both of their official forums and this site.

If you are a person who plays a variety (seriously stfu if you want to argue and only play 2 pc games frequently) of pc games or uses your pc for complex home theater setups you know what I'm talking about.

I believe it boils down to one issue.... the fps (frames per second not first person shooter) race. The problem is that many review sites will simply dismiss game issues as something that will for sure be fixed while doing a review, and continue on to fps as the main focus.

What I propose; calling these companies out if ANYTHING happens on any major game that's out, and not recommending anyone buy the card till the issues are fixed, and giving bugs just as much value as fps.

Thoughts?
 
My take? I think today's drivers kick quite a bit of ass vs. the way they were since the quake days. They produce infinitely more complex 3D scenes than they did back then, and have relatively few problems to boot.

I think we just notice more minute and specialized problems as we gain knowledge or use our cards in exotic configs.
 
People want the fastest card. It's been that way for a long time.

Bug fixes happen too, quite frequently on both sides. Generally though, people just want the fastest. A large proportion of those also don't care if the card is power-hungry or whatever, as long as it gives them that extra 5FPS.

butterfliespretty said:
What I propose; calling these companies out if ANYTHING happens on any major game that's out, and not recommending anyone buy the card till the issues are fixed, and giving bugs just as much value as fps.

And what if it's a game bug, rather than a driver bug?. If we did as you propose, no-one would be buying any cards from either AMD or nVidia since they both have issues one way or another with some games.

The only thing I'd like to see changed is AMDs one-driver-per-month policy, and change that to 'release a driver that is a significant improvment over the last and doesn't break current features/reintroduce old bugs'.
 
Not sure where all these driver complaints come from. I usually play 25-30 different games a year and I never run into any note worthy issue unless it is 12+ years old(which usually require some type of hack). Of course I don't run anything exotic, right now I am running a 5850 in one computer and a 450gts in a HTPC and neither one has had issues(I play on both computers about equally). The only real issue I have had was when the 5850 just came out and the witcher didn't work, other than that I can't complain. Just a quick note, I don't play Blockbuster games, its mostly games like Kings bounty, men of war, and other games that would not get the attention of ATI/NVidia...
 
AMD/nvidia are relatively bad at coding drivers, but find a good one, and they work most of the time. I play at least 15 games on a relatively regular basis, and all of them work fine with the driver I use now, the driver I used before that, and before that etc. etc.

Game developers are relatively bad at coding games, plenty of bugs in current games out there. Just take all the crashes in Bad Company 2 as an example, or all the VSync problems with the original Dead Space.
 
AMD/nvidia are relatively bad at coding drivers, but find a good one, and they work most of the time. I play at least 15 games on a relatively regular basis, and all of them work fine with the driver I use now, the driver I used before that, and before that etc. etc.

Game developers are relatively bad at coding games, plenty of bugs in current games out there. Just take all the crashes in Bad Company 2 as an example, or all the VSync problems with the original Dead Space.

Let me see you do a better job at making a piece of software run on millions of different hardware configurations
 
Gotta say, I been doing 3d gaming since the voodoo 3 generation, and drivers today are worlds better from a compatibility standpoint than they were back then.
 
While I personally have no issue with ATI drivers(Or nVidia as long as they support the card I'm using), ATI has had issues since the Rage days. That's always been their knock, though it's not as bad as some claim.
 
While I admit stability of drivers and core functionality is pretty darn important, I'm more concerned with image quality these days.

Nvidia hasn't done anything recently to raise image quality, while AMD even taking a few steps back.

And game developers must think we are all blind as bats. I have yet to see flicker free game.
 
Don't forget the terribleness of the TDR in recent versions of Windows as well.
http://www.microsoft.com/whdc/device/display/wddm_timeout.mspx

Worst fucking idea ever. The detection routine is FAR too sensitive (that or drivers for Win7 are just crap regardless of version).

I can install XP on this box and NEVER get a video hang/timeout. But running Win7, I can probably get one at least once a day. Screen goes black, then resumes a couple seconds later after the display reinitializes.

Can't disable it via the registry or the machine will just hang when one of these events occur.
 
Let me see you do a better job at making a piece of software run on millions of different hardware configurations

This is precisely why I believe the console war is winning. Easier to code for. I also think that people are generally fed up with chasing drivers/firmware/whatever to alleviate their gaming performance issues when at the end of the day you could buy a console for less than the price of a high end video card and play to your hearts content.

But that's just me. I haven't played anything seriously since UT, and before that Subspace. (which I am still fond of)
 
I have around 350 games now I even tho I cant play them all at the same time ive beaten a huge majority of them and still continue to beat them today. Just yesterday I beat dead space 2 & singularity.

I believe both AMD and Nvidia need to work on their drivers but in reality Ive never really had that many problems with either company. I did have problems with ATI's 10.12 drivers but thats about it.

I don't think any current video card by Ati or Nvidia is in anywhere close to an acceptable state

I dont agree to this statement at all but thats my opinion. I run a 6950 2GB unlocked & Oced & this thing is a Beast. It may not be a 5970 or a 580 but I can max any game ive tried with pure ease. I only paid 258.00 for my 6950 2gb after MIR so I cant complain

What I propose; calling these companies out if ANYTHING happens on any major game that's out, and not recommending anyone buy the card till the issues are fixed, and giving bugs just as much value as fps.

This would be almost be impossible to do & to be honest how do you plan on getting a huge mass of users to do this. Now im not saying it would be bad thing at all but it would take a hell of alot of work & dedication to pull it off.
 
Last edited:
How many "driver" issues are actually caused by the user? Not that most would ever admit they did something wrong.
 
Before I got the Radeon 5770, I had no problems up to Catalyst 8.xx and 9.12.

Once I got the 5770, I had some issues with ATI/AMD's PowerPlay up to 10.12. atikmdiag kept popping up in Event Viewer whenever something "black screened" or "cursor randomly enlarged." I ended up modding the BIOS to keep all power states the same GPU and Memory clock speed.

I moved onto the Radeon 6950 (unlocked shaders) and did the same thing with the power states, and currently no issues with 11.2 at the moment.

So, my stable versions has been 9.12 (3870), 10.11 (5770) and 11.2 (6950).

But, AMD is not only at fault here. Nvidia does have problematic drivers. The power switching state I've seen happen on 200-series GPUs from Nvidia that I've witnessed myself and of my friends' systems.

And, let's not forget the performance issues that plagued FFXI and FFXIV (if anyone plays them) during the 8800-series days and the 400-series. Especially the graphical issues in FFXI on 8800-series cards from many years back. Also, when the drivers somehow magically burned GPUs due to a faulty issue with the software fan controller or something (I'd have to dig up that [H] thread again).

So, in the end, both AMD and Nvidia have had driver issues. Like someone mentioned above and I will reiterate-- it is wholly impossible for any company regardless if it's Dell, Intel, AMD, Nvidia, Realtek, or Creative or someone else to make a driver that would work for the practically limitless number of combinations that PC or [H] users configure their computers to become.

During the FFXI and 8800-series GPU heydays, I had this thought that the one and only way to fix driver incompatibility is to revamp the whole driver architecture and how OS-device and device-software communicates to each other.

In my honest opinion, Plug-n-Play is a misnomer. TRUE Plug-n-Play should have drivers that auto-adapt its driver software to the hardware configuration that the new hardware or device is plugged into. It should work about as seamlessly as plugging in that USB memory stick into the computer-- no drivers, no mess. That's how it should work in the future. You buy a new video card or sound card and put it into your PCI Express 4.0 slot (or whatever), load up the driver software, and the software automatically reprograms and reconfigures itself to the hardware it's in. And, when hardware changes, it reconfigures itself automatically.

That, in my belief, would fix all these driver issues but I don't think we're ever there yet from a programming standpoint.
 
How many "driver" issues are actually caused by the user? Not that most would ever admit they did something wrong.

Whenever someone complains about crap drivers (looking at you and my old crossfire 5870s, AMD) there is always someone that claims USER ERROR! MINE WORK FINE SO YOU MUST BE DOING IT WRONG!

Two entirely different computers and countless configurations and they still gave me issues... Once the 580 came out I had to switch to Nvidia, at least their crap isn't completely broken :rolleyes:
 
Graphics cards and graphics in general are orders of magnitude more complex now than they were back then. The fact that we have as few issues as we do now screams volumes about the quality and the work these companies put into their drivers. To be honest, most of the posts you see on here and other forums are the <1% of the population that actually has a problem due to the drivers or more often screwed up something themselves.
How many "driver" issues are actually caused by the user? Not that most would ever admit they did something wrong.
Exactly. You'll have people come on here and cry and moan that they tried and it can't be their fault when they just kept making the same mistake(s) over and over. Honestly, I've built dozens of computers and have not once had a driver issue that I couldn't fix or wasn't fixed by AMD/NVIDIA because it was a known issues Some people simply can't man up and fix the problem or admit their failure.
 
I disagree. Compared to 15 years ago, driver quality is amazing...for single card setups. But I agree, driver quality is down from where it was 5 years ago.

I think that the thing that has hurt driver stability the most in the past decade has been feature creep. I'm not talking about new image quality improvements, I'm talking about new and exciting ways to either (1) waste your video card cycles or (2) waste your money on even more expensive setups. Every new feature means the number of developers increases and testing complexity grows.

Some examples:

SLI/CFX - I can't name any other feature that gets so many driver man hours wasted on it, but then is is used by so few people. It's also the single greatest source for bugs in the modern video card world! Only a tiny fraction of real users (this forum doesn't count) take advantage of this, and I still to this day find it hard to believe that all the driver development costs have been justified by all the additional card sales attributed to this.

Triple-monitor gaming - in reality, this is an excuse to sell more SLI/CFX setups, because yes you can try Eyefinity on your single AMD card, but as soon as you see the marginal performance/sacrificed settings, you'll throw another card into the fire. So, suddenly your developers need to make sure that SLI/CFX and triple-monitor work well together (just ask AMD how hard this is - they'll tell you how many months it took before Eyefinity worked with CFX). The percentage who can afford this is even smaller than the pure SLI/CFX group, and yet the GPU makers think they can justify the extra costs/complexity this brings.

Accelerated Physics - this is a stupid, gimmicky way to do physics acceleration because (1) you can't do anything interactive and (2) we have the spare CPU cycles now to handle it. Still, that doesn't stop companies from wasting their time developing sometimes incompatible standards and wasting developer time. Both AMD (Havok FX in it's various forms) and Nvidia (Physx) are guilty of this, and it pains me when I think of how many hours were wasted on this crap that hardly anyone uses.
 
Last edited:
Don't forget the terribleness of the TDR in recent versions of Windows as well.
http://www.microsoft.com/whdc/device/display/wddm_timeout.mspx

Worst fucking idea ever. The detection routine is FAR too sensitive (that or drivers for Win7 are just crap regardless of version).

I can install XP on this box and NEVER get a video hang/timeout. But running Win7, I can probably get one at least once a day. Screen goes black, then resumes a couple seconds later after the display reinitializes.

Can't disable it via the registry or the machine will just hang when one of these events occur.

this is honestly a thing of beauty to be honest, i'd prefer my driver to eat shit and restart over hanging the system or fucking off to a bsod.

especially since it prevents stupidity like your browser shitting on your system because someone decided to make a page that uses some kind of 3d content, be it webgl or whatever, that's designed to throw your driver into an infinite loop.
 
Lol at some of you saying drivers are better than than the voodoo3 days/15 years ago.

Back then I never had problems and I would play just about every major title that came out. The issue about there being more configurations nowadays is true and I'm sure is part of the issue.

However I'd like to point out a few things. If you look at the latest driver releases you see that every time one is released there's a huge focus on performance, but they always seem to break something to make up for it. Hell custom resolutions haven't worked on any nvidia card of this generation since around 6 months ago. Ati/amd isn't any better really.

I think some of you responding really are only playing a few games or aren't very sensitive to various issues (I mean doesn't everyone have at least 1 minor issue with bc2). Go look at the official forums for both and tell me again that this is user error and simple configuration conflicts. Also when talking about issues if they are game related or driver related it's easy to diagnose if things used to work fine on the game previously, should be common sense. I think some of you just try to find loopholes in any statement that's posted on here.

Also some of you fanboys are so hard up on anything you've spent money on that you refuse to acknowledge various issues you're having.
 
Last edited:
In the last 5 years, I very rarely run into any major driver issues. I've had to roll back to previous revs with both nvidia and amd before, but never anything serious. I did have some crossfire problems when the 5800 series was launched, but I've had the same type of problems with sli.

I definitely think drivers have come a long way since the mid '90s, but maybe I was more exposed to driver problems having worked at CompUSA at the time when I was 16.
 
this is honestly a thing of beauty to be honest, i'd prefer my driver to eat shit and restart over hanging the system or fucking off to a bsod.

especially since it prevents stupidity like your browser shitting on your system because someone decided to make a page that uses some kind of 3d content, be it webgl or whatever, that's designed to throw your driver into an infinite loop.

Lol, using language like that to describe something like this makes me laugh, don't really know why, I just find it hilarious :D
 
if any of you ever had a 3dfx banshee you won't complain. I bought one when i was in 11th grade and couldn't afford a voodoo3, or an upgrade. 3dfx updated voodoo3 drivers constantly, and the banshee drivers about once every 2 years. I had to learn how to hack my own drivers together from banshee/voodoo3 stuff.

ati drivers were garbage until amd bought them. They have come ALONG way since then.

In my opinion we all just have higher expectations than we used to. Spoiled by ipods and consoles that 'just work'.
 
I appreciate having more options and the stability that new drivers offer. Old games and old drivers crashed too, I found that they crashed more often and were pickier about drivers.
 
Back
Top