ATI Drivers really that bad?

Drivers and different hardware platforms can be so tricky to get working stable in some cases, Mr. Wolf you can always check your hardware problems on different computers if you want to be certain your hardware is not flaky.

I can only say that having owned both ATI and Nvidia cards over the years and seeing that Nvidia just manages to magically get 100+ on top of what the previous driver did with the benchmark with driver revisions is not impressive. Stability in games i got to say that Nvidia has been poorer for me then ATI.
 
Ok here it goes from my experience.

ATI 1 or 2 monitors 1 Video card = Ok

Nvidia 1 or 2 monitors 1 Video card = OK

ATI 1 or 2 monitors 2 Video cards (xfire) = OK

Nvidia 1 or 2 monitors 2 Video card (SLI)= OK

ATI 3 or more monitors 1 Video card (eyefinity)= random issues (display port issues with the adapter and my vary with hardware components, have tried active DVI and VGA and it works but lots of testing and troubleshooting, also diferent driver versions will change results on certaing games like BFBC2, even chaning Bios on the VC) not a smooth ride.

ATI 3 monitors 2 or more Video cards (EF + XF) = Sucks dosent work well 2 many issues, random shit depending the hardware and games.

Nvidia 3 monitors 2 or more Video cards (SLI+ surround) = Works some tweaks required for certain games but scales and less issues than eyefinity

Thats only from my experience

I have tried 5870 on xfire eyefinity with 10.5 and below and had to take out 1 card to make it work properly
If you going for an eyefinity just use 1 card unless the new drivers already fixed all the issues with EF+XF.
On the other side Nvidia, i rather have a little more power and heat with good results than no heat and less power with crappy gaming experience overall.

Currently running 2 EVGA GTX 470 SLI surround and much better.
 
@yuplis3d
Can you set custom resolutions? (e.g. 1680x1024?)
 
In my personal experiance as an IT pro of a few years I can say driver issues on both platforms are about equal --keep in mind I've seldom done anything like 3 monitors on 2 video cards. Ages ago a friend of mine used to have a problem with color accuracy on ATI cards, but thats long since been fixed. ATI has typically had better asino quality from what I've seen, but as of the G80 and R700's, they both have such good quality hits hard to argue. But again, I've got little experience with crossfire or SLI, but the experiences I have had usually have minor annoyances --no show stoppers, the game was almost always playable.

...

The current equivalent of this situation is somewhat GPU computing, where of course we have ATI and Nvidia going with different approaches. In this area I suppose it might be logical that ATI side is more "buggy," in the sense that developer support will likely lean towards the more established Nvidia cuda solutions.

...

At the moment if I had to pick, I'd say Nvidia's drivers are still better simply because they have been the market leader for so long that it is more likely that developers would lean/test towards Nvidia. So it is likely that Nvidia has the less odd bug here and there then ATI. But again this is hardly significant and likely inconsequential for most mainstream popular titles since they develop for both. (Mainly for instance I know more casual gamers who play say free MMOs, and they tend to have more issues with ATI cards)

Really though nowadays if you look at brand comparison the differences only exist in there extra features sets. Cuda/OpenCL, eyefinity/surround, physx and etc.

I'm a computing science student at a local university and my focus has been supercomputing on graphics chips in consumer grade PC's, and while I wouldn't say I'm fluent in either open CL or CUDA (who can be with the languages being so new?), I will say that, as a mass market developer, your only choice is Open CL (--or Direct Compute, part of the Direct X 11 API suite, which I've heard some good things about but haven't looked into). Cuda is a very mature platform with a very good implementation top-to-bottom, which makes it a fantastic proposition for purpose-built machines and super computing applications. But if you're valve and you want to do something insane for your next Half-Life game, you look to open CL, because everyone with an HD5k or Geforce GTX series card will support it --of course, if you're valve, and you want to take advantage of those API's, you're looking at market readiness 2013 at the earliest.

But as for developers aiming more for Nvidia than ATI, thats false. An unsupported dev will aim for what the documentation tells him they will get. The guys in the programming department will then compile what they can and throw it onto whatever box is most convenient, be it ATI or Nvidia, and see if it gives them the effect their looking for. Chances are, with the Direct X and Open GL specs now being so mature, its going to look nearly identical on both platforms, and if it doesn't the difference will be patched out quickly.

But that assumes the devs go unsupported, and Nvidia's "The Way Its Meant To Be Played" (TWIMTBP) program is much more aggressive than a similar program from ATI. Nvidia lends programming knowledge to developers to add tricky effects to makes the games look good --of course, if you get one of the chip makers into the game, they're going to make sure any code lent works best on their platform, at the very least (or, infamously, not at all on the other platform, as is the case with AA in batman).

A lot of this comes from the fact that a lot of people are running the same hardware without any issue. It's difficult to understand why two people running very similar hardware in a very similar environment can have such an extreme difference of experience. People base everything off of their experience, and someone who hasn't have trouble will look to something other than the driver to blame.

this is a very good point, and the truth is often a likeness for one company over the other boils down to good or bad hardware, and hardware is a crapshoot --which is why I think Nvidia's decision to push out a card that runs at an expected 95C under load bis a terrible idea (I mean really, no normal consumer cleans out their computer, and 95C under load new... what does that number turn into after 2 years of dust?)
 
Haven't had issues with ATI drivers when it comes to my single monitor one card. But Eyefinity with crossfire is nothing but headaches.
 
I have this problem now while watching a movie on my HDTV and I have post on another thread:

1 DVI-D (monitor) + 1 HDMI (HDTV) = In extended mode, I turn off my monitor, my movie still play on my HDMI

1 DISPLAYPORT (monitor) + 1 HDMI (HDTV) = In extended mode, once I turn off my monitor, my movie would FREEZE and CRASH on my HDTV until I turn it back on my monitor and the movie would move and sit there!

Hows this for a bug?
 
I've used both Nvidia and ATI cards over the last 14 years and they both have approximately equal drivers. Bugs come and go on both sides, features both increase on both sides.

OP is either mislead or trolling
 
I've used both. When Vista first came out I used to get a lot of blue screens and crashes caused by the Nvidia gpu driver but that was about it. Past few years have been great.

Ati drivers make it easier to install just the driver and not the control panel. You can remove the Nvidia one though too. I just prefer to leave it but set the help services to manual so they are not running if not needed.
 
My experience, im a bit iffy with Ati. I was at my brother in law house, and he had a 5970 but I notice he was playing it in single card mode in Borderlands. I enabled crossfire under his properties, and after 25 minutes of gaming, crashing to desktop. Finally after lots of reading we found that 10.4 where the best to use. We installed them and played borderlands for 4 hours straight without a single crash. Then the new drivers cames out recently and under notes it said, increased performance in Borderlands! I was awesome I will let my brother in law know to go ahead and update the drivers, I kept reading the release notes and then it states, it can cause random crashes in Borderlands in crossifre. I was WTF is the point. I honestly think, if your running one card, I wont sweat it but if you going to do crossfile/Sli I will stick to Nvidia. Thats my 2 cents.
 
Thanks for all the info. I've been out of the gaming relm for a few years and getting back into it I just wanted to know where things stand.
 
ATI has been rockin for a while now! I'm perfectly happy with my purchase of a 5 series card.

Go with Nvidia if you're down with overheated GPUs that go into thermal meltdown and explode.
 
When I said "I wanted to know where things stand", I meant with drivers. Thank you every one for sticking to that and not turning this into a crap thread.
 
Drivers are fine, I've had zero problems with the stock drivers from www.ati.com on my system. Not sure what everybody else is saying but my experience with my 5870 has been stellar.
 
They both suck, but ATI sucks more. Haven't been cursed yet with Nvidia drivers blowing up any my hardware, but I did have ATI drivers blow up a very nice Sony Trinitron 20" back in the 9800 Pro days.

Just fixed the YouTube/Flash BSOD problem by downloading the latest Flash player...........again.............same videos didn't BSOD my Nvidia rigs.
 
Using both nvidia and ATI cards for years one continually ran into more problems than the other due to drivers.

It's not just nvidia fan-boys throwing it out there.

If you play very few games or want to risk minor annoyances that will eventually be fixed, its not a big problem.

edit: I have no ran into any problems with ATI drivers lately. But have played less games.
 
Last edited:
i have never experienced any issues with my 5850 and drivers

of course it may be a different story when I get another and do crossfire
 
ATI 2 monitors 1 Video card (5850) = Artifact garbage = BS

I needed to raise the clocks which negates the power savings.

EDIT: The early Radeon 9500 PRO drivers were the shittiest drivers ever. I had trouble with early nVidia Vista drivers, but they still didn't top those Radeon drivers.
 
Last edited:
ATI has been rockin for a while now! I'm perfectly happy with my purchase of a 5 series card.

Go with Nvidia if you're down with overheated GPUs that go into thermal meltdown and explode.

Im not sure how much you know about nvidia cards or have just been reading the forums and see people trying to be clever every post about the temperature of GPUs. But, judging from your post you seem to have the wrong idea.

The fact that they run hotter than ATIs cards has very little effect on the owners. They actually do not overheat and will not come close to overheating in a near properly ventilated case.

If my cards, which run considerably cooler than 470/480, ran properly and was designed to run at 10 degrees celcius hotter I would be fine with them running 10 degrees hotter. People who bought Nvidia cards did not avoid buying them because they will see 90c instead of 80c when monitoring. That, along with the sound, is the extent of the temperature difference for most users. If they were equal db/db and they never checked the temperature people would never notice the difference. So, it is hardly a deal breaker for most.
 
Last edited:
another stupid ATi drivers suck thread lol been with ATi since the original Radeon days and have had very few issues overall. I prefer ATi over Nvidia due to cost. I am not paying up the rear for performance that is just marginally better ....
 
I have problems with ATI and dual monitors. It took them forever to get the clocks correct, and even now it underclocks to 400/900 (causing flicker) every time you watch a video. The issue where the cursor bugs out and looks like a strange glob of hlf transparent pixels still happens a lot. That and the fact that it cant monitor 2 cards in crossfire on its own?

I really really tried to like ATI, and generally I do, but I like a system that can perform the most basic of tasks without bugging out and causing problems. I have 2 gtx 480s on the way. I'm going to run for a week with them as opposed to my 5870s, and see which pair I end up selling.
 
I'm a computing science student at a local university and my focus has been supercomputing on graphics chips in consumer grade PC's, and while I wouldn't say I'm fluent in either open CL or CUDA (who can be with the languages being so new?), I will say that, as a mass market developer, your only choice is Open CL (--or Direct Compute, part of the Direct X 11 API suite, which I've heard some good things about but haven't looked into). Cuda is a very mature platform with a very good implementation top-to-bottom, which makes it a fantastic proposition for purpose-built machines and super computing applications. But if you're valve and you want to do something insane for your next Half-Life game, you look to open CL, because everyone with an HD5k or Geforce GTX series card will support it --of course, if you're valve, and you want to take advantage of those API's, you're looking at market readiness 2013 at the earliest.

Not very familiar with with GPU computing, I mainly based my reasoning on things like how Adobe CS5 for instance I believe only supports CUDA, and how Nvidia Purevideo seems to be more "solid" then ATIs AVIVIO still.

But that assumes the devs go unsupported, and Nvidia's "The Way Its Meant To Be Played" (TWIMTBP) program is much more aggressive than a similar program from ATI. Nvidia lends programming knowledge to developers to add tricky effects to makes the games look good --of course, if you get one of the chip makers into the game, they're going to make sure any code lent works best on their platform, at the very least (or, infamously, not at all on the other platform, as is the case with AA in batman).

This was something I was more referring to. Nvidias position/history in the market seems to show that they can leverage more cooperation with developers then ATI. And this from my experience, in less popular and less played non-mainstream titles, it seems ATIs cards tend to have more "quirks" then Nvidias.
 
CS5 supports ATI Radeon cards (see http://blogs.amd.com/work/2010/04/12/bringing-adobe-creative-suite-5-to-life-with-amd-technology/ ).

Not very familiar with with GPU computing, I mainly based my reasoning on things like how Adobe CS5 for instance I believe only supports CUDA, and how Nvidia Purevideo seems to be more "solid" then ATIs AVIVIO still.



This was something I was more referring to. Nvidias position/history in the market seems to show that they can leverage more cooperation with developers then ATI. And this from my experience, in less popular and less played non-mainstream titles, it seems ATIs cards tend to have more "quirks" then Nvidias.
 
Got my 5870 in November and have used both the 9.10 and 10.4 drivers. Both were rock solid and actually quite a relief after having years of blue screens and NVKLLDM (sic?) driver errors with Nvidia cards.
 
Ive said it before and Ill say it again....over a 10 year span with cards from both sides, I have had maybe one or two issues with drivers.....

I'm currently running a 5870 with the 10.5 drivers and they work fine......I'm not updating them because they do what I need and I don't go looking for trouble if I don't have to;)
 
I must say that I'm pretty pissed that you need 10.7 to play Crossfire Starcraft2 and yet anything over 10.4 is a mess with Crossfire Bad Company 2. I'm also disappointed that Crossfire Eyefinity is still so stuttery.

It feels like one step back for every two forward lately.
 
The issues I've had with NVIDIA drivers have been minor. No matter what I did or what drivers I used, the card always worked, though there were occasional issues with certain games. The issues I've had with ATi drivers have been severe. So, that's where I'm coming from.

I still can't believe ATi still has yet to implement game profile support.
 
I thought this topic died after proof was published directly from Microsoft proving that Nvidia had worse drivers?

Apparently not, Nvidia fanboys should focus positive energy on Nvidia, it needs it bad check the stock, instead of trying to dredge up 8 year old FUD.
 
I thought this topic died after proof was published directly from Microsoft proving that Nvidia had worse drivers?

Apparently not, Nvidia fanboys should focus positive energy on Nvidia, it needs it bad check the stock, instead of trying to dredge up 8 year old FUD.

http://arstechnica.com/hardware/new...it-paints-picture-of-buggy-nvidia-drivers.ars

I had an 8800GT around that time too. Really turned me off from nvidia after all the bluescreens from nvdllnmmmansd
 
http://www.zdnet.com/blog/hardware/warning-nvidia-19675-drivers-can-kill-your-graphics-card/7551

Im not sure how much you know about nvidia cards or have just been reading the forums and see people trying to be clever every post about the temperature of GPUs. But, judging from your post you seem to have the wrong idea.

The fact that they run hotter than ATIs cards has very little effect on the owners. They actually do not overheat and will not come close to overheating in a near properly ventilated case.

If my cards, which run considerably cooler than 470/480, ran properly and was designed to run at 10 degrees celcius hotter I would be fine with them running 10 degrees hotter. People who bought Nvidia cards did not avoid buying them because they will see 90c instead of 80c when monitoring. That, along with the sound, is the extent of the temperature difference for most users. If they were equal db/db and they never checked the temperature people would never notice the difference. So, it is hardly a deal breaker for most.
 
The only problem I have ever had with drivers is nvidias giving my bsod on a geforce mx2 (something like that).
 
I am running MSI afterburner, do I need to run MSI Kombustor also with my setup? :confused:
 
I have a rig with a 4670 which isn't the latest and greatest, I know. It does get the job done though and I've never had an issue with ATI's drivers. I've also never had an issue with Nvidia drivers so maybe I'm just lucky?
 
I have a rig with a 4670 which isn't the latest and greatest, I know. It does get the job done though and I've never had an issue with ATI's drivers. I've also never had an issue with Nvidia drivers so maybe I'm just lucky?

Most people actually rarely have issues with either drivers but they have nothing to post about :) I've never had issues with either drivers except for some issues when AGP first came out that I had with both drivers.

I went ATI last 2 generations and my only complaint is this generation I occasionally lose portrait mode with 3 monitors hooked up. I'll probably try nVidia again once they can do 3 monitors on 1 card just cuz I like seeing both sides but for now I'll probably be on ATI for S.I. and N.I.
 
I'm using both drivers in same install (HD5870 + GT240), but guess what happened when i replaced my previous GT250 (i wasn't able to find a solution for a quiet dual slot cooling for it) with a GT240 ? Nice big BSOD at boot time, so i had to reinstall my Windows 7 because of that.

On other side, since Catalyst 10.4a the ATI drivers ignore the memory setting in overclocking profile (the Feature name="MemoryClockTarget_0" setting), so my memory runs at max freq all the time because at standard speeds (157/300) i have the green dots issue.

So neither is perfect, but neither is extremly bad. Overall, i would give NVIDIA drivers a 8/10 and ATI drivers 9/10 :).
 
I thought this topic died after proof was published directly from Microsoft proving that Nvidia had worse drivers?

Apparently not, Nvidia fanboys should focus positive energy on Nvidia, it needs it bad check the stock, instead of trying to dredge up 8 year old FUD.

Is 3 year old FUD better?
 
I'm using both drivers in same install (HD5870 + GT240), but guess what happened when i replaced my previous GT250 (i wasn't able to find a solution for a quiet dual slot cooling for it) with a GT240 ? Nice big BSOD at boot time, so i had to reinstall my Windows 7 because of that.

On other side, since Catalyst 10.4a the ATI drivers ignore the memory setting in overclocking profile (the Feature name="MemoryClockTarget_0" setting), so my memory runs at max freq all the time because at standard speeds (157/300) i have the green dots issue.

So neither is perfect, but neither is extremly bad. Overall, i would give NVIDIA drivers a 8/10 and ATI drivers 9/10 :).

Err you don't have to reinstall Windows because of a bluescreen... you boot into safe mode and uninstall the video driver..

And clearly your card was bad.. there are no cases of GPUs just BSODing at startup.. those basics are all tested. You had a bad board it sounds like (if all other things were equal and it wasn't your.. say.. RAM).
 
1) it did BSOD even in safe mode - don't ask me why, it just did. There was no way to boot it
2) the card is fine, the Windows had problem with the fact that i swapped GTS250 for GT240. Every piece of hardware is ok. It was a pure software issue of nvidia driver dying on graphic card swap. The cards are running no problem, the GT240 is running F@H every time computer is turned on.
 
I have had couple nV driver file crash, but none on ATi so far.

Some might experience the opposite. They always have problems, just depends on user configuration and luck? LMAO

They both are bad and good, get over it.. None of them are better than the other. both are shit TBH..
 
Great, I just remembered the reason why I jumped from hd5770 crossfire setup to a hd5870 :|

I keep on hearing that crossfire is theissue with current drivers...
Posted via [H] Mobile Device
 
Back
Top