NV GTX 460 1GB SLI vs. ATI HD 5850 CFX Redux @ [H]

Its nice to see ATI responded and fixed a glaring driver problem with their top end cards. However, the 5800's have been out for 10 months +, and they're just now getting around to addressing the issue? That speaks very poorly of them.

Man I wish XFX had 460's on the market.
 
First off, thank you very much for doing a second review with a more "corrected" driver released from ATI.

However the reviews, even starting from the prior ones where the 460 just raped the 5850s, have felt biased towards Nvidia (I run Nvidia currently). I think that it should have been made very clear during the original reviews how big of a performance hit that the drivers had made on the CFX, and not that the cards were the limiting factor. Which is VERY relevant to reviews as the same piece(s) of hardware will perform at a much higher rate than shown once the driver issues are fixed. With about 5 minutes of searching the web I found how hard the newer drivers had hurt the CFX setups.

During those original reviews, there should have been a pair of 5850's running on the older 10.5/6 drivers that were much faster as to give a more fair judge on what the cards were "capable" of. And leave it to the consumer to decide if they should wait for ATI to fix their drivers or jump on the immediate fix in the 460.

While I am glad that reviews like those help force ATI to really get on their driver team, I think a little bit more fairness is deserved not to ATI, but to the readers.
 
Last edited:
Its nice to see ATI responded and fixed a glaring driver problem with their top end cards. However, the 5800's have been out for 10 months +, and they're just now getting around to addressing the issue? That speaks very poorly of them.

Man I wish XFX had 460's on the market.

they only fix what they broken since 10.5 or 10.6 driver.. which is kinda stupid IMO..
 
First off, thank you very much for doing a second review with a more "corrected" driver released from ATI.

However the reviews, even starting from the prior ones where the 460 just raped the 5850s, have felt biased towards Nvidia (I run Nvidia currently). I think that it should have been made very clear during the original reviews how big of a performance hit that the drivers had made on the CFX, and not that the cards were the limiting factor. Which is VERY relevant to reviews as the same piece(s) of hardware will perform at a much higher rate than shown once the driver issues are fixed. With about 5 minutes of searching the web I found how hard the newer drivers had hurt the CFX setups.

During those original reviews, there should have been a pair of 5850's running on the older 10.5/6 drivers that were much faster as to give a more fair judge on what the cards were "capable" of. And leave it to the consumer to decide if they should wait for ATI to fix their drivers or jump on the immediate fix in the 460.

While I am glad that reviews like those help force ATI to really get on their driver team, I think a little bit more fairness is deserved not to ATI, but to the readers.

The previous article was nvidia favorable because the GTX460s shit stomped all over the 5850s. Every reviewer should always test the latest drivers regardless of bugs or performance issues, there should be no going back in time and comparing older drivers unless it is specifically a review of drivers. I expect the latest drivers on AMD's site to work and if they don't and it makes them look bad in reviews well tough fucking shit. I'm not gonna play musical chairs with drivers for either company. Release one driver and make it work, not that I don't expect drivers like any piece of software to have bugs but when they get worse and worse as time goes on, you're doing something seriously fucked and wrong. These Crossfire problems have been stated many times in many forums, it shouldn't be Brent and Kyle's job to make the Catalyst team get off their asses and do something.
 
The previous article was nvidia favorable because the GTX460s shit stomped all over the 5850s. Every reviewer should always test the latest drivers regardless of bugs or performance issues, there should be no going back in time and comparing older drivers unless it is specifically a review of drivers. I expect the latest drivers on AMD's site to work and if they don't and it makes them look bad in reviews well tough fucking shit. I'm not gonna play musical chairs with drivers for either company. Release one driver and make it work, not that I don't expect drivers like any piece of software to have bugs but when they get worse and worse as time goes on, you're doing something seriously fucked and wrong. These Crossfire problems have been stated many times in many forums, it shouldn't be Brent and Kyle's job to make the Catalyst team get off their asses and do something.

Well said!
 
it shouldn't be Brent and Kyle's job to make the Catalyst team get off their asses and do something.

It seems to me, as sad as that is, that is exactly what happened.
Well said, indeed.:eek:
 
Nice to see the re-write on the article, but I must agree with other's when it came to "The Bottom Line". I found it terrible that the "original conclusion" still stands. Either 1) the original was way off or 2) this one is way off.

The bottom line from the original article was you'd almost be a fool (or have extreme faith in ATI) to buy two 5750's and get 1/2 the performance in several new cutting edge games.

The bottom line in this article is a price and performance difference of less than 10%. You can purchase either card based on your preference for power, noise, heat, 3D, etc.

The conclusion from these two articles is VASTLY different. Now a person can purchase ATI without feeling like they've been taken to the cleaner. That's extremely significant.
 
First off, thank you very much for doing a second review with a more "corrected" driver released from ATI.

However the reviews, even starting from the prior ones where the 460 just raped the 5850s, have felt biased towards Nvidia (I run Nvidia currently). I think that it should have been made very clear during the original reviews how big of a performance hit that the drivers had made on the CFX, and not that the cards were the limiting factor. Which is VERY relevant to reviews as the same piece(s) of hardware will perform at a much higher rate than shown once the driver issues are fixed. With about 5 minutes of searching the web I found how hard the newer drivers had hurt the CFX setups.

During those original reviews, there should have been a pair of 5850's running on the older 10.5/6 drivers that were much faster as to give a more fair judge on what the cards were "capable" of. And leave it to the consumer to decide if they should wait for ATI to fix their drivers or jump on the immediate fix in the 460.

While I am glad that reviews like those help force ATI to really get on their driver team, I think a little bit more fairness is deserved not to ATI, but to the readers.

The review sounded biased I think because nVidia really hit a home run with 460 SLI. The 460 is a solid board and it scales really well in SLI. The 5850s on the other hand are better boards and yet performed worse when paired together. Furthermore CFX worked fine before 10.5 and AMD somehow managed to break it and not fix it until 10.8.

I really don't think [H] has a green team bias because if you read their review of the 450 they pretty much end by saying there's no reason to buy one and AMD totally kicked nVidia's ass with this last product cycle.

nVidia SLI > AMD Crossfire
AMD Cypress/Pro > nVidia Fermi

That's more or less how it is right now and [H]'s tone in reviews reflects that.
 
Pretty sure [H] got called ATI biased when the 5870 got released too. :p It goes back and forth.
 
I would love to see a comparison of the SLI/crossfire scaling of the GTS450 vs. 5750/5770. Seems lik the 57xx series has had less crossfire issues than the 58xx series, though I never benchmarked my 5750s with any of the newer drivers, I never noticed a performance problem, and when I DID bench them with I think Cat 10.3/10.4, I had 180+% of the single card performance.
 
I would love to see a comparison of the SLI/crossfire scaling of the GTS450 vs. 5750/5770. Seems lik the 57xx series has had less crossfire issues than the 58xx series, though I never benchmarked my 5750s with any of the newer drivers, I never noticed a performance problem, and when I DID bench them with I think Cat 10.3/10.4, I had 180+% of the single card performance.

10.3 and 10.4 were fine. After 10.5 something was broken in the drivers that made CF much slower. 10.8 appears to have fixed this bug/issue/whatever. CF was fine until AMD pretty much broke it.
 
props for hardocp keeping the manufacturers on their toes, because without the original comparison, the driver fix may not have happened.

the conclusion MAY be too hard on the 5850. it is close. he has a point. if you can't tell the difference, 40 bucks is 40 bucks wasted...
 
The previous article was nvidia favorable because the GTX460s shit stomped all over the 5850s. Every reviewer should always test the latest drivers regardless of bugs or performance issues, there should be no going back in time and comparing older drivers unless it is specifically a review of drivers. I expect the latest drivers on AMD's site to work and if they don't and it makes them look bad in reviews well tough fucking shit. I'm not gonna play musical chairs with drivers for either company. Release one driver and make it work, not that I don't expect drivers like any piece of software to have bugs but when they get worse and worse as time goes on, you're doing something seriously fucked and wrong. These Crossfire problems have been stated many times in many forums, it shouldn't be Brent and Kyle's job to make the Catalyst team get off their asses and do something.

The review statistics were evidence enough of the 460's raping the 5850's, but that doesn't mean that's the end of the story... Getting drastically different results from different drivers is extremely significant to a COMPREHENSIVE review. I read the [H] reviews because they are typically very detailed and informative. Even if they didn't test the older driver versions, it should have been stated that the CFX setup was very limited by driver issues and not the cards themselves. Many people don't buy 2 cards at one time, instead opting to get one at a time. This is extremely relevant to those people, because between the last review and this one I'm sure a multitude of people jumped over the 460's (they are a wonderful card no doubt), however a lot has changed since then and the 5850 is now closer to where its performance should be about. That alone is a huge reason to include that information into a review.

And I'm not asking Brent or Kyle to be the responsible part to make the catalyst team to do their job, it is inexcusable how poorly the drivers are working. However if you are doing a HARDWARE review and not a SOFTWARE review, this information is pertinent.
 
And I'm not asking Brent or Kyle to be the responsible part to make the catalyst team to do their job, it is inexcusable how poorly the drivers are working. However if you are doing a HARDWARE review and not a SOFTWARE review, this information is pertinent.

Hardware is kinda useless without software. While I understand your point, hardOCP has already provided single HD5850 and signle GTX460 reviews. Admittedly, while they aren't using anywhere near the latest of drivers, none of the drivers have really provided an across the board performance increase. Some bugfixes, some general reordering of things (Forceware 25x/ CCC CAP), but that's it. A handful of games get "upto 5-10% improvement" every release, for specific setups. Crossfire, however, seems to be a wildly swinging wildcard that just doesn't always keep tame.
 
I used to be an ATI fan up to the 9800 PRO but the 7000 series & beyond won over to the nVidia camp....I'm no fanboy. If you make a better product, I'll jump ship to your brand....
 
oh yeah...and nVidia seems to run cooler and last longer too...every 1 of my ATI cards died from too much heat & stress....
 
Come on guys, calling Hardocp biased is crazy I think.
They put the smack down on inferior hardware and promote hardware that is supreme.
I've been reading Hardocp since before I was a forum member and they are not biased.
Certain hardware shines above others, no matter the company behind it.
This case, 5770>GTS450, accept it. That's all.
I agree with the conclusion of the article, and I feel ATI/AMD has the upper hand in GFX right now. Hard to say for myself coming from a strong nVidia past, all the way back to GeForce 2 MX. Not one ATi card in my main rig ever. But that does not stop me from recommending the ATi 5850 to my friends atm.
 
I used to be an ATI fan up to the 9800 PRO but the 7000 series & beyond won over to the nVidia camp....I'm no fanboy. If you make a better product, I'll jump ship to your brand....

Both ATI and NVidia have had their fair share of "making a better product". I guess you weren't looking for new cards when the 4xxx series was making a killing and when Fermi was 10 months late? It's cool, I'm sure you're not a "fanboy" though. :rolleyes:

oh yeah...and nVidia seems to run cooler and last longer too...every 1 of my ATI cards died from too much heat & stress....

Right, because ATI cards dieing from heat is typical.

Except it's not, grats on killing your cards due to user error.

Also wondering which ATI cards you had which "died from heat and stress" since by your own admission you stopped buying ATI after the 9800 Pro.

If ATI's cards run too hot for you, you can always go with an extremely cool and efficient design such as Fermi :rolleyes:
 
The review statistics were evidence enough of the 460's raping the 5850's, but that doesn't mean that's the end of the story... Getting drastically different results from different drivers is extremely significant to a COMPREHENSIVE review. I read the [H] reviews because they are typically very detailed and informative. Even if they didn't test the older driver versions, it should have been stated that the CFX setup was very limited by driver issues and not the cards themselves. Many people don't buy 2 cards at one time, instead opting to get one at a time. This is extremely relevant to those people, because between the last review and this one I'm sure a multitude of people jumped over the 460's (they are a wonderful card no doubt), however a lot has changed since then and the 5850 is now closer to where its performance should be about. That alone is a huge reason to include that information into a review.

And I'm not asking Brent or Kyle to be the responsible part to make the catalyst team to do their job, it is inexcusable how poorly the drivers are working. However if you are doing a HARDWARE review and not a SOFTWARE review, this information is pertinent.

All they did was review both sets of cards with their latest drivers. They can't tell people "Oh it's just driver problem no big deal," when the card has been out for almost a year, the drivers have been broken for 3 months, and there was no sign of fix in sight. ATI doesn't need to continue on with it's monthly driver releases if it can't put out working drivers.
 
I dont believe AMD has optimized its crossfire enough, if its normal to expect AMDs $300 card to get beat by Nvidia's $230 card or for them to trade blows then mediocrity is being settled for. However I feel these customers deserve the best because they are paying a premium for their cards. I know fewer people buy the high end cards vs mainstream but cmon. If I bought a pair of 5850s I'd be pissed right now. It is acceptable now to trade blows in crossfire vs Sli with GTX 460 which by the way are in another price bracket below 5850. Price bracket or not we KNOW the 5850s are more powerful thus scaling should favor them. Yet everyone is happy they have matched lower end cards in Multi card.

Also am I supposed to second guess every driver update coming out from AMD from now on. Questions like, hey these are the new Cats how do they perform on xyz game are going to be the normal? No Thank you! Terry needs to clone himself 10x and all his clones need to work day and night to fix these issues or the amd 68xx series will be pwned in crossfire vs sli against GTX 480s??!!:eek:

Not trying to bash AMD but somebody needs to be pissed about this as the current sentiment and fanboyism isn't going to get the job done.
 
Why not compare 470 SLI vs 5850 crossfire, now those 2 cards are at the same price range. I will agree with the rest of the people, I would be piss if a pair of 460 still keeps up with a pair of 5850 and if I owned 5850's. Everyone complaining about the OC, being higher on the 460, think about it this way, thats how the card comes from factory. Now if you buy a 5850 OC from the factory the performance/price ratio becomes worst. Now imagine how the price/performance ratio would have been if they would have use a vanilla 460 1 gb that cost 230, and have them OC it to what the galaxy was. I want to see 470 SLI vs 5850 CFx.
 
Last edited:
Why not compare 470 SLI vs 5850 crossfire, now those 2 cards are at the same price range. I will agree with the rest of the people, I would be piss if a pair of 460 still keeps up with a pair of 5850 and i own 5850's. Everyone complaining about the OC, being higher on the 460, think about it this way, thats how the card comes from factory. Now if you but a 5850 OC from the factory the performance/price ratio becomes worst. Now imagine how the price/performance ratio would have been if they would have use a vanilla 460 1 gb that cost 230, and have them OC it to what the galaxy was. I want to see 470 SLI vs 5850 CFx.


Yup, thats my point. The 5850 should be at or just under 470 in multicard performance. However that is not nearly the case. Overclocked 470s would destroy 5850 OC crossfire and then river dance on it's dead bodies. Yet this is acceptable. The 5850s have come down in price alot lately, however that doesn't tell the true story where just 2 months ago they were much more expensive and the majority here who bought a pair for much more are getting much less.

Now everybody is excited and speculating about the new amd 67xx/68xx series and its new architecture, telling people to wait for it that it should perform this way and that way when in truth they know the architecture sounds cool but nothing about how amd will optimize the trivers. Nobody is perfect when it comes to drivers but AMD leaves alot to be desired in that department. I'd be excited if I was a single card user but going from dual 5870 to dual 6870 wouldn't be as exciting to me if they dont fix their driver and game performance issues asap.

BTW To me fixed is when a pair of 5850s outclassed a GTX 460 SLi pair and begins to trade blows with GTX 470 Sli. It's acceptable to lose a few more benches against 470 SLi due to 470 coming out 6 months later but seriously.

BTW good point about how the 460 oc used came overclocked and HardOCP overclocked the 5850 in order to be fair as supposed to buying an overclocked card. Overclocking the 5850 changed the price/performance ratio further in AMD's favor so I'd say add an additional $10-15 for the price of the AMD 5850 in order to buy an overclocked model at or around these frequencies.

BTW
 
Last edited:
GPU usage doesn't mean anything, it's a useless number.

You will never actually get 100% GPU usage unless the game is perfectly tuned for a specific architecture.

and I assume
it's a useless number that may as well be pulled out of someone's ass.

You know what they say about people who assume, don't you?

There's a difference between memory usage, memory controller load and GPU usage. You can track each one individually. Other games such as Metro 2033 have no problem maxing out the GPU usage. GPU usage actually does mean something. It's how much of the GPU clock is being used. Even if you're shader limited, it will still be able to use more GPU cycles.

When Nvidia reps confirm the issue and are telling users to test out new beta builds to see if the problem has been fixed, it's pretty obvious the problem exists.


And, 470s in SLI will walk all over 5850s. Max OC to max OC, 5870s @ 1ghz will lose to 470s @ 860. If you're concerned with $230 460s competing with $300 AMDs, wait for the 6xxx series.
 
There's a difference between memory usage, memory controller load and GPU usage. You can track each one individually. Other games such as Metro 2033 have no problem maxing out the GPU usage. GPU usage actually does mean something. It's how much of the GPU clock is being used. Even if you're shader limited, it will still be able to use more GPU cycles.

What? I didn't talk about memory usage or memory controller load at all. There is dedicated hardware in a GPU. What part is GPU usage measuring? For example, in DX9 and 10 games, 100% GPU load is impossible because the tessellation engines will always be idle. The simple fact is that there will be bottlenecks for some games, making GPU usage a pointless number.

"how much of the GPU clock is being used" also doesn't make any sense. How do you "use" a clock? Not to mention, WHICH clock? Core clock? Shader clock?

And no, if you are shader limited you won't be able to use more "GPU cycles". If you could, you wouldn't be shader limited, now would you?

But even if it wasn't a pointless number (which it is), who gives a shit? If the drivers can't properly utilize the card, the card might as well just be slow.
 
What? I didn't talk about memory usage or memory controller load at all. There is dedicated hardware in a GPU. What part is GPU usage measuring? For example, in DX9 and 10 games, 100% GPU load is impossible because the tessellation engines will always be idle. The simple fact is that there will be bottlenecks for some games, making GPU usage a pointless number.

"how much of the GPU clock is being used" also doesn't make any sense. How do you "use" a clock? Not to mention, WHICH clock? Core clock? Shader clock?

And no, if you are shader limited you won't be able to use more "GPU cycles". If you could, you wouldn't be shader limited, now would you?

But even if it wasn't a pointless number (which it is), who gives a shit? If the drivers can't properly utilize the card, the card might as well just be slow.

If GPU usage couldn't go up when tessellation isn't in use then games that don't even have tessellation wouldn't be able to use 99% usage according to you, right? I guess that's why you can use 99% in BC2 on an i7 or by cranking up the AA...

You really have no idea what you're talking about. Unless you think you know more than Nvidia devs do about their own product.
 
If GPU usage couldn't go up when tessellation isn't in use then games that don't even have tessellation wouldn't be able to use 99% usage according to you, right? I guess that's why you can use 99% in BC2 on an i7 or by cranking up the AA...

You really have no idea what you're talking about. Unless you think you know more than Nvidia devs do about their own product.

BC2 is DX11, for one. But that just further emphasizes how useless that number is. If it is game/context dependent, then it isn't actually measuring "GPU usage", but rather what the driver guesses the GPU is capable of for the given context.

Still didn't answer the point of why does it even matter?

Also, I can promise you this, Nvidia's engineers aren't using that number to test tweaks to the driver. That number is there for you to get all excited about, not because it actually means anything. Companies do that sort of shit all the time. Case in point: Windows Performance Index. Or a 3D Mark score. Useless numbers.
 
BC2 is DX11, for one. But that just further emphasizes how useless that number is. If it is game/context dependent, then it isn't actually measuring "GPU usage", but rather what the driver guesses the GPU is capable of for the given context.

Still didn't answer the point of why does it even matter?

Also, I can promise you this, Nvidia's engineers aren't using that number to test tweaks to the driver. That number is there for you to get all excited about, not because it actually means anything. Companies do that sort of shit all the time. Case in point: Windows Performance Index. Or a 3D Mark score. Useless numbers.

You clearly have no idea what you're talking about. GPU usage is not even remotely comparable to WIndows Performance Index. It's not put there for anyone to get excited about by the marketing department. Educate yourself: http://en.wikipedia.org/wiki/CPU_time

You can't even keep a consistent story either. If tessellation isn't being used, how would the GPU be using 99% according to one of your many made up stories? BC2 runs in dx9,10 and 11. The results are the same. GPU-Z doesn't measure GPU usage for a given game, it's the actual processor cycles being physically used.

Anyways, might as well just ignore the troll from here on out. Nvidia has commented on this issue on their official forums and are currently working on a fix. But so far, it seems that C2Q and Phenom II users face a huge disadvantage compared to i7 users even when overcloked to 4ghz. While at the same time, ATI doesn't seem to have this problem with even their 5970 cards on similar systems.
 
Now we'll have to see if AMD says: "OK, got some good publicity, now let's sit on our asses (again) and not give a shit (again) about the users (again) for the next year of drivers. Then we'll gather back, high five, see if any "real" users (*coughreviewsitescough*) complain and bitch and then maybe bother addressing it if it doesn't interfere with our wanking off", or if they're genuinely going to move forward with this stuff. I have a feeling it's gonna be the former as we've witnessed for too long.

AMD fixes have consistently and unacceptably popped out way too late; like when I was stuck in BIA:HH level for a good few months/half a year because the textures were borked until many patches later.

(disclaimer: i ran/am still running 4850s, 4870, 4890, 5870, 5970)
 
You clearly have no idea what you're talking about. GPU usage is not even remotely comparable to WIndows Performance Index. It's not put there for anyone to get excited about by the marketing department. Educate yourself: http://en.wikipedia.org/wiki/CPU_time

Please, please go read that link yourself. You linked a wiki article about measuring how many clock cycles something took, which is NOT a measure of usage.

You can't even keep a consistent story either. If tessellation isn't being used, how would the GPU be using 99% according to one of your many made up stories? BC2 runs in dx9,10 and 11. The results are the same. GPU-Z doesn't measure GPU usage for a given game, it's the actual processor cycles being physically used.

My story is consistent, you just seem to lack the knowledge to understand it.

If BC2 is running in DX9/DX10 (or even DX11 since it doesn't use tessellation afaik) on a DX11 card, GPU usage *ISN'T* and will *NEVER* be 99%/100%. The fact that the program is displaying such usage is just proof that the number is wrong and meaningless. THAT is my point. The displayed number *doesn't mean anything*. Even if it perfectly measured every part of the GPU and it's usage (which clearly it isn't) that STILL doesn't mean anything. It would just tell you how the game is using the GPU, not how much untapped power there is for drivers to exploit.

In other words, the "gpu usage" number is a rough *ESTIMATE*, not a hard measurement. And you don't seem to understand the meaning of that estimate, either.

And again "actual processor cycles being physically used." doesn't make any goddamn sense. Clearly you have no idea what a clock cycle actually is much less a high level understanding of a GPU.
 
I hear alot of people complaining about ATI driver issues, and how ATI driver is inferior to Nvidia. but why people still buy ATI with top dollar? this sounds very contradictory. we all know that software is as important as hardware. Nokia today admitted this after years of losing to iphone.
 
Both ATI and Nvidia have some driver issues. ATI has had some trouble lately, but they have been working toward fixing them. The other thing is, like you said, hardware is just as important, and when the hardware is superior, you take that into account too. Whatever the case, right now, ATI really isn't "top dollar" most of the cards available are actually fairly competitively priced.
 
I hear alot of people complaining about ATI driver issues, and how ATI driver is inferior to Nvidia. but why people still buy ATI with top dollar? this sounds very contradictory. we all know that software is as important as hardware. Nokia today admitted this after years of losing to iphone.

The answer should be obvious - ATI's driver isn't inferior to Nvidia's ;)
 
The answer should be obvious - ATI's driver isn't inferior to Nvidia's ;)

This bullshit comment I can't let slide. Your not serious are you. Nvidia has always had better drivers. There were times where they had mishaps some were epic even, however in general through game scaling, support, and in general, there has ALWAYS been a consensus that nvidia's drivers are superior. Only somebody truly blinded would state and believe what you said. So again I have to ask are you serious?
 
This bullshit comment I can't let slide. Your not serious are you. Nvidia has always had better drivers. There were times where they had mishaps some were epic even, however in general through game scaling, support, and in general, there has ALWAYS been a consensus that nvidia's drivers are superior. Only somebody truly blinded would state and believe what you said. So again I have to ask are you serious?

then back to my previous question: why people buy ATI card with top dollar if their drivers are inferior?
 
then back to my previous question: why people buy ATI card with top dollar if their drivers are inferior?

Because single card solutions (the vast majority of computers still use 1 video card) in favor has swung between the 2 graphic giants multiple times over the past few years. Their drivers have been solid for a long time now for that market, its the CFX area that is getting hit hard from their lack of optimization and bugs.

And yes I know that hardware is useless without the right software, but software can be fixed, hardware cannot. So when I am buying something that will be used for a couple years I do take that into account. If the hardware is superior, I just need to judge if it is worthwhile to wait for the drivers to catch up or not.
 
then back to my previous question: why people buy ATI card with top dollar if their drivers are inferior?

Because although they are inferior they are not utter shit. And their hardware kicks ass too. Their inferior drivers wont keep me from buying a 6870 single card, but their inferior drivers WILL keep me from buying a crossfire pair of 6870s or a 6970. However I would consider a single or dual card Fermi refresh either way. ;)
 
I hear alot of people complaining about ATI driver issues, and how ATI driver is inferior to Nvidia. but why people still buy ATI with top dollar? this sounds very contradictory. we all know that software is as important as hardware. Nokia today admitted this after years of losing to iphone.

Look a bit more closely to a lot of those complaining and you'll notice that they have an agenda. Many of those parroting the "Ati drivers are inferior to Nvidia drivers" are more interested in "social media marketing" as viral marketing is called nicely these days. Some operate on several forums under different names. As example: Prime1 on [H] is Wreakage on Anandtech, Atech (banned from [H]) continues his Nvidia marketing as Lonbjerg on Anandtech...

This year alone, Nvidia have had 2-3 driver sets that effectively have killed GFX cards due to overheating. If you look at the post history of many of those that complain about ATI drivers, they haven't mentioned a word about this, but have complained about minor issues with ATI drivers.

Take from this thread alone:
Also am I supposed to second guess every driver update coming out from AMD from now on. Questions like, hey these are the new Cats how do they perform on xyz game are going to be the normal? No Thank you!

He's worried about whether or not to second guess performance on AMD drivers (for his GTX280?), but his post history shows no worries about second guess if the next Nvidia driver is going to kill his card or not?

Nvidia issues? You can find a lot on the internet. Here's one of the issues discussed in this thread, which doesn't only affect BFBC2, but other major titles as well:
http://forums.nvidia.com/index.php?showtopic=170238
http://forums.anandtech.com/showthread.php?t=2102509

Or issues with the newest driver set (from scaling being broken again to FSX still being corrupted):
http://forums.nvidia.com/index.php?showtopic=180561
+ Nvidia's own unresolved issues in their PDF:
http://us.download.nvidia.com/Windows/260.63/260.63_Win7_WinVista_Desktop_Release_Notes.pdf

etc.

ATI and Nvidia both have their fair share of issues. Most of them are pretty minor and I wouldn't have problems buying either cards from drivers.

"Nvidia drivers are superior to ATI drivers (or visa versa)" is FUD and you shouldn't pay attention to it. Those that come with it usually have a marketing agenda.

That said, SixtyWattMan's words still stand strong:

These Crossfire problems have been stated many times in many forums, it shouldn't be Brent and Kyle's job to make the Catalyst team get off their asses and do something.

ATI needs to get their fingers out and do something here with CF. Quadfire has been sorely lacking since release and they should be ashamed that a GTX460 beats their top cards even on a single driver release!
 
I can tell you this I do know people that still uses catalyst 10.5a because they work better. thats drivers from May/June. I don't recall having issues using "lastest" stable drivers from nvidia or being forced to rollback to 4 months old drivers.
 
I can tell you this I do know people that still uses catalyst 10.5a because they work better. thats drivers from May/June. I don't recall having issues using "lastest" stable drivers from nvidia or being forced to rollback to 4 months old drivers.

I'm using 10.9 and are not forced to rollback to 4 months old drivers... You don't recall having issues using "latest stable drivers" and neither do I. But some do. 10.5a is shown to be very good for some users, so they prefer that one (and use the CAP from newer sets instead).

Through most of 190 series, many people were still holding on to 186 drivers, since they were considered most stable among many. There are also always some with certain configurations or requirements that prefer one driver over another. Here's from the link above:

My GTX 470 runs at 405/837/810MHz clocks if I use 120Hz mode on desktop with my Samsung 2233RZ. Setting any other refreshrate, 110Hz / 100Hz / 60Hz immediately makes the card idle at 50/67/101MHz.

Previous drivers did not do this, but I had random black screen bug on desktop while idling with 257.21 / 258.96 while 197.75 worked ok (never tried 197.45). Hopefully this beta fixes at least this much, but I'd love to have 120Hz desktop using idle clocks and not those low power 3d clocks.

Do you have any info about this ManuelG
http://forums.nvidia.com/index.php?s=&showtopic=180561&view=findpost&p=1117006

Another guy a few posts down in that link sticks to quadro drivers, instead of the newest.
Both sides have these users and issues.
 
Back
Top