GeForce GTX 580 vs. Radeon HD 5970 2GB Performance @ [H]

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,629
GeForce GTX 580 vs. Radeon HD 5970 2GB Performance - We will see if NVIDIA’s new GeForce GTX 580 has what it takes to compare to AMD’s Radeon HD 5970 2GB dual-GPU video card. These five apples-to-apples tests will reveal performance differences and memory limitations on the Radeon HD 5970 2GB. The Radeon HD 5970 is less expensive, so let's where the chips fall.
 
Good idea for a test. I didn't realize the 5970 had fallen in price that far. The effectively only "1GB" of VRAM on the 5970 really hurts it but putting 4GB on a card like this just to compete would price it out of the range. It bears mentioning however, that you also get surround gaming on the 5970 where you don't with a single GTX 580 but everybody knows that already :)
 
I realize you guys game at the 30 inch monitor resolution but I would like to see 1920x1200 type resolutions for max gameplay.. does the 5970 play with 8x AA in game at that res ok ? or is the memory a factor even at that resolution?
 
Thanks again for a great review.

Any Surround Vs. Eyefinity reviews in the plans?

Well I'm sure the 580 would be the decisive winner in surround vs Eyefinity because you would have to SLI two 580s and crossfiring two 5970s (which is basically quad fire 5850) does not bring anywhere near the gains of SLI because it just doesn't scale well enough and scaling what amounts to 4 GPUs introduces even more overhead to the Eyefinity setup.

In GTX 580 vs 5970 single card, of course the 5970 wins as the GTX 580 produces zero frames in surround ;)
 
If AMD could somehow equip its GPUs with intermediate levels of VRAM, instead of large increments like 2GB and 4GB, 1.5GB per GPU like the GTX 580 would be enough to spank lots of newer GPUs.
 
That was a good read. I'll be disapointed if the 6900 series versions with more than 1gb are marked up by 100$. The 2gig 5870 simply costed too much.
 
Kyle or Brent, correct me if I am wrong, but is this not the first time you have given a card the "best of" based off the amount of ram? I know it has been mentioned but I don't ever recall it being the deciding factor in a review before. this is something of a small milestone.
 
Interesting comparisons here. I'll have to go back and look at some of the old reviews to get some Power/Heat/Noise #'s to see how things stack up there as well.

I can't wait to see AMD's next round in a month.
 
Great comparison. The only thing I would suggest is more proofreading before the final submit. The information was great, but "more smoother" and the grammar in that last paragraph almost made me cringe.

I would definitely enjoy a 580, but still don't have a reason to upgrade my 285. Guess it pays off to stick with "low" resolutions like 1920x1080!
 
What was the difference in power consumption between the two?

It was a good timely review... but I would have liked to have seen that also. Along with some heat and noise comparisons. The initial 580 review here gave a nice "perspective" comparison... but no actual numbers on heat and noise.
 
If you somehow managed to factor in the stutter in a quantifiable way, I would easily rank [H] reviews far above all others. IMO this review is the first to approach this, although it was in a very informal way. "It stuttered, so the one with less avg frames is better" is a good-enough statement, but if I had some "Experience index" number in addition to the FPS numbers, I would probably ignore all other reviews of any video card from other sites. [H] rocks, but could rock even more with this.

Here's my idea:

Take the average deviation in framerates in 1/4 second intervals (or some other appropriately small period of time to allow for the numeric detection of "stutter") and subtract it from the average FPS.

Or I suppose you could just post the numbers for the average deviation, but the less work for the readers the better.

Good review by the way. Thanks!

P.S. sorry if this has been proposed before or if it can't be done. Just thinking off the top of my head.
 
Kyle or Brent, correct me if I am wrong, but is this not the first time you have given a card the "best of" based off the amount of ram? I know it has been mentioned but I don't ever recall it being the deciding factor in a review before. this is something of a small milestone.
That was interesting. While in the past I have felt dumb for jumping on the highest available vram bandwagon, that may end up being my default setting.
 
Good comparison. It's nice to see that "overall framerate" isn't the deciding factor and that the perception of the gameplay is what stands overall. So many say, "5970 whips the GTX 580" ... Especially in the "HardwareZone's review thread" ... Glad to see the real perspective.
 
I was dreading this review for several reasons. It's my opinion that we shouldn't compare single gpu graphic cards to dual if only to put the information out there for people to make buying decisions. I doubt nvidia planned to dethrone Hemlock, nor do I believe AMD intends to with its 6970 single gpu card. As noted in this article there are sooooo many caveats to dual gpu solutions, however to my and most peoples surprise it was the vram that did 5970 in. I'd be lying if I said I expected that.

I'd imagine at 19x12 it would look better for Hemlock but then much of the GTX 580 ram would go unused and then it's being put at a disadvantage. I agree that 5970 has eyefinity on a single card but the 1gb would be a limitation if wanting to add AA correct? Although it's possible to play surround I'm not sure I would want it at the cost of Antialiasing. However what about the new 10.10e hotfix that adds Morphological AA!! Could that be a savior at this time for Hemlock.

Antilles will more than likely be dual 2gb card and perhaps if this review is an indication a force to be reckoned with. IF nvidia does have a dual card solution coming it'll be a interesting battle between the 2. Oh and what happened to overclocking, apples to apples? heat/power/noise? We may have to change the websites name to HardP if this lack of Overclocking trend in reviews continues :)

Thanks for an overall informative review guys.
 
This review was a pretty interesting read. If Nvidia came out with the card a year ago then maybe the numbers would be a little more noteworthy, but they didn't. If I was looking for a card thats around $500-600 I would wait the 3 weeks or so for the release of the 6900 series because it could just blow the 580 out of the water, assuming they are priced at that range. It's nice to know that there is a card that beats out the current top dog but it's ashame that it's too late in the game. I really would like to see Nvidia come up with some more competition to bring prices down. As long as there are ~equal performance cards the consumers win when it comes to price wars.

6 months or so from now when it comes to build a new system I hope I have more than one high end option to choose from.
 
I don't quite follow:

From page 2:
Similar to F1 2010, we are experiencing erratic and inconsistent behavior with the Radeon HD 5970 running at 8X MSAA at 2560x1600 in this game. The average framerate is just slightly higher than the GeForce GTX 580, but the variance in performance is much greater. This can be felt in game. As you zoom around the game, pan left and right and up and down, there are delays and pauses that occur. Radeon HD 5970 is plenty fast enough, but the VRAM isn’t there to support the performance that is capable. This means the GeForce GTX 580 and the ATI Radeon HD 5870 2GB deliver a smoother experience. I would not play Civ 5 at 8X MSAA on the Radeon HD 5970. I would drop down to 4X AA so that the gameplay is smoothed out; whereas, the GTX 580 is more than playable at 8X MSAA in this game.

I clearly see spikes from all three tested cards going to ~0 FPS constantly throughout the test. Why then would one be so much worse then the others. I'd say they all suck in this test... But that's just my opinion.

Also, as a side note, what about other resolutions? Why is it only tested at the unaffordable 2560x1600 resolution? Personally, as an example, I hate 16:10 format. I'd like to also see 2560x1440 or 1920x1080 or at LEAST the inclusion of 1920x1200 to get it in the ballpark. Many cards act different at different resolutions. And some of us play on our lag-free-er versions of our larger HDTVs. At these resolutions, the slower one at the extreme 2560 may be the easy king with all toys on at 1920...

I completely understand not going lower but I think a good 2560 and 1920 should be done in all graphics card reviews to accommodate more people than just what the max the card itself might be capable of.

Another more interesting test might also be the > 60FPS tests... Change settings for each card (lower AA/AF/etc to find what it takes to play X at Y resolution for game Z and have a min FPS of > 60FPS (common refresh of most LCDs, once 120Hz become more popular/available, this would be less important.). I think that could be very useful information as well. But obviously would be considerably more time consuming.
 
That was interesting. While in the past I have felt dumb for jumping on the highest available vram bandwagon, that may end up being my default setting.

it is. I am hoping kyle or brent confirms that as I can't see anything else except the 512 vs 1gb 4870. granted its kind a mute point for most people and not very relative (4xAA vs 8xAA at 4mp) but I am wondering if its a sign of things to come now.
 
I don't quite follow:

From page 2:


I clearly see spikes from all three tested cards going to ~0 FPS constantly throughout the test. Why then would one be so much worse then the others. I'd say they all suck in this test... But that's just my opinion.

Also, as a side note, what about other resolutions? Why is it only tested at the unaffordable 2560x1600 resolution? Personally, as an example, I hate 16:10 format. I'd like to also see 2560x1440 or 1920x1080 or at LEAST the inclusion of 1920x1200 to get it in the ballpark. Many cards act different at different resolutions. And some of us play on our lag-free-er versions of our larger HDTVs. At these resolutions, the slower one at the extreme 2560 may be the easy king with all toys on at 1920...

I completely understand not going lower but I think a good 2560 and 1920 should be done in all graphics card reviews to accommodate more people than just what the max the card itself might be capable of.

Another more interesting test might also be the > 60FPS tests... Change settings for each card (lower AA/AF/etc to find what it takes to play X at Y resolution for game Z and have a min FPS of > 60FPS (common refresh of most LCDs, once 120Hz become more popular/available, this would be less important.). I think that could be very useful information as well. But obviously would be considerably more time consuming.

1289507336SzLWKc8bvR_2_1.gif


I believe that they are stating that it feels worse since the performance is as high as 70+ fps and dips to 0. That is something that you'd feel far more than something at 38fps that dips to 0. The erratic performance is due to memory limitations on the 5970 and it's obvious the GPU's have the horsepower to do it but just not enough track to gallop.

As far as the over 60 fps you stated, 60fps is not the end all of FPS in games, some games feel smooth at 30fps some at 25, others dont feel smooth until after 70 etc.. Where they draw the median line on their graphs is where they define that sweet spot (in this case 30). so your asking for more than playable settings which is something they are not going to focus on.
 
great review. I love how you not only tell us what the raw frame rates are but more importantly you go beyond to tell us what our expected experience would be (qualitative over quantitative).
 
It seems like it really depends on whether or not the game itself is optimized better for dual cards or just wants toe at mroe memory. I know GTA4 is a vram memory hog! Anyway, I because of delays and lack of good systems on newegg I guess I will have to wait until January after CES before I buy a new Gaming PC. I'm not big on the holidays anyway.
 
they use the max res 2560 x1600 because that is what cards like these are for. Anything lower does not stress the cards like this resolution and the high AA. That is when the large ram performance amounts show/ start to perform negative on a card.
 
I was dreading this review for several reasons. It's my opinion that we shouldn't compare single gpu graphic cards to dual if only to put the information out there for people to make buying decisions. I doubt nvidia planned to dethrone Hemlock, nor do I believe AMD intends to with its 6970 single gpu card.
You keep mentioning this, but that's not the purpose of the comparison. Nobody is expecting a refresh to dethrone dual GPU's from the previous gen. The comparison is based on bang-for-buck value. Yes there are caveats to dual GPU's, and I'm sure [H]ard readers understand that well enough to take that under consideration. But nobody is going to give nVidia a free pass just cause they finally got their act together. Mind you, my current setup is nVidia 460's in SLI, and even I'd like to see a [H]ard comparison against the 580. Even so, I don't see myself switching to a single 580 whatever the outcome due to the lack of Surround support on a single card (which should honestly knock the 580 down a few pegs compared to AMD cards).
 
You keep mentioning this, but that's not the purpose of the comparison. Nobody is expecting a refresh to dethrone dual GPU's from the previous gen.

This is the part I dont agree with, there ARE people who are saying that the GTX 580 failed because it can't beat a 5970. AMD themselves blame the delay of Cayman because their 5970 is still the world's fastest single Graphics card

Obviously we know now that depending on your resolution and game choice that is not the case and the GTX 580 is an overall safer bet at 2560x1600 but during its launch there were quite a few people discrediting it due to that comparison. I defended it and the upcoming 6970 saying that just because they may lose to Hemlock a bit doesn't mean they are bad cards. Being Refreshes they shouldn't be expected to outperform the uber high end recently $600+ graphic card from just the last gen. Many people disagreed. That's why I keep saying that. If no one expected it I'd omit it and spare myself the wasted time and energy explaining it.
 
Minus the grammar, quite a nice review. As many others before me have already mentioned, +1000 for going beyond the framerates (and +1 more for mentioning that the GTX 580 can be SLIed xD)- reviews that bit dual-gpu cards against single gpu cards usually annoy the hell out of me as they only look at performance and not user experience (and obviously, the latter is far more important)- performance is a component of user experience, not the other way around, and kudos to [H] for getting that right today.
 
I believe that they are stating that it feels worse since the performance is as high as 70+ fps and dips to 0. That is something that you'd feel far more than something at 38fps that dips to 0. The erratic performance is due to memory limitations on the 5970 and it's obvious the GPU's have the horsepower to do it but just not enough track to gallop.

I guess perhaps I see what they may be getting at. I do understand the 'feel' but was not understanding how they were trying to blame the 5970 for dipping when they all did...

As far as the over 60 fps you stated, 60fps is not the end all of FPS in games, some games feel smooth at 30fps some at 25, others dont feel smooth until after 70 etc.. Where they draw the median line on their graphs is where they define that sweet spot (in this case 30). so your asking for more than playable settings which is something they are not going to focus on.

On this I do not agree. On any 60Hz monitor, anything under 60Hz almost always involves shuddering and/or tearing which to me completely ruin the gaming experience. I don't know if I just have better eyes than everyone else but; a friend of mine has a 2560 setup and a 5970 to boot. He refuses to run it fast and insists everything be maxed out. Some games look OK but while watching others, I just get the biggest headaches from the slide-shows or flashes of slow screens... I dont' see how anyone can stand < 60FPS. or < 40 if they had a higher refresh monitor.

EDIT: Now don't get me wrong; I do understand people do see things differently... But dropping to ~0FPS would never be 'acceptable' in any situation during game-play in my book.. They would all fail that test. Plus, simply adding other resolutions to the test like the 1920's gives the more regular non-rich users a 'what I might expect with everything turned on'... The cards still can't even do that reliably all the time. I see it more as real-world testing vs the stress-to-the-limit test. So change it to a 30FPS limit then if said game is so difficult to run. Still it's a valid question. What can they do to get what res you're looking for with a min 30 FPS drop... Or what settings have to be nerfed to achieve said 30FPS minimum...
 
Last edited:
they use the max res 2560 x1600 because that is what cards like these are for. Anything lower does not stress the cards like this resolution and the high AA. That is when the large ram performance amounts show/ start to perform negative on a card.

Who says? I want it for playing at 1920x1080 with everything turned on without dipping below 60FPS... They still can't do that 100%. Not even close. Heck in some games even the best can't do that at 1680x1050. These lower-res tests are still completely valid evaluations.
 
I guess perhaps I see what they may be getting at.



On this I do not agree. On any 60Hz monitor, anything under 60Hz almost always involves shuddering and/or tearing which to me completely ruin the gaming experience. I don't know if I just have better eyes than everyone else but; a friend of mine has a 2560 setup and a 5970 to boot. He refuses to run it fast and insists everything be maxed out. Some games look OK but while watching others, I just get the biggest headaches from the slide-shows or flashes of slow screens... I dont' see how anyone can stand < 60FPS. or < 40 if they had a higher refresh monitor.

I underlined the words in your quote that made me disagree. Having slower FPS than your monitor can cause tearing and shuddering but that is not always the case and actually rare these days. If there was shuddering and tearing Brent or Kyle would mention it. I guess it depends on what games you play. Granted I want higher FPS and dont want to settle for less but I'm just playing devils advocate here to explain the [H] methodology.
 
at 1920 these cards probably get 60+. look at normal review cards for that. People paying 500 for a card probably have something a little bigger than a 24-27 inch monitor (150-300 dollars for monitors these sizes). Ive seen my gtx580 blast any game maxed out at 1920. Even at 2560x1600 the frames are acceptable with maxed settings. Go look at the normal review for these cards and you can see your 1920 res. The point of this article was to show where the ati card chokes with the less ram even though it gets higher fps sometimes.
 
I underlined the words in your quote that made me disagree. Having slower FPS than your monitor can cause tearing and shuddering but that is not always the case and actually rare these days. If there was shuddering and tearing Brent or Kyle would mention it. I guess it depends on what games you play. Granted I want higher FPS and dont want to settle for less but I'm just playing devils advocate here to explain the [H] methodology.

Well, they might, might not. Depends what it 'feels' like to them. How picky their eyes are... (I adde edits to the above by the way, you might have missed. Seems we did it at the same time.) Anyhow, simply put, my friend drools over his 2560 pixels in these games asking me 'See how beautiful it is', all the while it's tearing like crap. So to him obviously, tearing doesn't bother him. He does tend to notice the studdering a bit more but... Again, just goes to show everyone is unique to some extent (and if you think it's bad with eyes, try audio... I also design custom competition audio systems and OMG, talk about extreme difference in the ear... Even between two judges sometimes... Whew... Anyhow, off-topic... back-to-topic.. Sorry bout that...)

So while I do very much see your point and what you're trying to say, I still see a very urgent need for the rest of the information (other resolutions and possibly AA/AF settings, etc). Just relying on their 'opinions' are not enough to be honest. Again just too personal. Obviously I want the most eye candy possible. But at the same time, I will NOT sacrifice my 60FPS minimum personal limit to get that. I will turn stuff off and even lower the res if I have to to get it smooth as glass with no exceptions. My competitive FPS history might be part of the reason for my need for smooth over pretty along with my sensitive eyes and tenancy toward headaches with non-smooth images but... Still... This data helps those like myself tremendously. At least just lower the res to any 1920 res and retest... Is that too much to ask for a minimum? Ya, it extends testing time but we're talking about being thorough here and not just another 'what we can make it do' test. While I still trust [H] over many others... They could be a touch more thorough because I'd love to see the min/max/avg numbers in those other situations.

Personally, I'm waiting for the 6970 before my next purchase to see what it's game is but... Until then I'll suffer with the rig I got now even though my system out-dates my video by over 2 generations now (4890's heavily OCed)...

And maybe I'm just completely in left field but it doesn't mean I'd like to see any less. Being a consumer myself, I'd think [H] would always be up for a simple suggestion now and then. I'm not saying go crazy and do all resolutions, but the big one and the more conventional one would at least be nice. Personally I'd love to see 16:9 resolutions only as I'll likely never own a 16:10 format (can't stand it), but then movies are the 'other' thing I do and since now-a-days they conform HD to 16:9 standard, I don't understand why computers stick to being so stubborn and off. But that, obviously is very much just my opinion... But 2560x1440... I mean really? Again, obviously this particular argument is just me... I know many do not see that to be an issue but... Still most people don't have 2560 monitors so a 1920x1200 would be a valid 'other' to show...

I mean if we REALLY want to stress them out, we should be doing something like 5760x1080 triple-screen or something... But then how many people have those either?!? (though would not be a fair test obviously since Nvidia can't do it single-card to my knowledge). Naw, they try to do what most people have, which unless there's been some stupid-sick sale on them lately, I doubt most people have much over 1920 res... I mean, you can get three large (24") good 1920's cheaper than a single 2560 so... If I wanted extreme... Anyhow, just thinking out loud. I'd like to see the 1920 res' included in the testing. That's just my point. But I do see your point too... Not trying to argue (I hope obviously)... Just explaining my point of view.

/thumbsup
 
I've seen overclocked GTX 580 vs stock 5970 benches and it's pretty much a tie so that goes to show you how much overclocking matters in this comparison.

As a 5970 owner I can say the problem is that it is nearly impossible to overclock and maintain a cool and quiet card whereas the GTX 580 will be able to do so with the stock cooler. Once you add a custom cooler or water cooling the 5970 has the capability to easily beat everything else but that's a pricey thing to do.
 
Would it have been possible to see video of the video stuttering performance you guys describe? Something side by side...F1 on a 580 and 5970 with one system spectating another. Yes, I believe your words and your charts are enough but let every other review site be just "good enough".
Posted via [H] Mobile Device
 
On this I do not agree. On any 60Hz monitor, anything under 60Hz almost always involves shuddering and/or tearing which to me completely ruin the gaming experience...

Triple buffering... all the graphical goodness, none of the tearing.

...dropping to ~0FPS would never be 'acceptable' in any situation...

Some games have hitching caused by saves. Due to the length of the tests run by the [H] guys these saves often show up in the graphs.
 
Back
Top