AMD Radeon HD 6970 and 6950 Video Card Review @ [H]

I'm curious to see your Eyefinity and SLI/CFX reviews.

I already jumped on the SLI GTX 580 setup, and I'm glad to see it's still the fastest around. But I am glad to see AMD offering a very competitive card at a great price, hopefully that helps drive down prices for everyone. I'm wondering if the NVIDIA offerings are (ironically) noticeably faster in the multi-GPU/multi-monitor gaming arena still, though, so I hope that article comes soon :)

Side note, but I really like the connections AMD offers on their boards, I wish DP was offered on NVIDIA's cards as well since all three of my monitors support DP.
 
Call me silly, but I game @ 1080p and I'm one of the 60fps freaks whose standards are nothing less than 4x AA and 16x AF. I'm currently thinking of upgrading to 2560 x 1600 in due time. However, I want my titles to shine with good min. fps and thinking of getting (2) 6970's xfire or just waiting for the 6990. My graphics setup in long in the tooth and needs some replenishing.

Xfire does almost nothing to min FPS in my experience. A 6970 is overkill at 1080p. Crossfired 6970s or a 6990 is throwing your money out the window.

The 6970 is a very, very fast card. Unless you're running Eyefinity or 2560x1600, there's almost nothing it can't do well.
 
What I meant to say was it seemed to me the card should perform BETTER at lower resolutions, because it would be running as hard as it would at higher res's. But the comment implied the 6950 and 6970 had poorer performance at lower resolutions.

Seems backwards to me.

I can't help but think that we are somehow getting absolute performance and relative performance confused here..

There appears to be a larger gap between the GTX580 and HD6970 at lower resolutions. This gap seems to shrink as the resolution goes up.

I actually find this behavior a little odd, as thats how you usually judge CPU performance, not video card performance. Maybe we are starting to reach the limit of bus speeds? I didn't think we had, but you never know...
 
Do you guys think when the drivers mature it will gain some performance?
Im just at 1900X1200 but plan on upgrading very soon..
 
Would it be possible to add a chart/graph like you do for CPU coolers, like a cost per FPS or something of that ilk? Great review.
 
What i gather, is that the drivers are currently still a bit of a mess, performance view. They only had about 6 months to tweak drivers, compared to how much longer from tap out other cards have.

http://forum.beyond3d.com/showpost.php?p=1503864&postcount=6644

Not to mention, this is a major redesign of the ati chips since the hd 2900 era, so performance should get better as things go along.

The GTX 580/570 had an advantage of being just a bug fixed/process refined chip, of an already known design, the gtx 480, and the same with the 5870, it has been out for a year, so there is less that can be squeezed out of the drivers.

A driver comparison article should be interesting in about ~6 months though.
 
Would it be possible to add a chart/graph like you do for CPU coolers, like a cost per FPS or something of that ilk? Great review.

We will not be reducing video cards' values to frame rates.
 
Do you guys think when the drivers mature it will gain some performance?
Im just at 1900X1200 but plan on upgrading very soon..

As always drivers improve performance of cards. Cayman is a new architecture so I'd feel pretty confident in saying that we'll see a nice boost in performance after drivers mature and their engineers have a chance to optimize for said architecture.

Some part of me feels that the 6970 is especially gimped. It's performance is very sporadic and in consistent, at least so it seems. We shall see.
 
Anandtech makes a very good point about 6950/6970 performance. "The major architectural overhaul in Cayman quite likely means that AMD driver developers are not even close to maximizing the performance of these products. Both nVidia and AMD/ATi made significant performance improvements over time via driver improvements following major architectural changes. It's not unreasonable to expect the same from AMD this time around."
 
Zarathustra[H];1036569512 said:
I can't help but think that we are somehow getting absolute performance and relative performance confused here..

There appears to be a larger gap between the GTX580 and HD6970 at lower resolutions. This gap seems to shrink as the resolution goes up.

I actually find this behavior a little odd, as that's how you usually judge CPU performance, not video card performance. Maybe we are starting to reach the limit of bus speeds? I didn't think we had, but you never know...

Several more questions occur to me.

Am I gonna be bummed out I have only this one display if I get into gaming? Using this single 52" HDTV @ 1920X1080P might not be as cool as using several smaller higher resolution displays. (Extra $2000 at least-OH BOY!)

Or if I plan on using the 52" would a different card give me better gaming performance at the lower resolutions?

Why do they call it "High Definition" @ 1920X1080P if what I REALLY want is 2560X1600?
 
Do we have to rehash this same bullshit with EVERY goddamn video card review?! Can someone just pull all the usernames that post these arguments and forward them to their own forum? Or 4chan?

Great review guys. As a proud owner of a 30" Dell, I still value my money. I used a large severance package (thanks, financial crisis!) to purchase it; I don't usually have $1,200 lying around to just buy whatever I would like.

That being said, if you have two cards that perform at nearly the same level, even if money is not the issue, why pay 40% more? That's valuable information. What's more is that given the sporadic performance of the 6970 I think it is safe to say that it is driver related; once that gets worked out we should see better performance. And talk about scalability; for 150% of a GTX 580, you can have CFX 6970s, with a full 1 GB more memory.
 
Alaska Wolf -

As Heebs said, TV's suck as monitors. You'd have a much better gaming experience spending a few hundred bucks on any mid-range monitor than your TV.
 
Thanks for the great review guys. The HD 6950 is a real surprise at $300 - it delivers GTX 570-level performance for a great price and low power envelope! The HD 6970, while disappointing compared to the 6950 (%10 faster for $70 more, and gobbles power), it is very competitively priced versus the GTX 580.

I did make the call on AMD using the new line of Hynix 2Gbit modules, and I'm happy to see them pushing the bandwidth limits with the the new 6Gbps speed grade chip. But I feel this is not enough for the 6970, and it shows in reviews:

6950 to 6970 - %20 more raw compute power, but only %10 more memory bandwidth.

The scaling between these two in the [H] review is only about %11.5, which says to me that the 6970 is heavily bandwidth-limited. This may explain the disappointing performance of the 6970, and means AMD will either have to increase to a 384-bit bus for the next generation, or concoct some entirely new memory tech. GDDR5 has hit the performance wall.
 
It seems like AMD is the winner of the "performance without gouging" award. I'd hoped for NVIDIA to learn this sooner...but it seems like they're just not going to see the light of "charge less and make it up in volume" anytime soon.

Of course, one has to consider the problematic driver situation that AMD seems to have chronically suffered...but it seems like they're learning in that arena as well. Still, for a variety of reasons, NVIDIA's attractions of driver quality and ease of installation could still make their overpriced products more attractive. Undoubtedly, much of their hardware "profits" must be reinvested in driver support...but it shouldn't cost an extra 50% to get it.

NVIDIA seems to be losing the grip on everything that made them great, with the exception of technical innovation and driver excellence...but at too great a cost to the consumer, judging by AMD's ability to offer a considerable "bang for buck" value.
 
God damnit; that $369 price is just too reasonable. All I wanted to do with my year end bonus was sock it away...but my Q6600 / GTX280 is getting just a bit long in the tooth, although really it's been my best build so far in terms of longevity.

But a Sandy Bridge / CFX 6970...damnit, begone foul temptress!
 
God damnit; that $369 price is just too reasonable. All I wanted to do with my year end bonus was sock it away...but my Q6600 / GTX280 is getting just a bit long in the tooth, although really it's been my best build so far in terms of longevity.

But a Sandy Bridge / CFX 6970...damnit, begone foul temptress!

Even after being so sure of myself that I would be picking up a green card, looking at the high res performance comparisons over and over... I might have to go with red... The price is right if those charts are right!
 
Very exciting, but would I be able to realize the full potential of either of these on an E8400 Core 2 Duo machine?


And just a small typo, all the A2A comparisons are labeled 6970
 
Several more questions occur to me.

Am I gonna be bummed out I have only this one display if I get into gaming? Using this single 52" HDTV @ 1920X1080P might not be as cool as using several smaller higher resolution displays. (Extra $2000 at least-OH BOY!)

Or if I plan on using the 52" would a different card give me better gaming performance at the lower resolutions?

Why do they call it "High Definition" @ 1920X1080P if what I REALLY want is 2560X1600?

Depends on the TV and personal preference. I game on my 55" plasma and love it but I also have a secondary computer hooked up to a 27" monitor that I enjoy. However with that said I prefer gaming on the 55" any day of the week over the 27".

I've gamed on an eyefinity setup at a buddies house and the experience was pretty damn cool. The bezels were annoying at first but after a few minutes they blend in and weren't super noticeable.

As for what I would choose? Well I'm still using my 55" and debating the switch to eyefinity but in all honesty I don't game no where near as much as I did in my younger days so eyefinity may just end up being more or less a whoa that is cool setup when friends come over.
 
It is. The card performs better in an absolute sense at lower resolutions. But in a relative sense, it trails behind the nVidia offerings by more (or beats the nVidia offerings by less) at lower resolutions.

So if I'm stuck with a 52" monitor chugging along at 1920X1080 the nVidia GTX 570 makes more sense than a 6970?

My GF just asked what games I'm interested in anyway? I told her the new 3D P0rn ones. It resulted in stomping upstairs and doors slamming.

ARE there any cool game where you fly F-22's? Is Call of Duty the best? Or is the good stuff these role playing things?

The last games I had were played on an Atari 2600...
 
So if I'm stuck with a 52" monitor chugging along at 1920X1080 the nVidia GTX 570 makes more sense than a 6970?

My GF just asked what games I'm interested in anyway? I told her the new 3D P0rn ones. It resulted in stomping upstairs and doors slamming.

ARE there any cool game where you fly F-22's? Is Call of Duty the best? Or is the good stuff these role playing things?

The last games I had were played on an Atari 2600...

F22s? Try Arma II: Combined operations if you machine can handle it, for something more actiony and much better/cheaper than any COD.. Battlefield Bad Company 2. You get infantry, tanks, vehicles, and helicopters online with great graphics and fully destructible buildings and environments.
 
Several more questions occur to me.

Am I gonna be bummed out I have only this one display if I get into gaming? Using this single 52" HDTV @ 1920X1080P might not be as cool as using several smaller higher resolution displays. (Extra $2000 at least-OH BOY!)

Only you can answer that. If you're happy with the pixel real estate and the picture you get at 1920x1080, you can save quite a bit of money on not having to buy new displays nor a particularly high end video card.


Or if I plan on using the 52" would a different card give me better gaming performance at the lower resolutions?

It's not that a different card would give you better performance; essentially, it's that a cheaper card would give you equal performance. If you have a 60Hz display any frames per second > 60 are wasted - you won't see them.


Why do they call it "High Definition" @ 1920X1080P if what I REALLY want is 2560X1600?

Hoo boy...it's because of the transition from the analog NTSC standard for broadcast television to the ATSC standard for broadcast television. The top 2 ATSC standards, 1920x1080 interlaced and 1280x720 progressive, got the label 'high definition.' It has nothing to do with computer monitors.
 
That's the point I tried to prove a few weeks ago. It skews results.
The 6870>GTX580 if things were done that way which is very deceptive.

I don't know man...

I bought a GTX580 so that I could run at 500fps at 1024x768 with no eye candy turned on, and upsacled on my 30" monitor :p

All joking aside, the [H]'s test method showing how well each card performs in the settings they are most likely to be used is the best out there.

Sometimes it can be useful to compare frame rates for decisions like "Is this GTX460 actually going to be an improvement and worth the money over my 216 core GTX260" in title X that I play. There are sites that do this kind of review, and I look at their reviews for this kind of information. Most of the time - however - , the [H]'s approach is more effective for me.
 
So if I'm stuck with a 52" monitor chugging along at 1920X1080 the nVidia GTX 570 makes more sense than a 6970?

You're going to be happy with either of the cards to be honest at that resolution. Its like asking if you want a GT500 or a Z01. Both are awesome and will draw double takes whereever you drive them


ARE there any cool game where you fly F-22's? Is Call of Duty the best? Or is the good stuff these role playing things?

The last games I had were played on an Atari 2600...

Check out the DCS: A-10 Warthog. While not an F22, it is a Warthog!

Fallout 3 and Mass Effect are two great RPG's you really shouldn't miss.

Assassins Creed 2 is a very beautiful and detailed game.

Batman Arkham Asylum is another you shouldn't miss.

Quite a few games that you will enjoy I think.

Also check out Minecraft! ;)
 
Alaska Wolf -

As Heebs said, TV's suck as monitors. You'd have a much better gaming experience spending a few hundred bucks on any mid-range monitor than your TV.

Bummer, I don't have the extra dough. The Sammy 52 works SWELL for web surfing and blue rays. Not so good for games though?

I wasn't planning on a video card upgrade but the PC Gods shat upon my parade.
 
You're going to be happy with either of the cards to be honest at that resolution. Its like asking if you want a GT500 or a Z01. Both are awesome and will draw double takes whereever you drive them




Check out the DCS: A-10 Warthog. While not an F22, it is a Warthog!

Fallout 3 and Mass Effect are two great RPG's you really shouldn't miss.

Assassins Creed 2 is a very beautiful and detailed game.

Batman Arkham Asylum is another you shouldn't miss.

Quite a few games that you will enjoy I think.

Also check out Minecraft! ;)

Thank you! Will do!
 
Zarathustra[H];1036569898 said:
I don't know man...

I bought a GTX580 so that I could run at 500fps at 1024x768 with no eye candy turned on, and upsacled on my 30" monitor :p

All joking aside, the [H]'s test method showing how well each card performs in the settings they are most likely to be used is the best out there.

Sometimes it can be useful to compare frame rates for decisions like "Is this GTX460 actually going to be an improvement and worth the money over my 216 core GTX260" in title X that I play. There are sites that do this kind of review, and I look at their reviews for this kind of information. Most of the time - however - , the [H]'s approach is more effective for me.

A direct card comparison is one thing but the sites who develop those types of charts list every card on the market.
The big boy cards will aways be at the bottom of the chart.
 
Last edited:
Thanks for the great review [H]. The 6900 series is showing good performance for the price. 6970 may be my next upgrade depending upon how well the 6990 performs.
 
Hoo boy...it's because of the transition from the analog NTSC standard for broadcast television to the ATSC standard for broadcast television. The top 2 ATSC standards, 1920x1080 interlaced and 1280x720 progressive, got the label 'high definition.' It has nothing to do with computer monitors.

I never thought I'd wish I could afford a $3000 monitor for games. Of course I never thought at 60 years of age I'd have to go back to school to learn how to use my telephone either. I can text, surf, and take videos OK but I screw up answering and placing calls. I'm informed I'm a champion Butt Dialer though. WTF?

Those 2560 monitors are TOO spendy for me. What would be a GOOD 2 or three monitor substitute for my HDTV? I gather the 6950 or 6970 OR the GTX 570 can run both my HDTV AND a two or three monitor setup?
 
Bummer, I don't have the extra dough. The Sammy 52 works SWELL for web surfing and blue rays. Not so good for games though?

I wasn't planning on a video card upgrade but the PC Gods shat upon my parade.

Hop on down to the monitor section and post your Sammy model number, or check out the avsforums as they generally have good reviews on what each tv and monitors are good for.

Nothing wrong with gaming on a TV with a few exceptions.

Those 2560 monitors are TOO spendy for me. What would be a GOOD 2 or three monitor substitute for my HDTV? I gather the 6950 or 6970 OR the GTX 570 can run both my HDTV AND a two or three monitor setup?

Lots of great monitors out there for an eyefinity setup. Dell, Asus, Acer, Samsung, LG, and even some generic models at your local stapels can be found dirt cheap. Look for some 24" 1980 x1200 or 1080 monitors in the 150-200 price range and you should be good. Also double check the monitor forums and keep an eye out on the hot deals section for monitors.

Oh ya, the 570 or any Nvidia offering requires SLi for Nvidia's surround view. ATi can run eyefinity on a single card. If you are going that route then your going to want the 6970.
 
Last edited:
So I guess its just surreal to me to see praise about $140 savings while also assuming all of us have blown $1000+ on monitors.

I think the conclusions are spot on. Why spend that extra $140 when you don't have to? I could do so many other better things with that saved cash, like an sexy SSD for instance. Or heck, putting that money into a bigger monitor! Just because I have a pretty nice system doesn't mean I'm filthy rich.


What I meant to say was it seemed to me the card should perform BETTER at lower resolutions, because it would be running as hard as it would at higher res's. But the comment implied the 6950 and 6970 had poorer performance at lower resolutions.

Seems backwards to me.

What does it matter? These cards are already overkill for anything less than 25x16, and they already give you more than playable framerates than you need. What's the extra 5-10 frames going to do for you when you're already getting 70fps @ 19x12 8xAA 16xAF max settings?
 
Back
Top