AMD Radeon HD 7970 Video Card Review @ [H]

Zarathustra[H];1038183552 said:
Yeah, but 3D is just a fad anyway. Completely uninteresting.

I tried it the first time it came around bundled with my Geforce 2 card. I didn't care for it then, and think its a silly waste of time now.

Hehe. I still have the ~2000 Asus branded 3d shutter glasses :p

293199_10100289348500172_9113649_50698529_4437918_n.jpg



Hottt
 
Zarathustra[H];1038183552 said:
Yeah, but 3D is just a fad anyway. Completely uninteresting.

I tried it the first time it came around bundled with my Geforce 2 card. I didn't care for it then, and think its a silly waste of time now.

LOL you tried it in like the year 2000

well done 3d games are awesome. best I have seen is killzone 3 on PS3, with passive glasses

it was downright awesome. VERY immersive

Ive played with active glasses and grid and it was a huge improvement over driving games in 2d for me since I actually have been on road courses and you could 'feel' braking points instead of memorizing a marker

its really starting to get good now
 
I'll be looking to upgrade my system when Ivy Bridge comes out, hopefully Keplar will be out too, along with custom 7970s with nice quiet heatsinks. I'll be like a kid in a candy store.
 
LOL you tried it in like the year 2000

well done 3d games are awesome. best I have seen is killzone 3 on PS3, with passive glasses

it was downright awesome. VERY immersive

Ive played with active glasses and grid and it was a huge improvement over driving games in 2d for me since I actually have been on road courses and you could 'feel' braking points instead of memorizing a marker

its really starting to get good now

You know what happens when you really want/think something is awesome? you trick yourself into thinking it is awesome. You are making crap up to justify spending the money for 3D.
 
You know what happens when you really want/think something is awesome? you trick yourself into thinking it is awesome. You are making crap up to justify spending the money for 3D.

Getting stabbed is so cool man! You have to try it! I just stabbed my self in the knee with a pen knife, it was great! Excuse me, I have some medium rare dirt cooking up for dinner.
 
You know what happens when you really want/think something is awesome? you trick yourself into thinking it is awesome. You are making crap up to justify spending the money for 3D.

A lot of ATI fans were anti-SLI until ATI finally got crossfire working. I'm sure they will come around if AMD ever gets 3D right. Same with physics, CUDA, etc. I remember when Starcraft 2 launched and AMD did not support AA in it. Some people were actually saying AA was not a big deal.
 
A lot of ATI fans were anti-SLI until ATI finally got crossfire working. I'm sure they will come around if AMD ever gets 3D right. Same with physics, CUDA, etc. I remember when Starcraft 2 launched and AMD did not support AA in it. Some people were actually saying AA was not a big deal.

You'd figure AMD would have a better 3D setup with Eyefinity anyway. It doesn't make it any less pointless, though. People aren't buying into the 3D craze for a reason, doesn't matter who rehashes it.

But you're right about the fanboys on both sides of the aisle. Sometimes it seems that people forget these companies don't give 2 shits about you
 
3D is fantastic so long as you're not one of the 20% of the population whose eyes can't handle it. Most everyone else who bashes it is like the guy a few posts above: "I tried it in 1865 and it was no good then, so clearly nothing evolves and it can't be any good now either!"

Also laughable is the argument that people who claim to like 3D are just trying to justify their purchase - as if returns departments around the world evaporated. One can just as easily say that you don't like it because you can't afford it (sour grapes) or because AMD's 3D solution is plainly inferior (fanboyism).

Bottom line, if your eyes work like 80% of everyone else's, and you have the basic intelligence to manually adjust your convergence and depth settings, 3DVision, at least, looks spectacular in a good number of games. Trine 2 and Skyrim and The Witcher 2 become mind-blowing in 3DVision. Now that the dimness has been alleviated with 3DVision 2, it's even better.
 
You'd figure AMD would have a better 3D setup with Eyefinity anyway. It doesn't make it any less pointless, though. People aren't buying into the 3D craze for a reason, doesn't matter who rehashes it.

But you're right about the fanboys on both sides of the aisle. Sometimes it seems that people forget these companies don't give 2 shits about you

Sort of like the fanboys that make a huge shitstorm over a card that out performs and still costs less then its competitor?
 
3D is fantastic so long as you're not one of the 20% of the population whose eyes can't handle it. Most everyone else who bashes it is like the guy a few posts above: "I tried it in 1865 and it was no good then, so clearly nothing evolves and it can't be any good now either!"

Also laughable is the argument that people who claim to like 3D are just trying to justify their purchase - as if returns departments around the world evaporated. One can just as easily say that you don't like it because you can't afford it (sour grapes) or because AMD's 3D solution is plainly inferior (fanboyism).

Bottom line, if your eyes work like 80% of everyone else's, and you have the basic intelligence to manually adjust your convergence and depth settings, 3DVision, at least, looks spectacular in a good number of games. Trine 2 and Skyrim and The Witcher 2 become mind-blowing in 3DVision. Now that the dimness has been alleviated with 3DVision 2, it's even better.

I guess my eyes can't handle it then because it looks like shit to me. Everything has that pop up book look to it. Things don't really appear 3D at all. More like 2D objects on different planes.
 

I knew I should have listed the sites that I read yesterday..;)..ah, well...here they are:

AnandTech http://www.anandtech.com/show/5261/amd-radeon-hd-7970-review
Guru3d http://www.guru3d.com/article/amd-radeon-hd-7970-review/
[H] http://www.hardocp.com/article/2011/12/22/amd_radeon_hd_7970_video_card_review/
Hardware Canucks [for some reason could not post this link here--paste operation kept mangling the url]?
Hardware Heaven http://www.hardwareheaven.com/revie...d-7970-graphics-card-review-introduction.html
Hardware Secrets http://www.hardwaresecrets.com/article/AMD-Radeon-HD-7970-Video-Card-Review/1458
Hot Hardware http://hothardware.com/Reviews/AMD-Radeon-HD-7970-28nm-Tahiti-GPU-Review/
Legit Reviews (bad) http://www.legitreviews.com/article/1805/1/
NeoSeeker http://www.neoseeker.com/Articles/Hardware/Reviews/AMD_HD_7970/
TechPowerUp http://www.techpowerup.com/reviews/AMD/HD_7970/
TechRadar (terrible!) http://www.techradar.com/reviews/pc...phics-cards/amd-radeon-hd-7970-1049734/review
Vortez (strange) http://www.vortez.net/articles_pages/amd_hd7970_3gb_tahiti_review,1.html

Why those 12 sites, exactly? They just happened to be listed in the hardware section of yesterday's (12/22) Bluesnews http://www.bluesnews.com/ edition for 7970 reviews.

I wasn't implying that nobody else tested Eyefinity, just that out of the 12 sites listed above that I read, only 2 sites covered it, and only one site covered it decently--[H].

I stopped reading Tom's years ago--has Tom's improved that much? I'll check it out, though. Thanks for the links!

Honestly, though, three of the links you gave me are (for me) foreign sites, one French, one German, and one Martian...;) I agree that bar charts are universally understood, but I don't generally frequent other-than-English sites for in-depth reviews simply because I am fluent only in English.

But let's count those sites, too. That would mean that only 7 out of 17 web sites bothered with Eyefinity coverage of any type or any degree at all in their 7970 reviews--not even a 50% coverage rate. IMO, pretty poor record for such an intrinsic feature of the card, most especially when you consider that the GTX 580, the card most of these sites directly compared with the 7970, doesn't even *offer* a similar feature.

Again--good job Brent!
 
Lets wait and see if the retail cards arr infeed loud and i'll add it then. Thanks.
 
Good point on only needing a single card for eyefinity. If there's excitement about 3d i'll add it.
 
You've covered most of the bases. However the other thing to add ont he pro-AMD side is that the Radeon HD 7970 can be used in Eyefinity / Multi-monitor gaming by itself. The GTX 580 can't unless you get the more expensive Galaxy MDT GeForce GTX 580 which was used in the [H] review of the Radeon HD 7970, however given how it works there are some caveats to using it, such as games not supporting it's method of reaching these resolutions.

On the pro-NVIDIA side, isn't NVIDIA better at 3D gaming than AMD is? Does AMD even support that yet? I've had so little interst in 3D gaming due to my affinity for S-IPS panels that I haven't paid any attention to that aspect of the card's respective feature sets.

Previius post was in response to this. Sorry!
 
...Fermi was designed to scale well into the future according to NVIDIA, and hopefully that's true, but sometimes those types of plans just don't pan out...

nVidia said the same thing, only with far more flowery and verbose speech, about nV30...;) (How could anyone ever forget nV30?)

I need more performance now damn it.

7970's your cup of tea, then.
 
What year was this? In today's money, this thing must cost a fortune!

I can remember buying 2D cards prior to the advent of the first *playable* 3d card to hit the market--the 3dfx Voodoo 1.

Who remembers the Matrox Millennium? Heh...IIRC, 2D, it retailed for about $600, and I remember "getting a bargain" on mine @$475...;)
 
If you are going to use a heavily modified and overclocked 580 you should be using a heavily modified and overclocked 6970 as well.

The Galaxy MDT card was required to test Eyefinity / Surround resolutions without having to get CFX/SLI involved.

I think Brent made the right decision not altering the "stock overclocked" speeds of the Galaxy 580 MDT; instead he made it clear that the MDT card was more expensive, and it was still getting outperformed.

The review makes it very clear that, short of GTX 580 SLI you won't find a faster performer at Eyefinity resolutions. And in no cases can you get a better-performing Surround solution for a competitive price.
 
I'm also from those 20% who hate 3D so no party for me :p
Still the true is that 3D is too artificial.
 
I'm glad most of you seem to think $1,100 to $1,200 dollars is an O-K price now for a pair of 7970 cards to run in Crossfire.

That's only because this card is the high end, of course. It's not "mid-range" and it's not "value."

What really bothers me the most and for some odd reason, no one is addressing this, this move can't be good for AMD. The more expensive something is , the less of it that company sells. This is true for any product in almost any scenario. Quantity is where the profits are.

If that was true then no manufacturer would bother to make "high-end" anything--because he couldn't make a profit. But, we've got high-end cars, high-end TVs, high-end phones--you name the product niche, and there's a "high-end" in there somewhere...;) Obviously, high-end makes profit for companies--in fact, the most profit per unit sold comes out of the high-end product category.

As far as "demand" goes it's like this: if demand is strong @~$550, but begins to wane @$600, then $550 is a desirable, sustainable price simply because lots of people will think that the product is *worth* $550.

Here's another example: the reason that up to now the ASP for nVidia's GTX 580 hardware has sustained itself at darn near the ~$600 mark (between $500 & $600) is because, even after the product has been on the market for a year, enough people still think it is worth paying that kind of money for. IE, nVidia is satisfied with the volume of GTX 580 hardware it's been selling, and therefore is satisfied with the demand for it, and so nVidia sees little reason to lower the price to stimulate demand. nVidia is pleased with present demand at present prices.

Flip side of the coin is that now the 7970 is here, and it's all-around better than the GTX 580 in pretty much every category, so nVidia's current OEM pricing for GTX 580 hardware is going to have to come down--becuase few will think a GTX 580 is worth $550 when they can buy an AMD 7970 for $550.

But mainly, remember that AMD is going to release mid-range and value-segment gpus within the 7000 series, too--and so I think it is entirely possible that ~$300 and ~$150 prices will emerge for 7xxx products and that AMD's target audience for those products will find them compelling at those prices. Your comments sound as though you think the only 7000-series product there will be is the 7970.

But, as many people have pointed out, yields are low, so it is claimed,

If all the 7xxx products were the same price, and there was just one product to buy, the 7970, then your sentiment might have merit. But "low yields" have nothing to do with the 7970's MSRP. The price of nVidia's single gpu flagship, the GTX 580, otoh, has everything to do with the $550 7970 price. The 7970 is coming in at the OEM/AIB pricing of the GTX 580, yet in all respects the 7970 is already a demonstrably better product in every category. Therefore, it doesn't stand to reason that the 7970 has any chance of failing at its introductory price point. If for some reason AMD should feel the need to stimulate 7970 demand, then and only then might we expect lower pricing. Of course, that will eventually happen anyway. But certainly not within the next several months, I should think.

Being an AMD fan, sure, this does bother me to see them lose ground in this area.

Your only problem here is that you keep forgetting that in a very short time much cheaper 7000-series products will debut. As the 7970 is AMD's high-end flagship, and it's a superior product to nVidia's GTX 580, *and* it is selling for the same ~ price as the GTX 580, how could it fail? It may well be that nVidia has priced the GTX 580 hardware so high that it isn't selling any of it at all--but somehow I doubt that very much...:D It really doesn't appear that AMD has "lost ground" with the introduction of the 7970.

I am hoping that the 7950 with 1.5gigs or 3gig of ram can be unlocked and that they have that card priced around $325 dollars or so.

A ~$300 7000-series card is no doubt in the works even as we speak...;)
 
Ugh that quoted post was painful to read. I'm not even sure I understood it.

Okay, I read it. It was really bad - just getting used to a forum with english.

Either way I don't get people bitching about the price of the Radeon HD 7970 or the performance as compared to the GTX 580.

The GTX 580 is now obsolete. This was inevitable.

Yeah. :) the point I was trying to make; I wouldn't put my 500$ on the first 7970's coming out. Although there was some article about a MSI 7970.
 
Zarathustra[H];1038183552 said:
Yeah, but 3D is just a fad anyway. Completely uninteresting.

I tried it the first time it came around bundled with my Geforce 2 card. I didn't care for it then, and think its a silly waste of time now.

Couldn't agree more. It's main Achilles's heel, aside from representing what is essentially 2d parallax scrolling/effects as "3d", is the fact both your eyes have to be good to perceive the illusion. My wife 's left eye has blind spot right in the center, and although in daily affairs she doesn't notice it very much, she cannot do anything with this shoddy, phony fad called "3d"--unless it is something that can be done with one eye.
 
The Galaxy MDT card was required to test Eyefinity / Surround resolutions without having to get CFX/SLI involved.

I think Brent made the right decision not altering the "stock overclocked" speeds of the Galaxy 580 MDT; instead he made it clear that the MDT card was more expensive, and it was still getting outperformed.

The review makes it very clear that, short of GTX 580 SLI you won't find a faster performer at Eyefinity resolutions. And in no cases can you get a better-performing Surround solution for a competitive price.

I do understand the rationale for using that particular 580. I said as much in my response. If the review was against the 580 only I'd have no issues. That said, there was a 6970 included in the benchmarks and I still feel a more complete picture of the previous generations cards capabilities would have been provided with an equally off the shelf overclocked 6970. If you have to use a specialty 580 for surround benchmarks that also happens to be overclocked, sitting on a specially made PCB for power efficiency and fitted with aftermarket cooler. You should use as similar a 6970 as you are reasonably able to lay hands on for purposes of comparison with price differences and clocks noted as well. That simply makes more sense to me than using a stock 6970. Brent does a good job of including the reference 580 numbers into the mix with power and heat measurements. I'm not baggin' on the review at all or implying it's biased in any way. I'm just pointing out where I feel it could have been better. It's a fair critique IMHO.
 
I guess my eyes can't handle it then because it looks like shit to me. Everything has that pop up book look to it. Things don't really appear 3D at all. More like 2D objects on different planes.

Yeah, that's unfortunate. Because that's not how it looks at all to me unless you set depth to 0 and convergence to max. Only then it does look like toys popping off the screen to my eyes.
 
I don't get some of the people posting here.

The 7970 is FASTER AND CHEAPER than the 580mdt and is a LOT FASTER and right about the same price as a regular 580.. and you are saying it costs too much and/or is a failure.

Where do you get whatever you are smoking?

Eyefinity on a single GPU card with good quality setting.. freaking awesome!

And it uses less power than the 6970 and loads less than the 580.

This new generation is full of win. Those of you who think otherwise need to open your eyes.
 
I'm sorry if what I wrote was a bit harsh. The lack of Eyefinity testing is real and until recently [H]ard|OCP was the only "Tier-1" site doing regular reviews at triple-monitor resolutions.
I stopped reading Tom's years ago--has Tom's improved that much? I'll check it out, though. Thanks for the links!

Honestly, though, three of the links you gave me are (for me) foreign sites, one French, one German, and one Martian...;) I agree that bar charts are universally understood, but I don't generally frequent other-than-English sites for in-depth reviews simply because I am fluent only in English.
I am not too fond of Tom's either, but that they started testing at 5760x1080 resolution is something they deserve to be commended for.
Among non-English sites (which I labeled as "for international readers"), hardware.fr is probably the one worth reading most.
 
Yeah, the 590 is really the competition for a 7970 not the 580. And the 7970 is a couple hundred cheaper to boot.
After reading some more reviews, I too get the impression that not the GTX 580 but instead the GTX 590 is the card the 7970 should be compared against. Especially at 2560x1600 and 5760x1080 the 7970 performs closer to the 590 than to the 580.
If you look at the $200 price and 150W power consumption difference the 7970 appears even more attractive.

Uploaded with ImageShack.us
 
Last edited:
Yeah, 3D has been a fad for 50 years. Seriously, why do people who don't like something always have to call it a fad? Hell there's still people calling tablets a fad, that's pretty silly. If you don't like S3D fine, you won't use it. But I wish people would stop telling those who use the technology that it's fake, or a fad, or this or that. Since people like me use the shit every, I think I have a much better idea of how well it works and it means to me than someone who tried it once at a store years ago.

The bottom line for me is that in some games and movies, the visual experience is tremendously enhanced, Batman Arkham City and Arkham Asylum are simply different games to me with S3D, especially Arkham City, it feels like a Batman holodeck. The sense of distance and space that I get from S3D just make it more immersive, as though I really am looking out over a city. Sure not all games work well with it, and few like the Batman games work at that level, but there's ever more titles coming and indeed all of the big titles that came out this fall support 3D.

No adult who works everyday has to justify how they spend money to anyone else, let alone a bunch of strangers. I've bought many things that were crap in my life and regretted buying them, 3D Vision, isn't one of them. And whether or not anyone believes or cares is irrelevant to my enjoyment of the technology, a technology that I will continue to support and spend money on because I like it for me, what others think is of no consequence. That's how individuality works.

For some reason many of those who dislike 3D seem determined to want to make EVERYONE dislike it which makes no sense. I guess some get mad that effort is being devoted to something they do like, so what? Does everything a company works on or supports have to meet everyone's approval? Isn't the world big enough for you to enjoy your flat 2D only stuff and let others who can enjoy the depth and immersion of 3D enjoy that as well? Can't we just all get along?:D

nVidia just upgraded 3D Vision to 3D Vision 2 with new glasses and support for something called Lighboost technology that improves the light levels and color and reduces ghosting. Unfortunately Lighboost requires new monitors and I really wasn't ready to spend $2k on three new monitors until I read up more on how well Lighboost performed but I did pick up a pair of the new glasses and even without the new monitors they are an decent improvement over the old ones. No matter what one thinks of the technology though, AMDs lack of support for S3D can't really be helping it, anyone into S3D on the PC is pretty much going to buy an nVidia card.

So the technology is far from perfect and it has a lot of issues but for those that work past them there's a lot to offer. It's something that I've really enjoyed and I will continue to buy products that support 3D technology on the PC.
 
I get the impression that not the GTX 580 but instead the GTX 590 is the card the 7970 should be compared against. Especially at 2560x1600 and 5760x1080 the 7970 performs closer to the 590 than to the 580.

I guess there's at least two ways to look at it, the single fastest GPU versus the single fasted GPU or whatever old card more closely matches the performance of the new one. Both are valid but for most it makes more sense to compare single card to single card to get a sense of the delta in performance. Indeed the dual GPU would mask the improvements of the single GPU I believe.
 
Minimal FPS is not that much better than gtx 580 in 1600p tests, even worst in some cases. That is one off the most important thing for me. I dont have good experience with Radeons and frame drops and i would like to now how 7970 handles in BF3 during explosions with lots of smoke etc..
7970 could perform better than gtx 590 in 5780x1080 because 3gb vs 1.5gb vram. Much better would be gtx 580 3gb vs 7970 but I dont know if there is any 580 3gb model with support for 3 monitors.
 
After reading some more reviews, I too get the impression that not the GTX 580 but instead the GTX 590 is the card the 7970 should be compared against. Especially at 2560x1600 and 5760x1080 the 7970 performs closer to the 590 than to the 580.
If you look at the $200 price and 150W power consumption difference the 7970 appears even more attractive.

I guess there's at least two ways to look at it, the single fastest GPU versus the single fasted GPU or whatever old card more closely matches the performance of the new one. Both are valid but for most it makes more sense to compare single card to single card to get a sense of the delta in performance. Indeed the dual GPU would mask the improvements of the single GPU I believe.


The 7970 is a VERY attractive card :

Compare the 7970 to a 590 with a baseline of 5760x1080 minimum for this class of card. Sure you can run a 7970 on 1 monitor, there's lots of excess on [H]... ;)

But I usually aim for the sweet-spot between price & performance. +/-$200 less for similar or better performance than a 590 (less than a 6990 however) at 5760x1080 HITS that sweet-spot. Just one 7970 is over a 90% improvement on my 5870+5830 CF setup PLUS it can MAX out all the games I now run at the lowest settings on eyefinity. :eek: Heck that's probably closer to a 200%-300%+ improvement when you factor the eye candy in with the 3GB. :eek:

Nvidia just has nothing in single GPUs to compare to the 7970 at eyefinity resolutions except dual GPUs, the 590 (assuming you can find one). The 580, if anything, should probably be compared to a 7950 when it comes out. Nvidia obviously knows this should they drop the price of the 580 to compete, AMD can just release the 7950 at the same price point and keep their margins on the 7970. So nvidia is not moving their price until AMD releases the 7950 and they HAVE to. I'm sure AMD has PLENTY of candidates for the 7950 from trying to bin 7970 parts. nvidia did the same with the 580 & 570 vs the 6790, it's a no brainer.

Then throw in near 100% CF scaling...
 
Then throw in near 100% CF scaling...

You simply can't say this about CF or SLI. While multi-GPU does tend to work there's a lot of new titles that come out with broken multi-GPU and even then while performance numbers my be great actual game play experience may not be.

Indeed this chart shows pretty much no CF OR SLI scaling but Batman is an nVidia TWIMTBP title.


So the point is touting a number like near 100% CF is fairly meaningless. And this is also true of SLI, there's far to many variables from game to game.
 
It scales from 51 to 91 on the 580.

The DX11 codepath is heavily optimized for nVidia, and probably uses tesselation more than anything.
 
Back
Top