AMD Radeon HD 7970 Video Card Review @ [H]

I have no issue paying $550 for a video card that will give me 20-30% performance increase over my current card. I will NOT pay $550 for a card that gives me a 5-10% performance increase...

Where are you seeing just 5-10%? If your comparing it to the 580 SLI setup in your sig then that is just plain stupid...
 
I have no issue paying $550 for a video card that will give me 20-30% performance increase over my current card. I will NOT pay $550 for a card that gives me a 5-10% performance increase...

Agreed. Also people don't understand that ATI / AMD has charged this much for single GPU video cards many times before. This is not new as we've shown in this thread. The 9800XT and X1950XTX both come to mind. I don't remember if the X1800XT's cost that much off hand but I don't think so. NVIDIA usually charges more on the high end, but most of the time they have a better performing higher end part than ATI / AMD does. Also consider that the Radeon HD 7970 has a brand new GPU built on a 28nm process. While process upgrades do make things cheaper in the long run, initially the costs for the tooling have to be recouped as do engineering and R&D costs. While AMD isn't the fabricator of these cards, the actual fab has to charge enough per chip to make a profit on their end. That trickles down to you and I. So the card is more expensive and will be until NVIDIA comes out with something that knocks them down a peg. If that doesn't happen until AMD's refresh, then the price simply won't drop much.
 
LOL...you nvidia fan boys think Kepler will be cheaper when it come out! New processes are expensive until they mature...I know it's hard but try and use some common sense.

I have no idea why anyone that's bought high end GPUs over the years would think this, the top single GPU nVidia part has never been below $500 ever that I can recall, definitely nothing over the last 5 years.
 
I have no idea why anyone that's bought high end GPUs over the years would think this, the top single GPU nVidia part has never been below $500 ever that I can recall, definitely nothing over the last 5 years.

There's been a couple people posting here saying Kepler will be faster and cheaper (and also "around the corner", I'd like to know where they get his info).

What I can't believe is the same posters still complaining about the price. This GPU is not for you, just get over it already.
 
Agreed. Also people don't understand that ATI / AMD has charged this much for single GPU video cards many times before. This is not new as we've shown in this thread.

Agreed, my point though was that in RECENT years this hasn't been the case and thus why I say that AMD customers are not used to paying this price. Indeed a LOT of AMD customers have touted AMD's superior performance/price ratios for years now citing over and over nVidia's price gouging and now that AMD is bucking that trend there's some backlash.
 
Agreed. Also people don't understand that ATI / AMD has charged this much for single GPU video cards many times before. This is not new as we've shown in this thread. The 9800XT and X1950XTX both come to mind. I don't remember if the X1800XT's cost that much off hand but I don't think so. NVIDIA usually charges more on the high end, but most of the time they have a better performing higher end part than ATI / AMD does. Also consider that the Radeon HD 7970 has a brand new GPU built on a 28nm process. While process upgrades do make things cheaper in the long run, initially the costs for the tooling have to be recouped as do engineering and R&D costs. While AMD isn't the fabricator of these cards, the actual fab has to charge enough per chip to make a profit on their end. That trickles down to you and I. So the card is more expensive and will be until NVIDIA comes out with something that knocks them down a peg. If that doesn't happen until AMD's refresh, then the price simply won't drop much.

When I bought my 8800GTX it cost me $550.
 
Agreed, my point though was that in RECENT years this hasn't been the case and thus why I say that AMD customers are not used to paying this price. Indeed a LOT of AMD customers have touted AMD's superior performance/price ratios for years now citing over and over nVidia's price gouging and now that AMD is bucking that trend there's some backlash.

It's only wrong if NVIDIA does it. Like paper launching or rebadging. It's perfectly fine for AMD to do these things.
 
I can't believe how everyone is complaining about the price of gaming graphics cards in the first place.

Nvidia's and AMD's workstation cards go for at least $1,000 for the high end....Nvidia's new top of the line Quadro card goes for $3,999!

http://www.newegg.com/Product/Product.aspx?Item=N82E16814133347

Be glad you don't have to do any CAD engineering LOL. I've built $20,000 dollar workstations for clients....they don't complain about an extra 50 bucks. Bottom line is they need the best available parts to do their job effectively and efficiently. And right now the best available video card for consumer gaming, is the AMD 7970.....so deal with it!!

Why everyone seems to think there will be a 200% performance increase with each new generation of CPU's, Mobo's, Video Cards, etc....is beyond me. AMD has made a product that is better in every way than the competition....it's faster, less power, has new features that haven't even been taken advantage of yet.....but it's crap and underwhelming??? LOL, ok.

We have reached the point where there simply is not that much room to grow anymore...you can only fit so many billions of transistors on a PCB, you can only do so much. Until we move away from silicon and transistors, and go to the carbon nanotubes and holographic storage and quantum computing, we are only going to see marginal improvements with this current technology.
 
If they threw in a 24" LCD with it it would be a good deal but I am done with wasting my money on expensive video cards that give me marginal gain over what I already have, which is 2x5870. Is this card even any faster than 2x5870? I wanted to switch back to a single GPU too but this looks like a sideways swap for $550.00. Not happening unless it is sub $400.00.
 
One of the most invalid arguments is the price fight and how people with no gtx580 in their has to buy this one.

I'll "quote" someones post a bit.

I think if you don't have a card with good performance and you want to spend 500$ or more in it. Just fucking wait. I can't understand why this iso hard to accept just the fact that it's going to be much more better. I just bought a gtx580 - I don't give a rats ass. And also, this is strictly for the HIGH resolution users. What is so hard in it? I would like that my coming gtx580 would have a tech. like this one, but it doesn't. It's a good card, so is the 7970. Wait for the 7950 and I'm hopeful someone unlocks it just like the 6950.

Silly as it sounds, the gtx580 in my case was a bandage which will be ripped off when we know more about the upcoming kepler. When it comes out, it (it as amd/nvidia) will more likely be my last graphic card update in lets say next 2 to 3years. Easily.

Bottomline is this: IF you don't have a card such good as those two, don't buy one at the very first moment it comes out on the markets.. I think that would be just stupid.
 
One of the most invalid arguments is the price fight and how people with no gtx580 in their has to buy this one.

I'll "quote" someones post a bit.

I think if you don't have a card with good performance and you want to spend 500$ or more in it. Just fucking wait. I can't understand why this iso hard to accept just the fact that it's going to be much more better. I just bought a gtx580 - I don't give a rats ass. And also, this is strictly for the HIGH resolution users. What is so hard in it? I would like that my coming gtx580 would have a tech. like this one, but it doesn't. It's a good card, so is the 7970. Wait for the 7950 and I'm hopeful someone unlocks it just like the 6950.

Silly as it sounds, the gtx580 in my case was a bandage which will be ripped off when we know more about the upcoming kepler. When it comes out, it (it as amd/nvidia) will more likely be my last graphic card update in lets say next 2 to 3years. Easily.

Bottomline is this: IF you don't have a card such good as those two, don't buy one at the very first moment it comes out on the markets.. I think that would be just stupid.

Ugh that quoted post was painful to read. I'm not even sure I understood it.

In any case, I think if you need to buy a high end card today, you should wait for some availability of Radeon HD 7970's. If you can wait even longer, wait and see the Radeon HD 7990 and NVIDIA offerings at that time. The GTX 580 isn't as fast as the Radeon HD 7970. If you are like me and you need all the power you can get, the Radeon HD 7970 would be a worthy, albeit expensive upgrade. If you have a GTX 580 or some other card which is overkill for your 22" LCD at 1920x1080 or lower resolutions, then you shouldn't worry about an upgrade considering you bought way more card than you needed to begin with. Either way I don't get people bitching about the price of the Radeon HD 7970 or the performance as compared to the GTX 580.

The GTX 580 is now obsolete. This was inevitable.
 
I paid $550.00 CAD for Voodoo5 but that was in my silly days and I am not so silly now.
 
LOL...you nvidia fan boys think Kepler will be cheaper when it come out! New processes are expensive until they mature...I know it's hard but try and use some common sense.

Do you think that card costs more than $30 to make? I think that when a "rival" card comes out AMD will likely drop this price too. Like how the 6000/500 series will become cheaper and cheaper once these new parts become availible, this currently has no competition (though it's not really a huge improvement, it's the latest iteration). With the potential mass 580s/6970s hitting the market for cheap, and smallish improvement over the last generation, I don't many people will see $550 as making much sense when you could get an "old" almost as good card for $300. When nvidia comes out with the next iteration, the "market price" of a high end card will be lower, so you can expect the pricing to reflect this.
 
Very interested to see how the 7970 stacks up against nVidia's next offering. I'd like nothing more than to see some serious competition between the two - that is the best case scenario for the consumer.

Right now though the 7970 clearly holds the performance advantage, and people are upset about it. Why else would this thread grow to 30+ pages as quickly as it did :p
 
I don't think a 7970 is overkill for 120hz gaming @ 1920x1080.

(can't wait until we get high resolutions at high refresh rates...)
 
Do you think that card costs more than $30 to make? I think that when a "rival" card comes out AMD will likely drop this price too. Like how the 6000/500 series will become cheaper and cheaper once these new parts become availible, this currently has no competition (though it's not really a huge improvement, it's the latest iteration). With the potential mass 580s/6970s hitting the market for cheap, and smallish improvement over the last generation, I don't many people will see $550 as making much sense when you could get an "old" almost as good card for $300. When nvidia comes out with the next iteration, the "market price" of a high end card will be lower, so you can expect the pricing to reflect this.

Yes, the cards cost more than $30 to make.
 
Very interested to see how the 7970 stacks up against nVidia's next offering. I'd like nothing more than to see some serious competition between the two - that is the best case scenario for the consumer.

Yes! In a close competition world then consumers and tech both florish. I think people were expecting more so that nvidia would have to seriously push out the boat (was hoping that too).

I don't think a 7970 is overkill for 120hz gaming @ 1920x1080.

(can't wait until we get high resolutions at high refresh rates...)

There are no mainstream monitors with 120hz above 1080p so it will probably be a while before a GPU maker accounts for that! :(
 
Do you think that card costs more than $30 to make? I think that when a "rival" card comes out AMD will likely drop this price too. Like how the 6000/500 series will become cheaper and cheaper once these new parts become availible, this currently has no competition (though it's not really a huge improvement, it's the latest iteration). With the potential mass 580s/6970s hitting the market for cheap, and smallish improvement over the last generation, I don't many people will see $550 as making much sense when you could get an "old" almost as good card for $300. When nvidia comes out with the next iteration, the "market price" of a high end card will be lower, so you can expect the pricing to reflect this.

Yep,

Prices are never set by what something costs to make. This is an old fashioned concept called "cost plus" pricing. Just doesn't happen anymore.

Prices are set after what the market will bear.

Companies exist to make as much money as possible, so they will charge as much as the can for a product regardless of what it costs to make. This is also true in the opposite direction. If they can't charge much for a product they are forced to lower prices (up until the point where they make less from something than they are spending to bring it to market, at which point a decision is usually made to stop making something).

Cost and Sale price - for most well managed businesses - are unrelated. make it as cheaply as possible, and sell it for as much money as possible.

In order to determine what the market will bear companies try to predict and look at the demand curve for their product. They could charge $100 for the 7970, but they would sell fewer of them. They could charge $400 for the 7970 and likely sell lots.

Then they move up and down this curve and predict at what price the combination of volume and price will make them the most money, and this is where the price is set.

So, once Kepler is launched, unless it trounces the 7970, (which I don't think is likely, they will probably be within 5% of each other) they will not be able to price it at top dollar like the 7970 is priced. At this time the prices of both products will drop, regardless of what they cost to make develop.

Lets say a video card does not perform as well as hoped when the company developed it. They have a few choices. Drop prices, and make less money than they had hoped, but at least make some money. Keep prices where originally intended, and sell fewer units, and also make less money, or just stop selling it all together.

In most cases the first alternative is going to be the best for the business.
 
They cost closer to $30 than $500+ It's not exactly going to be a razor thin margin.

True. There is likely a good amount of margin on these units if you just factor in production costs.

Factor in two years of R&D, developing and validating manufacturing processes, including investing in manufacturing equipment and plants, etc. and the overall profitability is a little bit lower.

This is why they like to come out with their halo product before the competition, so they can justify the high prices and make a decent margin up front, so that it averages out well over the lifetime of the product.
 
Zarathustra[H];1038183322 said:
True. There is likely a good amount of margin on these units if you just factor in production costs.

Factor in two years of R&D, developing and validating manufacturing processes, including investing in manufacturing equipment and plants, etc. and the overall profitability is a little bit lower.

This is why they like to come out with their halo product before the competition, so they can justify the high prices and make a decent margin up front, so that it averages out well over the lifetime of the product.

There isn't jack shit for margin in this industry. The only time this changes is when the products are on the extremely high end of the spectrum. On a card like the Radeon HD 7970 I'd expect there to be $150 to $200 margin MAX on them. Now that's from distribution to retail, so I don't know what kind of margin there is from the manufacturer to distrubution, but I'd wager it's not as much as you guys might think it is. Retailers and E-Tailers do not make nearly as much money on stuff like this as you might think. Often times they sell hardware below their cost in the hope you'll buy other items with it that will make up for their losses which do carry a high margin. Cables, software, mice, keyboards, etc.
 
Here are the Pros and Cons that I'm seeing for the 7970.

Pros:
10-40% speed increase over gtx580 1.5gb at similar cost.
Much better power consumption than 580.Improved power usage.
Slightly better tessellation than GTX580.
Comparable or slightly better multi gpu scaling than gtx580
Tremendous overclocking potential.
Big improvement for high resolution multi monitor users.

Cons:
Not as big a jump in performance as we expected for a next gen/non refresh part.
Expensive compared to previous high end GPUs from Amd/ATI.

Am I missing anything?
 
Here are the Pros and Cons that I'm seeing for the 7970.

Pros:
10-40% speed increase over gtx580 1.5gb at similar cost.
Much better power consumption than 580.Improved power usage.
Slightly better tessellation than GTX580.
Comparable or slightly better multi gpu scaling than gtx580
Tremendous overclocking potential.
Big improvement for high resolution multi monitor users.

Cons:
Not as big a jump in performance as we expected for a next gen/non refresh part.
Expensive compared to previous high end GPUs from Amd/ATI.

Am I missing anything?

You've covered most of the bases. However the other thing to add ont he pro-AMD side is that the Radeon HD 7970 can be used in Eyefinity / Multi-monitor gaming by itself. The GTX 580 can't unless you get the more expensive Galaxy MDT GeForce GTX 580 which was used in the [H] review of the Radeon HD 7970, however given how it works there are some caveats to using it, such as games not supporting it's method of reaching these resolutions.

On the pro-NVIDIA side, isn't NVIDIA better at 3D gaming than AMD is? Does AMD even support that yet? I've had so little interst in 3D gaming due to my affinity for S-IPS panels that I haven't paid any attention to that aspect of the card's respective feature sets.
 
Pixel Fillrate - The number of pixels the raster operators can render to a display in one second. Measured in MPixels/s or GPixels/s.

http://en.wikipedia.org/wiki/Comparison_of_AMD_graphics_processing_units

When designing graphics intensive applications, one can determine whether the application is fillrate-limited by seeing if the frame rate increases dramatically when the application runs at a lower resolution or in a smaller window.

http://en.wikipedia.org/wiki/Fillrate

AMD Radeon HD 6970 40nm process: Pixel Fill Rate = 28.16 GP/s

AMD Radeon HD 7970 28nm process: Pixel Fill Rate = 29.6 GP/s

http://www.youtube.com/watch?v=OoGzSbaFVuI

conclusion: in terms of Pixel Fill Rate, either the TSMC die shrink from 40nm to 28nm process is a bust, or the AMD design of this latest graphics card is a bust
 
I would like to see a second set of tests, this time against a 3gb GTX580. Then I might actually consider buying the 7970. I've been burned by ATI in the past, so I'm quite leery.
 
There isn't jack shit for margin in this industry. The only time this changes is when the products are on the extremely high end of the spectrum. On a card like the Radeon HD 7970 I'd expect there to be $150 to $200 margin MAX on them. Now that's from distribution to retail, so I don't know what kind of margin there is from the manufacturer to distrubution, but I'd wager it's not as much as you guys might think it is. Retailers and E-Tailers do not make nearly as much money on stuff like this as you might think. Often times they sell hardware below their cost in the hope you'll buy other items with it that will make up for their losses which do carry a high margin. Cables, software, mice, keyboards, etc.

They made a staggering 3% profits since their ATi acquisition, so you're absolutely right. Factor in that the market is being crushed from below by Sandy's and Llanos and it's even worse.

I don't expect neither nvidia nor AMD to drastically drop the prices this time around. We were getting some real sweet deals because AMD felt they had to establish themselves as not only a competitor, but someone who can produce even better quality products. Now that that's over they won't be so quick to drop their prices below nvidia. Historically, Nvidia has been a lot like Intel in that they don't like dropping prices at all :p or at least they're far more hesitant

AMD does need something to offer their customers other than Eyefinity. It is great, but not enough. Nvidia has 3D (which I hate), and until recently CUDA wasn't open-source. If OpenCL takes off and they can finally get developers to use that big ass video card for FP-related tasks that'd be awesome. I guess PCIE 3.0 isn't that bad then...

Technically AMD has 3D too, but it isn't supported nearly as well
 
On the pro-NVIDIA side, isn't NVIDIA better at 3D gaming than AMD is? Does AMD even support that yet? I've had so little interst in 3D gaming due to my affinity for S-IPS panels that I haven't paid any attention to that aspect of the card's respective feature sets.

This might be the main feature difference between nVidia's and AMD's parts these days and one reason why AMD cards aren't an option in my main gaming rig. AMD does support 3D in their drivers but all the hardware is done by 3rd parties which does in theory make it more open than nVidia's proprietary hardware, but it looks like nVidia does lot more work in this space than anyone else.

But even on the driver side AMD is well behind, they just added Eyefinity and CF support for 3D, something that nVidia has had for 18 months, its that lack of effort that pretty much proves that AMD is only doing 3D support as a check list item and not much else.
 
http://en.wikipedia.org/wiki/Comparison_of_AMD_graphics_processing_units



http://en.wikipedia.org/wiki/Fillrate

AMD Radeon HD 6970 40nm process: Pixel Fill Rate = 28.16 GP/s

AMD Radeon HD 7970 28nm process: Pixel Fill Rate = 29.6 GP/s

http://www.youtube.com/watch?v=OoGzSbaFVuI

conclusion: in terms of Pixel Fill Rate, either the TSMC die shrink from 40nm to 28nm process is a bust, or the AMD design of this latest graphics card is a bust

Nope.

http://www.anandtech.com/show/5261/amd-radeon-hd-7970-review/26

43134.png
 
...its that lack of effort that pretty much proves that AMD is only doing 3D support as a check list item and not much else.

It doesn't hurt them in GPU sales but on the CPU side as well where they're having the exact same issue. They favor open source, and I love that about them. Unfortunately developers and manufacturers, software and hardware side, need to have their hands held and shown that doing a bit more work can actually pay off for their bottom line. Forget 3D, it's a gimmick that barely anyone uses. OpenCL can be an absolute goldmine if they play their cards right.
 
Great review, thanks for taking the time to lay it out!

I doubt you'll be reading 17 pages into this thread but as a long time reader I've gotta mention this. I know you explained your rationale behind using a heavily overclocked and modified 580 up front. And a lot of folks seem to miss that you point it out repeatedly in advance and in the summary. I'm not one of them. Nor am I some rabid fanboi for either camp. I'd simply like to add my2c on the issue so here goes:

If you are going to use a heavily modified and overclocked 580 you should be using a heavily modified and overclocked 6970 as well. You state your intent is to show what the highest potential for the previous generation cards is vs. the newcomer so folks can make a much more informed decision on upgrading. I understand that logic. What I do not understand is why you'd leave a hole in that data by not showing the full potential of the 6970 line as well? There might be restrictions placed on you by AMD... I simply don't know. IMHO the highest potential for the previous generation from both camps should be represented for a complete data set. While not as extreme as the 580, had you used a 6970 such as :

PowerColor PCS+ AX6970

and mentioned in advance the price and factory overclock, I think the review would have a much greater value along the lines of your stated intent.

Thanks again for the review.
-scoot
 
conclusion: in terms of Pixel Fill Rate, either the TSMC die shrink from 40nm to 28nm process is a bust, or the AMD design of this latest graphics card is a bust

Fill rate and bandwidths, just like core clock speeds and core counts are a pointless way of comparing different GPU's/video cards of different designs.

Within the same family of cards designed similarly (like the 400 and 500 series geforce cards) they can mean something, but when comparing different GPU's of different generations/designs the theoretical numbers are completely meaningless.
 
But even on the driver side AMD is well behind, they just added Eyefinity and CF support for 3D, something that nVidia has had for 18 months, its that lack of effort that pretty much proves that AMD is only doing 3D support as a check list item and not much else.


Yeah, but 3D is just a fad anyway. Completely uninteresting.

I tried it the first time it came around bundled with my Geforce 2 card. I didn't care for it then, and think its a silly waste of time now.
 
3D never has really looked good to me in 95% of all movies and games I've seen it in. It reminds me of pop up books. Some items are kind of 3D, but the background is still flat. None of it ever looked truly right.
 
Back
Top