Sapphire HD 7970 6GB Toxic Edition

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
With 6GB of DDR5 and a "Lethal Boost" button, the fellas at Overclockers Club say that Sapphire has "hit one out of the park" with its Radeon HD 7970 6GB Toxic Edition video card.

When it came to overclocking this card I was pleased to see the clock speeds of 1235MHz on the core and 1735MHz on the 6GB of memory. The only other HD 7970 I have tested to reach this level was a factory overclocked and water cooled HD 7970; driving the cooling significantly lower while needing more current than I had to use on the Toxic Edition HD 7970. Sapphire added to the feature set on the Toxic Edition with several means of overclocking the card.
 
Nice review. Only con they posted is price even though they only have an "estimated" price of $680.
 
God bless them for doing proper stability testing.

Maximum Clock Speeds:

Testing for the maximum clock speed consists of looping Unigine 3.0 for 30 minutes each to see where the clock speeds fail when pushed. If the clock speed adjustment fails, then the clock speeds and tests are rerun until they pass a full hour of testing.
 
few things wrong with this one.
1) multiple cards do not increase your total usable memory.
2) 3GB cards can handle eyefinity resolutions just fine.

Back when I was running triple 1920x1200, I was getting mighty close to 3GB of memory used at only 2X MSAA. Go higher resolution than that or a higher level of AA and you'll see some significant benefit from having 6GB.

Now why they gave it the standard connections and didn't make it Eyefinity 6 is beyond me. Giving it 6GB of memory is going to be just about useless on a single monitor. Really high AA on a single 30", maybe.
 
Almost nice card.

As already said... Had it come with Eyefinity 6 mini-dp connections for 5 screen eyefinity (and thus use the 6GB very well) it would have been an insta-buy for me.

As it is... No.

Sigh...
 
Very powerful card. 1200 Mhz out of the box and stable at 1250+ Mhz. The only problem is price. You could pick up a Gigabyte HD 7970 Windforce or Sapphire HD 7970 Dual X and get very close to those speeds for much cheaper. though to be honest you are not going to get any OC guarantees unlike the HD 7970 Toxic which comes with 1200 Mhz out of the box.
 
God bless them for doing proper stability testing.

Maximum Clock Speeds:

Testing for the maximum clock speed consists of looping Unigine 3.0 for 30 minutes each to see where the clock speeds fail when pushed. If the clock speed adjustment fails, then the clock speeds and tests are rerun until they pass a full hour of testing.

TBH Unigine 3.0 is a good starting point but hardly definitive. I've had OC's that would loop Unigine 3.0 for 30 minutes w/o issues that fail within 10 minutes of BF3 on a 64 man server. BF3 is my personal "acid test" of late. My current GPU OC will pass Unigine well over 1250 GPU and 1800 memory looping but will not last more than a few minutes in BF3 without starting to artifact.

That said I'd love to get my hands on one of these binned Toxic GPU's... but not at that price point.

:p
 
Got lucky on a sale out and a 7970 for 330euro here.
1200mhz out of the box for me.
 
few things wrong with this one.
1) multiple cards do not increase your total usable memory.
2) 3GB cards can handle eyefinity resolutions just fine.


few things wrong with this one.
1. he said single card.
2. yeah sure it can handle it if you are only running 5760x1080. but there are people running 3x2560x1600(to lazy to do the math) that could definitely use 2x7970 6GB cards.



Back when I was running triple 1920x1200, I was getting mighty close to 3GB of memory used at only 2X MSAA. Go higher resolution than that or a higher level of AA and you'll see some significant benefit from having 6GB.

Now why they gave it the standard connections and didn't make it Eyefinity 6 is beyond me. Giving it 6GB of memory is going to be just about useless on a single monitor. Really high AA on a single 30", maybe.

because it uses eyefinity 2.0 which means you can use a splitter on the mini DP ports instead. no reason to throw 6 DP ports on there when you don't have to.
 
eyefinity (5760x1080) on a single card

I usually go Nvidia, this card has my attention.

The vram is nice. I always seem to cap my card by hitting the vram limit and this would give some nice room.

Any nvidia products have even close to that?
 
TBH Unigine 3.0 is a good starting point but hardly definitive. I've had OC's that would loop Unigine 3.0 for 30 minutes w/o issues that fail within 10 minutes of BF3 on a 64 man server. BF3 is my personal "acid test" of late. My current GPU OC will pass Unigine well over 1250 GPU and 1800 memory looping but will not last more than a few minutes in BF3 without starting to artifact.

That said I'd love to get my hands on one of these binned Toxic GPU's... but not at that price point.

:p


It may not be definitive in your case, but I guarantee it comes a lot closer to testing maximum limits that any other consistent method. Heaven is the standard right now for stability testing. If you weren't stable in a particular map in BF3, but were in Heaven, I guarantee your maximum clocks were somewhat close. Artifacting however is one of those things that varies though depending on game and the actual card.
 
because it uses eyefinity 2.0 which means you can use a splitter on the mini DP ports instead. no reason to throw 6 DP ports on there when you don't have to.

I assume when you says "splitter" you mean the DisplayPort 1.2 MST hubs that AMD has been talking about since October 2010. They never released those. You can't buy them. Even if you could, one displayport doesn't have the bandwidth to handle 3 30" or 27" (2560x1600 or 2560x1440) monitors at 60hz.

The only monitors without DisplayPort you could really justify a card like this for are first generation 30" screens, like the ACD, Dell 3007, and a couple others. I suspect there are a very small number of people who have 3 or more of those screens and haven't already bought DP to DL-DVI adapters. Future 4k or Retina displays will require DP to run at 60hz, and won't work with the MST hubs due to bandwidth limitations. There's no good reason at all to load up a 6gb 7970 with anything other than 6 DP plugs.
 
Hopefully this Sapphire 7970 special edition does not have the coil whine issues present in the 7970 Dual X OC
 
I'd buy a 7750 6 port card in a heart beat over the 7850 it has. My work PC (although only has 3 monitors) would be sick with 6!

Completely unecessary too.
 
It's not outperforming the 690(lol) or even the 680 on average with the 670 right behind it. And why does a 6GB card seem to fall short of the 3GB cards when applying multi-gpu resolutions? This taught me something very interesting about GPU's and on-board Memory's role in gaming. Thx for the enlightenment! :) Seems like mem frequency/cuda or streams(pick your favorite)/and bus bandwidth are what to shop for? At 700 bux it may b better to run cfx/sli 6970/670's o_O
 
It's not outperforming the 690(lol) or even the 680 on average with the 670 right behind it. And why does a 6GB card seem to fall short of the 3GB cards when applying multi-gpu resolutions? This taught me something very interesting about GPU's and on-board Memory's role in gaming. Thx for the enlightenment! :) Seems like mem frequency/cuda or streams(pick your favorite)/and bus bandwidth are what to shop for? At 700 bux it may b better to run cfx/sli 6970/670's o_O

Off the top of my head, I'll bet that the increased latency of having more memory hanging off each memory controller probably made it less competitive.

One of the issues of increasing VRAM is that you usually need more than one card to take advantage of the frame-buffer headroom...
 
I think they missed the big concept of testing when they did not do high res eyefinity with FSAA. I guess they need two cards and 5 monitors to really see the benefit of 6 GB but that whole topic was not even discussed or hinted at.


Maybe [H] will do a proper review with a setup worthy of 6GB cards.
 
It may not be definitive in your case, but I guarantee it comes a lot closer to testing maximum limits that any other consistent method. Heaven is the standard right now for stability testing. If you weren't stable in a particular map in BF3, but were in Heaven, I guarantee your maximum clocks were somewhat close. Artifacting however is one of those things that varies though depending on game and the actual card.

I agree it's a really good start.. which is why I use it as well. But my second question nowadays is nearly always "hows that OC hold up in BF3?', ;)
 
And yet it still uses the same sh!tty ATI/AMD drivers as all their other cards.
 
Last edited:
It's not outperforming the 680 on average ...

So.... when an NVIDIA card is a few FPS ahead of AMD then NVIDIA is "owning" AMD.

But when AMD is a a few FPS ahead than an NVIDIA card then AMD is not outperforming "on average" so NVIDIA still wins.


Got it!
 
few things wrong with this one.
1. he said single card.
2. yeah sure it can handle it if you are only running 5760x1080. but there are people running 3x2560x1600(to lazy to do the math) that could definitely use 2x7970 6GB cards

Few things wrong with this one.
1)By stating "on a single card", he is probably referring to single vs multiple cards.
Unless you think he means half of a card vs a single card? :)
2) Pretty sure with that resolution, you're going to hit a gpu wall before a memory one. so you'll need 3-4 cards, additionally, if the 680 is fine with 2GB running 5760x1200/1080, there is no need for 6GB for 99% of the multi-display users at the moment :)
Which is exactly what my point was. 1 Card with 3GB is enough memory for 3 displays, so eyefinity 5760x1080 on a single card comment makes no sense JUST because it has 6GB.
 
So.... when an NVIDIA card is a few FPS ahead of AMD then NVIDIA is "owning" AMD.

But when AMD is a a few FPS ahead than an NVIDIA card then AMD is not outperforming "on average" so NVIDIA still wins.


Got it!

Hadn't really said anything about winning or losing as you've interpreted. Simply stating that a card, of ANY brand, w/6GB of vRAM symbolized superiority until the "real-world" testing. Proves that more(in the wrong place in this case vRAM) isn't always better. And fyi my GPU's are AMD Radeon HD cards. And yes I'm an AMD/ATi afficianado that appreciates good tech from any vendor green red or whatever.
 
Last edited:
I would say this card would or a pair would be a great investment but with only 2 DP connectors and no damn hubs available anywhere whats the point? If they were gonna make such a badass card why didnt they just equip it with 4-6 Display Ports?

It makes no fucking sense?
 
Despite the advantages of DP over the others, many people still use displays that do not have DP inputs. In my opinion, they shouldn't have catered to people cradling old tech. Move forward or get left behind.
 
TBH Unigine 3.0 is a good starting point but hardly definitive. I've had OC's that would loop Unigine 3.0 for 30 minutes w/o issues that fail within 10 minutes of BF3 on a 64 man server. BF3 is my personal "acid test" of late. My current GPU OC will pass Unigine well over 1250 GPU and 1800 memory looping but will not last more than a few minutes in BF3 without starting to artifact.

That said I'd love to get my hands on one of these binned Toxic GPU's... but not at that price point.

:p

well, not arguing with your post per-se, but unigine was always the single application that would crash my system when stability testing my GPU clocks. I was running 1290Mhz on GPU (7970, water) in 3DM11 but I had to dial it down to 1225 in Unigine
 
well, not arguing with your post per-se, but unigine was always the single application that would crash my system when stability testing my GPU clocks. I was running 1290Mhz on GPU (7970, water) in 3DM11 but I had to dial it down to 1225 in Unigine

Go with what works for you.. YMMV always applies. ;) As I stated earlier ungine is a good start and I do use it as well. I stopped using 3dm some time back as it appeared to me to be stressing my system more than is necessary and stopped being a good OC indicator. I know that sounds odd as it's supposed to be stressful but that's the impression 3dm's latest iteration left me with.
 
Can you cfx this card with a ref 3gb card and still get the 6gb memory usage or would they cap you at 3gb like the other card?
 
I'm sure Vega will be along shortly with benchmarks of 4 of these in 5-monitor Eyefinity on a Quadfire setup. :)
 
Sadly this card cannot run 5 monitor setup properly or I'd have four of them. To run five monitors with this card you would need no less than THREE different monitor connections and you would be limited to 1920x1200@60 Hz max. Plus no water blocks for them.
 
Lol, newegg blacklisted my negative one egg review of this GPU. Basically, I warned people not to buy this card if they want to do eyefinity properly.

So much for Newegg being unbiased and offering the consumer educated purchasing choices based on other user feed back. Looks like Amazon.com will be getting more of my business.
 
Lol, newegg blacklisted my negative one egg review of this GPU. Basically, I warned people not to buy this card if they want to do eyefinity properly.

So much for Newegg being unbiased and offering the consumer educated purchasing choices based on other user feed back. Looks like Amazon.com will be getting more of my business.


You understand that negative reviews from someone that never purchased the product are looked upon as trolling, right? I am sure their moderators are not extremely technical or willing to research your beef, just to decide whether or not to allow you to assail a product they sell.
 
Every generation there's people saying "that's overkill". I remember hearing the same thing when it came to the Ti4200 and 64MB vs 128MB versions. You know what's always better? More memory.
 
You understand that negative reviews from someone that never purchased the product are looked upon as trolling, right? I am sure their moderators are not extremely technical or willing to research your beef, just to decide whether or not to allow you to assail a product they sell.

You understand, most people that want a 6gb card plan on using eyefinity right? Helping consumers make educated choices is what user feedback is all about. My comment was beneficial as it would help prevent returns from unaware consumers. Newegg permits comments from non-owners of products, and their ENTIRE CURRENT ADVERTISING campaign is based upon honest user feedback. Bottom line is, Newegg is not to be trusted anymore as they now censor informative / constructive feedback, so please take your shilling somewhere else.

Every generation there's people saying "that's overkill". I remember hearing the same thing when it came to the Ti4200 and 64MB vs 128MB versions. You know what's always better? More memory.
It actually is though...my quadfire 7970s gpu power gives out before Vram even becomes an issue on 99% of games @5400x1920p....this card should have had a 512 memory bus...then we would have had something special.
 
Last edited:
You understand, most people that want a 6gb card plan on using eyefinity right? Helping consumers make educated choices is what user feedback is all about. My comment was beneficial as it would help prevent returns from unaware consumers. Newegg permits comments from non-owners of products, and their ENTIRE CURRENT ADVERTISING campaign is based upon honest user feedback. Bottom line is, Newegg is not to be trusted anymore as they now censor informative / constructive feedback, so please take your shilling somewhere else.


It actually is though...my quadfire 7970s gpu power gives out before Vram even becomes an issue on 99% of games @5400x1920p....this card should have had a 512 memory bus...then we would have had something special.

you do realise the connections on that card work just fine for eyefinity. maybe not your eyefinity setup but they work fine for eyefinity.

if someone has 3 monitors or 5 monitors they obviously need to know what thier monitor supports and what the card has available.

so maybe this card doesnt work for your eyefinity setup but it sure as hell would work for mine. stop crying foul and go burn some time in a game or something.
 
Back
Top