AMD ATI Radeon HD 5970 Video Card Review @ [H]

Im starting to wonder about the need for cards like this myself. Especially since pc gaming as of late is console port after console port, a 4870 or equilant will run just about anything out there. So cards like this just seem to be for people who feel the need to laugh at how badly crysis scales or people that need another 200fps on left 4 dead. And thats coming from someone whos owned a 3870x2, 4870x2 and a 295.

Theres some on sale on overclockers uk, £520+ for a card with a basic 2 year warranty.....one of ati's partners really need to get their thumb out and have a warranty comparable to evga or bfg.
XFX does, but I am not sure about their UK support.
 
Visiontek also has a lifetime warranty, and I hear that everyone loves them.
 
Well what card are you using now Chris B? I'm guessing you weren't gaming at 2560x1600 resolution which seems to be the standard @ H. I think most people are still gaming at 1920x1200 and below, so you might have a point.

At the minute an 8500gt :p my 295 is in for rma. I do game at 1920x1200, even when i bought the 4870x2 i ended up selling it on and going back to the 3870x2 as the games i played ran fine on it. Just seemed to be a pointless purchase and i had it maybe 2 months tops. Ended up getting the 295 for cryis wahead and crysis wars, probably about the only game on the market that actualy needs a card like that.
 
Im starting to wonder about the need for cards like this myself. Especially since pc gaming as of late is console port after console port, a 4870 or equilant will run just about anything out there. So cards like this just seem to be for people who feel the need to laugh at how badly crysis scales or people that need another 200fps on left 4 dead. And thats coming from someone whos owned a 3870x2, 4870x2 and a 295.

Theres some on sale on overclockers uk, £520+ for a card with a basic 2 year warranty.....one of ati's partners really need to get their thumb out and have a warranty comparable to evga or bfg.

+1 on AMD board partners and warranty/service needing improvement. But really, the rivalry is a good thing. There IS a need for all of this, because it all filters down to the low end eventually. That includes consoles, iPhones, BlackBerries, everything.

More likely to see the card from 3rd parties rather than a new model from ATI ala 4890 clocked at 1ghz.

Doesn't matter. The design obviously has room for more power consumption, whether that comes from AMD itself or AIBs overclocking the existing design. The point is, that there's some headroom designed into the base product. That's obvious from looking at the unoccupied VRM/other pads on the 5970 PCB.
 
At the minute an 8500gt :p my 295 is in for rma. I do game at 1920x1200, even when i bought the 4870x2 i ended up selling it on and going back to the 3870x2 as the games i played ran fine on it. Just seemed to be a pointless purchase and i had it maybe 2 months tops. Ended up getting the 295 for cryis wahead and crysis wars, probably about the only game on the market that actualy needs a card like that.

Just went from a 5850 to a 295(shipped). Those 295s get way too hot or what?
 
great article

I have to be honest, I just bought (2) 5870's and when I heard the 5970's were going to be released I was thinking "ah crap". I can now rest easy again. The next step is to see what "real-world" 5970's get with overclocking.
 
kinda dissapointing review especially when you have only 4 games to test and choose two games with same engine. ok we understood there is problem with unreal engine aa lol
 
badass card. where is the 6 display eyefinity version ? ;). if anyone wants to sell me their xfx 5850 and get this ... ;)
 
Just seen a review @ PC Perspective

5970 OC @ 875GHz beats 5870 CF

ATI cards are scaling extremely well.
 
A bit confused with the batman AA results, it's clearly fame rate capped at the start which will massively reflect comparisons between the speeds of cards.

I know it's the stupid unreal engines frame smoothing rubbish (which is no doubt some lame feature left in from the consoles to try and keep it from looking rubbish), you can force that off through the ini and im sure many enthusaists do because it's not desireable as it tends to actually cause less smooth gameplay and for me odd mouse movements (lag or something)
 
Just went from a 5850 to a 295(shipped). Those 295s get way too hot or what?


No idea yet, mine was faulty and only had one gpu working. Have to wait on it coming back from rma.:p
 
nice review.. just wanted to point out a slight mistake that would probably cause some noobs to cry..

"Taking the memory from 1.10v to 1.15v resulted in a smaller increase up to 5.4GHz GDDR4 compared to 5.2GHz GDDR5 before. This is a small bump, but still a very respectable clock speed overall. When all is said and done, our final overclocks were 900MHz/5.4GHz."

you put GDDR4 instead of GDDR5.. not that big of a mistake for people with common sense but ya never know with some people..

also under fan noise you put quite instead of quiet..

other then those 2 mistakes the review was awesome.. nice to see a dual gpu card that can overclock as well as it did.. wonder what kind of overclocks you can get with an accelero extreme heatsink on that beast.. maybe hit the 1ghz mark?


btw brent.. i wonder if the reason you cant see the second gpu temp at idle is because its turning the second GPU off to save power? do you think thats a possibility?
 
Hmmm. I want this card, but it appears that 2x5850 is just as good. *Ponders*
So the extra shaders on the 5970 seemed to offer no advantages at all?
3200 shaders on 5970 vs 2880 shaders on 5850. *Slightly puzzled*

Still looking for a credible answer to this one.

What is the GPU<->GPU bandwidth like for 5970 vs 2x5850?
Could the GPU<->CPU bandwidth actually be bottlenecking the setup a little?

The 2x5850 has 10% fewer texture units and 10% fewer shader processors. Otherwise, they should be exactly the same at stock clocks.
 
Still looking for a credible answer to this one.

What is the GPU<->GPU bandwidth like for 5970 vs 2x5850?
Could the GPU<->CPU bandwidth actually be bottlenecking the setup a little?

The 2x5850 has 10% fewer texture units and 10% fewer shader processors. Otherwise, they should be exactly the same at stock clocks.


probably is a badwidth limitation between the crossfire bridge on the card its self.. the 3870x2 and 4870x2 both suffered from it.. but looks like its not as bad with the 5970..
 
Hats off to AMD for delivering another choice for those upgrading/building new systems. :)
I suspected the rename to 5970 instead of 5870x2 was going to mean lower performance than most hoped for.
 
Sad that there was no 295 in the review...


it was there in the apples to apples test.. they probably left it out of the full on test because we already know a single 5870 owns the GTX 295.. so no sense in spending the extra time benchmarking all the games with it when we know it will lose by a large margin..
 
Brent,

Thanks for the review, however I am curious to know if you are able to test this card with Crossfire disabled and only using one GPU. My thought process is if this is a down clocked version of a 5870 what does it do as a single GPU? For those extreme gamers who can afford two 5870s what would be the better deal I assume I know the answer but hard numbers are better and I am currious to know.

THANKS,

~GabooN
 
No Eyefinity for me till next year... sigh...
Looks like a great card, i'm surprised with the overclocks, I wish they would have unlocked CCC for the 5870 as well without having to flash the bios, oh well.
 
it was there in the apples to apples test.. they probably left it out of the full on test because we already know a single 5870 owns the GTX 295.. so no sense in spending the extra time benchmarking all the games with it when we know it will lose by a large margin..

Since when? last i checked a 295 was usually 5-10% ahead of a 5870.
 
Something I forgot to mention in the article, the PLX chip in use on the 5970 is the same chip used on the 4870 X2, however, AMD has exposed a few options in the PLX chip that weren't in use on the 4870 X2. What those specific things are I do not know, they didn't seem to be majorly worth mentioning to us, no mention of SidePort or anything.
 
Something I forgot to mention in the article, the PLX chip in use on the 5970 is the same chip used on the 4870 X2, however, AMD has exposed a few options in the PLX chip that weren't in use on the 4870 X2. What those specific things are I do not know, no mention of SidePort or anything.

I remember how much hype sideport got on the 4870x2 and to the best of my knowledge it never got enabled. Lots of people saying they didnt enable it because it pushed up the wattage a ton, and ati saying it consumed a couple of more watts but wouldnt show any additional performance increase. Which begs the question..why do it in the first place? :confused:
 
Perhaps I'm off on a tangent, but I'm curious about that AMD overvoltage app and if Brent/Kyle tried it out with the 5850 or the 5870 on the RAM side. Particularly curious about the 5850 since that's what I own, but it has the most spartan power circuitry of the 3 boards.

5970 OTOH... just looked at a naked PCB shot of it, that thing has power circuitry flying out of its ass. And apparently, 1450 MHz GDDR5 too :eek:

The app AMD provided us only allowed two Voltage options for the 5970, the stock voltage and the higher voltage indicated on the overclocking page. It was a slider, but it only gave us those two values, so not really any way to manually tweak it beyond those settings with this app. I have not tried it on any Cypress cards. When AIBs build their own apps I'd expect to see much more control over the voltages on the 5970.

- No 4GB card?

- No Hex card?

- No 5950 card that spans the price/performance gap between 1x 5870 and 2x 5870?

- Crossfinity limited to internal 5970 cores on a dozen games, in landscape only? WTF?

I understand that a market leader can't be built in a day, but I expected a bit more. Especially after two months.

- No 4GB card yet, I asked if we might see 4GB boards, and the answer was it is up to the add-in-board partners, the package is capable of supporting 4GB, now if someone is willing and able to produce one...

- Hex card?

- Nothing yet

- For now yes, for good reason, that was explained

Honestly, AMD has come out with 5 models of their next generation in a two month span, I'm very impressed. I did not expect to see Hemlock this soon.

could you do a test in Crysis WARHEAD?

multi-GPU setups are having trouble stuttering in the ice level where you start in a Tunnel (right after the first boss is defeated) with your squad, and the beach level on the beach.

is that problem fixed?

I been tested with GTX 295, GTX 260 SLI, 4870 CF, 4890CF. all of them have the same problem...

just wonder if the problem still exist in 5970, if not I might grab one :D

I can look at it in the next eval, I did check out those levels at the playable settings shown, and they were at or above 30 FPS in these areas at the playable settings indicated

kinda dissapointing review especially when you have only 4 games to test and choose two games with same engine. ok we understood there is problem with unreal engine aa lol

Time was not on our side, I would have loved to include several more games

A bit confused with the batman AA results, it's clearly fame rate capped at the start which will massively reflect comparisons between the speeds of cards.

I know it's the stupid unreal engines frame smoothing rubbish (which is no doubt some lame feature left in from the consoles to try and keep it from looking rubbish), you can force that off through the ini and im sure many enthusaists do because it's not desireable as it tends to actually cause less smooth gameplay and for me odd mouse movements (lag or something)

Yeah, I'll have to play with disabling bsmoothframerates, which I did disable in Borderlands.

btw brent.. i wonder if the reason you cant see the second gpu temp at idle is because its turning the second GPU off to save power? do you think thats a possibility?

Perhaps, though the GPU isn't turned OFF, it is just put in a sleep state, so there should be some slight power running to it me thinks, and it should register at least room temperature one would think. As more utilities become available we'll take a closer look at that.

Sad that there was no 295 in the review...

There was, we used it in apples to apples testing and did use it as a baseline in Borderlands and Batman when we performed highest playable testing on 5970. I would have loved to include it throughout the evaluation in the highest playable tables and graphs, but time was not on my side with this one.

Decent review Brent, but where are the overclocked game results??? :confused:

Again, I did not have time to go back and perform these tests, though I had wished too, we will include overclocked game testing in retail evaluations.

Brent,

Thanks for the review, however I am curious to know if you are able to test this card with Crossfire disabled and only using one GPU. My thought process is if this is a down clocked version of a 5870 what does it do as a single GPU? For those extreme gamers who can afford two 5870s what would be the better deal I assume I know the answer but hard numbers are better and I am currious to know.

THANKS,

~GabooN

Negative, "CrossFire" is hard coded ENABLED on the 5970, just like the 4870 X2. There is no CrossFire tab in Catalyst Control Center to disable or enable it.
 
Negative, "CrossFire" is hard coded ENABLED on the 5970, just like the 4870 X2. There is no CrossFire tab in Catalyst Control Center to disable or enable it.

Im pretty sure disabling catalyst ai is meant to disable crossfire? I remember some amd guy saying to leave catalyst ai enabled for it to work at some point.

Also, any chance of kyle doing a video on this showing it in action?
 
Can my PCP&C 750W handle a 5970 overclocked? I've also got my i7 920 OC'd to 4.1Ghz. Also is there a program I can download that will measure/monitor my system's power consumption?
 
btw did you guys notice that 5870 just passed gtx 295 in crysis. not only 5970 is faster than gtx 295 now also 5870 is doing better job with latest drivers
 
I love how you guys managed to snag a processor from the future! An i7 9120! AMAZING!
;)

Second page, second paragraph, first sentence.


P.S. You have to say amazing in the manner of this gentleman.
 
Last edited:
Great and informative review as usual guys! Just letting you know about a typo in your test setup info you have the Core i7 920 listed as the i7 9120

And LULZ at video above... :p
 
Kyle:

Any likelihood of seeing these cards with more variety in outputs? Ala: 6xDP? Or perhaps 1 DVI and 2 DP? Or just 3 regular DP?

So long dental plan! (HDTV output, lol)
 
ATI AA and CPU Physx are intentionally crippled in Batman AA, it is a severely flawed tool to use as a benchmark, as it intentionally, and artificially favors NVIDIA.

http://www.techpowerup.com/104868/B...nables_AA_Only_on_NVIDIA_Hardware_on_PCs.html
Batman: Arkham Asylum Enables AA Only on NVIDIA Hardware on PCs
Anti-Aliasing has been one of the most basic image-quality enhancements available in today's games. PC graphics hardware manufacturers regard it as more of an industry standard, and game developers echo with them, by integrating anti-aliasing (AA) features in the game, as part of its engine. This allows the game to selectively implement AA in parts of the 3D scene, so even as the overall image quality of the scene is improved, so is performance, by making sure that not every object in the scene is given AA. It seems that in one of the most well marketed games of the year, Batman: Arkham Asylum, doesn't like to work with ATI Radeon graphics cards when it comes to its in-game AA implementation.

Developed under NVIDIA's The Way it's Meant to be Played program, and featuring NVIDIA's PhysX technology, the game's launcher disables in-game AA when it detects AMD's ATI Radeon graphics hardware. AMD's Ian McNaughton in his recent blog thread said that they had confirmed this by an experiment where they ran ATI Radeon hardware under changed device IDs. Says McNaughton: "Additionally, the in-game AA option was removed when ATI cards are detected. We were able to confirm this by changing the ids of ATI graphics cards in the Batman demo. By tricking the application, we were able to get in-game AA option where our performance was significantly enhanced." He further adds that the option is not available for the retail game as there is a secure-rom.

With no in-game AA available to ATI Radeon users, although the features do technically work on ATI Radeon hardware, the only way AA can be used is by forcing it in Catalyst Control Center. This causes the driver to use AA on every 3D object in the scene, reducing performance, compared to if the game's in-game AA engine is used. "To fairly benchmark this application, please turn off all AA to assess the performance of the respective graphics cards. Also, we should point out that even at 2560×1600 with 4x AA and 8x AF we are still in the highly playable territory," McNaughton adds. Choose with your wallets.

http://www.youtube.com/watch?v=AUOr4cFWY-s

http://forum.beyond3d.com/showthread.php?p=1332461
 
HMM, is it me or is the article missing peices, I do not see any of the GTX 295 cards stats in any of the game listings for its performance charts, the GTX 295 is completely missing on all the charts and graphs to compare, all 3 of the ATI cards are in the charts clearly, but the GTX 295 is missing for me ??

Using Windows 7 and the IE 8 it comes with.
 
Fun fact of the day..

the 5970 is 1288 Times faster than the 3dfx's Voodoo 1 released in 1996 (When comparing Pixel/Fillrate)
 
(Nevermind, someone already asked about tri-fire)
 
Last edited:
Fun fact of the day..

the 5970 is 1288 Times faster than the 3dfx's Voodoo 1 released in 1996 (When comparing Pixel/Fillrate)

Moore's law is a beautiful thing...












Until it grinds to a screeching halt at sub-10nm dimensions using current litho techniques.
 
Ah...for a while I thought for sure AMD/ATI understood what was going on. For a while I was starting to believe they were forcing NVIDIA to be competitive, that they were collectively pushing the PC gaming world down the right path. That's what I get for having hope.

It would not surprise me in the least if like the previous card, you find very limited quantities of this video card. In our lovely world of today perception is reality so of ATI only builds 10,000 of these cards and sells them all, they can go to the stockholders and say "See!! See!! We can't build them fast enough...they are flying off the shelves!!.' Shareholders get their greedy rocks off, and the company prospers.

If ATI saturates the market with a $600 card everyone wants but virtually no one can afford, then they lose. It is the same thinking that got car companies, finance companies, and a lot of other industries in trouble in the last couple of years; gambling that your 1% die-hard faithful will save you.

I still consider myself a PC gamer, but I'm slowly losing interest because of continued lack of these companies to consider what is slowly choking this part of gaming to death...you can swear against the consoles as long as you live, but we all need to realize, like it or not...they are simply stomping PC's into the ground. PC gaming must become more economical. This money one will pay for this behemoth will land you not one but both the 360 and PS3 now...and because more and more games are becoming multi-platform...someone explain to me how companies like ATI and NVIDIA can justify I spend this ridiculous amount of money on something that will be outdated before I can get done with this post?

If you're an avid PC gamer, chances are during the single lifespan of a console (5 years, give or take), you'll probably build/re-build/upgrade your gaming PC at least twice, and probably for no less than $1000, especially when you're dropping $600 for a freakin' video card. There is absolutely no way this business model can continue. It's becoming a larger and larger waste of money, and for people like me, I will only bend over for so long. There simply isn't enough lube to make it feel right.
 
Back
Top