AMD's ATI Radeon HD 5870 Video Card Review @ [H]

Goodnight America and other places, I'm slightly drunk - my second team won our Aussie rules Grand Final(it's our Superbowl, 99,000 at the MCG in the pouring rain!!, $200 for a bleachers ticket, I see 4 free yehah!!)

5870 rulz, GL [h] and all you [h]erz....I will sleep well, hope you all do too t'nite!!!!!!!!


PS, the team is Geelong(the cats, meoooow!!) AFL Premiers for 2009, WD catters!!!!!!!
 
Last edited:
I realize this. Only reason I'm focusing on Crysis is because essentially, it's still the only game my 275 doesn't run well at max settings. Nothing else is really pushing our cards - Arma II yes, but it's more or less CPU limited.

And it looks this way for the foreseeable future too. I don't see any games on the horizon that will push the technology envelope, aside from.. Crysis 2. Games like Modern Warfare 2 and BioShock 2 are going to be console ports meaning they will run similar to their predecessors.

I suppose if there were more demanding games out there, splurging on a 5870 wouldn't be a hard decision. But that's not the case.

So spend 200, buy another 275 and enjoy. Here is my results with SLI 260s

9/24/2009 9:35:11 PM - Vista 64

Run #1- DX10 1920x1080 AA=No AA, 64 bit test, Quality: High ~~ Overall Average FPS: 47.405

9/24/2009 9:39:31 PM - Vista 64
Beginning Run #1 on Map-island, Demo-benchmark_gpu
DX10 1920x1080, AA=No AA, Vsync=Disabled, 64 bit test, FullScreen
Demo Loops=3, Time Of Day= 5
Global Game Quality: VeryHigh
================================================== ============
TimeDemo Play Started , (Total Frames: 2000, Recorded Time: 111.86s)
!TimeDemo Run 0 Finished.
Play Time: 59.37s, Average FPS: 33.69
Min FPS: 22.66 at frame 1948, Max FPS: 49.82 at frame 88
Average Tri/Sec: -7525541, Tri/Frame: -223397
Recorded/Played Tris ratio: -4.10
!TimeDemo Run 1 Finished.
Play Time: 54.93s, Average FPS: 36.41
Min FPS: 22.66 at frame 1948, Max FPS: 50.14 at frame 109
Average Tri/Sec: -7646426, Tri/Frame: -210018
Recorded/Played Tris ratio: -4.36
!TimeDemo Run 2 Finished.
Play Time: 54.51s, Average FPS: 36.69
Min FPS: 22.66 at frame 1948, Max FPS: 52.42 at frame 103
Average Tri/Sec: -7747647, Tri/Frame: -211158
Recorded/Played Tris ratio: -4.34
TimeDemo Play Ended, (3 Runs Performed)
================================================== ============

Completed All Tests

<><><><><><><><><><><><><>>--SUMMARY--<<><><><><><><><><><><><><>

9/24/2009 9:39:31 PM - Vista 64

Run #1- DX10 1920x1080 AA=No AA, 64 bit test, Quality: VeryHigh ~~ Overall Average FPS: 36.55
 
After reading this review and Anand's I was left feeling a bit let down by ATi's card. That being said I'm definitely not the target audience of this card; there is no way I'd drop $300 on a card and then another $400 on two more 24" monitors so I can play my games on 3 screens. Besides which it seems like Anand had some real problems getting his games to play properly with Eyefinity, which I'm sure ATi will address asap.

On the other hand this should help drive prices down on other cards and that makes me :)

This is because Eye-Finity will only work if you plug in atleast 1 DP supporting monitor. The 5870 only has 2 clock generators and once you plug in 2 DVI or DVI/HDMI plug, you have used up all the clock generators for the card. Kinda a shame they come up with something that has limited support and added cost to what your after.

From Dave Baumann at B3D
"The chip only has clock generators for two "legacy" (any non-DP output) outputs, so as soon as you plug any two of either the DVI or HDMI in there are no more clocks for the final one. The additional display output has to come via the DisplayPort output. We are working with vendors to qualify and help bring to market active adapter solutions that convert DP into other outputs."
 
[H] doesn't use the canned benchmark test.

My performance in the game isn't any different. I use fraps but dont have a registered version so I can't record the FPS over a period of time. Now if some wants to share there info with me so I can register it, I'll be happy to redo it with actuall game map/zone. I've completed the game twice now and still go back to play. I also have Warhead instaled but my brother has my game disk til turkey day as he wanted to play over the summer.
 
So spend 200, buy another 275 and enjoy. Here is my results with SLI 260s

9/24/2009 9:35:11 PM - Vista 64

Run #1- DX10 1920x1080 AA=No AA, 64 bit test, Quality: High ~~ Overall Average FPS: 47.405

9/24/2009 9:39:31 PM - Vista 64
Beginning Run #1 on Map-island, Demo-benchmark_gpu
DX10 1920x1080, AA=No AA, Vsync=Disabled, 64 bit test, FullScreen
Demo Loops=3, Time Of Day= 5
Global Game Quality: VeryHigh
================================================== ============
TimeDemo Play Started , (Total Frames: 2000, Recorded Time: 111.86s)
!TimeDemo Run 0 Finished.
Play Time: 59.37s, Average FPS: 33.69
Min FPS: 22.66 at frame 1948, Max FPS: 49.82 at frame 88
Average Tri/Sec: -7525541, Tri/Frame: -223397
Recorded/Played Tris ratio: -4.10
!TimeDemo Run 1 Finished.
Play Time: 54.93s, Average FPS: 36.41
Min FPS: 22.66 at frame 1948, Max FPS: 50.14 at frame 109
Average Tri/Sec: -7646426, Tri/Frame: -210018
Recorded/Played Tris ratio: -4.36
!TimeDemo Run 2 Finished.
Play Time: 54.51s, Average FPS: 36.69
Min FPS: 22.66 at frame 1948, Max FPS: 52.42 at frame 103
Average Tri/Sec: -7747647, Tri/Frame: -211158
Recorded/Played Tris ratio: -4.34
TimeDemo Play Ended, (3 Runs Performed)
================================================== ============

Completed All Tests

<><><><><><><><><><><><><>>--SUMMARY--<<><><><><><><><><><><><><>

9/24/2009 9:39:31 PM - Vista 64

Run #1- DX10 1920x1080 AA=No AA, 64 bit test, Quality: VeryHigh ~~ Overall Average FPS: 36.55

My 4870x2 gets that with 1920x1200 4xAA / Very High all on the tougher assault_map benchmark. GTX 260 c216s?
 
Are they overclocked?

No, stock speeds. 576/1350. My little room gets very warm when I play, so I wait till winter to OC more as I keep my window open then and the nice cold air helps keeps the temps around where I prefer them 65-70C range. FYI, I tend to run it 626/1600 in the winter, if interested I could re do the bench line again once temps at night stay consistantly low with the OC.
 
Each see it differently - I see it as these guys are telling it like it is, it seems a great little card - yes, there may(or may not?) be a touch of bias pro-ATI, the front page is littered with ATI advertisements which I don't know(or care) if [H] gets paid for but irrespective, it's not like Kyle and co are telling fibs(like I did about my age - I'm really 23 hehe) and then saying 'quick, buy it, before it's gone!!'

Hehe, get our first AMD advertising contract in 7 years or so and we are biased? We have been advertising for NVIDIA or it partners without stop for the last 7.

I have lost ad contracts because of what we say about products, and personally, I don't give a shit. We say what we say because it needs to be said, not because of an advertisement contract. Those come and go, but they always come back, because they KNOW our readers are intelligent and trust us to tell them the truth.
 
Objective journalistic integrity and not afraid to defend it. Why I come back to this site before any other.

Not to blow smoke up anyone's ass, but you guys are great. Keep on keepin' on.

Thank you sir, your readership and support are appreciated.
 
This is the reason I and many others here enjoy [H] so much. I've been reading here since 2000, and I've personally seen that this place has no loyality to any brand, but only to performance. It's a bit unsettling to see all the baseless accusations thrown your way. If the fan-boys would just look over your reviews over the years, the truth would smack them upside the head. Please keep bringing us un-biased and quality reviews. This is the only place on the web who I can trust won't lead us down the wrong path due to business alliances.

My man, wow where to start. First of all get your head out of the owners asses here. Ok great, now that that's out of the way let me commence.

I don't remember anyone here saying Kyle or any [H] staff was biased or in favor of AMD products in any way shape or form. I really think some of you guys are blowing it way out of proportion. I mean on one hand you have [H] great site, great reviews and good journalistic integrity and all that. They go out of their way to let us all know they won't just say or put up shit for a company because of money or any kind of incentive and that's great and one of the reasons I have been a reader and poster here for YEARS. On the other hand you have people who , the minute they see anyone, not be in love with every morsel juiciness and detail in every review they say your bashing [H] or Kyle or whatever. What kind of nonsense and bullshit are you spewing? Take me as an example the only issue I had with Kyle was the fact that I personally felt he was being a lil pushy with the overjoyness of the 5870, that is all. Never questioned his integrity or his will or meaning in any way. I didn't imply it either. In fact it was a great review and I like the [H] review style but I disagreed with how I got the impression that he was pushing the fact that anyone with a current gen card should upgrade to this now and not wait for to see what nvidia has to offer. Now that's not a bad thing, it's simply my view and my opinion. if that offends some people then grow up and learn to understand the difference between criticism and accusation and innuendo.

I have been a loyal fan, reader and poster at [H] for a long time and I usually agree with most of the reviewers here but just because I (or others) disagree with one aspect of a review doesn't suddenly mean we hate Kyle or Hardocp or are accusing anyone of anything.

/endrant
 
Kyle, about GT300, I've been in touch with few of nVidia folks, and they claim that next month they will have paperlaunch, hardlaunch is in November. Not that I question your information, but I wonder if the PR is lying as they always do in any company :D and they just cover up things, so I won't write that their cards won't be on market in half year. Just want to ask if you are absolutely positive about your GT300 info (I trust [H] more then any PR stuntman)

If your info is correct, then switching to ati and their 5870 :)

Also got a question of different connectors type. Did you notice any differences in quality, if you used DVI, HDMI or DP?
 
Last edited:
Time for me to rest:

Holycrap2.png
 
GT200b (GTX285) was a die shrink and it took 8 months for it to be released after tape-out (From May 08 to January 09).

If scuttlebutt says tape out in August for GT300, wouldn't a November launch be extremely optimistic?
 
First off, thank you for the hella fine Xman smackdown. Gave me a big fatty grin. :)

Thanks for the launch info confirmation of rumors, I'd been hearing a lot too from various sources but too much to make much sense of the signal through all the noise. Hearing that you don't feel the launch is imminent carries significant weight with me.


Recently, and with who I think? :eek:

Oh yes, that was so epic, it had me quaking in my boots the moment i read it. oye, i've read better.
 
Just for quick reference without having to go back several pages in this thread, Brent has replied to this:
Hey Brent and Kyle, I've been thinking about the blurry-texture problem with SSAA.. perhaps there would be a way to force negative LOD (Level of Detail) on the textures to balance it out? Nvidia's CP allows us to choose "Clamp" instead of "Allow", but Clamp does not allow the LOD to be negative, or to go below 0 (anisostropic filtering takes care of the rest). Rivatuner allows a manual tweaking of the LOD when clamp is chosen, but AF does not allow for negative LOD, right? Anyways, ATI could implement a -2 LOD when 2xSSAA is used, -4 LOD for 4xSSAA and -6 or the maximum negative LOD for 8xSSAA or something like that.

2xSSAA effectively doubles AF, while 4x SSAA gives 4x AF by itself if no AF is being used (or quadruples it anyways). Say, if we choose 16x AF with 2x SSAA, we're actually getting 32x AF as shown in the D3D AF Tester program, but if the textures are being blurred out by the supersampling, then the only logical solution is a much sharper LOD to try to alleviate the problem, no?

I'm looking forward to SSAA because I just cannot stand the shader-aliasing in Unreal3-engine games (Mass Effect, R6Vegas, Bioshock, etc..). It's like as if FSAA is actually "HSAA" or Half-screen AA because only half of the things on the screen are being AA'ed. It was horrible with Lost Planet too!
, with:

It is possible that what we are seeing is a bug, but I have tried it in several games with the same result. I'll be expanding that testing in my follow-up to see how many games exhibit this behavior with SSAA. No way, AFAIK to change LOD manually.

Yep, SSAA should not be blurry, but it is, it could be a bug, or it could be how they have implemented it. I'll be testing it on more games in the follow-up.

I must have forgotten about ATI Tray Tools. It included a slider for tweaking LOD settings. Mer1 below just confirmed this with a link from PCGH which states "In comparison to the internal Oversampling, which can be forced on Geforce graphics cards with the tool Nhancer, the texture LoD is not adjusted with Ati's SSAA. To put it bluntly this means that the amount of AF is not increased."

Regarding the SSAA bluriness I have found this @ PCGH 5870 review:

There are also screenshots showing the difference between default and adjusted LOD.

In another review there is a comparison between SSAA 2x/4x/8x modes on R870 and GT 200 in Oblivion and FEAR.
Basically Ati SSAA(RGSSAA) mode is better than the one used by Nvidia (OGSSAA) but you have to change the LOD. Bug or a feature?

Hey Brent, do you still plan on doing an in-depth coverage of the new SSAA modes?

Would using 2x SSAA in combination with Edge-Detect result in 6x effective AA for polygon edges, so that it's identical to ATI's old 6x MSAA mode? I am actually more interested in 4x SSAA but the performance hit appears to be too great for most of the newer games. Using nHancer with Nvidia cards will allow for 3x3 SSAA (works only with some DirectX games) which is ordered-grid. Perhaps ATI would let us use 3x SSAA in future drivers for a balance between severe performance hit?

SSAA is perhaps the only way to get rid of this horrible shader aliasing that is becoming a huge problem with many of the newer games, if not the majority. When playing UT3, any high-contrast edges (due to HDR lighting) completely undermine the AA that is done on the polygon edges, to the point that the screen appears to be so full of jaggies.
 
I would rather have 3D than wrap around monitors. Cheaper too. Multi-monitor is easier for ATI with less compatibility problems so I can see why they're going that route. Just too bad, since games are nearly 3D ready anyway. But we won't see developers coding for it except as an after thought if both teams aren't on board with it.
 
Support for large resolutions and FOVs is built into all modern game engines.

Yes but getting access to them in any reasonable way is a pain in the arse, if it's possible at all. There are many game engines still applying widescreen as vert- rather than horz+ so all of you people thinking of tri-monitor setups are going to run into all the problems that the people using matrox TH2G setups have been struggling with for years now.

Basic 1 monitor widescreen is still a pretty tough deal for most developers at the moment, a lot of them still cannot get horz+ right and until they do, gaming with widescreen is going to be awkward and gaming with ultra wide 3 monitor displays is going to be impossible.

Why do you think places like widescreengamingforum.com and pcgamingstandards.com exist
 
Yes but getting access to them in any reasonable way is a pain in the arse, if it's possible at all. There are many game engines still applying widescreen as vert- rather than horz+ so all of you people thinking of tri-monitor setups are going to run into all the problems that the people using matrox TH2G setups have been struggling with for years now.

Basic 1 monitor widescreen is still a pretty tough deal for most developers at the moment, a lot of them still cannot get horz+ right and until they do, gaming with widescreen is going to be awkward and gaming with ultra wide 3 monitor displays is going to be impossible.

Why do you think places like widescreengamingforum.com and pcgamingstandards.com exist

+1
RacerS over at wsg has come up with more fixs,hacks and workarounds for widescreen and triple head users than I can remember. I don't run a triplehead setup but I have gone to him for widescreen fixes many times over the years. I really wonder how all these new triple screen people jumping on the eyefinity wagon will handle it. Well we will see I guess.
 
I have a 47'' LG that I have been gaming on for 2yrs ,first with a ATI 2900 pro,now 2 ATI 4870 no promblems
 
Ok Brent and Kyle, you convinced me. I just ordered a 5870 from Newegg after clicking the banner of course. I'm off work next week so I should have lots of time to give it a shakedown cruise.
 
... i like taht, shakedown cruise, lol.

i'm assuming the spring refresh will also bring us a 5890 2GB or 5GB card or sth?
 
I hope support for eyefinity works in the 20-30-20 setup I have. That is a deal breaker at this point
 
Was there any mention of mixing 120Hz & 60Hz monitors... I'm assuming they'd all just run at 60Hz in that case. I've only had my 3008WFP for a year if that... but from short time of using a friend's 22" 120Hz monitor, I am a tad envious. Guess I'll just have to wait longer until large 120Hz displays are available :)
 
How much of a performance increases is their between a 5850 and 5870?
 
How much of a performance increase is their between a 5850 and 5870?
 
I wanted the Asus HD5870 at first but due to the severe shortages I'm not sure when one will become available. I have the option of snagging a sapphire right now, but I've never used a sapphire product before. Is there any particular reason I should avoid them?

I *may* decide to do a very mild overclock, maybe ~900/1250.
 
VENT/Seriously my biggest beef, the thing I cannot stand the most is the GD Fing vaccum cleaner fan, drivers will improve, they always do so no biggy but I cannot stand that GD fan on the oem hsf.....WTF are they thinking even at 40% it is annoying at least just for the hell of it ramp it up to 100% seems to be atleast 65-70db...by now a part of being a competitve manufacture both NV/ATI should be able to devise a better performing/quieter cooling solution instead throwing an god awful loud fan on a big piece of pot metal, aftermarket companies can and do with no where near the resources, to me it is a total package concept .
Anyway there air NO aftermarket hsf out for 5870....at least that are capable of cooling everything needed to be cooled aka VRM, you can cool the gpu and ram np but VRM will go nuts without active cooling or the big hunk of metal the stock hsf is to displace the heat....I went through this same thing when I got the 4870 and a V-2, core temps was great but measely sinks couldn't keep the card's VRM cool enough and it contiously locked up....spending $400 on a WC system as I post this.....VENT/
 
Last edited:
Back
Top