AMD's ATI Radeon HD 5870 Video Card Review @ [H]

Man these are some great cards. Cannot wait to pick 2 of them up this friday.

Glad to know that Physix is going to be useless compared to these cards.
 
After reading a lot of reviews around the net I'm simply not all that impressed by this card. I really was expecting more. In the time I spent reading I kept wondering if nvidia was the one to come to market with its next gen card first would it have been welcome to the same reception? Or is nvidia held to a higher standard for some reason? I can't help but feel people would have been bashing nvidia for such a medicore jump in performance after all the hype. Am I wrong? I doubt it sadly.

When Nvidia brings its GT300 to the market, I bet it will generate a lot of interest there as well. At least for me and [H] has a history of welcoming new Nvidia cards as well.

Sadly, I don't think that you have given the 5800 any welcome. You've complained about lack of PhysX evaluation and how little impressed you are by it. Perhaps you'd prefer that [H] is talking about Nvidia when evaluating an ATI card?

PhysX itself is not very impressing in its current state. It has potential, but not shown it yet. Cloth we've seen already in UFC 2009: Undisputed which is not scripted running only on CPU. Flying papers and leaves might appeal to some, but I must say I'm more impressed with Farcry 2's ingame physics with weather that impacts gameplay where wind direction affects your flamethrower and grass burning etc:
http://www.youtube.com/watch?v=cfSBHJRU9_Q

Eyefinity however gives more to gameplay then any physics it seems, also in batman:
http://www.youtube.com/watch?v=ujf6P6iGcfc&feature=channel_page

You might argue about cost, but thats up to everyones wallet. If you buy a cheap GFX, you'd probably turn off PhysX as well to run the game decently.

Current PhysX is sadly only a sidenote in my opinion... If you have the same effects shown on PhysX GPU scripted, it doesn't give that much to desire. Some might be a bit more impressed if effects are simple turned off instead.

As long as PhysX isn't an open standard on GPU, you'll probably never see anything major in them anyway. Perhaps when games with either Bullet on OpenCL or Havok on OpenCL comes, we'll see some GPU accelerated physics thats worth a second thought.
 
Last edited:
It's great to see a new card hit the market, that's never a bad thing at all.
I too am a little disappointed by the 5870's performance when compared to existing cards. I think I may have been expecting too much, but I really wanted to take a glance at one of these benchmarks charts and be able to instantly see the 5870. Think back to the release of the Radeon 9700 Pro, or the 8800GTX.

Sadly I am looking at AnandTech's graphs and the only reason I know it's the 5870 is not because of it's extra long bar on the bar graph, but rather the fact that they colored the bar neon green. :(
To make matters worse, we still can't run Crysis.

I'm currently running a Sapphire 4890 and I can seriously say that I will not be upgrading to the 5870, the dollar to performance ratio isn't even there for me. I originally only bought the 4890 to tide me over (I had an 8800GTX) until the next release of graphics cards, but now sadly, it looks as if I'll have to hang on to this thing for a little bit longer.
 
crysis is an awesome game. nothing wrong with the coding, it was designed to push hardware, and honestly it's the only game benchmark worth a damn, since almost every other game is a 360 port which presents almost no challenge to last generations hardware.

Qft.

When a game won't run on max setting on a card spitting out over 2teraflops and a core I7 clocked at 3.6ghz then there is something obviously wrong with the card and or the cpu. I am glad we have all these experts on coding in here. /sarcasm
 
Hmm... I think Nvidia might be turning red just about now ;)

This is the first time in about about 6 years that I can actually say that I REALLY want a new video card!
The 4800 series already had my interest, but the 5800 series has totally won me over. When are we gonna see these available?

WHOA!!! Finally 64 posts after 5 years of noobness :D
 
On AMD 2X, 4X and 8X AA are MSAA modes. 12X and 24X CFAA are CFAA modes. CFAA uses the shaders to perform edge antialiasing, no memory bandwidth required, and it does not affect textures or make them blurry, just on polygon edges.

The 256bit bus does not hurt it since they use high speed GDDR5 to keep the bandwidth high.

It handles MSAA well, just like the 4800 series did. I saw very nearly no drop in performance between 2X and 4X AA in NFS: Shift. 8X AA as usual was a lot faster than NV's 8xQ MSAA.

The 5870 trounced the GTX 285 in our tests, so yes, it trounces a GTX 280 which is clocked slower.

Thanks for the helpful response.

Some games only show a 10 fps difference (roughly), and I already run everything maxed with good performance in most games. Some other games show upwards of 20 fps+ more. Of course, that's comparing [H]'s testing with some others I'm seeing around.

But in general, it looks as if that moving from the GTX 280 to the 5870 running at 1920 res would be a relatively huge boost in both visuals and performance then?
Generally speaking...

Is that a safe assumption to make? Or would that be over-playing things?
 
Last edited:
Hmm... I think Nvidia might be turning red just about now ;)

This is the first time in about about 6 years that I can actually say that I REALLY want a new video card!
The 4800 series already had my interest, but the 5800 series has totally won me over. When are we gonna see these available?

WHOA!!! Finally 64 posts after 5 years of noobness :D

I ordered one from the Egg today. Keep looking online, they tend to come in and out of stock on the various etailers. If you have a microcenter or something like that near you, then call and check to see if they have a card in stock.

So, I suppose that moving from the GTX 280 to the 5870 running at 1920 res would be a relatively huge boost in both visuals and performance then?

Is that a safe assumption to make? Or would that be over-playing things?

Check out the numbers for yourself:

7a2fb08f-737b-410d-b191-7c41534f2066.jpg


349849e0-998e-43e1-bb2a-75174adcc82b.jpg


http://en.inpai.com.cn/doc/enshowcont.asp?id=7014
 
Last edited:

Holy crap...

I never go by 3D Mark scores, but the 50+ fps difference between the 285 and 5870... damn!

Well, considering it runs Crysis: Warhead on Enthusiast and gets the same fps I get with a GTX 280 on Gamer, pretty much tells me that it's worth it, I'd think. Hell, if I ran the 5870 on Gamer in Crysis: Warhead, I'd bet it'd be the first single GPU to hit between 50-60fps, considering the massive difference better Enthusiast and Gamer.

Eh, I guess I'm pretty well sold. Heh.

Again, the only thing for me being that most games I play (mainly FPS) I can already max-out with solid performance.

I top at 60 fps (always have vsync on) in Fallout 3 and almost never drop.
L4D, I'm almost always around 60 fps, but when there's a ton of action sometimes I drop in the 40's, which is a bit odd compared to Fallout 3, which is far superior in visuals (cant explain that one).
HL2, always maxed at 60 fps with only few exceptions.

But, between the massive performance difference, which should always keep me at a solid 60fps, and the fact that ATi's visuals have always seemed a little better to me (though I've been using Nvidia for years now), I'd have to say... seems worth the move.
 
This bodes well for my future video card giving at least 4890 performance at 4770 temp and power
 
This is the first card since the x800xtpe that makes me wanna trade camps. I'll give it a few weeks until retail samples hit the shelves and we see more reviews. On that note, Nvidia has some work to do. Their next card better be a beast or else!!! lol
 
I am always impressed when a 1 gpu card is equal the permance of a 2 gpu card.
Well yeah, if they came out at the same time. That's all those dam dual gpu's are, another way to milk the system. No thanks, I'm not paying for the same piece of video real estate twice.
 
Well yeah, if they came out at the same time. That's all those dam dual gpu's are, another way to milk the system. No thanks, I'm not paying for the same piece of video real estate twice.

That makes no sence. If the dual gpu card and single gpu card came out at the same time with the same proformance what would be the point of having a dual gpu card.
 
Great review as always

The blurring with Super Sampling reminds me of nvidia's 2x Quincunx, I recall a game that used its own super sampling via shaders, can't remember its name. It also blurred textures a lot.

I think its been a while since there was an issue with AF, quality and speed wise. Other than the circles I've seen no IQ difference in the review.
 
Well, if DX11 really does provide a performance boost, we might see the 58xx's lead over previous gen's increase even more:

bf1920.png
 
I've got a 4870x2 where I run games in 1920x1200 resolution and still consider upgrade. Reason is simple: Eyefinity. Old games can be played with more immersion and though TH is out with DP edition, this implementation seems to be less fuzz (though some initial bugs and lack of features with Eyefinity like bezel management needs to be dealt with which TH already have fixed). I have performance enough for games on the marked now, but I don't have eyefinity.

DX11 hardware support for some features I want like:
Tesselation
Directcompute on SM5 (I have support for 4.1 now though)

New feature support in OpenCL 1.1 in future version

Here's a list:
http://pcper.com/article.php?aid=783

I'm considering to buy some cheap 26" (25.5") TN just for added immersion in existing games and a new 5800 series card (preferably X2, since I've grown fond of them and I hope Gainward comes with a custom cooling edition like the one I have).

Looking forward to [H]'s Eyefinity review!
PhysX's little eyecandy was underwelming, but this:
http://www.youtube.com/watch?v=ujf6P6iGcfc&feature=player_embedded

Now thats adding to the game, even when Eyefinity is in the start phase!

Edit: Nice review as always from [H] btw. Tnx!


This is adding to the game:
http://www.youtube.com/watch?v=6GyKCM-Bpuw
 
When Nvidia brings its GT300 to the market, I bet it will generate a lot of interest there as well. At least for me and [H] has a history of welcoming new Nvidia cards as well.

Sadly, I don't think that you have given the 5800 any welcome. You've complained about lack of PhysX evaluation and how little impressed you are by it. Perhaps you'd prefer that [H] is talking about Nvidia when evaluating an ATI card?

PhysX itself is not very impressing in its current state. It has potential, but not shown it yet. Cloth we've seen already in UFC 2009: Undisputed which is not scripted running only on CPU. Flying papers and leaves might appeal to some, but I must say I'm more impressed with Farcry 2's ingame physics with weather that impacts gameplay where wind direction affects your flamethrower and grass burning etc:
http://www.youtube.com/watch?v=cfSBHJRU9_Q

Eyefinity however gives more to gameplay then any physics it seems, also in batman:
http://www.youtube.com/watch?v=ujf6P6iGcfc&feature=channel_page

You might argue about cost, but thats up to everyones wallet. If you buy a cheap GFX, you'd probably turn off PhysX as well to run the game decently.

Current PhysX is sadly only a sidenote in my opinion... If you have the same effects shown on PhysX GPU scripted, it doesn't give that much to desire. Some might be a bit more impressed if effects are simple turned off instead.

As long as PhysX isn't an open standard on GPU, you'll probably never see anything major in them anyway. Perhaps when games with either Bullet on OpenCL or Havok on OpenCL comes, we'll see some GPU accelerated physics thats worth a second thought.

Show me scripted PhysX GPU physics?

I would rather play the game like this:
http://www.youtube.com/watch?v=6GyKCM-Bpuw

Than like this:
http://www.youtube.com/watch?v=ujf6P6iGcfc
 
Regarding the SSAA bluriness I have found this @ PCGH 5870 review:
In comparison to the internal Oversampling, which can be forced on Geforce graphics cards with the tool Nhancer, the texture LoD is not adjusted with Ati's SSAA. To put it bluntly this means that the amount of AF is not increased. With a third party application like the Ati Tray Tools you can adjust the LoD by hand in order to receive the best texture sharpness. For 2x SSAA we don't recommend any changes yet, but for 4x SSAA you should set the LoD to -1 and for 8x SSAA -2 or -3 are flicker free - depending on the texture content.
There are also screenshots showing the difference between default and adjusted LOD.

In another review there is a comparison between SSAA 2x/4x/8x modes on R870 and GT 200 in Oblivion and FEAR.
Basically Ati SSAA(RGSSAA) mode is better than the one used by Nvidia (OGSSAA) but you have to change the LOD. Bug or a feature?
 
Last edited:
I was more surprised that the 4890 hanged in *very well* with the $100+ more expensive GTX 285 in the most demanding games.
I guess 4890 drivers really matured since, very nice.

Oh yeah 5870 is good too :)
 
And Age of Conan is still the "best" mmorpg graphics wise ;) yet hardocp is one of the few that ever used it as a benchmark.


Except that no one plays it anymore because (despite being officially released) still felt like it was in beta.
 
Show me scripted PhysX GPU physics?

I would rather play the game like this:
http://www.youtube.com/watch?v=6GyKCM-Bpuw

Than like this:
http://www.youtube.com/watch?v=ujf6P6iGcfc

And if that's your preference that's fine. That doesn't make PhysX better though. You and Silus have this holy crusade going on to try to prove PhysX is the best shit ever.

Guess what: IT'S NOT!

PhysX in its current form is proprietary bullshit.

I would rather have this along with Eyefinity.

Courtesy of the [H]
http://www.viddler.com/player/32928b1b/

Shows that damn good Physics don't have to be limited only to the GPU like Nvidia wants you to believe.

Pair the HD5870 up with Eyefinity and a physics engine that damn near maxes out a multi-core CPU and you're in gaming heaven.
 
Hey Brent does the crossfire test,review come soon? i don't really trust other reviews ;)
 
Hey if you want to play at 30 fps with some extra eyecandy we've seen with the Havok engine in a bajillion games but done better, be my guest - in another thread. We'll be playing at at full details with 60 fps minimum framerates with Havok putting the CPU to good use, kthx.
 
I'm still concerned about the Linux support (not sure about the rest of you guys). So far I read a little of Phoronix.com and this is what I found:

In regards to open-source support for the Radeon HD 5800 series, we would expect initial mode-setting support to appear within the xf86-video-ati driver within weeks and hopefully we will also see kernel mode-setting support shortly thereafter or around time of the Linux 2.6.33 kernel, We would expect it to be at least a couple of months before there is any open-source 3D acceleration support for these new graphics cards.

Their Linux support sucks, and this is precisely why I've been avoiding GPUs from the red team in my rigs :(
 
Hoping Ati will sort their drivers and we get like 10/15% more performance for Crossfire ;)
I won't even mention how good 4890s in crossfire scale.
 
Why don't game companies just grab a full core on my quad core and use that for physics? Or maybe two full cores? Oh yeah cause Nvidia is paying them moneys... /slap
 

That is only a demonstration of how little PhysX gives. They just took away effects that you can get in several years old games. Take a look at this:
http://www.youtube.com/watch?v=kOmnm5eyUjY

Smoke, destructible objects etc. All without physx.

Calling that Batman video a demonstration for GPU accelerated physics? I bet if they removed the cape and called the cape PhysX effect, you'd be all over it saying that "PhysX gives batman cape".

What you see in that video, is effects thats been in games for years. They added these effects and run them in realtime physics, instead of scripted (the youtube link I gave you is realtime physics as well on CPU).

PhysX gives nothing special to games and they have to remove effects that other games have without PhysX to try to sell it. Pathetic.

Show me scripted PhysX GPU physics?

What are you talking about? I'm talking about having GPU PhysX effects in games vs. scripted physics effects showing the same.

Like Mirrors edge's banners. The PhysX there needed to have a "with banners on GPU" and without banners at all. Even though the most stupid game developers are able to add banners without physx.

Eyefinity on the other hand, actually adds something new to the game thats worthwhile.
 
Why don't game companies just grab a full core on my quad core and use that for physics? Or maybe two full cores? Oh yeah cause Nvidia is paying them moneys... /slap

Games are getting better and better at this, I agree that there is a lot of room left in Quad-Core CPUs for Physics, and they are very underutilized. One thing DX11 has is multithreading,so driver, dx runtime and the application can run in separate threads, this should naturally make DX11 games more multithreaded.
 
That is only a demonstration of how little PhysX gives. They just took away effects that you can get in several years old games. Take a look at this:
http://www.youtube.com/watch?v=kOmnm5eyUjY

Smoke, destructible objects etc. All without physx.

Rigid bodies in that small scale dosn't impress me...this isn't 2006 anymore.
and none of the physcis in crysis surpasses PhysX:
http://www.youtube.com/watch?v=XBc3AR-Dl10
Older than Crysis I might add...don't confuse bling-bling IQ with physics...

Calling that Batman video a demonstration for GPU accelerated physics? I bet if they removed the cape and called the cape PhysX effect, you'd be all over it saying that "PhysX gives batman cape".

Ad hominen is not a valid argument....it only waste yours and my time.

What you see in that video, is effects thats been in games for years. They added these effects and run them in realtime physics, instead of scripted (the youtube link I gave you is realtime physics as well on CPU).

I countered your link, your premise is false.

PhysX gives nothing special to games and they have to remove effects that other games have without PhysX to try to sell it. Pathetic.

No what is "pathetic" is people trying to sell unstable, 1/5th accurate physics as the same a stable 100% physics...and people buying into that lie and even promoting the lie...



What are you talking about? I'm talking about having GPU PhysX effects in games vs. scripted physics effects showing the same.



Like Mirrors edge's banners. The PhysX there needed to have a "with banners on GPU" and without banners at all. Even though the most stupid game developers are able to add banners without physx.

FUD...they didn't before PhysX, it's only after AGEIA that focus really came onto physics and I love how you try and ignore this...borderlining deperate:
http://www.youtube.com/watch?v=6GyKCM-Bpuw

Lots of the effects cannot be scripted..as the objects are INTERACTIVE *chough*

Eyefinity on the other hand, actually adds something new to the game thats worthwhile.

The cost of more monitors, fisheye vision, zoomed in gameplay..or nothing at all...all documented.
 
Kudos to AMD, but my modded 4850 still runs everything I throw at it and everything I plan on running the rest of the year.

The low power consumption when idle is very attractive though. Impressive and long overdue.
 
FUD...they didn't before PhysX, it's only after AGEIA that focus really came onto physics and I love how you try and ignore this...borderlining deperate:
http://www.youtube.com/watch?v=6GyKCM-Bpuw

Lots of the effects cannot be scripted..as the objects are INTERACTIVE *chough*

The cost of more monitors, fisheye vision, zoomed in gameplay..or nothing at all...all documented.

So wait, you mean nobody cared about physics in games before something named AGEIA? And you start off this sentence with FUD? Is that a warning or something..?

And why can't they do the physics in my unused processor cycles in my unused processor cores? Brent says that DX11 will allow this to happen and there is only one gen of cards right now that support it... AMD's. But you want me to have to waste my precious GPU flops to run physics?

And what was I supposed to see in that Batman video? Batman's cape? I should waste my GPU flops so Batman's cape can flutter? How about I use my GPU to get more AA or better effects and game developers use up my completely unused CPU cycles for physics?

I really don't understand the reason to have physics done in my GPU.
 
Back
Top