AMD Radeon HD 6870 & HD 6850 Video Card Review @ [H]

Great review guys. Really appreciate all the effort and time that went into making it. Answered all my questions so far. Really looking forward to seeing how they OC too.

And I appreciate the concise writing style of your reviews.

Getting too itchy for the upgrade bug, damn you student loans!
 
Right now, counting shipping, I'm seeing a $0 difference in price.
MSI 5850
MSI 6870
And that's NOT counting the $60MIR for the 5850.

A 6w difference in power draw at full load (6870/183w - 5850/189w), and the 6870 only allowed higher rez/settings by one AA tic on average.
I'm sorry, not quite upto [G]old in my book.
 
Nice review guys, really helpful.

I was waiting for such a review before deciding on a video card, decided on the ASUS 6870HD.
 
Love the reviews but I just gotta say, with the price, performance, and efficiency of the 5850 and 6870 being wayyy to close, I can't see the 6870 getting a gold. On the other hand, the 6850's was well deserved.

Remember that the 5850s price point was $399.00 at one point. At least it was here in Canada. By comparison, the 6870s price point is $240. So yes, the performance is quite similar but same performance at 60% the cost is impressive and possibly gold star worthy.

Its like if Hyundai put out a BMW quality car with all the features/performance of the BMW plus a few new features(better UVD/HD3D, HDMI 1.4a, etc) for the cost of the current elantra instead of the cost of a entry model BMW.
 
Look at the price Hole between the new GTX 470 price and the GTX 480.

Price holes don't matter at all unless someone else is selling a lot of cards there.

Cayman is supposed to be faster than GTX 480 which is still well over $400.

Depends on what AMD is up to. It may be they want to force Nvidia to sell at a loss across the consumer graphics market.

AMD appears confident they can start shipping Llano chips in 2H 2011, which means Llano products start appearing in 3Q 2011, which means Nvidia starts losing what remains of their OEM graphics market share even as they continue to lose market share across the AIB market segments.

With no prospect of regaining their steadily shrinking market share in the low end and OEM graphics markets, and AMD continuing to be a ferocious competitor in the mid and high end consumer segments, Nvidia will lose the capacity to recoup the cost of developing new graphics architectures that are competitive in the consumer market while also achieving their design goals in the professional and HPC markets. They may be forced to scrap their 'straddle' architectural strategy and focus their designs solely for the professional and HPC markets.
 
I've seen some good articles out there. I pay most attention to [H]ard's results due to them actually bothering to play the games and they offer charts that show FPS over time. I like to know how often a card dips to its min FPS. If it's once when a map loads, I couldn't care less. If it's often regularly below 30, I care. Say card Y hits 15FPS when a map first loads and then does not do that again. Card X has a 19FPS min that it dips to fairly regularly in game play. Give me card Y despite the lower reported min.

Or for example, if a card is hitting 19FPS when standing looking at something like an ocean and a great depth of field scene I don't care as much as if it's hitting 19FPS every time you enter a fire fight. One might be a tad annoying and two is unacceptable.

The 6870 and 6850 are solid products for their positioning.

I won't be surrendering my manually overclocked 5870s until Cayman hits however.
 
Right now, counting shipping, I'm seeing a $0 difference in price.

And that's NOT counting the $60MIR for the 5850.

That's because the 5850 came down in price due to the performance of the 6870. The 6870 drove down the 5850s price to what its currently sitting at. If you had tried to buy it a month ago, you'd find another $100 dollars on the 5850 price.
 
Yeah. I can understand why Nvidia wants it used and AMD doesn't though.

Yep. The 460 is faster than the 6870/5870 no matter the resolution. We can easily see that the tesselator is holding the Radeons back.
 
Right now, counting shipping, I'm seeing a $0 difference in price.
MSI 5850
MSI 6870
And that's NOT counting the $60MIR for the 5850.

A 6w difference in power draw at full load (6870/183w - 5850/189w), and the 6870 only allowed higher rez/settings by one AA tic on average.
I'm sorry, not quite upto [G]old in my book.

I think the name thing is at work here (thanks AMD, good job). this is now AMD MIDRANGE cards. the 6800 are taking the place of the 5700 in this generation. if you think in this in terms of the new 6700 matching the 5800 it makes sense. so its actually a hell of a card.

and yes, you can excepted lots of deals on the 5850 / 5870 coming down the line. they have to clear out the older inventory. I would not be the least supposed to see the 5850 retail for less then the 6850 at some point. (thinking rebates here) Out with the old in with the new.
 
Yep. The 460 is faster than the 6870/5870 no matter the resolution. We can easily see that the tesselator is holding the Radeons back.

What reviews are you reading? did PRIME1 do those reviews you looked at?
 
What reviews are you reading? did PRIME1 do those reviews you looked at?

It's not clear in his quotation but he's quoting a quote on the Hawk 2 benchmark where H.A.W.K 2 current shows the 460 beating the 6870 at very resolution. Of course, there's that whole controversy thing surrounding the benchmark for Hawk 2.

I do think your correct though that Prime1's benchmarks if the where to do one would contain only three peices of software: Batman AA with AA turned way up, H.A.W.K. 2 and Ugine benchmark with extreme tessolation.
 
Are there other examples besides HAWCKS TOO where a game has been specifically designed to screw one hardware maker? I can't think of any off the top of my head...not since 3DFX Glide days anyway...
 
It's not clear in his quotation but he's quoting a quote on the Hawk 2 benchmark where H.A.W.K 2 current shows the 460 beating the 6870 at very resolution. Of course, there's that whole controversy thing surrounding the benchmark for Hawk 2.

I do think your correct though that Prime1's benchmarks if the where to do one would contain only three peices of software: Batman AA with AA turned way up, H.A.W.K. 2 and Ugine benchmark with extreme tessolation.

Don't forget the DX10.1 patch UBISOFT said was bugged, which showed AMD cards beating Nvidia cards.

Ubisoft is in bed with Nvidia, Which explains the Hawkx2 benchmark, and why Kyle will not use it.
 
Don't forget the DX10.1 patch UBISOFT said was bugged, which showed AMD cards beating Nvidia cards.

Ubisoft is in bed with Nvidia, Which explains the Hawkx2 benchmark, and why Kyle will not use it.

and that it is a canned benchmark.
 
That was a very special case, they had the only DX11 boards in existence and already in the hands of developers. I'll def look that aside because they weren't crapping out PR about it when Nv were about to release or hype up something of their own.


It's not clear in his quotation but he's quoting a quote on the Hawk 2 benchmark where H.A.W.K 2 current shows the 460 beating the 6870 at very resolution. Of course, there's that whole controversy thing surrounding the benchmark for Hawk 2.

I do think your correct though that Prime1's benchmarks if the where to do one would contain only three peices of software: Batman AA with AA turned way up, H.A.W.K. 2 and Ugine benchmark with extreme tessolation.

Nvidia sent that out themselves with this in the mail

As a member of the ultra-secret H.A.W.X. 2 squadron, you are one of the chosen few, one of the truly elite. You will use finely honed reflexes, bleeding-edge technology and ultra-sophisticated aircraft--their existence denied by many governments--to dominate the skies.

And, in fact, you really are one of the truly elite, thanks to Ubisoft providing an early press preview of the H.A.W.X. 2 benchmark prior to its public release this Friday.

H.A.W.X. 2 has been optimized for DX11 enabled GPUs and has a number of enhancements to not only improve performance with DX11 enabled GPUs but also greatly improve the visual experience while taking to the skies.

AMD responded with

It has come to our attention that you may have received an early build of a benchmark based on the upcoming Ubisoft title H.A.W.X. 2. I'm sure you are fully aware that the timing of this benchmark is not coincidental and is an attempt by our competitor to negatively influence your reviews of the AMD Radeon HD 6800 series products. We suggest you do not use this benchmark at present as it has known issues with its implementation of DirectX 11 tessellation and does not serve as a useful indicator of performance for the AMD Radeon HD 6800 series. A quick comparison of the performance data in H.A.W.X. 2, with tessellation on, and that of other games/benchmarks will demonstrate how unrepresentative H.A.W.X. 2 performance is of real world performance.

AMD has demonstrated to Ubisoft tessellation performance improvements that benefit all GPUs, but the developer has chosen not to implement them in the preview benchmark. For that reason, we are working on a driver-based solution in time for the final release of the game that improves performance without sacrificing image quality. In the meantime we recommend you hold off using the benchmark as it will not provide a useful measure of performance relative to other DirectX 11 games using tessellation.

It's a bum benchmark and should not be used. Do not trust benchmarks unless the executable comes from a neutral third party, but hey - [H] has time and again been demonstrated to be better than that. Be wary reading reviews on other sites, even more so if they claim a card is better because it has a vendor optimized folding client or does outrageous over-exaggerated tessellation you will never ever have any use for ever.
 
The difference with tessellation on and off seems to be so minute to me.
 
HAWX2 is a really, really bad game anyway. Somehow it's worse than the first one which was just a poor man's Ace Combat.
 
HAWX2 is a really, really bad game anyway. Somehow it's worse than the first one which was just a poor man's Ace Combat.

The first one had moments where it could be fun. It was, at the very least, competent. I heard people call it soulless, which seems appropriate. It feels like no one involved with it really cared. It was just done because they were told to make it. Doesn't help that they tried to have this really stupid sim/arcade mix that just did not work as well at it should have. After a few levels I had had enough and never touched it again.
 
Are there other examples besides HAWCKS TOO where a game has been specifically designed to screw one hardware maker? I can't think of any off the top of my head...not since 3DFX Glide days anyway...

Half Life 2 (meant to screw Nvidia over with Gabe being an asshat about the whole thing) and Doom3 (meant to screw ATI over with their terrible OpenGL performance).

Ah 2003. You are missed.
 
Half Life 2 (meant to screw Nvidia over with Gabe being an asshat about the whole thing) and Doom3 (meant to screw ATI over with their terrible OpenGL performance).

Ah 2003. You are missed.

From what I remember, Nvidia tried to pass off the FX5800 as DX9 spec when it wasnt, Which is why it would not run in DX9 mode in HL2.

And Nvidia tried to blame Valve. Was not Valve's fault ATI was DX9 compatible and Nvidia wasn't
 
Yep. The 460 is faster than the 6870/5870 no matter the resolution. We can easily see that the tesselator is holding the Radeons back.

I think you meant to say that the 460 is slower than the 6870 no matter how we look at it. :D
 
I'm confused. I didn't think the numbering scheme switching over would be that big of a deal.

So basically, AMD has increased computing performance through shader performance and overall architectural design while decreasing overall die size and is passing the savings onto us?

From looking at the benchmarks here, 6870 is a tad bit slower than 5870, however:

$240 --- $340
6870 --- 5870

So, depending on the game, shave off about 10-20 FPS on average depending on game but save $100.

Is this about right?
 
I'm confused. I didn't think the numbering scheme switching over would be that big of a deal.

So basically, AMD has increased computing performance through shader performance and overall architectural design while decreasing overall die size and is passing the savings onto us?

From looking at the benchmarks here, 6870 is a tad bit slower than 5870, however:

$240 --- $340
6870 --- 5870

So, depending on the game, shave off about 10-20 FPS on average depending on game but save $100.

Is this about right?

Everything here makes sense...I think you've assessed it spot on.
 
Depends on what AMD is up to. It may be they want to force Nvidia to sell at a loss across the consumer graphics market.

AMD appears confident they can start shipping Llano chips in 2H 2011, which means Llano products start appearing in 3Q 2011, which means Nvidia starts losing what remains of their OEM graphics market share even as they continue to lose market share across the AIB market segments.

With no prospect of regaining their steadily shrinking market share in the low end and OEM graphics markets, and AMD continuing to be a ferocious competitor in the mid and high end consumer segments, Nvidia will lose the capacity to recoup the cost of developing new graphics architectures that are competitive in the consumer market while also achieving their design goals in the professional and HPC markets. They may be forced to scrap their 'straddle' architectural strategy and focus their designs solely for the professional and HPC markets.


if anything it would force nvidia to re-think their architecture.. instead of this bullcrap brute force power hungry attempt to get the same performance as AMD maybe its time to fix the problem like ATI did almost 7 years ago when the developement of the RV870 started and even before that with the HD3k series.. instead of creating a monster gpu like nvidia's been doing for years they went the other route and tried to get as much performance out of the architecture while using as little power as they could... then maybe nvidia will be able to compete with AMD in my opinion..

Right now, counting shipping, I'm seeing a $0 difference in price.
MSI 5850
MSI 6870
And that's NOT counting the $60MIR for the 5850.

A 6w difference in power draw at full load (6870/183w - 5850/189w), and the 6870 only allowed higher rez/settings by one AA tic on average.
I'm sorry, not quite upto [G]old in my book.

really.. not gold?

hmm lets see

6870:
HDMI 1.4 support for 3D blu-ray playback
native 4 display eyefinity
6w less power usage at full load 5w less at idle.. do the math..
6wx24x365 = 52,560w less power used if the card was at full load 24/7
5wx24x365 = 43800w less power used if the card sat idle 24/7 compare that to what you pay in your electricity bill then you will see the big picture..
higher resolution and AA support
priced to compete from day 1

5850
no 3d bluray playback support
no native 4 display eyefinity
uses more power under load..
uses more power under idle
runs lower res/AA
and was never priced to compete even a year later thanks to price gouging e-tailers and B&M stores..


so please tell me where it doesnt deserve the gold status?
 
Last edited:
In regards to the 6870, if you currently own a 5870 then the 6870 is not an upgrade, but if you own a 5850 or below, then the 6870 is an upgrade.

How the hell is a 8/9% (on average) improvement over the 5850, "an upgrade"? What a load of rubbish.
 
How the hell is a 8/9% (on average) improvement over the 5850, "an upgrade"? What a load of rubbish.


because thats technically what an upgrade is.. more performance.. whether it doesnt count as an upgrade per your definition of an upgrade doesnt matter..

now the question is if you have a 5850 should you upgrade to a 6870? it really depends on what your looking for.. if quad display eyefinity or hdmi 1.4 is something you want then yeah its a good upgrade for more features with slightly better performance.. if those arent things you want then obviously no it wouldnt be the most economical upgrade..
 
How the hell is a 8/9% (on average) improvement over the 5850, "an upgrade"? What a load of rubbish.

AMD asked for the cards to be compared to the GTX 460 768/1GB models.
Reviewers are comparing it to the 5850 across the Internet.
The 6870 isn't intended to be a replacement for the 5850 but it seems reviewers keep making comparisons.
IMO, anything below 15%, just keep your current card, especially if you OC.

These cards are meant to outperform Nvidia's midrange. AMD would have released the 6950/6970 first if they were trying to replace the 5850/5870.
 
Last edited:
6870:
HDMI 1.4 support for 3D blu-ray playback
native 4 display eyefinity
6w less power usage at full load 5w less at idle.. do the math..
6wx24x365 = 52,560w less power used if the card was at full load 24/7
5wx24x365 = 43800w less power used if the card sat idle 24/7 compare that to what you pay in your electricity bill then you will see the big picture..
higher resolution and AA support
priced to compete from day 1

5850
no 3d bluray playback support
no native 4 display eyefinity
uses more power under load..
uses more power under idle
runs lower res/AA
and was never priced to compete even a year later thanks to price gouging e-tailers and B&M stores..
I dont think the majority of people care about 3d bluray support, 4 screen eyefinity or having support for higher res than the 5850 is capable of pushing. 5W difference is only going to end up about $5 a year if you have your computer on 24/7, but the far more likely case is maybe 4-8 hours a day and $1-2 a year, so no real biggy. But it is a bit cheaper, the examples the guy above used were the cheapest 5850s and the most expensive 6870.

Overall I'm not sad that I bought a GTX460, yeah the 6850 is a bit better, not a lot better and I got my 460 cheaper than that anyway. I'll save my next upgrade for something better price/performance than the 6800 series.

Nothing has really blown me away since the 4870/gtx260, it seems like since then neither ATi nor nVidia have given any large gain in price/performance.
 
I dont think the majority of people care about 3d bluray support, 4 screen eyefinity or having support for higher res than the 5850 is capable of pushing. 5W difference is only going to end up about $5 a year if you have your computer on 24/7, but the far more likely case is maybe 4-8 hours a day and $1-2 a year, so no real biggy. But it is a bit cheaper, the examples the guy above used were the cheapest 5850s and the most expensive 6870.

Overall I'm not sad that I bought a GTX460, yeah the 6850 is a bit better, not a lot better and I got my 460 cheaper than that anyway. I'll save my next upgrade for something better price/performance than the 6800 series.

Nothing has really blown me away since the 4870/gtx260, it seems like since then neither ATi nor nVidia have given any large gain in price/performance.

Good to see you got a 460 GTX 1gig for under $180, that was a damn good steal for you,

But I am not sure how you cannot be blown away by the 5870?. It was twice as fast as 1 4870.....on 1 single card?. Thats a damn good jump from 1 generation to another.
 
Tbh it's all a bit meh! Some things I like - the power usage, the price, which has caused a price war is good. Some things I hate - the name - even nvidia has never brought out a card with the same badge but a generation up that is slower. 6870 my ass - it's a 6770.

Some things worry me, it's a very focused but very compromised design. It's well down on shader power vs 5 series, it's tessellator is still rubbish - it's basically the same as a 5 series except for low values of tessellation (e.g. http://www.geeks3d.com/20100826/tessmark-opengl-4-gpu-tessellation-benchmark-comparative-table/). Hence while these cards might work ok in most of todays games, in the future where the balance of shaders/tessellation/etc is different they might struggle.

Ati's solution to this is driver *optimisations* - which basically means shader replacement. i.e. using a lower performance shader with slight IQ loss instead of the one the game writers wrote - and hence intended to be used. Doing this isn't so bad as most people won't notice the IQ loss if ati are smart. The real problem comes from the dependence this puts on ati drivers. Most would agree that ati drivers have problems - there's already lots of people stuck on various cat versions for various games with the 5 series. Expect this to get worse with the 6 series, and even worse again for the 6 series when ati brings out the 7 series and hence stops concentrating on 6 series for drivers.

Still the price war is great - no ones forcing you to buy one of these - the GTX 470 for example is a much more future proof design and has just got cheap.
 
Tbh it's all a bit meh! Some things I like - the power usage, the price, which has caused a price war is good. Some things I hate - the name - even nvidia has never brought out a card with the same badge but a generation up that is slower. 6870 my ass - it's a 6770.

Some things worry me, it's a very focused but very compromised design. It's well down on shader power vs 5 series, it's tessellator is still rubbish - it's basically the same as a 5 series except for low values of tessellation (e.g. http://www.geeks3d.com/20100826/tessmark-opengl-4-gpu-tessellation-benchmark-comparative-table/). Hence while these cards might work ok in most of todays games, in the future where the balance of shaders/tessellation/etc is different they might struggle.

Ati's solution to this is driver *optimisations* - which basically means shader replacement. i.e. using a lower performance shader with slight IQ loss instead of the one the game writers wrote - and hence intended to be used. Doing this isn't so bad as most people won't notice the IQ loss if ati are smart. The real problem comes from the dependence this puts on ati drivers. Most would agree that ati drivers have problems - there's already lots of people stuck on various cat versions for various games with the 5 series. Expect this to get worse with the 6 series, and even worse again for the 6 series when ati brings out the 7 series and hence stops concentrating on 6 series for drivers.

Still the price war is great - no ones forcing you to buy one of these - the GTX 470 for example is a much more future proof design and has just got cheap.


Explain how a 470 GTX is more future proof?
 
But I am not sure how you cannot be blown away by the 5870?. It was twice as fast as 1 4870.....on 1 single card?. Thats a damn good jump from 1 generation to another.

It wasn't quite twice as fast and cost heaps more than the 4870 did at release. I'm not denying performance hasn't improved, price/performance hasn't improved all that much. The 6850 and 6870 are around what the 5850 and 5870 should have cost in the first place.
 
Back
Top