ATI Radeon HD 3870 X2 @ [H]

Status
Not open for further replies.
I'm frustrated that all the [H] regulars (and especially Kyle) seem to think I'm "[H] Bashing".

Re-read my initial post here http://www.hardforum.com/showpost.php?p=1031985071&postcount=638 and please try to understand that I'm coming from the perspective that I DO agree with you guys and I WANT to see you convince the rest of the net of the serious problems at hand.

I'm not criticizing because I disagree.. I'm trying to offer constructive criticism from the perspective of a fellow concerned enthusiast who is seeing more and more communities across the net shoving [H]'s philosophies by the wayside because it's not doing a good enough job of explaining itself.

This has nothing to do with agreeing or disagreeing with the results found here.. it just has everything to do with adopting an approach that doesn't scare people away but instead informs and educates.

Sticking your head in the sand, throwing up little disclaimers, and lumping everyone who's even remotely critical of this review into some 'hater bucket' is not enough to regain popularity.. in fact it's turning people away. You need to accept that if you're doing things differently, there's going to have to be a concerted effort to explain what's going on.. and I just don't see enough of that here. That's just my opinion and getting angry at me or calling me names isn't going to change it.. I guarantee you seeing some actual canned benchmark scores from your same setup for comparison will have a profound effect towards the rest of the review community.

Now if you choose to think about everything I'm trying to say here and then wish to formulate a rebuttal, please do read the entire thread. I did and I wouldn't have posted before doing such.. there are numerous examples of the people who "don't get it".. people who I'm trying to help with my ideas.. people who you shouldn't be swearing at or calling names.. and my viewpoints are in direct response to all of these kinds of opinions, so it's unfair to judge my views without first seeing everything it was based on.

I'm not asking to change much.. in all honesty the biggest change that needs to be made is the attitude.. nothing technical.
 
Yes, I know - I didn't say I wasn't impressed with it, I was just trying to point out why it might not be considered as impressive as a single-GPU card with the same power. 'tis all, man. :)

But still, the thing with SLI and Crossfire being 'broken' in Crysis demo wasn't the result of missing profiles as I understand it - anyone know how this card would handle a situation like that?

I think Crysis is certainly to blame for poor multi-GPU performance. It isn't ATI's fault necessarily. Though future driver updates may help to improve performance.
 
I'm not criticizing because I disagree.. I'm trying to offer constructive criticism from the perspective of a fellow concerned enthusiast who is seeing more and more communities across the net shoving [H]'s philosophies by the wayside because it's not doing a good enough job of explaining itself.

Actually, i think [H] explained itself far more thoroughly than any site I've ever seen before. For one, Kyle just linked a 4-year-old article in another thread explaining the whole reason for the change, what had been changing, and what new methods are going to be used. He's also been personally regurgitating the same philosophy all over the forums, especially during contraversial events such as this.

All of the actual [H] evaluations contain paragraphs upon paragraphs of detailed explanations of their gameplay experiences, not just a huge set of bargraphs and numbers with a 1-liner or a paragraph of "No duh" content. I've seen reviews where paragraphs said "According to the bargraphs, X card came out on top" which is a total waste of words because the numbers spoke for themselves. The author basically reiterated what's already been said without adding any additional useful noteworthy comments.

The only people who think that [H] isn't explaining itself very well are the ones who are either too lazy to find out exactly *why* [H] uses such a methodology, or are genuinely ignorant and need the explanations handed to them.

EDIT: In response to suggestion #1, I think it would be asking too much to play at different configurations over different games and over different cards. Remember that the evaluations arent about running 3-minute timedemos or benchmarks. The game gets tweaked manually with card #1, which may take a long time for a single game. Then you'd have to play through the entire game (while evaluating the most GPU-intensive levels) to designate a "good testing zone." Then you'd have to train yourself to make moves and do things in a consistent manner for card #1. After that you have to swap out the card with a different one tweak settings for card #2 manually, then play through the designated evaluation zone as close as possible to card #1. This is just testing 2 cards with one game. Now how about testing 3 or so cards over 5 games? Extremely complex and time consuming. To ask for different resolutions/eyecandy levels across 2+ cards over several games is not only too demanding, but not incredibly useful to begin with.

For example, how many people would wanna play a game at 640x480 with all settings maxed and 16x AA? There may be some, but the amount of time and energy spent into testing that particular configuration just isn't worth it, even if it can stress a card about the same as high resolution, medium settings and medium AA. The bottom line is that you can't please everyone.

For suggestion #2, I think its pretty unnecessary to explain absolutely everything, just areas that need explaining (which are usually the extreme peaks and valleys in the line graph). Low fps areas, namely where it hits 0, are usually explained as automated save points that cause the game to halt for a second. I recall Brent explaining a high fps area in Battlefied 2, where when you die, the camera looks to the sky. I don't recall if it was in an actual article or something that I read in the forums. Those are particular areas that *may* need explanations, but not every action taken.
 
Great review, Kyle. Pretty much the only website I trust in terms of how well a videocard will actually perform.

Noseguard: How's your 2900XT treating you? I'd heard Anand returned his. ;)

Concerning others: You people constantly spouting "Omg, the results are so different from other websites! [H]'s review must be wrong!!!", listen very, very closely. The reason that [H] review shows drastically different numbers is because [H] uses a different testing methodology. I will repeat it for the slow, the stupid and the mentally handicapped: [H] USES A DIFFERENT TESTING METHODOLOGY. If you can make any sort of logical leap whatsoever, that is where the difference in scores comes in. Because one is "real-world" and the other is canned/synthetic. Capisce?

Also, for the people who complain that "real-world testing is not accurate due to variables": you're idiots; this is how car magazines test cars (in "real-world conditions", not vacuum tunnels) and how the latest pharmaceutical drugs and medical studies are performed (in vivo > in vitro). And please, take a basic statistics class for god's sake, so you understand how trends work and that tiny variables are really negligible in the grand scheme of things.
 
During the review did you try 8 gigs of ram to see if the transparent crossfire had issue with that amount of memory? Because currently two individual cards are unable to run crossfire with 8 gigs of ram.
 
Hey Kyle. Just a thought here. Have you considered using a program like EZ Macro's to to record keyboard and mouse movements? Using a program like this for your benchmarking runs would eliminate any timing issues and give you consistant movement, firing, jumping, etc etc etc etc, issues that currently crop up from doing the runs manually. You could even have it set up to start and stop fraps benchmarking log in exactly the same place on every run.

It's a fairly lightweight program, though I don't know if it would skew the results due to having it running, might see a slight drop in framerate vs not using it, then again might not. Also any computer AI might not be in exactly the same place duing each run if the AI is dynamic enough to try different things to shoot back at you,

Just a thought.

I doubt that will work very well in games like Crysis, where the AI is more autonomous. If the AI decided to do as little as to take one step in another direction, a clear headshot would just become an airshot. This will become more and more apparent over time as scripted AI becomes more dynamic and complex. Rather than to have a program do all the work for you, its best to be personally responsible for producing the data, as you can see and play everything that goes on.
 
Taken from techPowerUp!-



The HD 3870 X2 is a better value vs the GTX and at the same time the 8800GT 512Mb, HD3850 256Mb and 8800GTS 512Mb are better than it "price vs performance". For most people the 8800GT 512Mb and 8800GTS 512Mb are definitely going to be the best choice. The HD 3870 X2 is certainly viable for folks on a 30" screen.
 
I doubt that will work very well in games like Crysis, where the AI is more autonomous. If the AI decided to do as little as to take one step in another direction, a clear headshot would just become an airshot. This will become more and more apparent over time as scripted AI becomes more dynamic and complex. Rather than to have a program do all the work for you, its best to be personally responsible for producing the data, as you can see and play everything that goes on.

Umm duuh, thats why I mentioned it..... :mad:
 
I Have gotten called an ATI fanboy for this..even though i havent owned an ATi card since the pre-doom 3 era...

however...with [H] review as 50% weighted...and all other reviews combined as 50% weighted

final answer...

3870X2 is the fastest single card you can buy at the moment and the High End Crown has Swung to Ati.

If you believe the numbers that use the other sites are putting out, then yeah, it is faster, in benchmarks that don't represent actual gameplay. If however you believe [H] and their real time in game benchmarks, then no, it hasn't.

I tend to believe [H] on this one. Hell the 2900XT beat the 8800GTX and Ultra in 3D mark, but couldn't overtake it in games, I think the same applies here.

At any rate, with that new SLI board you got, your going to be better off going for an NV based card anyway. Especially if your gaming at 1680x1050 or above, which seems to be the breaking point between enthusiast level cards and the high end.

Me I would buy the higher end card if at all possible simply do to the fact that I know future games are going to push the hardware even more.

One last note. If everybody is patting ATI on the back for putting out this card, they should realize one thing, it took them 13 months to make a card that would match/beat the 8800GTX, which was released on October 8th 2006. That's actually pretty sad.
 
Well I'm adding the 3870x2 to my list of parts, after reading two different reviews and coming to the same conclusion that the card is the fastest single card on the market and for the price, I'm buying one...Yeah, yeah I know Nvidia are bringing out a dual GPU card as well, but judging by their current prices they will want your first born as payment when it does arrive on the market, I'm going to buy what is the best value for money and when someone brings out a card faster than the competitors top card for less money I won't hesitate to buy it (regardless of who made it or how many GPU's it has)!

About time Nvidia did something about their pricing on a card that has been around for almost two years, maybe they would do themselves a favour and gain a few more customers.
 
... One last note. If everybody is patting ATI on the back for putting out this card, they should realize one thing, it took them 13 months to make a card that would match/beat the 8800GTX, which was released on October 8th 2006. That's actually pretty sad.

And it took Nvidia 11+ months to beat themselves, and then only marginally. What the heck have they done since the Ultra that comes into the same ballpark? Nadda! You keep yanking theirs, and pretty soon we'll see gaming graphics fall into a holding pattern, and prices stay artificially high. That would be the case except for the ATI card you're trying to denigrate.
 
If you believe the numbers that use the other sites are putting out, then yeah, it is faster, in benchmarks that don't represent actual gameplay. If however you believe [H] and their real time in game benchmarks, then no, it hasn't.

I tend to believe [H] on this one. Hell the 2900XT beat the 8800GTX and Ultra in 3D mark, but couldn't overtake it in games, I think the same applies here.

At any rate, with that new SLI board you got, your going to be better off going for an NV based card anyway. Especially if your gaming at 1680x1050 or above, which seems to be the breaking point between enthusiast level cards and the high end.

Me I would buy the higher end card if at all possible simply do to the fact that I know future games are going to push the hardware even more.

One last note. If everybody is patting ATI on the back for putting out this card, they should realize one thing, it took them 13 months to make a card that would match/beat the 8800GTX, which was released on October 8th 2006. That's actually pretty sad.

I keep hearing about how H said 2900xt wasnt as good while every other site using standard benchmark techniques said it was king...

The other site I read(www.hothardware.com) are always honest in their benches. Infact you claim everyone was showing 2900xt as "THE CARD" except for H. But that isnt true at all and infact in the summary section for the 2900xt review on hothardware.com he says its almost as good as a 8800gts but more often the GTS is better. Oh and it doesnt beat the GTX or ultra in 3dmark.

BTW they use Custom timedemos there, which is what I been wanting to see from [H] so we can see if any of those results disagree with real world results for ourselves. I dont need it milk fed to me. I will decide what is playable.
 
Time demo != Real World - why are you failing to understand this?

Why are you failing to understand that in that example I talked about a custom time demo, while, if you had read on in the same post, you'd discover:

They also had a fraps test against the Ultra in UT3, where they played themselves. Methology is similar to [H] here and the 3870x2 beat the ultra at same settings.
http://techreport.com/articles.x/13967/9

or as you would say:

Real World = Real World

The numbers are wrong. I don't buy that crap that the only reason the 3870x2 beats the Ultra is that the Ultra is bad on time demo's Even custom time demo's. It beat the Ultra also in real world gaming using fraps.

[H] using a beta vista 64 driver vs. tech report using beta vista 32-bit drivers is a more plausible explaination for this.
 
Pepsiennis, thats one of the worst argument I've heard in this thread: "it took nvidia 11 months to beat themselves." Are you serious? Am I missing sarcasm here? It's public knowledge that NV is not releasing better products because:

There is only 1 or 2 games that actually need more power than the Ultra provides.
There is no reason for NV to spend more money to beat a dead horse.

You can slap a honey coated view over stuff all you want, but Ati is slipping. I expected that though and I don't even expect ATI to become competitive till R700 which is something I thought ~2 years ago when the 8800 series dropped.
 
Not really: Nvidia hasn't risen to its next level since the Ultra, which apparently is only a suped-up GTX. After a year, the GTX is still @ US$450-500+. They could have moved on and caught the enthusiast dollar by providing an even better experience, but instead they have been peddling the exact same experience for a year now. And oh, I don't see anyone kicking Crysis' arse. Didn't Nvidia have Crytek's thumb up its arse as well? And what is still up with Nvidia Vista drivers?
 
Why are you failing to understand that in that example I talked about a custom time demo, while, if you had read on in the same post, you'd discover:



or as you would say:

Real World = Real World

The numbers are wrong. I don't buy that crap that the only reason the 3870x2 beats the Ultra is that the Ultra is bad on time demo's Even custom time demo's. It beat the Ultra also in real world gaming using fraps.

[H] using a beta vista 64 driver vs. tech report using beta vista 32-bit drivers is a more plausible explaination for this.


when the 2900XT or single 3870 didnt beat the ULTRA in any TIME DEMOS... noone said a thing

now when it beats it ... TIME DEMOS mean nothing

wierd
 
Not really: Nvidia hasn't risen to its next level since the Ultra, which apparently is only a suped-up GTX. After a year, the GTX is still @ US$450-500+. They could have moved on and caught the enthusiast dollar by providing an even better experience, but instead they have been peddling the exact same experience for a year now. And oh, I don't see anyone kicking Crysis' arse. Didn't Nvidia have Crytek's thumb up its arse as well? And what is still up with Nvidia Vista drivers?

they apparently don't have to when the only real competition is themselves :p

Btw, i've owned nvidias, voodoos and am currently running a x1900xt, so i'm far from a fanboy; just a realist who prefers to get the best hardware for the price.
 
[RCKY] Thor;1031980127 said:
I've been a huge proponent of the real world reviews since you guys started doing them Kyle. Maybe this will enlighten a few more people.

Thanks for taking the time to give us what we need to make an informed decision.

/agree
 
@ keflex... And NOT competing with yourself has resulted in success from when?

oh, i don't know... /looks at microsoft

think about this logically for a split-second: if your closest competitor hasn't offered anything that forces you to push the envelope, and you're already making a tidy profit *and* own most of the market share, why would you pour more money into r&d and making your previous products obsolete?

from a business standpoint, competing with yourself is pretty stupid.
 
And it took Nvidia 11+ months to beat themselves, and then only marginally. What the heck have they done since the Ultra that comes into the same ballpark? Nadda! You keep yanking theirs, and pretty soon we'll see gaming graphics fall into a holding pattern, and prices stay artificially high. That would be the case except for the ATI card you're trying to denigrate.

Actually the 8800GTX was beaten when the Ultra was released. If only by a little. The Ultra is over 6 months old now. ATI took over a year to come up with something competitive to the 8800GTX and Ultra. That's SAD. NVIDIA hasn't had to do anything else because until Crysis came out a single 8800GTX could chew up and spit out almost any game out there at nearly any resolution. At least ATI releasing something competitive will force NVIDIA to get the ball rolling on releasing technologies that have been in development all this time.

Not really: Nvidia hasn't risen to its next level since the Ultra, which apparently is only a suped-up GTX. After a year, the GTX is still @ US$450-500+. They could have moved on and caught the enthusiast dollar by providing an even better experience, but instead they have been peddling the exact same experience for a year now. And oh, I don't see anyone kicking Crysis' arse. Didn't Nvidia have Crytek's thumb up its arse as well? And what is still up with Nvidia Vista drivers?

Actually it is a new revision of the G80 core with higher shader clocks, higher overall clock speeds and slightly higher memory clocks. After a year the 8800GTX is still nearly $500 because there was nothing to compete with it until the release of the 3870X2. This isn't NVIDIA's fault but rather the fault of ATI who has only brought out one sub-par product afer another. As for NVIDIA's Vista drivers they have been fine for me for some time now. I don't know what you are talking about concerning those.

they apparently don't have to when the only real competition is themselves :p

Btw, i've owned nvidias, voodoos and am currently running a x1900xt, so i'm far from a fanboy; just a realist who prefers to get the best hardware for the price.

No they don't. Instead they get to continue to work on developing their future architectures and cards. If the Geforce 9-Series is what it is rumored to be it is only a stop-gap designed to maintain their leadership in the market place until their real successor to the G80 GPUs is released.

In my opinion the 3870 X2 is a great card. The performance is great however it is hardly compelling for those individuals who already have a 8800GTX, Ultra, 8800GTS 512MB or any kind of G80/G92 SLI setup. So again while it is a great product with some nice improvements to Crossfire itself, it isn't the GTX killer it needs to be.
 
from a business standpoint, competing with yourself is pretty stupid.

To a degree this is true. However when sales start to fall you need to provide the consumers a reason to keep giving your company money. That is the time to release newer compelling products.
 
when the 2900XT or single 3870 didnt beat the ULTRA in any TIME DEMOS... noone said a thing

now when it beats it ... TIME DEMOS mean nothing

wierd

because the 2900XT actually *IS* worse than the Ultra? Just maybe?

Of course, if you want to pretend that the 2900XT does better than the Ultra in real-world gaming too, go ahead. But you need to lay off eating the paint chips.
 
Actually the 8800GTX was beaten when the Ultra was released. If only by a little. The Ultra is over 6 months old now. ATI took over a year to come up with something competitive to the 8800GTX and Ultra. That's SAD. NVIDIA hasn't had to do anything else because until Crysis came out a single 8800GTX could chew up and spit out almost any game out there at nearly any resolution. At least ATI releasing something competitive will force NVIDIA to get the ball rolling on releasing technologies that have been in development all this time.



Actually it is a new revision of the G80 core with higher shader clocks, higher overall clock speeds and slightly higher memory clocks. After a year the 8800GTX is still nearly $500 because there was nothing to compete with it until the release of the 3870X2. This isn't NVIDIA's fault but rather the fault of ATI who has only brought out one sub-par product afer another. As for NVIDIA's Vista drivers they have been fine for me for some time now. I don't know what you are talking about concerning those.



No they don't. Instead they get to continue to work on developing their future architectures and cards. If the Geforce 9-Series is what it is rumored to be it is only a stop-gap designed to maintain their leadership in the market place until their real successor to the G80 GPUs is released.

In my opinion the 3870 X2 is a great card. The performance is great however it is hardly compelling for those individuals who already have a 8800GTX, Ultra, 8800GTS 512MB or any kind of G80/G92 SLI setup. So again while it is a great product with some nice improvements to Crossfire itself, it isn't the GTX killer it needs to be.

the ability to go 4 GPU x-fire in the near future is attractive to buyers i would assume
 
the ability to go 4 GPU x-fire in the near future is attractive to buyers i would assume

They can only do 4 GPU's now. Which means they can do up to four 3870 cards or two 3870 X2 cards.

So right now 3-Way 8800Ultra's are still the solution to beat for absolute performance.
 
because the 2900XT actually *IS* worse than the Ultra? Just maybe?

Of course, if you want to pretend that the 2900XT does better than the Ultra in real-world gaming too, go ahead. But you need to lay off eating the paint chips.

Well to be fair the 2900XT failed to beat the 8800GTX or Ultra in real world game play and in the time demos. So basically everyone knew it pretty much failed. The ATI fanboy's cried "It isn't supposed to be a match for the Ultra or GTX, it is supposed to beat the 8800GTS 640MB" which is actually couldn't do half the time.

However when compared to the 8800GTS many believed the 2900XT was the faster card because many time demos said it was. According to the [H] the real-world game play testing showed otherwise if I recall correctly. So the [H] took a lot of flak back then for that review.

Again, anytime anyone disagrees with the masses they come under fire. This is normal. Some people just don't want to look at the data. However after several driver revisions and several months most people will agree that the 2900XT was an inferior product and the 8800GTS 640MB was the way to go all around. Again this corresponds to the what HardOCP said all a long.

Real tests vs. synthetic testing is always the way to go. Whatever duplicates the real-world testing the best is the valid method in my mind.
 
Actually the 8800GTX was beaten when the Ultra was released. If only by a little. The Ultra is over 6 months old now. ATI took over a year to come up with something competitive to the 8800GTX and Ultra. That's SAD. NVIDIA hasn't had to do anything else because until Crysis came out a single 8800GTX could chew up and spit out almost any game out there at nearly any resolution. At least ATI releasing something competitive will force NVIDIA to get the ball rolling on releasing technologies that have been in development all this time.

Exactly my point; thay haven't been, or else they'd have been ready to counter the ATI launch. Instead, they now have to overcome their own inertia.


Actually it is a new revision of the G80 core with higher shader clocks, higher overall clock speeds and slightly higher memory clocks. After a year the 8800GTX is still nearly $500 because there was nothing to compete with it until the release of the 3870X2. This isn't NVIDIA's fault but rather the fault of ATI who has only brought out one sub-par product afer another. As for NVIDIA's Vista drivers they have been fine for me for some time now. I don't know what you are talking about concerning those.

I agree, ATI could have and should have done better during the period under discussion, but OC'ing your own chip doesn't strike me as terribly progressive; if they charge so much for a chip that can do so much more, then they were taking people to the cleaners. As for drivers, a year after Vista's release Nvidia's lack the polish and integration that ATI's do. I've used both, and Nvidia's pale in comparison.



No they don't. Instead they get to continue to work on developing their future architectures and cards. If the Geforce 9-Series is what it is rumored to be it is only a stop-gap designed to maintain their leadership in the market place until their real successor to the G80 GPUs is released.

Technologies they have had in development for some time? Is being matched the time to begin your development? Or have they been holding back on their fans and public? They have it or not? Selling old product or sitting complacently. It seems poor either way.

...it isn't the GTX killer it needs to be.

There are... a few... opinions that are to the contrary.

Regards Dan.
 
I sent him an email asking him why no love for the [H], let's see what he says :D


Likely that it was a simple error.Sure...


I emailed INQ about that; they replied (they are very good at getting back to me, so kudos to them) and said they didn't intentially just use "choice reviews." Seems weird to me that they could forget about HardOCP, but I'll give them the benefit of the doubt - there are a lot of review sites out there, and I'm sure they read through them pretty thoroughly before they post em on their site. So it is possible they just didn't get to it yet.


I believe that like I believe Peter Petrelli can fly. :p



I hope [H]ard dosn't start using canned benchmarks..or include them.
Would be a waste of their time and my time..as canned sites are 20 a dozen.
I want gaming prefomance, not 3D-wanna-be-numbers...


Agreed,I would hate to see [H] lower themselves to the level of some other sites review methods.


To bad it took them 2 GPU's on a single PCB to do it =/. Not as impressive in my honest opinion.

-DarkLegacy


Agreed,but still better then nothing.


@ keflex... And NOT competing with yourself has resulted in success from when?


Since AMD has fallen down and cant get up like Nvidia has.
 
Yep, stale competition=Nvidia gets lazy. The reason why they have stuck with the G80 architecture for so long is because when you R&D a new arch to beat your previous one, it takes an IMMENSE amount of resources. Nvidia isnt stupid enough to be wasting that much money on R&D for a better than GTX/Ultra card when they don't see the need to, that need being ATi bringing out something just as good. Hopefully 3870x2 and R700 will beat Nvidia into getting off their ass and doing something new now.
 
Yep, stale competition=Nvidia gets lazy. The reason why they have stuck with the G80 architecture for so long is because when you R&D a new arch to beat your previous one, it takes an IMMENSE amount of resources. Nvidia isnt stupid enough to be wasting that much money on R&D for a better than GTX/Ultra card when they don't see the need to, that need being ATi bringing out something just as good. Hopefully 3870x2 and R700 will beat Nvidia into getting off their ass and doing something new now.

My point is the same, why would they sit around not developing? Why not do it now and have it in the bag, ready for the next need? Does the boxer wait to drop a troubled opponent? Is R&D really going to be cheaper as the months roll along? They haven't, which means they are milking it to the last moment, or... they can't? Is it possible they are having trouble with new requirements (Vista, DX10.x)?
 
it is pretty exciting, but pairing them with a lame duck phenom doesn't sound right.


I would suggest a Phenom has more than enough power to drive you past the point of pretty much any 3D shooter game being CPU limited at resolutions that are common today. A new CPU scaling article is on the schedule for [H] as well and I think this is something that is important to again visit.
 
I will suggest this thread has played out and it now sufficiently off topic. It was fun for sure and I think everyone was heard.
 
Status
Not open for further replies.
Back
Top