ATI Radeon HD 3870 X2 @ [H]

Status
Not open for further replies.
is there really that many people with $500 to burn on a video card out there? Is this review THAT important? This card is catering to like 5-10% of the enthusiast market. The bottomline is this: It took ATi 18 months to beat Nvidia's flagship card. That still isn't good. At $400 it might be and at $500 it isn't. What is a price point that you guys think should be set for this card?


ps. I have no real personal gain in this situation as I refuse to spend more than $300 on a video card and already am set. This is purely speculation of who's better and the joy to see a new piece of hardware on the market.
 
Someone give this guy a medal. I was just about to further explain his post but he completely nailed it and it would just be redundant.

I've always liked hardocp's new methodology since they first introduced it - minor problems and all. While I haven't been keeping up in the last year on tech/computer news but clearly there's some backlash to your testing methods. Using the points Tamale made in future reviews, especially creating an article about why you test this way, would go a long way in helping turn the tide towards more realistic real world testing instead of alienating the site.. which for whatever reason it seems to be doing.


I think he meant something like that...
 
Er... why? The [H] review already has an order of magnitude more information and written text than any of the other sites. Anand's page on CoD4 had more graphs than sentences, and only a very quick, very brief, very hard to miss note on there that the results are from a freaking cutscene! By comparison, the [H] review has 6 paragraphs of text on their CoD4 page. And you have the nerve to say that they need MORE to bring them in line with other sites? WHAT?

Oh, and they even mentioned the sharp hit in performance from grenades:

I'm beginning to wonder if you read the same review I did...

there's a lot of good stuff in this [H] review.. I don't mean to sound that none of my ideas are already in place.. that's why I said "keep the review 90% the same".

you're right.. i'm just trying to make some suggestions that would keep the site from attracting so much negative attention
 
I think he meant something like that...

that article was perfectly acceptable for the reviews that [H] published between the time it was released and this one.. but it doesn't account for the massive difference in performance we're seeing here. Something else is going on. [H]'s reviews have historically shown differences from other sites, but never like this, 'ya know?
 
that article was perfectly acceptable for the reviews that [H] published between the time it was released and this one.. but it doesn't account for the massive difference in performance we're seeing here. Something else is going on. [H]'s reviews have historically shown differences from other sites, but never like this, 'ya know?

So what's your diagnosis?
 
So what's your diagnosis?

I have no idea. I'd love to see Kyle and the gang run some basic canned benchmarks with their setup and with a few more cards and more games... then do some more of their trademarked real-gameplay crunching too with a few more titles.. nothing as exhaustive as they've done already, just more data. I hope they just have a bad unit.. but if not, they might have uncovered a HUGE 'dirty little secret' in this industry. Can you imagine the kick-back to the community if they do this right and turn the GPU review world upside-down as a result?
 
that article was perfectly acceptable for the reviews that [H] published between the time it was released and this one.. but it doesn't account for the massive difference in performance we're seeing here. Something else is going on. [H]'s reviews have historically shown differences from other sites, but never like this, 'ya know?

I agree.

First and foremost, I am dissapointed in the [H] review.
Done and I will not beat a dead horse.

Second, I think something else is going on as well.

Third, as stated earlier on in the thread, WE ALL don't have high end monitors to splurge our cash on.
I operate a Chicago-based LAN party.
95% of people there had monitors that ran at an acceptable 1280X1024.
There was 1-2 people that had monitors that supported outside 1400X900.
At the most I saw ONE computer running an 8800 ULTRA.

Forth, there has been WAY too much ATI/AMD bashing as of late. Ok Intel has a nitche, but I think Nvidia has a way to come with bringing more "HTPC" and cheaper cards to the market that allow you to stretch your "multimedia / gaming dollar."
Crossfire with multiple monitor support over SLI, better image quality, power consumption, overall price, and native HD decoding is pretty sweet if you ask me.
I think many of these features are being overlooked and not being tallied into the overall review. THESE ARE HUGE FEATURES.

I've been a fan of [H] for many years, and I've met and talked to Kyle many times in person... I just don't believe this review.
And specifically I think drivers and test setups have a lot to do with it.

I'd like to see a mid-high grade processor for a change being benched with these new cards for a change. EX: 4800+X2, Q6600, etc.
I don't know ONE person outside of enthusiast websites or at a place like QUAKECON that have the cash to spend 1,000 dollars on a processor, 400 dollars on RAM, 300 on a motherboard, 2-3,000 dollars on an LCD, and most of all 5-600 dollars on a video card(s).

To be honest I've been REALLY TURNED OFF by [H] reviews because for some reason, I think the "common man/woman" has been left out of the equation concerning product reviews.

my 2 cents.
 
Can someone explain something a bit more to me.

I understand the desire to use "real" gameplay vs using cut scenes or timedemos/flybys included with games.

What would be the problem however, with recording a custom timedemo (of the real intense stressful stuff) of the first run through on video card A, and then playing it back on video card B? Or make a custom demo of whatever real gameplay you want, and playback on both.

Is playing back a timedemo just not the same kind of stress on a video card as playing it in real time or something?

It would appear to me, not knowing any different, that this would allow for use of real gameplay testing, while eliminating the variable of different run throughs on different cards.

Hopefully Kyle or Brent or someone will answer it, not using some shitty football analogy like that other guy.
 
that article was perfectly acceptable for the reviews that [H] published between the time it was released and this one.. but it doesn't account for the massive difference in performance we're seeing here. Something else is going on. [H]'s reviews have historically shown differences from other sites, but never like this, 'ya know?
The "massive difference in performance" is explained by the significant difference in testing approach between H and other sites. There is no discrepancy whatsoever. If you fail to understand this you shouldn't engage in a technical discussion.

"Something else is going on"... yeah, some people like drama, that's it.
 
This thread reminds of the [H] GF 8600 review. Every other review site thought it sucked hard donkey balls but [H] thought it was the best thing since sliced bread. Funnily enough, the same degree of chaos and flaming ensued. :D
 
I didn't have the patience to go through all 34 pages, I made it to page 14 and saw that nobody else mentioned Extremetech's review. And from what they had to say was "wait and see." They tested pretty much the same way [H] crew does i.e. real-world performance, and got the same results. I think [H]'s result's are consistent with their methodology. Let's wait for driver updates and hope for the best. Nobody wants to see one company dominate, we all wanna see healthy competition here. F the canned benchmarks, play some games and see what's better.




edit: Anybody else notice how noseguard20's join date was the 27th? I'm not implying trolling, but a little fishy don't you think?
 
This card is catering to like 5-10% of the enthusiast market. The bottomline is this: It took ATi 18 months to beat Nvidia's flagship card. That still isn't good.

I was thinking today similar thoughts. Then I also remembered when the situation was reversed with Nvidias 5x00 series vs ATIs 9x00. Previous gen tech was beating it then too. It took Nvidia a whole generation to come back. Sure it currently is taking AMD a little longer, but they are atleast competative (to my mind) in bang for buck.

For Kyle : Hurry up and test two of these puppies in crossfire X or whatever its called :D
 
What would be the problem however, with recording a custom timedemo (of the real intense stressful stuff) of the first run through on video card A, and then playing it back on video card B? Or make a custom demo of whatever real gameplay you want, and playback on both.

It sounds like a good way to narrow down the numbers a bit, but I think the primary focus on real-world gameplay is far more subjective than objective (which is intended). What matters is that the game is being played, and evaluations are based on gameplay experiences rather than fps output. I don't think [H] cares if the run-through is 100% consistent between cards A and B. Maybe about 90% or so. The graphs shown in countless articles display similar peaks and valley's between the compared cards, so I trust that Brent did well in attempting to reproduce the same actions. Whether the run-throughs are 90% consistent or 99%, the number one priority is to actually play the game.

To me it would sound rather embarassing if any reviewer said "This is our evaluation on the video card based on our timedemo experiences..." What exactly do you experience when you sit back and have a timedemo doing everything for you?

The almighty FPS shouldn't be the deciding factor of which video card is better. It's the overall impression one would get after playing the game at comfortable framerates through tweaking settings (a.k.a. subjective real-world experiences). Combine that with the cost of the card compared to other cards, that's where its overall "worth" is evaluated. Taking shortcuts with automated systems pretty much defeats the purpose.

I'm glad to know that Brent, Kyle and any other individuals involved in the reviewing process represent themselves as [H]ard gamers as well as [H]ard computer enthusiasts. Most other computer "enthusiast" out there just look for the highest numbers. Things have become so mechanical, they don't even bother showing off their e-peen with high fps scores, but rather the elusive "marks." This is pretty much the only place where I'm confident that the reviews I'm reading are from real gamers, not some "enthusuast" that collected some popular timedemos and benchmarks.
 
I agree.

First and foremost, I am dissapointed in the [H] review.
Done and I will not beat a dead horse.

Second, I think something else is going on as well.

Third, as stated earlier on in the thread, WE ALL don't have high end monitors to splurge our cash on.
I operate a Chicago-based LAN party.
95% of people there had monitors that ran at an acceptable 1280X1024.
There was 1-2 people that had monitors that supported outside 1400X900.
At the most I saw ONE computer running an 8800 ULTRA.

Forth, there has been WAY too much ATI/AMD bashing as of late. Ok Intel has a nitche, but I think Nvidia has a way to come with bringing more "HTPC" and cheaper cards to the market that allow you to stretch your "multimedia / gaming dollar."
Crossfire with multiple monitor support over SLI, better image quality, power consumption, overall price, and native HD decoding is pretty sweet if you ask me.
I think many of these features are being overlooked and not being tallied into the overall review. THESE ARE HUGE FEATURES.

I've been a fan of [H] for many years, and I've met and talked to Kyle many times in person... I just don't believe this review.
And specifically I think drivers and test setups have a lot to do with it.

I'd like to see a mid-high grade processor for a change being benched with these new cards for a change. EX: 4800+X2, Q6600, etc.
I don't know ONE person outside of enthusiast websites or at a place like QUAKECON that have the cash to spend 1,000 dollars on a processor, 400 dollars on RAM, 300 on a motherboard, 2-3,000 dollars on an LCD, and most of all 5-600 dollars on a video card(s).

my 2 cents.

People know about multimonitor crossfire, people know about native HD decode (which the 8800GT(s) G92 have as well). Image quality is a toss up at this point, read feb 2008 edition of maximum pc.

The whole point of the review is to "test" and "stress" the video card. You can't stress a 3870 x2 or an 8800 ultra with a 3200+, you can't stress those cards with a 17 inch CRT and a gig of corsair value select. Some people argued that the x6800 (2.93ghz) wasn't pushing it enough, that they needed to do quad cores. The test setup they use has been around for what looks to be over a year, back then corsair dominator, x6800, high end X38 motherboard was the top of the line, now its mainstream, 266 for the processor, 100 (tops) for the memory, motherboard costs about the same, but w/e. If you strap the 3870 to a slow ass processor your not effectively testing its potential.

What I don't understand why people think these results aren't somehow reliable, and somehow aren't reflective of the true performance of these cards at these resolutions, choosing to believe something else is to blame for its poor performance. You buy a video card (atleast the one in question) to play video games, so why not test it with actual games that people will use with this video card (crysis,hl2 ep2, WiC, and others), using actual gameplay instead of some cut and dry benchmark that runs thru the exact same 5096 frames, with the exact same graphics in exactly the same order. Its kind of like fear benchmark. It says my video card kicks ass with 4x AA and 16x AF, never drops below 25fps, not even close. But when I actually play the game, im running and gunning thru parts getting 10 fps at times.

Smoke grenades, flashbangs, fire, guns, and lets not forget ejected shells are all part of the game, And I want the video card that can put more of those items on the screen, and faster, I don't just want to see which card can shit out a frame faster, I want to see the detail. If the 3870x2 is really a better card than the 8800ultra it can render the same frame that the 8800 can, nomater what is on that frame, and can do it faster than the 8800ultra and continue to be faster even in the most intense scenes.

[H]ardOCP is doing a good job reviewing the material given, which is why I continue to read their reviews and post on their forums, I always thought anandtech did a good job as well, but their sticking up for the tools that make their jobs and their lives easier, when [H] is getting the job done and done right, even if that means taking longer and paying people more to put out a quality review.
 
I'm sure as hell not sorting through 34 pages of comments. But I will input my opinion; if it's repeated refer to the beginning of my post.

Don't lose hope in AMD everybody. Yes, nVidia is currently much better. And so is Intel. But I have hope in AMD. If you refer to historical data, AMD kicked ass during the last US recession. There will likely be a recession this year. Intel will suffer greatly from a recession if it does occur. They have MUCH more riding on their quarterly sales and inventories.

Everyone thinks AMD is a sinking ship right now. Not even close to the truth.

And to be honest I may be a computer nerd, spending hours on newegg, but this is my investing opinion. Just looking at the charts AMD will be back... Soon. I wouldn't be surprised if the stock went below $4/share, but in a 5 year frame I predict the price will be near $20 again.

Then again this is an internet forum so don't take my opinion too literally :p
 
Something is wrong with the numbers at [H]. Its not only that any other review have different numbers, but even at Tech report where they used a custom demo of a multiplayer gaming session with this card against an 8800GTX ULTRA and it came on top.
http://techreport.com/articles.x/13967/6
vs
http://enthusiast.hardocp.com/article.html?art=MTQ1NCw0LCxoZW50aHVzaWFzdA==

also note the usage of AA and that they used Vista 32-bit.

They also had a fraps test against the Ultra in UT3, where they played themselves. Methology is similar to [H] here and the 3870x2 beat the ultra at same settings.
http://techreport.com/articles.x/13967/9


[H] is one of my trusted sites when it comes to unbiast reviews, so I don't suspect foul play. But something must be wrong with [H]'s setup considering even when they don't use the benchmarks of the games the numbers differ a lot and where they use benchmarks the Nvidia cards looses.
 
The whole point of the review is to "test" and "stress" the video card. You can't stress a 3870 x2 or an 8800 ultra with a 3200+, you can't stress those cards with a 17 inch CRT and a gig of corsair value select. Some people argued that the x6800 (2.93ghz) wasn't pushing it enough, that they needed to do quad cores. The test setup they use has been around for what looks to be over a year, back then corsair dominator, x6800, high end X38 motherboard was the top of the line, now its mainstream, 266 for the processor, 100 (tops) for the memory, motherboard costs about the same, but w/e. If you strap the 3870 to a slow ass processor your not effectively testing its potential.

To play a little devils advocate :D :D
So you are saying that the average homeboy/girl that has spare 400 dollars to spend on a video card and pair it with a somewhat old FX, 4800X2, or C2Duo6600 is at a hardware/bandwidth disadvantage because they don't have the dollars to buy an EXTREME cored proc based setup?
Where is this proven in hard specs?
If you really put this into prespective, there's a TON of gamers out there that don't have the money to upgrade core hardware.
Many people I talk to just upgrade their video card because their procs, memory, etc are adequate.
I'd KILL to see an article written that scales back hardware for a change and pits it against powerhouse 1K$ processors/hardware.
All the while using newer generation video cards in dual and single card formations.
Sure it was proven in the past when people talked about it in the 6800 series Nvidia cards and such...
Things are a bit different now if you are at least running a somewhat respective dual core system.
I'm really curious.

Call me "oldschool" but I'd also rather see older hardware paired against newer hardware.
Example, dual x1900s, dual 7900s, vs 3870s and 8800s.
Many of us are just now upgrading from our 6800s and 1800s... Let's see how they still stack up.

Also feel free to prove me wrong but I also think to benchmark on an INTEL only system is rather descriminatory.
I'd like to see some of these cards run on 6000+ procs or dare I say, a Phenom setup.

To touch a bit on the driver issue, I know ATi and Nvidia has always improved massive preformance gains with driver revisions.
Feb and March will show this, I'm quite sure with respect to ATI's "spider" based drivers and Nvidia's 3-way SLI coming.
Being humble, as a consumer, I'd rather see matured drivers for a product benchmarked before going out to purchase.
For example, most of us did not switch over to XP till SP1.
It's the same with us IT nerds and Vista patiently awaiting SP1 before jumping off a cliff.

Maybe things will be different for a change.

There's that's my 4 cents now.
 
To play a little devils advocate :D :D
So you are saying that the average homeboy/girl that has spare 400 dollars to spend on a video card and pair it with a somewhat old 4800X2 or C2Duo6600 is at a hardware/bandwidth disadvantage because they don't have the dollars to buy an EXTREME cored proc based setup?
Where is this proven in hard specs?
Sure we talked about it in the 6800 series Nvidia cards and such...
I'm really curious.

I would say yes, with a powerful GPU you'd be somewhat disadvantaged with an X2 4800+ (E6600 not so much as it's quite a bit faster). You don't necessarily need a Core 2 Extreme of course, there is a thing called overclocking, you may have heard of it. ;)

[H] actually did an article on 8800GTX SLI CPU scaling... http://www.hardocp.com/article.html?art=MTI2MiwsLGhlbnRodXNpYXN0

You can see the FX-60 being a bottleneck in a few of the tests. The X6800 fared better in the CPU bound situations.
 
Something is wrong with the numbers at [H]. Its not only that any other review have different numbers, but even at Tech report where they used a custom demo of a multiplayer gaming session with this card against an 8800GTX ULTRA and it came on top.
http://techreport.com/articles.x/13967/6
vs
http://enthusiast.hardocp.com/article.html?art=MTQ1NCw0LCxoZW50aHVzaWFzdA==

also note the usage of AA and that they used Vista 32-bit.

They also had a fraps test against the Ultra in UT3, where they played themselves. Methology is similar to [H] here and the 3870x2 beat the ultra at same settings.
http://techreport.com/articles.x/13967/9


[H] is one of my trusted sites when it comes to unbiast reviews, so I don't suspect foul play. But something must be wrong with [H]'s setup considering even when they don't use the benchmarks of the games the numbers differ a lot and where they use benchmarks the Nvidia cards looses.

I can't wait to hear an explanation if they don't decide to ignore your post. If [H] cannot get over their Cognitive dissonance regarding this matter then there is no point in prolonging this subject because while it is very clear that [H] test is filled with errors, they will continue to deny the facts just to reinforce their own beliefs that their method is the gospel truth. They need to get over themselves and deal with this problem.
 
I can't wait to hear an explanation if they don't decide to ignore your post. If [H] cannot get over their Cognitive dissonance regarding this matter then there is no point in prolonging this subject because while it is very clear that [H] test is filled with errors, they will continue to deny the facts just to reinforce their own beliefs that their method is the gospel truth. They need to get over themselves and deal with this problem.

They are ordinary people like everyone else and do what they believe would benifit their users most. Using benchmark tests would be less time consuming and nobody would react if they didn't use their own test methology instead. They do chose a more time consuming method it because they care and believe themselves that its best. :) Bennet strikes me as a reflective person from what I have read.

I think there might be a plausible explaination for the differences. Most reviews have used 32-bit versions of the drivers and since the drivers themselves are beta, the 64-bit ones might be inferior. This might cause serious hits in fps.
 
I didn't have the patience to go through all 34 pages, I made it to page 14 and saw that nobody else mentioned Extremetech's review. And from what they had to say was "wait and see." They tested pretty much the same way [H] crew does i.e. real-world performance, and got the same results. I think [H]'s result's are consistent with their methodology. Let's wait for driver updates and hope for the best. Nobody wants to see one company dominate, we all wanna see healthy competition here. F the canned benchmarks, play some games and see what's better.




edit: Anybody else notice how noseguard20's join date was the 27th? I'm not implying trolling, but a little fishy don't you think?

Sadly it seems you are not quite right as the benchmarks are mostly canned.
Descriptions of tests are on this page
http://www.extremetech.com/article2/0,1697,2252573,00.asp
However they have managed to obtain some similar results to [H].

Noseguard might need a noseguard if he continues to troll :)
 
there's a lot of good stuff in this [H] review.. I don't mean to sound that none of my ideas are already in place.. that's why I said "keep the review 90% the same".

you're right.. i'm just trying to make some suggestions that would keep the site from attracting so much negative attention


But that is the problem. Your thinking this is about attention, It's not. It is about REAL WORLD GAMEPLAY!

Anand's article is set up for the "quick scanner." A person who just looks at graphs, maybe a line of text or two (that's usually all provided anyway) and can SEE a difference. They see a graph and think OMG look at teh powah!

[H]'s way requires the person to think, to devote some resources to the article. Why did they choose this setting? What caused the frames to dip badly at this part of the graph? And the answers are right there in the many paragraphs of text. :eek:


Saddly, most of the comments I'm seeing are from totally ignorant, shorted sighted individuals who wont give [H]'s article the brainpower it deserves.

It's not about attention, it's not about mudd slinging, name calling etc. It's is about what card will deliver the best gameplay, period...


Ply
 
I hope [H]ard dosn't start using canned benchmarks..or include them.
Would be a waste of their time and my time..as canned sites are 20 a dozen.
I want gaming prefomance, not 3D-wanna-be-numbers...
 
But that is the problem. Your thinking this is about attention, It's not. It is about REAL WORLD GAMEPLAY!

Anand's article is set up for the "quick scanner." A person who just looks at graphs, maybe a line of text or two (that's usually all provided anyway) and can SEE a difference. They see a graph and think OMG look at teh powah!

[H]'s way requires the person to think, to devote some resources to the article. Why did they choose this setting? What caused the frames to dip badly at this part of the graph? And the answers are right there in the many paragraphs of text. :eek:


Saddly, most of the comments I'm seeing are from totally ignorant, shorted sighted individuals who wont give [H]'s article the brainpower it deserves.

It's not about attention, it's not about mudd slinging, name calling etc. It's is about what card will deliver the best gameplay, period...


Ply
QFT. I keep coming back to this thread and it's not changing, just the names of the posters. There are a few people who don't understand the difference between a time-demo, benchmarks and real time gameplay. Real-time playing introduces a level of randomness that SHOULD be taken into account.

And again, since there are new people here I want to bring everyone's attention to one irrefutable success in the 3870x2: The card design and engineering. The amount of technology that they created and implemented here, with nominal heat and power, and a standard size, is incredible.
 
And again, since there are new people here I want to bring everyone's attention to one irrefutable success in the 3870x2: The card design and engineering. The amount of technology that they created and implemented here, with nominal heat and power, and a standard size, is incredible.

I totally agree with this. Unless I am incorrect, this is the first dual-GPU card to be recognized as a single card in device manager. All of nVidia's offerings are recognized twice and ATi's old two GPU offering, the ATi Rage Fury MAXX, was also recognized as two cards.

For analogy purposes, the concept of the IMC in the Athlon64 is not a failure because the Core2 has superior IPC. With the design of Nehalem, even Intel recognizes AMD was ahead of the game in applying this type of memory controller to their design.

Perhaps nVidia will take heed of ATi's design and the 9800GX2 will be the last two PCB dual-GPU space heater we see.
 
kinda, but right there. in every review, on every page with results.

note that most respectable sites explain how they get their numbers on every page too.. but when you're one of the only sites on the net doing things a certain way, you HAVE to go out of your way to explain why your results look different and how you got them.

the 'disclaimer' at the end of [H]'s review is a nice attempt at this, but it should be on every page, accompanying every graph... and personally I think it should be contrasted to the more readily accepted canned scores to show any discrepancies.

You must not read the pages, because our run-throughs that are represented on each page are described in depth. But we can surely go even more in-depth. And we do add.

Please be aware we test our video cards a bit differently from what is the norm. We concentrate on examining the real-world gameplay that each video card provides. The Highest Playable section shows the best Image Quality delivered at a playable frame rate.
 
is there really that many people with $500 to burn on a video card out there? Is this review THAT important? This card is catering to like 5-10% of the enthusiast market. The bottomline is this: It took ATi 18 months to beat Nvidia's flagship card. That still isn't good. At $400 it might be and at $500 it isn't. What is a price point that you guys think should be set for this card?


ps. I have no real personal gain in this situation as I refuse to spend more than $300 on a video card and already am set. This is purely speculation of who's better and the joy to see a new piece of hardware on the market.


Hehe, you don't know our readers do you? THESE ARE THE GUYS that buy at this price point.
 
I have no idea. I'd love to see Kyle and the gang run some basic canned benchmarks with their setup and with a few more cards and more games... then do some more of their trademarked real-gameplay crunching too with a few more titles.. nothing as exhaustive as they've done already, just more data. I hope they just have a bad unit.. but if not, they might have uncovered a HUGE 'dirty little secret' in this industry. Can you imagine the kick-back to the community if they do this right and turn the GPU review world upside-down as a result?


Hmm, run some canned benchmarks that don't represent real gameplay in order to prove that our actual gameplay results are right/wrong? No, not today. Go buy your own card and find out what we already know.

I have said what I have to say on all of this and need to get back to work. Either our evaluation has value for you or it does not. If it does not, we linked plenty other sites on our main page for you to check out. :)
 
I Have gotten called an ATI fanboy for this..even though i havent owned an ATi card since the pre-doom 3 era...

however...with [H] review as 50% weighted...and all other reviews combined as 50% weighted

final answer...

3870X2 is the fastest single card you can buy at the moment and the High End Crown has Swung to Ati.
 
To bad it took them 2 GPU's on a single PCB to do it =/. Not as impressive in my honest opinion.

-DarkLegacy
 
To bad it took them 2 GPU's on a single PCB to do it =/. Not as impressive in my honest opinion.

-DarkLegacy

just like when the RAGE FURY MAXX beat the GEFORCE 256 in UNREAL TOURNAMENT..... not as impressive because it takes 2 GPU's

but the fact remains
 
To bad it took them 2 GPU's on a single PCB to do it =/. Not as impressive in my honest opinion.

-DarkLegacy

WHO CARES?? I suppose you don't like Intel CPU's for the same reason, right?? "It's not TRUE Dual/Quad Core!!!!!" That makes NO sense... who cares how they do it? As long as it's competitively priced and works properly?

It's like saying 'Well.... Honda has to use DOHC instead of using a pushrod engine to get the same power out of a 4cyl as GM does out of a V6" Do they produce the same amount of HP? Are they competitively priced? Does it negatively affect my driving experience? If no... then who cares?
 
WHO CARES?? I suppose you don't like Intel CPU's for the same reason, right?? "It's not TRUE Dual/Quad Core!!!!!" That makes NO sense... who cares how they do it? As long as it's competitively priced and works properly?

It's like saying 'Well.... Honda has to use DOHC instead of using a pushrod engine to get the same power out of a 4cyl as GM does out of a V6" Do they produce the same amount of HP? Are they competitively priced? Does it negatively affect my driving experience? If no... then who cares?

Because it has negative effects - in this case, it doesn't scale well. I would be curious to see if it would work just as well with, say, version 1.0 of Crysis where multi-GPU was broken. It would really suck if a new game came out that had a similar problem and you're SOL because you bought the two GPUs on one card isntead of the single GPU that had the same performance...
 
I emailed INQ about that; they replied (they are very good at getting back to me, so kudos to them) and said they didn't intentially just use "choice reviews." Seems weird to me that they could forget about HardOCP, but I'll give them the benefit of the doubt - there are a lot of review sites out there, and I'm sure they read through them pretty thoroughly before they post em on their site. So it is possible they just didn't get to it yet.

He replied to me too. Said it should be up today. Here's what he said:

"Actually if you take a look at thepresentation we posted, there are a couple of benchmarks where AMD admitsfalling behind the Ultra - so there's no "choice reviews" here. It wasn't a"conscious decision". It just didn't pop up on our radar. *bangs machinethat goes ping*

Frankly, I'd be a lot more conservative about X2 than most editors are being- there's a lot of boasting going on, but GX2 is coming soon and I don'tthink AMD will like the result. On the other hand it's a pretty "cheap" cardif you think of it. Also, AMD did hand down some drivers at the last minute, which did improvegameplay substantially in some notorious cases, like Brent mentioned. I'll try and do Brent... uhm... justice, in today's roundup."

We'll be waiting for Justice for Brent :D

that article was perfectly acceptable for the reviews that [H] published between the time it was released and this one.. but it doesn't account for the massive difference in performance we're seeing here. Something else is going on. [H]'s reviews have historically shown differences from other sites, but never like this, 'ya know?

Umm... yes they have. Like someone said, go back and read the 8600 reviews. People flamed the [H] for giving it a high grade because it actually played games better than the X1950Pro when other sites were showing lower FPS than they thought it should. Or the ever-famous 2900XT review. [H] showed how broken anti-aliasing was on the card, but so many other websites save a select few were praising the 2900XT, saying it was better than the GTS because they tested without AA, without filters, canned benchmarks like 3dmark, where the 2900XT shined. That's nice, but the 2900XT still blew monkey chunks in gaming, and that's why I personally saw a few posters here on [H] that went out and bought a 2900XT - against what multiple articles said - because they trusted those other websites more.... and ended up selling that card and buying a GTX :rolleyes:

That's why I base 90% of my opinion from [H] articles. They haven't steered us wrong yet.

Hopefully Kyle or Brent or someone will answer it, not using some shitty football analogy like that other guy.

Your questions have been answered so many times RIGHT HERE in this thread it's not even funny. Learn to read, go back and find it, because it's there, nobody needs to answer it yet again.

I hope [H]ard dosn't start using canned benchmarks..or include them.
Would be a waste of their time and my time..as canned sites are 20 a dozen.
I want gaming prefomance, not 3D-wanna-be-numbers...

Yeah, don't worry, they won't. If you've been keeping up with the past articles and the threads that have ensued, this isn't the first time people have joined just to bash them. They're not changing anytime soon, because as Kyle said earlier, they haven't been proven wrong yet, just a bunch of people whining that they aren't showing a piece of hardware they way they want it to be seen.
 
Because it has negative effects - in this case, it doesn't scale well. I would be curious to see if it would work just as well with, say, version 1.0 of Crysis where multi-GPU was broken. It would really suck if a new game came out that had a similar problem and you're SOL because you bought the two GPUs on one card isntead of the single GPU that had the same performance...


Thats the tradeoff you make and you should know this when you buy the card. It all boils down to ati getting more profiles into their drivers. R700 is meant to be multi gpu so its in ati's interests to get as many games as possible done now with the x2.
 
Thats the tradeoff you make and you should know this when you buy the card. It all boils down to ati getting more profiles into their drivers. R700 is meant to be multi gpu so its in ati's interests to get as many games as possible done now with the x2.

Yes, I know - I didn't say I wasn't impressed with it, I was just trying to point out why it might not be considered as impressive as a single-GPU card with the same power. 'tis all, man. :)

But still, the thing with SLI and Crossfire being 'broken' in Crysis demo wasn't the result of missing profiles as I understand it - anyone know how this card would handle a situation like that?
 
WHO CARES?? I suppose you don't like Intel CPU's for the same reason, right?? "It's not TRUE Dual/Quad Core!!!!!" That makes NO sense... who cares how they do it? As long as it's competitively priced and works properly?

It's like saying 'Well.... Honda has to use DOHC instead of using a pushrod engine to get the same power out of a 4cyl as GM does out of a V6" Do they produce the same amount of HP? Are they competitively priced? Does it negatively affect my driving experience? If no... then who cares?

Never said I didn't like it. It's just sad that this is there answer to the 8800 Ultra. This is not the same ATI as it once was. Maybe we are leaning towards the generation of multiple GPU cards. Especially with the 9800 GX 2 just around the corner. I however feel that we are still not in need of it just yet. I just had more faith in AMD/ATi to deliver a single GPU deisgn that could compete & surpass the 8800 Ultra or the GTX atleast. Looks like that is not going to happen. I have very little faith that AMD/ATI can truly compete anymore if this is there solution to high end Graphics solutions. This card will sell well but just imagine if the 9800 GX 2 greatly outperforms this? They are damn near as good as dead as 3DFX if that's the case. It's only a matter of time now till we find out.

Just to give an idea. According to steam 65% of users that took there system survey are using Nvidia Cards. While 35% are using ATI. This is just on steam so the numbers could mostly likely be much different in the real world. It has to be known though that a huge chunk of sales has come from the AGP end since they continue to support that. A lot of those systems in that survery are using AGP ATI Cards. I want ATI to bring solid competition on all fronts to Nvidia. If they don't, we can be looking at some serious problems & limited options down the road.

-DarkLegacy
 
The bottom line is that Vista 64-bit and Vista 32-bit are completely different operating systems with completely different video drivers. There is a very real possibility that AMD FUBARed the Vista 64-bit drivers and the performance gains that all the other sites using Vista 32-bit or XP are seeing simply do not apply to the current 64-bit drivers.

Until [H] runs a few tests using Vista 32-bit or XP like nearly ALL of these other sites did, the benchmark results simply cannot be compared. It's exactly like comparing gaming benchmarks between Linux and Windows XP, apples to oranges.
 
Status
Not open for further replies.
Back
Top