Sapphire Radeon HD 2900 XT CrossFire @ [H]

something must be wrong......... 2 2900xt = 1 8800 gts? ...........
Yeah, something is wrong. ATI's product. nvidia got pwned by ati during the FX fiasco, and now ATI is getting beat by nvidia. Similarly, AMD was beating up on intel in price/performance for quite a while... until intel came to thier senses and discontinued the pentium 4, canceled plans for a 4ghz p4, AND stopped using the pentium brand name all together (which I think was silly of them), and brought the pentium M technology to desktop.
Now intel is beating up on AMD.
It happens in real life, various products come out that aren't always as good as the competition. If the companies involved are smart, they try harder and come back with something that beats the competition.
Which is good news for gamers, competition means better stuff for us.
 
I probably won't even then. I'm looking forward to SP3 for XP, as long as MS doesn't abandon its userbase.

You know its not even planned until 1H 2008 right?

So it will probably slip or be canned completely.

Vista is definately the way for the [H] to go. Contrary to popular opinion it's not actually as bad as people make out (admittedly my PC isn't a slouch) and with DX10 being Vista only and DX10 games like World in Conflict imminent I can't see how they can avoid the switch much longer. And god knows I would prefer Brent spend his time reviewing 2 products than I would doing the same product on 2 OS's.

4 Years active posting and still a limp gawd, ffs.
 
Interesting review, I was thinking we'd see a bit of change as there have been other reviews floating around the net showing newer drivers are giving big improvements. Too little too late though I think.

It's a game like Stalker where the [H] really shines. You show us what settings you use, and clearly show that the 2900XT STILL can't do full dynamic lighting. It's really too bad because that's what makes STALKER so good! (BTW on that page of the review you wrote S.T.A.L.E.R. twice.... making fun of the 2900XT stuttering and stalling? Ruthless....)

Another good review, and I'm looking forward to the reviews coming out for Vista from here on out, since I'm on Home Premium 32-bit. So far there are a couple annoyances but overall it's pretty nice compared to XP, I don't think as bad as some have made it out to be. Been using it for over 4 months.
 
You know its not even planned until 1H 2008 right?

So it will probably slip or be canned completely.

Vista is definately the way for the [H] to go. Contrary to popular opinion it's not actually as bad as people make out (admittedly my PC isn't a slouch) and with DX10 being Vista only and DX10 games like World in Conflict imminent I can't see how they can avoid the switch much longer. And god knows I would prefer Brent spend his time reviewing 2 products than I would doing the same product on 2 OS's.

4 Years active posting and still a limp gawd, ffs.
Yeah I'm sorta expecting MS to can it, they are always pulling stuff. But it would be nice anyway.

Anyway, I was under the impression that Lost Planet was supposed to be a nice DX10 title from all the hype, then it comes out and apparently dx9 mode on it is about the same as dx10 mode.
I wouldn't be surprised if its the same with World in Conflict or Crysis. Both of them will almost certainly have DX9 modes so they wont miss out on the sales to the 90%+ of gamers using XP, and it will be interesting to see how much actual differance there is between the modes.

Anyway, even though I don't like it and will probably complain, its probably a good idea for [H] to go to vista to stay on the cutting edge, I mean isn't that sorta what being [H] is about anyway?
 
Between the DX10 review and this review....I'm not sure there is a real winner other than the people taking our money. It is starting to remind me of the benchmark differences between 100FPS in Quake and 120FPS.

I believe the 8800 series has served its useful life and is also starting to show its weakness. I think for the gaming community in general...we need the new hardware out in the next few months or the only card that will be able to give any challenge to DX10 will be the 8800 Ultra.
 
Something just doesn't add up. The "increase" under crossfire was negligible. I'm wondering if the 680i is giving the X2900 a hard time.

I would really rather see crossfire AND sli for that matter evaluated on true x16 x16 boards like the ASUS Crosshair (SLI) and whatever the latest AMD based crossfire enabled board is.

What I would REALLY like to see is crossfire and sli run heads up on the Crosshair with a 600+ Windsor @ 3.25Ghz.

But the numbers are just plain weird, and I really think there is somekind of bottleneck in the PCIe on the 680i. Without the technical docs, we have no idea just how much ATI is depending on the PCIe bus during crossfire. Sure they have a bridge like the SLI cards, but the numbers just scream "somethings weird", the bump between ONE 2900 and TWO 2900's should be close to the kind of numbers seen ONE 8800GTS and TWO 8800GTS's.

Yes, the 2900 gets its ass kicked, we all know that, oh well. But Im really curious about the specifics of what's happening with this crossfire. The ONE 1900 to TWO 1900 crossfire gave higher net FPS improvement than we see with these 2900 tests.
 
i saw another review that seemed favorable compared to the ultra. strange :confused:
 
Something just doesn't add up. The "increase" under crossfire was negligible. I'm wondering if the 680i is giving the X2900 a hard time.

I would really rather see crossfire AND sli for that matter evaluated on true x16 x16 boards like the ASUS Crosshair (SLI) and whatever the latest AMD based crossfire enabled board is.

What I would REALLY like to see is crossfire and sli run heads up on the Crosshair with a 600+ Windsor @ 3.25Ghz.

But the numbers are just plain weird, and I really think there is somekind of bottleneck in the PCIe on the 680i. Without the technical docs, we have no idea just how much ATI is depending on the PCIe bus during crossfire. Sure they have a bridge like the SLI cards, but the numbers just scream "somethings weird", the bump between ONE 2900 and TWO 2900's should be close to the kind of numbers seen ONE 8800GTS and TWO 8800GTS's.

Yes, the 2900 gets its ass kicked, we all know that, oh well. But Im really curious about the specifics of what's happening with this crossfire. The ONE 1900 to TWO 1900 crossfire gave higher net FPS improvement than we see with these 2900 tests.

Guess you missed this tidbit?

From Article said:
and an Intel Bad Axe 2 for CrossFire.
 
Something just doesn't add up. The "increase" under crossfire was negligible. I'm wondering if the 680i is giving the X2900 a hard time.

I would really rather see crossfire AND sli for that matter evaluated on true x16 x16 boards like the ASUS Crosshair (SLI) and whatever the latest AMD based crossfire enabled board is.

What I would REALLY like to see is crossfire and sli run heads up on the Crosshair with a 600+ Windsor @ 3.25Ghz.

But the numbers are just plain weird, and I really think there is somekind of bottleneck in the PCIe on the 680i. Without the technical docs, we have no idea just how much ATI is depending on the PCIe bus during crossfire. Sure they have a bridge like the SLI cards, but the numbers just scream "somethings weird", the bump between ONE 2900 and TWO 2900's should be close to the kind of numbers seen ONE 8800GTS and TWO 8800GTS's.

Yes, the 2900 gets its ass kicked, we all know that, oh well. But Im really curious about the specifics of what's happening with this crossfire. The ONE 1900 to TWO 1900 crossfire gave higher net FPS improvement than we see with these 2900 tests.

Hehe, yeah, we would like for the 680i to support CrossFire too, but you will have to talk to NVIDIA and AMD about that, we have no control over it.

And just to point things out, we used the Intel Badaxe 2 i975 motherboard for CF testing.....680i does not support CF.
 
I finally see why HardOCP's results don't mirror anyone else's.

HardOCP uses certain titles that are all the titles the 8800 series dominate over the 2900 series.

Also HardOCP used an overclocked 8800GTS (running 80MHz above reference clocks) yet they used a reference 2900XT (instead of a pre-oc'ed one).

Where is Quake 4?
Where is Company of Heroes?
Where is F.E.A.R.?

Those are just a few of the many games the Radeon HD 2900XT has a distinct advantage over the 8800GTS 640MB (even with AA turned on).

It all makes sense now... it's called selective benchmarking and I'm going to prove my theory right.

I have acquired a Radeon HD 2900XT 512MB and I'll be benchmarking it on it's own vs an 8800GTS 640MB. I'll downclock the 8800GTS to reference clocks since the 2900XT I got was a reference clocked version and not a pre-oc'd version.

Another thing that comes to mind... WHY NO VISTA SCORES??? We all know AMD/ATi have been tweaking VISTA far more then they have XP and the opposite is true for nVIDIA?

I'll also show both on that end. Once and for all I'll aim to prove that HardOCP much like Fudzilla don't paint both sides of the story (whether inadvertently or on purpose remains a mystery).
 
i saw another review that seemed favorable compared to the ultra. strange :confused:

Why is that strange? It's dropped about $250 since it was first released, right into where it should be, more expensive than the GTX, but able to overclock higher and saves some watts. It's (slightly) more than just a GTX ;)
 
From what I've seen in other Crossfire vs SLI reviews, the trend was similar, though Crossfire did win in 1 or 2 benchmarks. AA really kills the HD 2900 XT. AMD/ATI really needs to do something about that. Having a $400 card and not be able to crank AA, without it having a huge performance decrease, is quite sad.

But I was impressed with the Ultra's performance. It sure is related with Crossfire and SLI scalability in games, but the Ultra beating or coming very close to Crossfire and SLI setups, while having higher image quality settings, was impressive.

Good review as always.

not reading the entire thread (this thing blew up fast!)

but that option may be out of ATI's hands right now, who knows ... they should take what they can get from this GPU thats worth taking (they do support alot of cool technology) and just carry it over to the R700.

its not like ATI's drivers are bad, something is just borked on this card and isnt working.
 
I finally see why HardOCP's results don't mirror anyone else's.

HardOCP uses certain titles that are all the titles the 8800 series dominate over the 2900 series.

Also HardOCP used an overclocked 8800GTS (running 80MHz above reference clocks) yet they used a reference 2900XT (instead of a pre-oc'ed one).

Where is Quake 4?
Where is Company of Heroes?
Where is F.E.A.R.?

Those are just a few of the many games the Radeon HD 2900XT has a distinct advantage over the 8800GTS 640MB (even with AA turned on).

It all makes sense now... it's called selective benchmarking and I'm going to prove my theory right.

I have acquired a Radeon HD 2900XT 512MB and I'll be benchmarking it on it's own vs an 8800GTS 640MB. I'll downclock the 8800GTS to reference clocks since the 2900XT I got was a reference clocked version and not a pre-oc'd version.

Another thing that comes to mind... WHY NO VISTA SCORES??? We all know AMD/ATi have been tweaking VISTA far more then they have XP and the opposite is true for nVIDIA?

I'll also show both on that end. Once and for all I'll aim to prove that HardOCP much like Fudzilla don't paint both sides of the story (whether inadvertently or on purpose remains a mystery).

Can we all say "vendetta"?

Hope you enjoyed that revelation you had... and have fun benchmarking.
 
I finally see why HardOCP's results don't mirror anyone else's.

HardOCP uses certain titles that are all the titles the 8800 series dominate over the 2900 series.

Also HardOCP used an overclocked 8800GTS (running 80MHz above reference clocks) yet they used a reference 2900XT (instead of a pre-oc'ed one).

Where is Quake 4?
Where is Company of Heroes?
Where is F.E.A.R.?

Those are just a few of the many games the Radeon HD 2900XT has a distinct advantage over the 8800GTS 640MB (even with AA turned on).

It all makes sense now... it's called selective benchmarking and I'm going to prove my theory right.

I have acquired a Radeon HD 2900XT 512MB and I'll be benchmarking it on it's own vs an 8800GTS 640MB. I'll downclock the 8800GTS to reference clocks since the 2900XT I got was a reference clocked version and not a pre-oc'd version.

Another thing that comes to mind... WHY NO VISTA SCORES??? We all know AMD/ATi have been tweaking VISTA far more then they have XP and the opposite is true for nVIDIA?

I'll also show both on that end. Once and for all I'll aim to prove that HardOCP much like Fudzilla don't paint both sides of the story (whether inadvertently or on purpose remains a mystery).

A) They are using the same games they have for a long time.
B) That OC is hardly anything on the 8800, I have a stock one (A2 core), and it does 621 easily vs 580 on this $50 extra OCed version from BFG. My memory is also higher (1800 vs 1700).
C) They already said that vista will be next, but as shown in this thread thanks to steam #s, Vista users with this hardware make up less than 2% of the total gamer population.
D) You sounds horribly biased, as you are set out to prove that [H] is wrong, and not trying to see what is what.
 
I finally see why HardOCP's results don't mirror anyone else's.

HardOCP uses certain titles that are all the titles the 8800 series dominate over the 2900 series.

Also HardOCP used an overclocked 8800GTS (running 80MHz above reference clocks) yet they used a reference 2900XT (instead of a pre-oc'ed one).

Where is Quake 4?
Where is Company of Heroes?
Where is F.E.A.R.?

Those are just a few of the many games the Radeon HD 2900XT has a distinct advantage over the 8800GTS 640MB (even with AA turned on).

It all makes sense now... it's called selective benchmarking and I'm going to prove my theory right.

I have acquired a Radeon HD 2900XT 512MB and I'll be benchmarking it on it's own vs an 8800GTS 640MB. I'll downclock the 8800GTS to reference clocks since the 2900XT I got was a reference clocked version and not a pre-oc'd version.

Another thing that comes to mind... WHY NO VISTA SCORES??? We all know AMD/ATi have been tweaking VISTA far more then they have XP and the opposite is true for nVIDIA?

I'll also show both on that end. Once and for all I'll aim to prove that HardOCP much like Fudzilla don't paint both sides of the story (whether inadvertently or on purpose remains a mystery).

Good luck with that and all your "proof."

As for the games we use, we use try out best to use the newest games that represent the larger particular 3D engines then we also watch the top selling games in the USA and go from there. We try to pick games also that we think really push the technology and leave games behind that do not matter anymore, sort of like the ones listed above. We do not want to concern ourselves with say FEAR running fully maxed and 129FPS on one card and 149FPS fully maxed on another and declaring one a winner. IMO they are both winners so we move on. We have a 2600 article coming up this week that will move to Vista and a newer set of games. Sadly, this year has been short on games overall.

http://biz.gamedaily.com/charts/?id=404 is where we source our sales numbers from.

These will be making an appearance in our next evaluation that moves to Vista.

World Of Warcraft: Burning Crusade
Lord of the Rings Online: Shadows Of Angmar
The Elder Scrolls IV: Shivering Isles
Battlefield 2142
S.T.A.L.K.E.R. Shadow of Chernobyl
Lost Planet: Extreme Condition

Lastly, NVIDIA out of the box overclocked cards are the norm and HUGE sellers in North America. If ATI had the same we would use those as well.
 
Not to mention it's not a big OC and it's priced closer to the 2900XT. Have to be fair, right? :p
 
It all makes sense now... it's called selective benchmarking and I'm going to prove my theory right.
Well, just be sure to let us know when you do, because we certainly wouldn't want to miss benchmark results from an individual forum member with an obvious agenda, because those have a tendency to be particularly valuable and meaningful.

I do suggest, however, posting said benchmarks at Rage3D, where they will be met with congratulatory back-patting and the sort of venomous anti-NVIDIA commentary you may feel right at home with (so long as your results agree with your expectations). Posting them here may only lead to well-justified criticism by hardcore individuals who can find flaws in testing methodology where so few others can.

In any case, good luck, and I mean that quite sincerely.
 
I finally see why HardOCP's results don't mirror anyone else's...
...
Also HardOCP used an overclocked 8800GTS (running 80MHz above reference clocks) yet they used a reference 2900XT (instead of a pre-oc'ed one).

Well, perhaps they used it because if they didnt use a factory overclocked card, the 8800gts is a fair bit cheaper than the 2900xt, and in all honesty, when comparing cards it makes sense to compare cards that have similar costs, which helps in the conclusion of which is better (particularly at that price point).

But, since you want to compare the 2900xt to a REFERENCE 8800gts, thereby comparing a more expensive ati card to a less expensive nvidia card, it would be only fair to compare a more expensive nvidia card directly to the 2900xt.

Using that logic, how about instead of comparing your 2900xt only to the 8800gts reference, how about you only compare your 2900xt to an 8800GTX.
 
One thing I noticed, the ATI crossfire bridges do work if you have two slots between your PCIe 16x slots on your motherboard, they just don't allow any gaps larger then that. With the Bad Axe 2, you'd want to use them right against each other anyway, since the far slot only runs at 4x. They just don't allow 3 slots between like the reference 680i boards use, but I haven't seen a crossfire mb that would need that large of a gap yet, atleast not one that doesn't run with a 16x and a 4x slot.

Looking forward to the Vista based reviews in the future, I thought the 2900xt vs 8800 GTS 320 was supposed to be the last XP based review...
 
...
[H] video card reviews are simply some of the best out there. Showing graphs of the game actually being played, minimum, average, and max frames per second, what eye candy is being used and such is just outstanding. Lets not even mention how time consuming. Canned benchmarks simply do not cut it...


Ply
QFT.
Running a 3dmark bench takes only a few mins.
I have no clue how long [H] spends on benchmarking this stuff, but I'm pretty sure its a lot. Lets say it takes 4 hours of setup time for each computer, hardware and software. I would imagine they probably run each benchmark a couple of times to make sure theres no anomalies, and they spend time tweaking each setting until they get a playable one. I bet they do a minimum of another 8 hours per game per card setup. If they do 6 games and 3 card setups thats something like 150 hours of solid work to do this. Three solid weeks.
Even if I'm off by say twice, 75 solid hours of work to produce this is still nothing to sneeze at.
And often, with graphics card reviews, reviewers are given only a couple days before the NDA expires. Squeezing 75 hours of work into 2 days is a little tough.

I'm doing some WAGs here for the numbers, but it wouldn't surprise me in the least if it took that long.
 
wow, i'm late to the party. after an article like that i can understand why too. i would have loved to see some better oc numbers out of the crossfired rig, but you did have a good point regarding the [heat:space:connector:price:modification] relationship; even if it was underlying.

anyways, not the results i was looking for but i'm still clickin on every ad yous have , 50 times each. :D

nice read!
 
Pretty amazing results, I was shocked when I saw those results and the 640 was beating the XTcrossfire...I assumed that was 640 SLI...but it was single. Crazy stuff.
 
Pretty amazing results, I was shocked when I saw those results and the 640 was beating the XTcrossfire...I assumed that was 640 SLI...but it was single. Crazy stuff.

Yeah, I'm surprised that most of the games tested didn't show much improvement at all... pretty sad for Crossfire.
 
This review shows how well a 8800 gts does using pixel shader 2.0 based games (aka dx9.0). We all know that G80 does very well with SM2.0 (thus why some regard 3d06 so well because it provides a sm2.0 and sm3.0 measurement). In this review the pixel shader 2.0 games used are:
Oblivion
BF2142
MS Flight Simulator

The only game in this review that is SM3.0 is Stalker. However, is this game intended for both ATI and Nvidia? The system requirement do not mention ATI video cards. If there is another website that shows system requirements please feel free to post it.
 
This review shows how well a 8800 gts does using pixel shader 2.0 based games (aka dx9.0). We all know that G80 does very well with SM2.0 (thus why some regard 3d06 so well because it provides a sm2.0 and sm3.0 measurement). In this review the pixel shader 2.0 games used are:
Oblivion
BF2142
MS Flight Simulator

The only game in this review that is SM3.0 is Stalker. However, is this game intended for both ATI and Nvidia? The system requirement do not mention ATI video cards. If there is another website that shows system requirements please feel free to post it.

Come on man, get off the whole "It's not optimized for ATI", the games are games, made to be played on video cards, not NVIDIA. Get over it.
 
The only game in this review that is SM3.0 is Stalker. However, is this game intended for both ATI and Nvidia? The system requirement do not mention ATI video cards. If there is another website that shows system requirements please feel free to post it.

Well, let me check the box in which my copy of STALKER came in...

"Recommended Requirements:

...
nVIDIA GeForce 7900/ATI Radeon x1850/ 256mb DirectX 9c compatible video card"


Even the minimum requirements mention the Radeon 9600. This is from the box the game came in, so your source you quoted is wrong (why not go to the manufacturer?)
 
Well, let me check the box in which my copy of STALKER came in...

"Recommended Requirements:

...
nVIDIA GeForce 7900/ATI Radeon x1850/ 256mb DirectX 9c compatible video card"


Even the minimum requirements mention the Radeon 9600. This is from the box the game came in, so your source you quoted is wrong (why not go to the manufacturer?)
The stalker website does not have any information regarding system requirements that I could find. I am not sure if you read my post. I notice that you did quote part of it but I am not sure if you read it. I did ask for any additional information on system requirements. What is a x1850? Also what ati video card is recommended for this game? I don't recall x1850/256mb video card :confused:

In any case we still have 3 pixel shader 2.0 and a single SM3.0 game benchmarked in this review.
 
http://www.stalker-game.com/en/?page=faq#q85

Like I said, the system requirements is a 3D Hardware Accelerator Card - 100% DirectX 9.0c compatible 256MB, that's a VIDEO CARD, not an NVIDIA card. Because they say a 7800 for example, doesn't mean only Nvidia cards will play it well.

Please stop blaming reviews you don't like for your cards poor performance.
 
The stalker website does not have any information regarding system requirements that I could find. I am not sure if you read my post. I notice that you did quote part of it but I am not sure if you read it. I did ask for any additional information on system requirements. What is a x1850? Also what ati video card is recommended for this game? I don't recall x1850/256mb video card :confused:

In any case we still have 3 pixel shader 2.0 and a single SM3.0 game benchmarked in this review.

And the single SM3.0 game did the worst out of them all. Also like mentioned quite a few times, they benchmark the popular games and their next reviews will be using Vista and different games.
 
I want to be as civilized here as possible, but come on. The consensus for some people seem to be that if it's not optimized for ATI hardware or in TWIMTBP program then "oh snap!". If it's such a downer, crank some e-mails to ATI to get their GITG program off its feet and get more games signed up. :rolleyes:
 
Come on man, get off the whole "It's not optimized for ATI", the games are games, made to be played on video cards, not NVIDIA. Get over it.


We purchase all our own games....off the shelf....like any other end user out there. It is just a retail product to us. It is of no matter to us if it is in one camp or the other in terms of marketing. We pick them as described above and put no merit into "optimizations."
 
I want to be as civilized here as possible, but come on. The consensus for some people seem to be that if it's not optimized for ATI hardware or in TWIMTBP program then "oh snap!". If it's such a downer, crank some e-mails to ATI to get their GITG program off its feet and get more games signed up. :rolleyes:

Funny thing is that over the years we have seen NV supported games play better on ATI and vice versa. NVIDIA has done a better job with overall with game dev support, and the devs will tell you that themselves. If this is making an impact on performance, so be it, but as stated above, that is ATI's problem not the consumers...unless of course you are still buying an ATI card after reading out evaluations. ;)
 
Thank you for an excellent product comparison.

As to posts above concerning, did not use this game or that game, this OS or that OS., didn't OC this or that.

Look at the cost comparison of the two solutions ! Even if the ATI's had marginally WON in every test, how in the world could any sane reviewer recommend the ATI solution unless this was some website for millionaires only ?

and even more important to my thinking,

After looking at the required physical limitation that short connector places on the boards and how it must impact the cooling of the smothered board I would have considered saying the ATI solution unacceptable for that reason alone and stopped right there. It astounds me you all will argue over "tests" and "methods" when despite it being clearly mentioned/shown why the ATI xfire solution is unacceptable before it is even is put into 3d mode. Anyone wishing to argue that that is not the case, must first turn off your pump or place socks over over every fan intake before I will listen.
 
The stalker website does not have any information regarding system requirements that I could find. I am not sure if you read my post. I notice that you did quote part of it but I am not sure if you read it. I did ask for any additional information on system requirements. What is a x1850? Also what ati video card is recommended for this game? I don't recall x1850/256mb video card :confused:

The X1850 was a classic ATI vaporware.
 
The game recommened requirements are 7900/AtiX1850 256meg card and dual core cpu at least that is what the box reads
 
In any case we still have 3 pixel shader 2.0 and a single SM3.0 game benchmarked in this review.
And?

SM3.0 was a fairly small evolutionary step. Unless R600 particularly excels at geometry instancing, or some other SM3.0 feature that doesn't substantially impact performance, I don't see anything worth worrying about. If you recall the SM3.0 FarCry patch, you'll know how slight the impact of moving from SM2.0 to SM3.0 can be, which is why developers have been understandably slow at adopting.

If you're using the utilized Shader Model as some sort of modernness barometer, you're fairly far off-base.
 
Back
Top