ATI Radeon HD 2900 XT @ [H]

Status
Not open for further replies.
WTF. I see end user reviews like this and it leaves me scratching my head. How could [H]'s review be so completely off?

http://www.nvnews.net/vbulletin/showthread.php?p=1265358#post1265358

I would guess, and it is just a guess, that they weren't running with HDR+AA on. From what I've read, you need to enable AA outside of the game if you're planning on running HDR as well. So max ingame settings may not be taking AA in to consideration. (Which is where the 2900 takes its biggest hit.)

I've also read a few things about the latest drivers having some problems enabling HDR + AA, but I'm not 100% sure on that.
 
Fair enough then. I've not actually been reading a whole lot of the posts happening now, on any forums - They're mostly turning in to bickering back and forth, heh.

Edit:

Having just read through that thread now, it looks as the only real difference in settings is the resolution. Wonder how much of a hit the higher res will cause on the card. (If it's tested at higher reses on that thread.)
 
WTF. I see end user reviews like this and it leaves me scratching my head. How could [H]'s review be so completely off?

http://www.nvnews.net/vbulletin/showthread.php?p=1265358#post1265358

So you're questioning [H], but not this guy ?
A certain reputable source of information, called "Gibbo", also said the HD 2900 XT was better than even the GTX. And not only that, but it consumed less power and produced less heat, while being quieter than the GTX.
Some called this Gibbo, a "hardly ever wrong" guy. I'd say his reputation is somewhat different today IMHO.

You believe what you want in the end I guess.
 
So you're questioning [H], but not this guy ?
Well err no... though I suppose it sounds like that. I don't know what to believe at this point. I don't believe Xion is a liar, or a fool. But then I don't believe [H] would intentionally rig their review either. So what am I supposed to believe?
 
Well err no... though I suppose it sounds like that. I don't know what to believe at this point. I don't believe Xion is a liar, or a fool. But then I don't believe [H] would intentionally rig their review either. So what am I supposed to believe?

You believe what you want.
I believe in the majority, that basically says the same thing [H] did. The only difference is the conclusion. Some sugar coat it, saying that despite its flaws (same performance as GTS 640, while costing more, consuming more power and producing more heat and noise), it's a good card. [H] doesn't follow this sugar coating trick. If it has more disadvantages than advantages, it surely is not something you recommend to anyone. In fact, even though they didn't call it a flop, DriverHeaven doesn't recommend the XT to anyone either, which IMHO, is just another way of saying the same.
 
So what am I supposed to believe?

What your brain tells you to believe.

What I like most about the [H] is that they take stance based upon a "gaming crowd". They don't benchmark for the sake of benchmarks, they benchmark certain configurations to figure out what will be of value to "gamers".

1. The 2900 is a "high end card", therefore it is reasonable to take the resolution points of 1680x1050, 1600x1200, and 1920x1080 as "typical".

2. Which games are gamers playing? Probably the ones that are selling the most of as of late. That should be an indicator of which games to benchmark.

3. How will the gamer configure the card? The typical answer is "what ever gives the best experience at the resolutions above".

To me, this is the [H] review method. However, the [H] method is not the typical method, and therefore is bound to come up with significantly different answers than "the mob" from time to time. It doesn't mean they are wrong, it means the set of answers to the question they asked gave a result that you may not like.
 
Well err no... though I suppose it sounds like that. I don't know what to believe at this point. I don't believe Xion is a liar, or a fool. But then I don't believe [H] would intentionally rig their review either. So what am I supposed to believe?

Notice in the first paragraph that you quote, he mentions the "alpha driver." The [H] review specifically mentions that driver--it was made for Oblivion, it cuts a lot of corners on IQ to fix bad framerates. IIRC, even ATi recommended not using it for comparative benching.
 
He's running 8.37.4.2. Are those the drivers you're talking about? And do you have a link where ATI recommends not using them for benchmarking?

Here's a link to the [H] review page that mentions the alpha driver, no version given but the term used is the same.

The part about ATi not recommending them may have come from a forum post or another site's review, or it may just be a fig-newton of my imagination, but I was sure I saw it somewhere... oh well. I did search this thread to see if it was posted here by Brent or Kyle, but no luck. I did confirm, though, that the version of the "alpha driver" was 8.37.4.2, just like your source. [H] tested with it and saw no significant improvement. So, back to who do you trust I guess.
 
I'll be curious to see what happens when he benchmarks at the same resolution and on his own system instead of his friend's.
 
It never dipped below 35 last night on my 2900XT. Like I said, it's hard to do an apples-to-apples comparison just yet, because I was playing on a 1280 res monitor compared to 1680 in that screenshot on a single GTX. But I was also on a slower setup compared to my Conroe rig that nets about 10fps or better in that game and had grass shadows maxed (didn't have grass shadows on in the GTX bench), so it kind of balances things out. I'll be able to make a better comparison today and try to set every graphic setting exactly the same (AF level, no grass mod, etc), but so far things look very close.

So he is using different settings and resolution on the two cards.. hardly anywhere near "Apples to Apples".

Also some of the 2900 drivers are not doing HDR or AA correctly, so don't believe any reviews that don't show SSes or are just Benchmarks.
 
I would wait until I saw some more benchmarks from some respectable sites with newer drivers, until then, I wouldn't touch this card with a 10 foot pole... literally, it might burn some of the pole
 
The comparison ought to be made between the 8800GTS 640mb and the HD 2900XT. Most folks considering these cards are after price vs performance. Honestly, I'd wait and see what happens with the drivers too. It's not appropriate to compare a maxed out XT on air vs a standard GTS 640mb because they OC too. Many GTS are doing 550-580 on the core. XFX even sells them at said clock. Personally, I'd OC it myself rather paying for one factory set. BTW GTXs OC too :p(http://service.futuremark.com/compare?3dm06=2014598)

Below is one comparison that can be made between an air overclocked GTS 640mb and an air overclocked HD 2900XT.

This is from DH, FEAR 1920x 4xAA|16xAF, The GTS 640mb is not at its max OC rather 550 on the core, and the XT is @ 863 on the core. You may open them in separate windows.

http://www.driverheaven.net/reviews/XFX8800GTS/fearenthusiast.php
http://www.driverheaven.net/reviews/r600reviewz/overclocking.php

DH got that particular XFX card to do 580 on the core however the above bench is not at that speed.

I am eager to see X-bit's complete review in the future. In any case, I hope someone takes the time to bench both cards overclocked vs one another.
 
Pretty much owns the GTS except in a few cases.Good review.


err games 8 wins and 9 losses vs the GTS , read the last two pages. There are some bugs in the tests though, but still.

Lawgiver, you're a judge dredd fan? make your judgement based on the facts :D.

j/k, where it beats the GTS it beats it good, but there are alot of games that it gets beaten or ties, more so then when it wins. But the card is more expensive to begin with, and uses more power, which increases the price that much more with the addition to your electric bill.
 
Pretty much owns the GTS except in a few cases.Good review.

GTS get beaten and it's a good review. 2900 get beaten in [H] review and it's a bad review.

i still laugh at those morons who said the 2900 is so hot it will limit your system wide overclocking.:D They were so desperate to criticize the 2900 they would say anything no matter how ridiculous it sounded.

Who is ridiculous now?
 
err 12 of the 19 tests it loses to the gts or ties the gts, read the last two pages. There are some bugs in the tests though, but still.

Lawgiver, you're a judge dredd fan? make your judgement based on the facts :D.

j/k, where it beats the GTS it beats it good, but there are alot of games that it gets beaten or ties, more so then when it wins. But the card is more expensive to begin with, and uses more power, which increases the price that much more with the addition to your electric bill.

lol pls. people who buy 300+ dollar video cards do not care about their electric bill, if you they , they have no business spending so much money on part in the first place. and it LOSES to the GTS in only 5 cases, i think 3 of which are driver bugs. and in 4 different games it competes with the GTX.
 
err 12 of the 19 tests it loses to the gts, read the last two pages. There are some bugs in the tests though, but still.

Lawgiver, you're a judge dredd fan? make your judgement based on the facts :D.

j/k, where it beats the GTS it beats it good, but there are alot of games that it gets beaten, more so then when it wins. But the card is more expensive to begin with, and uses more power, which increases the price that much more with the addition to your electric bill.

errr battlefied it beats the gts and in the highest res its minimum frames are faster than the average framerate of the gts .

Call of juraza it beats the gts
Far cry it has a 39.5 fps lead over the gts !!!
Far cry research it beats it by 61 fps !!!
Far cry hdr tests they are virtualy tied
Fear ati leads 2 of the 3 resolutions with the lowest and highest res's in their favor
Ghost recon once again they are virtualy tied
Half life 2 the gts leads slightly
stalker it get sits ass handed to it , but the review blames drivers and the game istelf citing problems on nvida cards also
hit man it looses
tomb raider it looses
splinter cell it wins
gothica it looses
nwn2 it looses
oblivion it wins
oblivion it wins
x3 it wins
c&c tited
coh it wins by a commanding lead
supreme commander it gets its ass kicked again
I don't know , looks to me like it wins alot more than you claimed .

These are older drivers also and not the 8.38 or cats 7.5s these are some form of the 7.4s

Performance will only go up and right now aside from a few bugs and a few games where it just dies from either poor drivers or poor design choices its on the gts lvl or greater
 
Performance will only go up and right now aside from a few bugs and a few games where it just dies from either poor drivers or poor design choices its on the gts lvl or greater

Did you at last accept the truth ? That the XT and the GTS 640 are more or less the same, which is what the majority of reviews said this far ?
And that, if they are more or less the same performance wise, one should pick the cheapest and with less drawbacks in terms of power, heat and noise i.e. the GTS 640 ?
 
this horse is 100ft underground it's so beat.

look, the ATI card WILL catch up by the time DX10 comes around but I still have to DING ATI for taking FOREVER to release a card that has so many dang issues upon launch. Believe me, I really want ATI to come strong b/c prices are getting nuts. Still $420 for a 2900XT? come on please it should be $340 max.
 
lol pls. people who buy 300+ dollar video cards do not care about their electric bill, if you they , they have no business spending so much money on part in the first place. and it LOSES to the GTS in only 5 cases, i think 3 of which are driver bugs. and in 4 different games it competes with the GTX.
How would you know? Have you done a survey? How long have you been building PCs?

The 2900XT may be scoring higher in some benchmarks because the IQ is subpar to NVIDIA's. They mentioned blurring issues when AA is applied, and who knows what else is being sacrificed for speed. Newer drivers may actually cause the card to run slower if they care about IQ. But that's not a bad thing if IQ is important to you.
 
errr battlefied it beats the gts and in the highest res its minimum frames are faster than the average framerate of the gts .

Call of juraza it beats the gts
Far cry it has a 39.5 fps lead over the gts !!!
Far cry research it beats it by 61 fps !!!
Far cry hdr tests they are virtualy tied
Fear ati leads 2 of the 3 resolutions with the lowest and highest res's in their favor
Ghost recon once again they are virtualy tied
Half life 2 the gts leads slightly
stalker it get sits ass handed to it , but the review blames drivers and the game istelf citing problems on nvida cards also
hit man it looses
tomb raider it looses
splinter cell it wins
gothica it looses
nwn2 it looses
oblivion it wins
oblivion it wins
x3 it wins
c&c tited
coh it wins by a commanding lead
supreme commander it gets its ass kicked again
I don't know , looks to me like it wins alot more than you claimed .

These are older drivers also and not the 8.38 or cats 7.5s these are some form of the 7.4s

Performance will only go up and right now aside from a few bugs and a few games where it just dies from either poor drivers or poor design choices its on the gts lvl or greater

Anything above 5% I concidered a win or lose depending. I grouped Far Cry as 1 title, and only used the HDR modes since without it both cards get more then enough frame rates.

BF: win for XT
COJ: win for XT
Far Cry: Tie
FEAR: Win for XT but mins are much lower to the point it becomes unplayable, GTS wins.
Ghost recon: Tie
HL2: win for GTS
Pray: Win for XT
Stalker: Win for GTS but because of bug most revist later
Hitman: Win for GTS
Tombraider: win for GTS
SC: win for XT
Gothic: win for GTS
NWN2: win for GTS
Oblivion: win for XT
X3: win for XT
CC: tie
COH: win for XT
SC: win for GTS


7 wins for the GTS, 3 ties, 7 wins for the XT, ok that sounds better? These cards right now are pretty much equal, hence AMD's pricing of the XT and add in the extra power, it realy doesn't sound like much but 60 watts over 1 year, can add up to alot, like $100 bucks or more depending on how much a person games. About Stalker its not a bug, its a problem with the shadows in the game, which go into software rendering for the XT, it will get around 20% increase in performance once that is fixed, so its not going to really change the position of the card in that game I think.
 
Important to remember too that there are some fishy things about X-Bit's methods/results. They have the XT winning in Oblivion because they contend that you can't enable HDR and AA simultaneously, but it's well-known by any enthusiast that you CAN find ways to enable both, and that the XT loses by a wide margin when you do.

They also make a big point of disabling every optimization on the nVidia cards, a harkening back to the old drum-beat of "all of nVidia's optimizations are tools of the debbil." The ATi control panel doesn't expose nearly as many optimization settings, but that does not mean that the optimizations are not there--in fact, it's been proven in the past that they are. I prefer [H]'s policy of only disabling optimizations that have a visible impact on image quality.

Take into account that these tests were run under Vista, which supposedly is ATi's home turf, and the fact that the GTS beats it as much as it does is even more evidence of which card is the better value.
 
Interesting review @ X-bit.

It looks solid except for Gothic 3 and NWN 2 which are slower than the X1950 XTX. Hopefully they can tune that. I won't mention the bugged tests.

For its price point strong arguments may be made for it. I still want to see a review that puts a overclocked GTS vs the XT. It would be interesting to see which one benefits more from the increased clocks.
 
Important to remember too that there are some fishy things about X-Bit's methods/results. They have the XT winning in Oblivion because they contend that you can't enable HDR and AA simultaneously, but it's well-known by any enthusiast that you CAN find ways to enable both, and that the XT loses by a wide margin when you do.

They also make a big point of disabling every optimization on the nVidia cards, a harkening back to the old drum-beat of "all of nVidia's optimizations are tools of the debbil." The ATi control panel doesn't expose nearly as many optimization settings, but that does not mean that the optimizations are not there--in fact, it's been proven in the past that they are. I prefer [H]'s policy of only disabling optimizations that have a visible impact on image quality.

Take into account that these tests were run under Vista, which supposedly is ATi's home turf, and the fact that the GTS beats it as much as it does is even more evidence of which card is the better value.

what is up with that? in the far cry test they quote 4x aa and 16x aniso, but in game you can only take the aniso to 8 so you would have to override from the driver. in oblivion they don't use aa with hdr but everyone who owns an x18 or x19 series card knows you can force aa in driver. they also fail to mention whether or not they use transparency aa. i guess its time to stop reading benchmarks from questionable sources and start testing on my own. unfortunately my 2900 bumps into the south bridge of my biostar board. whoops.
 
Important to remember too that there are some fishy things about X-Bit's methods/results. They have the XT winning in Oblivion because they contend that you can't enable HDR and AA simultaneously, but it's well-known by any enthusiast that you CAN find ways to enable both, and that the XT loses by a wide margin when you do.

They also make a big point of disabling every optimization on the nVidia cards, a harkening back to the old drum-beat of "all of nVidia's optimizations are tools of the debbil." The ATi control panel doesn't expose nearly as many optimization settings, but that does not mean that the optimizations are not there--in fact, it's been proven in the past that they are. I prefer [H]'s policy of only disabling optimizations that have a visible impact on image quality.

Take into account that these tests were run under Vista, which supposedly is ATi's home turf, and the fact that the GTS beats it as much as it does is even more evidence of which card is the better value.

I was just about to mention that.. wtf?

Like some of the games above, The Elder Scrolls IV: Oblivion doesn’t support FSAA and HDR simultaneously, so we tested it in HDR mode only.
 
Oh pls, everyone knows that it is impossible to enable FSAA and HDR in oblivion, IMPOSSIBLE! Because then the 2900 would lose horribly! And the 2900 must win at least one of the more popular games!

If you're spending $300+ dollars, you're going to want as much IQ as possible, and that means as much AA as possible!

LawGiver, I understand your argument with the power sucking of the 2900, it's true that when you buy a high end card, you would expect ygr electric bill to go up a bit. but how you're defending it is ridiculous, you're basically promoting throwing money away. You should be willing to pay for extra electricity IF you are heting improved performance back. With the 2900 for the most part, you hey GTS perf at more than GTX electric bill cost.

All that money spent on electricity could have been saved for your next ATI card that is worth the money :p
 
Do people actually count their power usage? I have an Opteron 170 clocked at 2.7Ghz and two X1900XTs clocked at 755/629, all power-saving features disabled, and my rig is on 24/7/365 so I'd imagine it's eating up a fair bit of power, but I've absolutely no idea how much, therefore I'm pretty sure I wouldn't care if it went up a little. My electricity bills seem cheap enough - £36/$72 a month, and even if they went up to say triple that it'd still only be about £1250/$2500 a year, which is still a pretty small number when you consider the timeframe.
 
Do people actually count their power usage? I have an Opteron 170 clocked at 2.7Ghz and two X1900XTs clocked at 755/629, all power-saving features disabled, and my rig is on 24/7/365 so I'd imagine it's eating up a fair bit of power, but I've absolutely no idea how much, therefore I'm pretty sure I wouldn't care if it went up a little. My electricity bills seem cheap enough - £36/$72 a month, and even if they went up to say triple that it'd still only be about £1250/$2500 a year, which is still a pretty small number when you consider the timeframe.

You wouldn't mind paying $72*3=$216 or £36*3=£108 a month on just electricity???

I understand when people make a lot of money, and I'm not a cheap person, but seriously, what's the point of just throwing money away?
 
Seriously... you could use that extra money on another video card or monitor or wife/gf/husband/bf :p
 
...
hit man it looses
tomb raider it looses
splinter cell it wins
gothica it looses
nwn2 it looses
...

Don't mean to be an ass, but.... can't you spell "lose"? What is loose? Are there some bolts on the 2900XT loose? Maybe that will help with the temps/power sucking!
 
You wouldn't mind paying $72*3=$216 or £36*3=£108 a month on just electricity???

I understand when people make a lot of money, and I'm not a cheap person, but seriously, what's the point of just throwing money away?

I don't consider £100/$200 a month a large amount to spend on electricity - in fact I find the level of my current bills pretty cheap considering all the electrical gizmos I've got (not that I'm complaining). I did make a mistake though - they're currently £72/$144 monthly, not £36/$72. OK, £210 a month would raise an eyebrow, about £125 is my limit before I'd start turning things off :rolleyes: I have bought energy-saving bulbs for everywhere as the benefit is obvious but when it comes to my PC I prefer sacrificing economy for power.

Seriously... you could use that extra money on another video card or monitor or wife/gf/husband/bf

I can't fit any more video cards in here, and the current gf might get cross if I spent the money on another one :D A new monitor would be nice but I still can't find a 24'' with proper HDMI/component scaling :(
 
I don't consider £100/$200 a month a large amount to spend on electricity - in fact I find the level of my current bills pretty cheap considering all the electrical gizmos I've got (not that I'm complaining). I did make a mistake though - they're currently £72/$144 monthly, not £36/$72. OK, £210 a month would raise an eyebrow, about £125 is my limit before I'd start turning things off :rolleyes:



I can't fit any more video cards in here, and the current gf might get cross if I spent the money on another one :D A new monitor would be nice but I still can't find a 24'' with proper HDMI/component scaling :(

I didn't say that last quote ;)
 
Oh pls, everyone knows that it is impossible to enable FSAA and HDR in oblivion, IMPOSSIBLE! Because then the 2900 would lose horribly! And the 2900 must win at least one of the more popular games!

If you're spending $300+ dollars, you're going to want as much IQ as possible, and that means as much AA as possible!

LawGiver, I understand your argument with the power sucking of the 2900, it's true that when you buy a high end card, you would expect ygr electric bill to go up a bit. but how you're defending it is ridiculous, you're basically promoting throwing money away. You should be willing to pay for extra electricity IF you are heting improved performance back. With the 2900 for the most part, you hey GTS perf at more than GTX electric bill cost.

All that money spent on electricity could have been saved for your next ATI card that is worth the money :p

sorry, I don't agree with any of that.
 
Funny that Anandtech was able to turnon AA + HDR, and look at those frames... losing to an 8800 GTS 320mb.

There's the first point you don't agree with.

So what do you not agree with about the electricity bill? Other than it doesn't matter? Because that's not an excuse.
 
Status
Not open for further replies.
Back
Top