ATI Radeon HD 2900 XT @ [H]

Status
Not open for further replies.

Brent_Justice

Moderator
Joined
Apr 17, 2000
Messages
17,755
ATI Radeon HD 2900 XT @ [H] - Does it have what it takes to compete?

ATI’s DirectX 10 capable GPU is finally here in the form of the Radeon HD 2900 XT. Does this $399 video card have what it takes to compete with NVIDIA’s 8800 series? We explore the architecture, image quality, and real world gaming that shows a different experience than canned benchmarks.

Please DIGG to share! Thanks!
 
Thanks for the great review.
It is akin to the FX-series, but not quite as bad. At least the 2900XT can actually do DX10. :rolleyes:
I look forward to having a choice for my next video card. nVidia already won me this cycle. Good luck ATi.
 
Already bought my GTS 640 several months ago... guess my jumping the gun wasn't really doing that :p... more like just getting something better, sooner :D!
 
This is one AMD\ATi fanatic that has become physically sick. Not so much that Nvidia soundly kicked the R600's ass but how the people in charge consistently lied to the public on what we could expect.

Nvidia has clearly won this round in dominant fashion and I for one will be eating crow for the next few decades. This is just plain a sad day for everyone with no real competition.

Congrats AMD you have just lost one of you biggest fans, 8800GTX here I come.

Thanks for the review Brent and Kyle you guys rule as always.
 
So, you can overlcock with Powerstrip/Atitool without the 8pin PCIE power cable? That sentence got cut off...

article said:
Third party apps, such as Powerstrip, will let you overclock your 2900 XT regardless of PCIe power connections, but

Sorry about that. We had some fixes we were doing at last minute tonight and I left that out. Should read - "Third party apps, such as Powerstrip, are currently limited as well to being usable only with the 6-pin and 8-pin connected." Thanks for the heads up. - Kyle
 
wow this round goes to nvidia... let's see if they can keep the ball rolling on that.
 
I wish I'd known a week ago. I'd have my GTX now instead of waiting for it to ship this week. Oh well, at least now I can be 100% positive in my decision.
 
Can't say I didn't see this coming. In anycase I hope the midrange can perform better than the 8600's or this is the greatest debacle in video card history.

Slightly off topic if amd can't get a processor to beat penryn then these are bad bad times ahead.

edit: In ati's slight defense it can not be helped that nvidia hit a home run (more like a grand slam) with the g80 architecture. The people that ask what was ati doing for the 6 months rom g80 launch till now you have to consider that these gpus have been in development for years and 6 months is not a lot of time to try and redo your product because your competitor made a monster product. In a lot of peoples eye this is still no excuse but meh. I hope that after they saw what the g80 could do in nov they quickly started trying to make plans for the revision. Still its really sad to see amd/ati in this position getting beat down on both sides :(.
 
i just always HATED nvidia image quality. so is it really true the 8800 cards have better image quality then the X1950 line?
 
owned.

Anyways, 3dmark indicates theoretical performance right? Unless ati is cheating ackin to nvidia with the 42 dets I think IIRC. So it's possible the drivers are 100% terrible right?
 
Great review Brent. I can't wait to read the follow up articles, especially the comparison to the 320mb 8800.
 
i just always HATED nvidia image quality. so is it really true the 8800 cards have better image quality then the X1950 line?

Yes, even the 2D quality of the 8800 is the best I've ever seen.
 
I am not surprised at the review. the ATi X1xxx series ran alot hotter than the Nvidia 7xxx series. ATIs stock cooling was also subpar, after market cooling is almost a must have. I have gone from nvidia, to ATI, Back to Nvidia, back to ATI and not it looks like I am going to be going back to Nvidia.

The thing I find most funny is the HL certificate. everyone that bought the older cards that came with the HL2 Certificates were all disapointed that they got the most stripped down package of all. Looks like Steam/Valve is at it again, but I think anyone buying this cards won't be doing ot to get the certificate. But who is actually gonna buy this card based on its own merits? even a die Hard ATI fan is going to quistion this one.
 
Why did they even bother releasing this card? If it consumed 120 watts at most and cost ~270ish it would be a great card, but as it is I can't see who in their right mind would consider buying this thing.
 
Drivers are still immature. Give ATi some time to crank out some drivers then make your decision. The latest beta drivers give 10-30% performance in game but sacrifices IQ.
 
eVGA 640mb GTS.....here I come!

Not sure how much drivers will help the overall picture.

Even if you got more mature drivers, and brought performance up to par with the GTS.....you would still be using more power, and will still be costing more.

Remember....as for the many early reviews....ATi is fighting the war in 3 fronts - FPS/Power/Cost. Take 1 out of the equation, and you still have the two other factors weighing down th 2900XT.
 
owned.

Anyways, 3dmark indicates theoretical performance right? Unless ati is cheating ackin to nvidia with the 42 dets I think IIRC. So it's possible the drivers are 100% terrible right?

3DMark still uses the drivers, it doesn't bypass them or anything, so whatever it is it is, it just goes to show how off 3DMark is to determining real-world gaming performance comparisons.
 
Drivers are still immature. Give ATi some time to crank out some drivers then make your decision. The latest beta drivers give 10-30% performance in game but sacrifices IQ.

Give someone a good reason to... I don't see one. Why should someone give ATI even more time than 7 months, or even wait at all? It's not like ATI's giving them anything in return. Yay, you boost performance by sacrificing image quality... not really a boost. Who cares? Not even sure what you're talking about, but even if it were true... WHO CARES?
 
Your statement is not conducive to adult conversation. You will be banned if I see you do it again. Kyle
 
I feel really bad for the ATI fans that waited so long and didn't go buy a GeForce 8800.

PS: They should have called it the FX2900XT, sounds better. Joking aside, I hope ATI quickly recovers, I don't like monopolies (except the game).
 
I had high hopes for this but wow.....I'm really shocked to see ATi bite the dust so hard. I now know how Nvidia fans felt when the 5800 bombed and the 9700 was released. I swear this is deja vu.

I'm at a lost here, I've been using ATi cards since the 9200 came out and I was hooked and proceeded to buy the 9800PRO and then the X850XTPE. I'm really at a lost here with words.

I guess I'm going to have to bite the bullet and end up buying the 8800GTS 640MB.

I'll admit, I'm an ATi fan and I want to say drivers will fix this but....=/

Thanks Brent for taking the time and posting the review in time after the NDA had expired. I can't imagine the pressure you had to endure with this card and the NDA, and us the fans. It was very much appreciated.
 
I am not surprised at the review. the ATi X1xxx series ran alot hotter than the Nvidia 7xxx series. ATIs stock cooling was also subpar, after market cooling is almost a must have. I have gone from nvidia, to ATI, Back to Nvidia, back to ATI and not it looks like I am going to be going back to Nvidia.

The thing I find most funny is the HL certificate. everyone that bought the older cards that came with the HL2 Certificates were all disapointed that they got the most stripped down package of all. Looks like Steam/Valve is at it again, but I think anyone buying this cards won't be doing ot to get the certificate. But who is actually gonna buy this card based on its own merits? even a die Hard ATI fan is going to quistion this one.

The Valve Black Box is a sweet deal, but we have been down this road before...
 
3DMark still uses the drivers, it doesn't bypass them or anything, so whatever it is it is, it just goes to show how off 3DMark is to determining real-world gaming performance comparisons.

Any standard non-propietary thing like 3dmark is victim to cheating "optimizations"... other than unique timedemos on a per-site basis, or manual run-thru's, a company can "optimize". I guess even they know their product's not good enough :eek:!
 
Status
Not open for further replies.
Back
Top