ATI Radeon HD 2900 XT @ [H]

Status
Not open for further replies.
Thank you for the extensive review and it must have took weeks to make it; I did read every part ; I'm hoping since it looks like 8800gts>hd2900xt . I'm hoping a price drop for the 2900 will boost sales especially if it's in the 320 market . I'm a AMD guy but AMD got their asses kicked .
 
I wonder if the 3.38 drivers have changed anything...supposedly it has improved performance...
 
*slow golfclap* Way to go you jackasses. Way to go. 6 months late and this is the fricken turd you lay on you loyal fans. Nice. I highly doubt the actual use of these "320 Stream Processors", for anything other than folding. Sounds like "TRUFORM and 48 Pixel shaders". More marketing BULLSHIT. I waited 6 months for this POS instead of just getting a 8800 GTX. My bad. Won't do that again. Assholes. You didn't just nudge me to Intel/Nvidia, you just full on shoved me! Thanks! I SEE THE LIGHT BROTHERS/SISTERS!!!!:mad: :mad:
 
Is surely disappointing to see this card fail like it did

but one learns from their mistakes.. thats how Nvidia did

and most likely ATi will do.. you fall but you need to get up to fight another day..

now we have to pay attention at their midrange card.. thats were the money lies. not the high end

good review
 
The thing I don't understand is this whole "wait" mentality. Wait for better drivers, wait for r650, wait for...

Didn't ATi fans just spend 6 months waiting? For what? A card that consumes more power and for all intents and purposes is slower than the cheaper 8800GTS that people bought in November of 06, about six months ago?

Waiting in the hardware community is silly anyway, but its exceptionally silly when you're waiting for something that ends up not even being as good as what you could have had a long time ago.

Besides, no one knows for sure if better drivers are going to fix anything. I tend to side with Kyle on this and I think we'll be seeing a rather fast refresh of the core.

The last thing I'd recommend anyone do is wait at this point. It isn't as if ATi didn't have very much time to release a competitive product, I mean look how late this card is.
 
I usually don't prefer HardOCP reviews, but this review is of utmost High Quality. :D


I was telling Brent the other day, that even though we went to this "real world gaming" format over 4 years ago, that is it just now truly coming into its own and I think this evaluation is the crowning jewel when it comes to showing why the way we do it gives our readers better information than canned benchmarks and time demos.

Interestingly enough, it was ATI's image quality superiority that pushed me to make the changes back then.
 
Thank you Brent and Kyle, for quelling the fanboy flames, and showing us what it can really do.

Its a good card in its own right. but compared to the competition, it can't hold a candle.

PS: Are we going to see overclocked CPU results? ( >= 3.2 ghz?
 
Is surely disappointing to see this card fail like it did

but one learns from their mistakes.. thats how Nvidia did

and most likely ATi will do.. you fall but you need to get up to fight another day..

now we have to pay attention at their midrange card.. thats were the money lies. not the high end

good review

I just hope there will be "another day" for AMD/ATI.
 
Great review Kyle, Brent. Looks like I'll be buying a GTS soon!
 
it will be interesting if this same shit happens with the new AMD cpus....:eek:
 
If that happens there will be no words to describe the amount of rage, distraught, shock I would feel all at the same time.

not to mention lose all hope in AMD/ATi. im holding out. i REALLY want to see theyre new shit wipe the floor with C2D without being super high clocked heat maniacs :( like the A64 did to the P4 :(
 
I'm not too worried about ATI the theoretical performance is there and everything looks great on paper and DX10 games aren't even out yet (That is the #1 reason most of us are getting these 8800's and the x2900...it seems like you guys keep forgeting)

Anyway I think its too soon to say that they failed it was just released for petes sake. A couple of driver revisions and DX10 games and performance between the 8800's and the latest ATI's will be right there.
 
The Ultra made me decide on a gts 320/640. The 2900xt reaffirms that decision. But given that the dx-10 games I'm waiting for are still months away I just decided to get a MountainMods U2-UFO case instead. I need it more than a new vid card currently and I can start enjoying it immediately. Cases DON'T need driver updates!!
 
By the time they get the bugs shook out and it's "right there" NV will crap on them with something else out the door.
 
not to mention lose all hope in AMD/ATi. im holding out. i REALLY want to see theyre new shit wipe the floor with C2D without being super high clocked heat maniacs :( like the A64 did to the P4 :(

Don't see that happening. Netburst architecture sucked ballz. Core architecture doesn't. It's either going to be slightly faster/as fast or..... well, let's not think about the other....

I'm not too worried about ATI the theoretical performance is there and everything looks great on paper and DX10 games aren't even out yet (That is the #1 reason most of us are getting these 8800's and the x2900...it seems like you guys keep forgeting)

Anyway I think its too soon to say that they failed it was just released for petes sake. A couple of driver revisions and DX10 games and performance between the 8800's and the latest ATI's will be right there.

Except by the time DX10 will be the norm, we will be in the 2nd or 3rd generation of DX10 cards. Also, ATI fans had to wait 7 months, and now they have to wait even longer for the performance to catch up?
 
I have waited for this review, I may end up going back to Nvidia ..... I'll be deciding within the next month or so. (I have a bud who will by my x1900xtx right now for 175.00, should I take it?) Then I may end up getting the 8800gtx, but what about vista support? I have read rumblings about nvidia having poor vista support for their cards.... how is ati's support for vista?
 
I'm not too worried about ATI the theoretical performance is there and everything looks great on paper and DX10 games aren't even out yet (That is the #1 reason most of us are getting these 8800's and the x2900...it seems like you guys keep forgeting)

Anyway I think its too soon to say that they failed it was just released for petes sake. A couple of driver revisions and DX10 games and performance between the 8800's and the latest ATI's will be right there.

I think that's assuming quite a bit.
 
Don't see that happening. Netburst architecture sucked ballz. Core architecture doesn't. It's either going to be slightly faster/as fast or..... well, let's not think about the other....

i know it sucked, but the A64 was pretty damn nifty. the core 2 is a P3 on steroids IIRC, im hoping they do something magical with the new chips...

1178941045mh7ld2qvcw55lmw6.png


Brent, you wanted to know if we saw anything different... that tree is missing parts on the 2900

every review sites benchmarks are different. In the [H] review for example, the 2900 XT looks especially bad, in techpowerup it looks especially good, see for yourself:

http://www.techpowerup.com/reviews/ATI/HD_2900_XT/5

Honestly, if you hadn't realized, [H] does things much differently than everyone else (except bit tech :p theyre pretty damn close, hence why [H] and B-T are they only two i trust. ) everyone else uses apples to apples comparisons. the [H] crew use Apples to Oranges, or rather Bananas in this case. as the settings achieved at an acceptable frame rate are way different

with the [H], it isn't FPS, its quality. all the FPS are right near each other, but the image quality changes.

it weeds out inefficiencies in certain things, like AA processing, AF filtering, and how a card can cope with things like high resolution.
 
I think once ATI works out the driver issues of its shaky release, the 2900xt will well outperform the 8800gts. I don't think that it stands a chance against the 8800gtx - but it really wasn't meant to.
 
i know it sucked, but the A64 was pretty damn nifty. the core 2 is a P3 on steroids IIRC, im hoping they do something magical with the new chips...

1178941045mh7ld2qvcw55lmw6.png


Brent, you wanted to know if we saw anything different... that tree is missing parts on the 2900

there is stuff missing in both shots if you look over it.
 
i know it sucked, but the A64 was pretty damn nifty. the core 2 is a P3 on steroids IIRC, im hoping they do something magical with the new chips...

The A64 was a damn good architecture. I don't know about Barcelona (come on AMD, show us something!).

PS: Actually you could call the Core 2 Duo a Pentium Pro on steroids, but it doesn't take away any of it's performance.
 
In the [H] review for example, the 2900 XT looks especially bad, in techpowerup it looks especially good, see for yourself
It's difficult to say. They didn't have an 8800 GTS for testing, so making comparisons is difficult. The only title the 2900 particularly excels in is X3, which seems to cause some serious issues for all NVIDIA cards, for whatever reason (though the GTX is still faster overall).
 
every review sites benchmarks are different. In the [H] review for example, the 2900 XT looks especially bad, in techpowerup it looks especially good, see for yourself:

http://www.techpowerup.com/reviews/ATI/HD_2900_XT/5

I can't comment on their review method, but will say that we actually play the games just like gamers do when evaluating performance.

Brent, you wanted to know if we saw anything different... that tree is missing parts on the 2900

Thanks, looks like ADAA is way oversampling there or something, interesting.
 
Ouch, the 2900XT sure doesn't fare well where [H]'s tests are. It's just another piece of this bigass puzzle where 2900XT performance is all over the board. One game it does well on perfomance vs. the 8800GTX at higher AA/AF but where it wasn't doing very well with the low/no AA/AF settings. But then on others it holds it's own vs. the GTX on low/no AA/AF and then totally tanks vs. even the GTS when you dial up the IQ settings. Very wierd.

It just doesn't seem to be ready for prime time at this point. Whether it's drivers or what? *shrug*

So any thoughts on where that large gap in IQ between the G80 and the R600 AA is going to show up? Or is maybe that tunnel test tool just not measuring something that comes up in a meaningful way because game developers figured out how to avoid those situations in games due to past cards not covering it well?
I think I have seen that before.....
:cool:
Yeah, I can see that debate openning up again with the extent of configuring that hardware is taking on. But, frankly, as far as I'm concerned if a card manufacturer is willing to do a game developer's job of optimizing for games that already exist when do hardware comes out I say power to them. For hardware that exists when the game is written there is actually a LOT of work done by the developer to tweak towards the hardware.

Of course that means you have to be careful trying to extrapolate performance in one game to expecting performance in another.
 
there is stuff missing in both shots if you look over it.

yes, but it is much less. there are a few branches that are more defined on the nV, i only highlighted the first thing i noticed. the nV is just missing sporadic pixels :p
 
its just based on techpowerup, the 2900 XT offers nearly same perf as the GTX but costs $150 less. when reading your review even the thought of getting 2900 XT over even the GTS seems silly.
 
This goes out to anyone who puts any faith into 3DMark, or any other synthetic benchmark or scripted timedemo, those number simply mean nothing in relation to real gaming performance. If you care about knowing how these video cards compare in actual games there is no other way to get that information than playing the games themselves just as you would do as a gamer.

Shouldn't it be...
This goes out to anyone who puts any faith into 3DMark, or any other synthetic benchmark or scripted timedemo, those numbers simply mean nothing in relation to real gaming performance.

If nothing else hopefully their low-end cards will have the same HDMI adapter. I plan on getting a new TV soon and my HTPC needs a new CPU/Motherboard. It would be a waste to hook up a nice HTPC to a nice TV via a component hookup.
 
there is stuff missing in both shots if you look over it.

The gray parts are the transparency supersampling aa patterns, they are suppose to fill in the broken tree parts, but in some cases there may be slight errors of over or undersampling.
 
I think that's assuming quite a bit.

How am I assuming to much, it looks great on paper and it performs great in certain games and applications some that werent tested here, I just think that you guys are shutting it down too quick, a month or so from now if performance doesent improve we will have a better judgement and even then we dont know how these different architectures perform in DX10. Maybe the x2900xt kills the 8800 line in DX10 games who knows, its way to early to make assumptions, today is launch...
 
I just wanted to add my thanks to Brent and Kyle for providing a great and very thorugh review. This has been the most anxiously awaited review in the last 4 years....maybe ever.

With 22", 24" and 30" widescreen monitors becoming more popular, the super hi res and high aa/af details become increasingly more important aspects of a detailed review process. From here on out anyone who looks at just 3d mark and canned benchmarks will be very un-informed.
 
Wow, just when I thought I was becoming an ATi loyalist. I am going to get a new card whenever I get back from my work/vacation trip to florida. By then (middle of July), the entire 2900 line should be 65nm and I will pick one if nvidia hasn't dropped their prices by too much. A 680i mb with a 8800gts 320mb for under $400 would be very nice.
 
So any thoughts on where that large gap in IQ between the G80 and the R600 AA is going to show up? Or is maybe that tunnel test tool just not measuring something that comes up in a meaningful way because game developers figured out how to avoid those situations in games due to past cards not covering it well?

The only speculation I can make on that is that in extreme cases where there are very high resolution textures you may see a slight difference far off in the distance at certain angles. At 16X AF though the mipmaps are pushed back quite far, so it will only be in the far distances. The difference may be so small that it really doesn't matter. I honestly don't know, so far I haven't seen anything that really stands out.

Of course we have no idea how DX10 games will behave.
 
Status
Not open for further replies.
Back
Top