Radeon HD 2900 XT: Flop or Not?

Radeon HD 2900 XT

  • Flop!

    Votes: 586 63.4%
  • Not really that bad!

    Votes: 316 34.2%
  • It's AWESOME!

    Votes: 22 2.4%

  • Total voters
    924
Status
Not open for further replies.
well if thats true then owners of GTX and Ultra cards will also sol and have killed any overclocking headroom since they use similar wattage, or just a bit less than the XT. Although I don't think you would agree with that, would you.;)

It's a fact. The XT consumes the same or more power under load, than even the GTX and the Ultra and is considerably slower, so I fail to see how that is a XT saving grace...
Besides, the Ultra is a silicon revision of G80, so they have plenty of headroom for overclocks.
 
When the best anyone can say about a card that's been hyped as the second-coming for half a year is that it's nearly as good as an older, cheaper card, I call that a flop.

No one, not even ATI Fanboys could convince me that ATI considers this card a success.

What this means is that for the first time in many years ATI isn't even in the game for the "Best Gaming Graphics Card" title. That's a major prestige hit right there.

I lol'd at Nvidia when they released the ridiculous Ultra recently (Which I also considered a Flop, by virtue of being completely fucking pointless), I now suspect that Nvidia only released that so people wouldn't forget they existed in the middle of all the r600 press coverage, not because they thought they needed to compete.

To those people saying, "wait for drivers", "wait for the refresh", I say: "Seriously, you think Nvidia R+D was doing nothing for the last 6months?".

Don't get me wrong, if ATI can squeeze out some driver based improvements, and do a quick manufacturing revision that lets them run it cooler/clock it higher, it might be a contender, and more competitive, but no matter how you spin it, it's still a piss poor showing out of the gate for a supposedly high end card, and Nv will almost certainly have a response if they need one.
 
What this means is that for the first time in many years ATI isn't even in the game for the "Best Gaming Graphics Card" title. That's a major prestige hit right there.

It's still a piss poor showing out of the gate for a supposedly high end card, and Nv will almost certainly have a response if they need one.

It's not even in the running for number two or three - both the Ultra and GTX beat it, and most of the time so does Nvidia's third-best, the 8880GTS640. It manages to beat the #4 from Nvidia, the 8800GTS320, for the most part - big whoop! :rolleyes:
 
I voted Flop as well. ATi's new card draws more power, and runs hotter, yet it has a hard time keeping up with the GTS's. For those who said it was gonna beat the GTX by 15-20% opps!
 
That is with the release drivers.
So it looks bad and [H] review makes it looks bad because it benches in a way that the R600 can't handle. Other reviews show the same where there is overlap, but do more where R600 can handle it, way better.

I and the gamer choose if HDR aF AA is on, not a reviewer so I want to know how it performs in those settings to. And I do know because other review did. That why [H] divers from other it's restricted.

Because of raw drivers it performance around GTS320 to Ultra depending what you do and with wat drivers. With a average around GTS

In some benches it surpass GTX reachin Ultra while a few it dives way below GTS320 and even R580+.

Realy big chance it's a Driver ISue. As nV and ATI optimise Game specific. Because g80 is out for a long time nV is at a further state with it's drivers and still optimizing away. while ATI have some serious catch up to do.

The only problem for me is where does this Driver maturing ends for 2900 can it reach GTX. Performance more steady also with AA and AF mixes of settings.

How will it do in DX10 with mature drivers and real new DX10 games.

How will it's Havok FX performance be.

It look bad now, it get better it will, but how much will be a gamble.
 
no one was talking about guaranteed anything. I was obviously point out his silly logic and bias towards nvidia cards. that somehow your "whole system" will not have "ANY" oc headroom simply because you install a 2900. Or did you miss that? what are you even talking about?

your entire premise for some time now is that a poorly performing card can be improved by drivers and/or overclocking, which is hardly guaranteed either way. This is therefore a flawed premise for spending a great deal of money on something, when there is guaranteed performance for the same cost or less.

Now then, his points are actually valid, as he adjusts in a post or two, so I'm not going to spend the time explaining for him, but you seriously need to look at why you're favoring a card that is more expensive, under-performs, is inefficient, and has a very questionable future.
 
your entire premise for some time now is that a poorly performing card can be improved by drivers and/or overclocking, which is hardly guaranteed either way. This is therefore a flawed premise for spending a great deal of money on something, when there is guaranteed performance for the same cost or less.

Now then, his points are actually valid, as he adjusts in a post or two, so I'm not going to spend the time explaining for him, but you seriously need to look at why you're favoring a card that is more expensive, under-performs, is inefficient, and has a very questionable future.

QFT - 8800GTS performance is guaranteed, why take a chance on the 2900XT improving (or not)?
 
Why do people excuse ATI and blame poor performance on young drivers?

They had MONTHS for Christ sake... the drivers should be as mature as Jennifer Lopez's butt!
 
That is with the release drivers.
So it looks bad and [H] review makes it looks bad because it benches in a way that the R600 can't handle. Other reviews show the same where there is overlap, but do more where R600 can handle it, way better.

So you're saying that because they turned the options as high as they could while maintaining 30+fps for the most part, that shows a bias against the ATI card?

You really really need to rethink that logic. So what if it beats the Nvida card at lesser settings, the guy with the nvidia card isn't going to be playing at those, he'll have the options cranked to that max. While the ATI guy can be sure of is dominence in the medium settings range.
 
I don't see that new drivers are going to have much effect on the cards power consumption and heat.And if they release a more powerful version to try and match the GTX and Ultra using the same design,can you imagine how much power it will draw and how hot it will run?Sorry,I expected better after a 6 month wait.AMD better do better with their reply to the Core 2.
 
All this talk about power consumption, heat output..blah blah blah....if you don't want any of that buy a laptop, obviously this was somewhat expected as PSU manufacturers keep bringing out 1K+ WATTS PSUs, if it runs hot but it still perform at its rated speed I don't what the problem is. Yes, this was a disappointment this time around but if it's not for you then go withanother solution. NOBODY is forcing you to buy anything.
 
Doesn't stop anyone from arguing that suck high power consumption is unecessary when the competition can yield better results with less power?
 
Add me in the Flop group as well...

A few words :

-Too late
-Too power hungry
-Bad die design which leak power (think Prescott).
-Too expensive for his performance point.

However, I think that with the die shrink and proper redesign, it would be a very good contender.
 
Doesn't stop anyone from arguing that suck high power consumption is unecessary when the competition can yield better results with less power?

Then go with the competition....it's been a known fact that ATi cards run hotter and consume more power...why do you think they chose red as their color :D!!

Additional Comments: If nVidia ever let us run SLi on Intel Chipsets I would have gotten another 8800 GTS 640MB along time ago but I will go with the 2900XT Crossfire eventually.
 
I was a true ATI supporter for long (since I am Canadian) but I am already with the competition. I am not some blinded fanatical fanboy who buys because of a brand name. I think it's crazy that ATI is leaving the high-end market entirely in the hands of it's competitor, even after 9 months of the original release of the G80 series... it doesn't make any sense.

I really don't understand the strategy here...
 
I was a true ATI supporter for long (since I am Canadian) but I am already with the competition. I am not some blinded fanatical fanboy who buys because of a brand name. I think it's crazy that ATI is leaving the high-end market entirely in the hands of it's competitor, even after 9 months of the original release of the G80 series... it doesn't make any sense.

I really don't understand the strategy here...

Maybe the AMD-ATI merger came @ a bad time or it might have had effect on this ordeal!! I'm still not convinced that this video card is a total flop, that why I'm getting me one and try things for myself as I've always felt ATi cards easier to OC than nVidia cards.
 
I was a true ATI supporter for long (since I am Canadian) but I am already with the competition. I am not some blinded fanatical fanboy who buys because of a brand name. I think it's crazy that ATI is leaving the high-end market entirely in the hands of it's competitor, even after 9 months of the original release of the G80 series... it doesn't make any sense.

I really don't understand the strategy here...

It's not a strategy. They simply don't have a choice in the matter. For a huge list of probable reasons they can't seem to keep up with NVIDIA's development cycle. Short development times are what NVIDIA is good at. They've always been like that.

I remember when NVIDIA made the statement that they were commited to bringing out new graphics cards every six months. Granted they were already doing that when the statement was made but no one thought they could maintain it including me. I was wrong because they've almost always stuck to that cycle over these last several years.

I think ATI was again caught off guard by the power of G80 and it took along time to design a CPU that could match it. I think they tried really hard, and realized that they really couldn't match the high end 8800GTX. So they immediately set their sites lower and also decided to concentrate on the mid range. Therefore they released the HD 2900XT to gain some sales in the higher end, and to save some face. I would bet that the HD 2900XT failed to meet their expectations just as it has with many of us. The worst part about that is that NVIDIA has basically a 9 month head start on them at the very least. That's alot of time that ATI has to make up if they want to be competitive again.

Notice how the X800 series and Geforce 6 series came out within a few weeks of each other? Since then ATI has continued to fall behind and now their development and product release cycles are 9 months behind NVIDIA at this point. That's not going to get made up very easily.
 
Do nV and ATI don't use different Design teams. A new GPU isn't made in 6 month from scratch I think. There is overlap so two main team are needed.
team one R3xx R5xx R7xx
Team two R4xx R6xx R8xx

I would think.

So you're saying that because they turned the options as high as they could while maintaining 30+fps for the most part, that shows a bias against the ATI card?

You really really need to rethink that logic. So what if it beats the Nvida card at lesser settings, the guy with the nvidia card isn't going to be playing at those, he'll have the options cranked to that max. While the ATI guy can be sure of is dominence in the medium settings range.
The logic is [H] choose witch settings they like and only use.
I want to see them all like other reviewer do. So I'am not alone

It can't handle games?

Nit picking. Can't handle setting [H] chooses for me. That a sucks.

Nice there are a dozen review wich do bench all options.

Espacialy the AF only scaling in Oblivibion with all AF settings
 
The logic is [H] choose witch settings they like and only use.
I want to see them all like other reviewer do. So I'am not alone
[H] gives real world results - what you as the user would actually see when playing the game. If you don't understand why that is important, I am sorry.

[H] extensively tests each card with each game so that each card is played at the maximum playable settings up to and for specific resolutions. This gives you (the consumer) the most information about the product than any other website gives you. That is why you don't see [H] having 20 games in every evaluation - it would take too bloody long to do.
 
The logic is [H] choose witch settings they like and only use.
I want to see them all like other reviewer do. So I'am not alone
No, they choose what's the highest that maintains playable settings. Peak performance is not how we play games, we play games based on what the highest settings we can play them with.
Nit picking. Can't handle setting [H] chooses for me. That a sucks.

Nice there are a dozen review wich do bench all options.

Espacialy the AF only scaling in Oblivibion with all AF settings


And the other reviews agree: it's not competing with the high end parts.

Do you play games by fiddling with AF settings? No, you play the GAME.
 
The logic is [H] choose witch settings they like and only use.
I want to see them all like other reviewer do. So I'am not alone



Nit picking. Can't handle setting [H] chooses for me. That a sucks.

Nice there are a dozen review wich do bench all options.

Espacialy the AF only scaling in Oblivibion with all AF settings

The whole point of the [H] review style is that the playable settings achieved are the result, not the fps.

Those settings are the best the card can do and still be playable, logically, what else do you really need to know?

If you want to know what fps the card can do, you can just look at the settings [H] achieved and work backwards, lower the resolution and the fps will go up, turn off effect x and the fps will go up, make the game uglier and the fps will go up....

There's a zillion other sites doing exactly what you ask, and guess what, they are free for you to read too, it's not like there's any restriction on what or how many sites you visit looking for info :p
 
i havent even read all the posts in this thread but according to the inquirer's review they got higher fps on a higher resolution yes i know in this forum the inquirer is often times regarded as a rumor monger a biased even a faanboy from what ive observed anyway id just thought id mention this seems very strange would someone care to explain this heres the linq
http://www.theinquirer.net/default.aspx?article=39580
 
i havent even read all the posts in this thread but according to the inquirer's review they got higher fps on a higher resolution yes i know in this forum the inquirer is often times regarded as a rumor monger a biased even a faanboy from what ive observed anyway id just thought id mention this seems very strange would someone care to explain this heres the linq
http://www.theinquirer.net/default.aspx?article=39580


Periods are evil They come out of nowhere and leave people wondering when the next sentence is Oh my god, did one almost attack Help they're coming
 
i havent even read all the posts in this thread but according to the inquirer's review they got higher fps on a higher resolution yes i know in this forum the inquirer is often times regarded as a rumor monger a biased even a faanboy from what ive observed anyway id just thought id mention this seems very strange would someone care to explain this heres the linq
http://www.theinquirer.net/default.aspx?article=39580

first of all, inquirer does not indicate whether or not they use transparency aa. this mode of anti aliasing is more demanding than regular aa. it cleans up jaggies on finer detailed textures like chain link fences, leaves, fur. both ati and nvidia cards that support smart shader 3 and above can enable transparency aa. the ati console refers to this method of aa as adaptive aa. if you are spending upwards of $300 dollars on a video card and your are not enabling this feature you are essentially wasting half of the performance potential of that card. in reference to stalker, this is a game that can be player with dynamic lighting turned on or off. you cannot run dynamic lighting and aa at the same time. the dynamic lighting features can make an inferior video crawl into single digit frame rates. dynamic lighting is even more intensive than running standard light with high aa. the inquirer test does not include the dynamic lighting mode for stalker as they are running high aa. the [H] test concludes that in stalker the 2900xt is only able to run dynamic lighting at a lower resolution than the 8800gts. the 2900xt is late to market, more expensive that a 8800gts 640 (at the time of this posting $375.48 including shipping to california), and slower in the newest most demanding game available at this time. if the card performs this poorly in stalker, how will it perform in potentially more demanding titles like unreal tournament 3 and crysis?
 
Taking six extra months to produce a "second place" card....

I call that a big phat FLOP. :)


In short.... ATI 2900XT = NVIDIA 5800
 
how well does this thing FOLD?
Given beta drivers it might be forgivable.
 
Firing engineers who created R300 aka 9700pro, R400 and R500 architechtures does not seem like very wise course of action to me :confused: Every company has its moments in the gutter, ATI/AMD is no different.

I'm not so sure that AMD's stockholders would agree with you on that point. "6 months late" and "uncompetitive" are not words that are well received during quarterly earnings conference calls. The glory days of the ATI 9X00 series are long gone and if ol' Hector doesn't get the engineers crapping competitive products... the board will bring in someone who will.
 
"Not that bad..."

HD 2900 XT and its delayed launch, hype, and not launching the entire familly of R6?? as promised isn't a flop to me. It isn't a desirable situation, but it isn't quite a NV30 scenario (they still sold quite well with the refresh).
 
semi-flop. Performs ok, but I think that we need to give it time to mature as far as drivers are concerned, to really see its potential.

Thanks.
 
The only way the HD 2900XT could possibly be considered a success in my eyes is if new drivers were to suddenly get the card to the point where it could beat the 8800GTX 75% of the time and match the Ultra at least 50% of the time in tests and real-world testing. Since ATI has stated that they are shooting for the 8800GTS's target market and performance levels, this seems unlikely.

ATI doesn't have confidence that the card can match the 8800GTX which is 7 months old now. The HD2900XT isn't a failure the way NV30 was, but I wouldn't call it a huge success either.
 
While I would not call it a failure, it is overpriced. Its a $300 card at a $400 price.
 
While I would not call it a failure, it is overpriced. Its a $300 card at a $400 price.
ATI fanboys will disagree as they're banking on better drivers to get the card closer to 8800GTX speeds. Somehow this card is going to get better--so they say.
 
more like totally f***ing pathetic.

whoopeee!

my 8800 SPANKS ALL ATI CARDS EVER MADE!!!!

yessssssssss:)
 
semi-flop. Performs ok, but I think that we need to give it time to mature as far as drivers are concerned, to really see its potential.

Thanks.
how???????

the 8800 drivers will get better at the same rate... You think the 8800 has room for more performance with better drivers? DUH!!!!!!!!

SO. The ATI will still suck.....

anyone catch that?

oh wait what? you forget that for every 1 driver update ATI makes, nvidia has 20


ATI was, is, and will always be, a joke.

are you people living under a ROCK? I don't even benchmark or get into video-cards. I was just SMART enough to snag an 8800gts superclocked, and its better than any ATI card made. WOW. What is the big problem??? just throw anything ATI in the trash, and call mommy for a 2-week advance on your lunch money.

get a damn clue.
 
how???????

the 8800 drivers will get better at the same rate... You think the 8800 has room for more performance with better drivers? DUH!!!!!!!!

SO. The ATI will still suck.....

anyone catch that?

oh wait what? you forget that for every 1 driver update ATI makes, nvidia has 20


ATI was, is, and will always be, a joke.

are you people living under a ROCK? I don't even benchmark or get into video-cards. I was just SMART enough to snag an 8800gts superclocked, and its better than any ATI card made. WOW. What is the big problem??? just throw anything ATI in the trash, and call mommy for a 2-week advance on your lunch money.

get a damn clue.

Why are you getting so bent out of shape that not everyone is completely trashing the 2900XT? Do you have a personal stake in how the card sells? Does ATI selling another 2900XT somehow make your 8800 perform worse? Do your framerates plunge every time someone goes with ATI? No? Then kindly lay off the vitriol.

And about ATI being a joke - have you heard of a little card called the 9700Pro? No? How about the X1900XTX? No? Then how about the X1950XTX? Each card was arguably the fastest of its time (at least, the 1950 was until the 8800 was released). How does that make ATI a joke?

Why am I even responding to you?

Oh, and one more thing: Didn't Nvidia need about 20 driver releases to get their "Vista-ready" 8800's to actually work properly in Vista?
 
Why are you getting so bent out of shape that not everyone is completely trashing the 2900XT? Do you have a personal stake in how the card sells? Does ATI selling another 2900XT somehow make your 8800 perform worse? Do your framerates plunge every time someone goes with ATI? No? Then kindly lay off the vitriol.

And about ATI being a joke - have you heard of a little card called the 9700Pro? No? How about the X1900XTX? No? Then how about the X1950XTX? Each card was arguably the fastest of its time (at least, the 1950 was until the 8800 was released). How does that make ATI a joke?

Why am I even responding to you?

Oh, and one more thing: Didn't Nvidia need about 20 driver releases to get their "Vista-ready" 8800's to actually work properly in Vista?

Well you are pretty much right on the money. Especially about NVIDIA needing 20 Vista drivers. They needed nearly that many it seemed.
 
Status
Not open for further replies.
Back
Top