Newer 2900Xt reviews? (people saying it matches a GTX)

Status
Not open for further replies.

MrGuvernment

Fully [H]
Joined
Aug 3, 2004
Messages
21,818
hey all.

it seems i may be confused or perhaps others are, saying that the 2900XT is on par with the GTX and beating the GTS.

My understanding was the XT is to compete with the GTS and ATI didnt have a "high end" card to compete with the GTX and Ultra cards.

from all the orginal reviews i read, the 2900XT is on par or worse then the GTS 320 and 640mb cards and doesnt even play in the same field as a GTX card..... but others are saying opposite..

Are the new drivers helping that much now?

Has any site redone their reviews with newer drivers?
 
Most are not "redoing" reviews.. they're doing driver to driver comparisons... 7.8 to 7.9 etc. So far their cards still take a large hit when any AA is applied, and performance hasn't improved very much... their DX10 framerates in most DX10 games are about half to 1/3 of their DX9 framerates. This is not the case for NVIDIA...

And there are maybe 1 or two cases where it matches a GTX that I've seen. And those were games no one cares to play :)
 
I'd like to see reviews as new drivers come out. 2900xt sure has the hardware power to compete. Since nvidia doesn't want to lower prices or allow SLI on X38 maybe I'll just tell them to stick it and get a crossfire setup. Just hoping for ATI to get some good drivers out the door for dx10.
 
some over on ocforums are saying the 2900Xt is shing in high res 1680+ where the memory bandwidth starts to kick in , but the issue of crappy AA is still present, but since it performs well at higher res, AA is not as needed..... but finally, the massive power consumption makes a big con.
 
2900 xt is too slow for most games, but performs really well with UT3 engine, the card architecture need redesigned, lez hope 2950 xt can have the oblitering performance in all games just like it does to the UT3 engine with out the loud noice
 
some over on ocforums are saying the 2900Xt is shing in high res 1680+ where the memory bandwidth starts to kick in , but the issue of crappy AA is still present, but since it performs well at higher res, AA is not as needed..... but finally, the massive power consumption makes a big con.

it drains 60 more watts than the GTS, I can't call that massive..
 
The 2900XT is not a GTX level card. It is priced point to compete with the GTS, and the GTS does have better perforance in most cases with AA enabled. But it will be interesting to see how the new multi and super sampling modes perform in the up coming Cat 7.10.
 
some over on ocforums are saying the 2900Xt is shing in high res 1680+ where the memory bandwidth starts to kick in , but the issue of crappy AA is still present, but since it performs well at higher res, AA is not as needed..... but finally, the massive power consumption makes a big con.

2900XT load = 220-240W
8800GTX load = 270-290W

So......it's a "big con" for ATI, but perfectly acceptable for Nvidia?
 
^^ dam,. i thought the 2900XT drained alot more, like 100+ more... thanks for clearing that up, but now if the 2900XT performed like a GTX.... perhaps it would be more okay...

what about the GTS? since that is the card it was to compete more with.. ?
 
granted, the power consumption/performance ratio is worse with the 2900XT....but bear it in mind. The GTX uses (on average) 25% more power under load than the XT, but it also performs over 25% better than the XT in most games. It could be any margin really, but the GTX's performance over the XT is widely known.

Wattage numbers could be off though. I googled for results, and came across several site giving me the 200+ wattages for both cards, and a bunch of sites saying the power consumption was even half that.

Bit-Tech:
http://www.bit-tech.net/hardware/2007/05/16/r600_ati_radeon_hd_2900_xt/21

AnandTech:
http://www.anandtech.com/video/showdoc.aspx?i=2988&p=30

[H]ardOCP:
http://enthusiast.hardocp.com/article.html?art=MTM0MSwxNSwsaGVudGh1c2lhc3Q=

TechReport:
http://techreport.com/articles.x/12458/15
 
check your sites again, the GTX uses less wattage under load than the XT, read the single card comparisons.
 
The point of the post wasnt supposed to be a direct analysis, but to point out that the 2900XT's power draw over the 8800's isnt "massive" as McGuvernment put it earlier.
 
granted, the power consumption/performance ratio is worse with the 2900XT....but bear it in mind. The GTX uses (on average) 25% more power under load than the XT, but it also performs over 25% better than the XT in most games. It could be any margin really, but the GTX's performance over the XT is widely known.

Wattage numbers could be off though. I googled for results, and came across several site giving me the 200+ wattages for both cards, and a bunch of sites saying the power consumption was even half that.

Bit-Tech:
http://www.bit-tech.net/hardware/2007/05/16/r600_ati_radeon_hd_2900_xt/21

AnandTech:
http://www.anandtech.com/video/showdoc.aspx?i=2988&p=30

[H]ardOCP:
http://enthusiast.hardocp.com/article.html?art=MTM0MSwxNSwsaGVudGh1c2lhc3Q=

TechReport:
http://techreport.com/articles.x/12458/15

depends if its system load or just the card....card is around half of system, most likely :)

2900xt is the single most power hungry card you can get. period. Doesn't mean its not a good card, but if you only have a 450 watt psu or even 500 watts, you probably should factor in an additional 75-100 dollars for a new psu with your 2900.

The 2900 is getting slightly better with each driver release, though it still doesnt compare to the 8800gtx...there are some games where it can come close, but often those are games that dont support AA (where it really shines).
 
hey all.


from all the orginal reviews i read, the 2900XT is on par or worse then the GTS 320 and 640mb cards and doesnt even play in the same field as a GTX card..... but others are saying opposite..

Are the new drivers helping that much now?

Has any site redone their reviews with newer drivers?


It depends if you use aa or not. Without aa(from what I've seen) it is usually around the gtx levels, with low aa it's around gts levels, and with high aa it loses to both. Unfortunately, it's hard to find comparisons that don't use aa.
 
It depends if you use aa or not. Without aa(from what I've seen) it is usually around the gtx levels, with low aa it's around gts levels, and with high aa it loses to both. Unfortunately, it's hard to find comparisons that don't use aa.

That's not true... it's simply not near GTX for very many games. Bioshock (DX9 only) and MOHA come to mind.
With AA it's usually below GTS, without AA it's typically at or just above GTS levels.
 
looks like the 2900xt still lags.In certain cases it gets the stuffing beat out of it and in other situations it fairs well
 
The XT trades blows in Call of Juarez, and gets fairly close to the GTX in UE3 titles, but its performance in the majority of games is still at sub-GTS/GTS levels.

As exclusively shader-based AA becomes more widespread, the XT is going to start consistently hovering around GTX performance levels with AA (typically -- there will always be exceptions), but we aren't there yet.

The XT is really just all over the map due to its complexity, and there are still significant driver issues holding the card back when it should otherwise be performing well.
 
The XT trades blows in Call of Juarez, and gets fairly close to the GTX in UE3 titles, but its performance in the majority of games is still at sub-GTS/GTS levels.

As exclusively shader-based AA becomes more widespread, the XT is going to start consistently hovering around GTX performance levels with AA (typically -- there will always be exceptions), but we aren't there yet.

The XT is really just all over the map due to its complexity, and there are still significant driver issues holding the card back when it should otherwise be performing well.

i agree with your thinking.There is plenty of room for improvement
 
Actually, driver's haven't really made any difference, performance has stayed about the same (even some slower results).

TweakTown 7.9 Review said:
Final Thoughts

There we go - another month and a whole bunch of results. Unless you are planning on jumping into the Enemy Territory demo, it’s not exactly a driver we would recommend.

The Catalyst Control Center also seems to have dramatically slowed down and feels very sluggish and the performance results are well... nonexistent.

We hope AMD are hard at work as we come into the fourth quarter of the year as it’s going to be the holiday season in no time with many new games just around the corner.

Until next month Happy Gaming on the Catalyst 7.8 drivers unless you are playing ET:QW!
http://www.tweaktown.com/articles/1184/12/page_12_final_thoughts/index.html

I did a full comparison of 7.6-7.8:
Results of 7.6-7.8 said:
"High Quality" Change from 7.6 - 7.8
3.14 fps gain in HL2
5.1 fps gain in Prey

And thats with AA, and AF, but what about w/o AA / AF?

1.27 LOSS in HL2 @ 1920x1200
1.2 LOSS in Prey @ 1920x1200
1.7 LOSS in min CoH @ 1920x1200
2.7 LOSS in avg CoH @ 1920x1200
http://www.hardforum.com/showthread.php?t=1215427

Note: I am not trying to be a "downer" or anything, I am only showing their results in an easy to read way. Also, I have a friend who uses the 2600xt for video stuff and so like to keep up on all things ATI as well as Nvidia :p
 
Here's a comparison between the 2900XT and GTS, GTX, and Ultra, starting with 3DM06 and HL2: Lost Coast. This was on older 7.5 drivers. http://www.tweaktown.com/articles/1122/9

I'd say even in this older test, the 2900XT acquits itself pretty well and a lot of the negative comments about it are either huge assumptions or simply bullshit. One thing some don't realize is that for the 2900XT to start showing its stuff you have to use resolutions of 1600x1200 or more, and at those res', the benefit of higher levels of AA is dubious. OTOH, for lower res' you might as well save some money and get smaller VRAM.
 
Here's a comparison between the 2900XT and GTS, GTX, and Ultra, starting with 3DM06 and HL2: Lost Coast. This was on older 7.5 drivers. http://www.tweaktown.com/articles/1122/9

I'd say even in this older test, the 2900XT acquits itself pretty well and a lot of the negative comments about it are either huge assumptions or simply bullshit. One thing some don't realize is that for the 2900XT to start showing its stuff you have to use resolutions of 1600x1200 or more, and at those res', the benefit of higher levels of AA is dubious. OTOH, for lower res' you might as well save some money and get smaller VRAM.

And which version of HL2 was it rendering?

http://endeavorquest.net:8880/2900xtvs8800gts/HL2.htm
http://www.rage3d.com/board/showthread.php?t=33891682

Also, performance for HL2 (or any other game they test) hasn't really changed from driver 7.5-7.9.
 
Doesn't really matter, because the point is made that the 2900XT is shown to do much better than some like to give it credit for.

What I'm saying is people than might have previously written off the card based on a few early reports and are looking for a new GPU should re-consider the 2900XT. 8800's are far from being the only game in town.
 
Doesn't really matter, because the point is made that the 2900XT is shown to do much better than some like to give it credit for.

What I'm saying is people than might have previously written off the card based on a few early reports and are looking for a new GPU should re-consider the 2900XT. 8800's are far from being the only game in town.

Actually it completely matters.

If a game is not being rendered properly, people need to know about it. Like I stated, nothing has changed in the newer reviews as the drivers themselves haven't really changed anything (better or worse) performance wise.
 
Actually you're trying to damn this card with faint praise or strawman arguments. You've been doing so constantly since the card hit. However, there is no basis in real-world use for anyone to dislike this card.

I think you should try to find another occupation, j.
 
Actually you're trying to damn this card with faint praise or strawman arguments. You've been doing so constantly since the card hit. However, there is no basis in real-world use for anyone to dislike this card.

I think you should try to find another occupation, j.

Please point out where I lied or provided false information.

I said if there is a problem in rendering, reviewers need to take the time to find them and talk about them.

Hell plenty of people have posted on these forums about problems with their 2900xt, ditto for 8 series or ANY other card. You can't sweep them under the rug and pretend they don't exist though.

Ask Blackstone how he enjoyed his 2900xt, or look at the stuff I posted above from an ATI forum (Rage3d) about their problems with their 2900 cards with regards to rendering.

What about the bioshock rendering problems when it first came out, or AF not working correctly in quite a few games which many reviewers missed.

Feel free to argue if you wish or personally attack me, but I'm just providing the facts.
 
Faint praise or strawmen don't necessarily mean 100% falsehood, but those tactics do tend to leave out important issues, or highlight non-important issues. You and I have been through this before many times j, so digging up old posts from months past isn't needed.

Most if not all of the problems you've mentioned about 2900XT no longer apply. So you should stop flogging the dead horse with a dead cat.
 
Faint praise or strawmen don't necessarily mean 100% falsehood, but those tactics do tend to leave out important issues, or highlight non-important issues. You and I have been through this before many times j, so digging up old posts from months past isn't needed.

Most if not all of the problems you've mentioned about 2900XT no longer apply. So you should stop flogging the dead horse with a dead cat.

So you bringing up the review with 7.5 drivers to show performance and me asking if the test was bugged or not by providing posts from users of the 2900xt with older drivers having problems is a beating a dead horse?

I see... its ok for you to provide old information but not myself. Gotcha.

Feel free to point out whatever strawman arguements you think I used too.
 
Like I said, you're yammering on about issues that no longer exist. You can't seem to accept the fact that the 2900XT is a great card, probably because it makes you feel superior to think the 8800 is better. Well it didn't last long did it?
 
Like I said, you're yammering on about issues that no longer exist. You can't seem to accept the fact that the 2900XT is a great card, probably because it makes you feel superior to think the 8800 is better. Well it didn't last long did it?

You just don't get it...

You pointed out benchmarks using 7.5 drivers. I pointed out flaws with older drivers that caused stuff to render incorrectly.

Now you are saying that I am the one living in the past? Heelllllllooo, you brought it up. Yes those issues have been addressed, but that doesn't mean you can backdate the fixes to possibly broken tests.
 
I'll try one more time: it doesn't matter if it was an "old" test or not because the problems that may have existed with the card and/or the rendering have been fixed, so any newer results will only be equal or better than those. I posted that link because it used all 4 cards in the test and was easy to compare them against each other.

You're trying to cloud the issue with non-issues (as usual), but the fact remains that the 2900XT can run with the GTX. You should accept it and move on.
 
One thing some don't realize is that for the 2900XT to start showing its stuff you have to use resolutions of 1600x1200 or more, and at those res', the benefit of higher levels of AA is dubious. OTOH, for lower res' you might as well save some money and get smaller VRAM.

Dubious? I want to know who is constantly perpetuating the myth that at high resolution you don't really need AA. You may need less AA, but speaking from experience, I still see the need for at least 4xAA even at 1920x1080. Granted, that isn't really that high of a resolution, but it's becoming commonplace. Seeing a review examining what happens at high resolution with AA is not only important to me, but to anyone that wants max eye-candy.

Have you gone back and read any of the old 2900xt reviews? IIRC, at least on one benchmark, AF was not being applied correctly. That's why old benchmarks matter. Even Firing Squad's latest review states that although the 2900xt is faster than the 8800GTX (+20%), it's doing so with 0 AA. That's an old review that will surely mattter when ATI finally implements AA with Bioshock.

MrGuvernment,
I think most of us are keeping an eye on what ATI is doing. So far, U3 performance has been great. I don't know of many more games, besides UT3 and Gears, that are coming out soon to "turn the tide".
 
The 2900XT is NOT on par with the GTX once you enable AF and AA. Period. It will never be.
 
i would have bought a 2900xt,but 4aa and above it does'nt do well that is why i got the gtx.the 2900xt is a admirable performer wo aa.at this point in time it just does not hold up.
 
Actually with the shift to LCD and bigger screens...AA is getting more important, not less. Our dot pitch over the past 3 years has gotten worse...not better. ;) The people who have 37" westies need AA quite significantly.
 
Aztec, you gotta rely on something other than tweaktown for your info. You sound like the Tiger Direct review show guy when you talk about the 2900xt lol. Every post you make is about how awesome the 2900xt is...its a good card! we get it! shut up already! it still is about on par with a 640mb 8800gts, and priced accordingly, so take it easy and chill.
 
Status
Not open for further replies.
Back
Top