2900XT Ain't That Bad...

My system goes to 11.

My amplifier goes to 11.

and yes, it is a good reason for them to buy one. Oblivion is basically the most graphically demanding game out there.
plus:

Crossfire is more efficient than SLI; this means that two ATI cards gain more of a % increase than two nVidia cards.
Crossfire can be done on Intel boards and not these magma spewing 680i
it costs less than a 640GTS at $319.99
It overclocks very nicely.

the bad:
red pcb :)o) more of a preference thing
more power draw
hotter? i duno bout this one. just from observation.
 
Is Oblivion the only game you play?

Every other game I play the XT gets its ass handed to it by the GTS cards.
 
I also have an X2900XT and have had it for about 3 weeks now, the noise levels are good, its quieter than my freezer 7 pro cpu cooler when it speeds up during games, temperatures are good, and the performance is fantastic, totally whoops my friends 7800 BFG OC SLI setup in DX9 games.

As for the 8800 series cards, it has better performance than the sub £240 cards (which is the price i paid for it) and the performance seems to be on par with the £280 region nvidia line up - asides from lost planet, which is an nvidia released game that ATI didnt get to see before launch.

I cant wait until I get another one in Crossfire for less than the price of an 8800 ultra and blow its framerate scores away by upto 30-40%

Previously I have always had Nvidia, and I am extremely impressed by the image quality of this card, a very noticeable difference between the two. I look forward to using the x2900's 1080p + surround sound HDMI output!

So, moral of the story, dont listen to all of Nvidia's claims - and their supporters, the ATI is a good card and excellent value for money as per usual, hats off to you ATI.

Regards,

Gilj

(This is not a dig at nvidia it is an informative post letting users know that the x2900XT is a good card with great value for money)

You think 2 2900xt will beat an ultra by 40%?
 
Thats like a 5fps increase in one game, while the XT gets owned in every other game.

That isn't a very good argument for anyone to buy the XT.

Ye, agreed. It's like a guy with a minivan saying ''I HAVE MORE CUPHOLDERS!!'' to a Ferrari owner.

Do you have one? No, you don't.

XT beats GTS in Oblivion.
There is quite a large amount of stupidity in this thread, and at 319.99, the XT is the more appealing choice.

If you aren't afraid to overclock, don't worry about the extra heat/power draw, the XT is not bad.

I do have a GTS. Do you?

Is Oblivion the only game you play?

Every other game I play the XT gets its ass handed to it by the GTS cards.

Haha, that made me chuckle. Pwnt.

Define "getting its ass handed to it."
5%? 10%?

For huge cost increase for the XT, yes. Also, funny how BestBuy is sold out. If you catch my drift.
 

A few problems here.

The same scene is not being rendered in both.

Screen 1) The GTS is looking farther to the right, an extra tree is in this screen. The XT has a huge stone structure on the left. Just guessing but rectangles are easier to render than trees. The XT is looking towards the ground more than the GTS.

Screen 2) The GTS has extra grass (check the bottom right). The XT also isn't displaying shadows from trees.

Screen 3) Once again, more trees on the GTS as its looking toward the forest more than toward the building on the right. And again, the XT is not displaying shadows from the trees.
 
A few problems here.

The same scene is not being rendered in both.

Screen 1) The GTS is looking farther to the right, an extra tree is in this screen. The XT has a huge stone structure on the left. Just guessing but rectangles are easier to render than trees. The XT is looking towards the ground more than the GTS.

Screen 2) The GTS has extra grass (check the bottom right). The XT also isn't displaying shadows from trees.

Screen 3) Once again, more trees on the GTS as its looking toward the forest more than toward the building on the right. And again, the XT is not displaying shadows from the trees.


+1



So the XT is barely inching ahead with a unfair comparison AND settings lower.
 
A few problems here.

The same scene is not being rendered in both.

Screen 1) The GTS is looking farther to the right, an extra tree is in this screen. The XT has a huge stone structure on the left. Just guessing but rectangles are easier to render than trees. The XT is looking towards the ground more than the GTS.

Screen 2) The GTS has extra grass (check the bottom right). The XT also isn't displaying shadows from trees.

Screen 3) Once again, more trees on the GTS as its looking toward the forest more than toward the building on the right. And again, the XT is not displaying shadows from the trees.

LOL@fanboys with rigged results.
 
Flawed. No overclocking results, slick. You have to overclock R600 to reap the benefits it seems.

Comparing stock to stock is useful. Comparing OC to OC is useful. Comparing OC 2900 to non OC 8800 is stupid.

The 2900 doesn't become magical and lose all of its problems when you overclock it.
 
Comparing OC 2900 to non OC 8800 is stupid.

Look at the results again. Those screenshots were taken with a massively overclocked GTS (mine, @ 660/2000) and an overclocked XT.

And yes, Xion X2's screenshots were slanted more to the right. Same save location. You can download the same saves we were using here.
 
Some people just wont accept that the 2900 sucks compared to 8800, no matter how you spin it!

And some people are so thick headed that don't even look at countless overclocking results that prove otherwise!


I'm done with this thread. I'll let you shills and fanboys bicker all you want.

byebyebabywv5.jpg\
 
My amplifier goes to 11.

and yes, it is a good reason for them to buy one. Oblivion is basically the most graphically demanding game out there.
plus:

Crossfire is more efficient than SLI; this means that two ATI cards gain more of a % increase than two nVidia cards.
Crossfire can be done on Intel boards and not these magma spewing 680i
it costs less than a 640GTS at $319.99
It overclocks very nicely.

the bad:
red pcb :)o) more of a preference thing
more power draw
hotter? i duno bout this one. just from observation.

LOL... you're going to compare the sale price of a card to it's usual price? Best Buy will on occasion sell new cards at a good price. I remember buying a 6800GT for $350 ($50 less than everywhere else) and the price of a GT didn't go down for a long time. I haven't kept up with GTS 640 prices but a few weeks ago there were offers approaching the $300 AR range (the best i could find right now was $329.00 AR but i didnt look very hard). For that price, of course the 2900xt is a good buy.
 
Look at the results again. Those screenshots were taken with a massively overclocked GTS (mine, @ 660/2000) and an overclocked XT.

And yes, Xion X2's screenshots were slanted more to the right. Same save location. You can download the same saves we were using here.

Xion had to correct his initial results because of that. The XT still won though. Like i said, if $320 were the starting retail price, i'd agree that it was a nice card.
 
A few problems here.

The same scene is not being rendered in both.

Screen 1) The GTS is looking farther to the right, an extra tree is in this screen. The XT has a huge stone structure on the left. Just guessing but rectangles are easier to render than trees. The XT is looking towards the ground more than the GTS.

Screen 2) The GTS has extra grass (check the bottom right). The XT also isn't displaying shadows from trees.

Screen 3) Once again, more trees on the GTS as its looking toward the forest more than toward the building on the right. And again, the XT is not displaying shadows from the trees.

I am not seeing what you are seeing. Have another look here
 
I am not seeing what you are seeing. Have another look here

Then you are flipping blind.

Here, I merged the images into one and circled the obvious differences.

8800gts2900xt01qs8.jpg


And you can't compare people with different systems directly together and say its only the graphics card that makes a difference. Memory, running applications in background, cpu and everything else can alter scores a lot, which is why reviewers use the same system and just swap out the card.
 
Then you are flipping blind.

Here, I merged the images into one and circled the obvious differences.

And you can't compare people with different systems directly together and say its only the graphics card that makes a difference. Memory, running applications in background, cpu and everything else can alter scores a lot, which is why reviewers use the same system and just swap out the card.

I agree, that is not a fair comparison.
 
jmackay
It is a fair comparison when there are more then 1 set screen shots to look at,:eek: :p :)
You crying about 1 screen shot doesn't help your case, deal with it ;)
PS: nice try, I see what you did there. make like there is only 1 set of photos in the thread. E for effort :D

Edit: What evidence have you presented that shows a 9-10 FPS difference when a 8800GTS frame rates increase/decrease when those screenshots are reproduced? What was that? You don't have it...well then it's kinda hard to cry about the screenshots being unfair then. If you have no physical proof that those 2 screen shots create a 9-10 fps difference on a 8800 GTS then the "it's not fair" is just blowing in the wind. :D Now don't get me wrong, it's ideal to have both screenshots at the exact same spot but there are other ss in that thread doing just that ;).

LOL
 
Too bad yet another 2900 thread turns into Nvidia marketing fest, I dunno why the true fanboys always feel the urge to crawl on to the other side of the fence. I honestly think that all this thread needs are comments like:

@OP

Glad you enjoy the card!

ps. Alternative opinions I understand, but to go on bashing the OP for his card choice/opinion for pages is not good taste. Sure 8800 can be better majority of users out there but considering OP lives outside of US (there are countries outside of US?!?) and got a good deal in his opinion, who are we to say hes stupid as rock.
 
Edit: What evidence have you presented that shows a 9-10 FPS difference when a 8800GTS frame rates increase/decrease when those screenshots are reproduced? What was that?


The fact that there is ~33% less to render on the 8800GTS screenshot?

Less to render = higher FPS.
 
There are 2 things that sadden me about this thread

1 - Nvidia fanboys bashing the card when it clearly sits comfortably between the GTS and GTX with a great price, while still having a lot of performance to unlock

2 - Ati fans cant even "defend" the card properly and keep using stupid arguments, the card is actually good, but youre doing a horrible job (dont be a lawyer please :p )

So, I challenge any nvidia fanboy to point me to a RECENT review of HD2900XT and tell me it isnt the superior card compared to the GTS (dont bother posting launch day reviews as those are already outdated)
 
I fail to see how that last screenshot can be an accurate comparison when the GTS screenshot includes more of the tree canopys (the MOST demanding visual element in any game without a doubt) and the 2900XT's screenshot includes more "solid objects" (much easier to render than trees). I have played this game on both cards (well, gtx) and in both cases framerates radically change as you pan around even a little bit.

Further, the image quality is clearly better on the GTS by an order of magnitude.

I tried running this game on the 2900XT with 4xaa. The bottom line is that you can't run 4xaa and transparency aa it simply brings it to its knees. IT SUCKS.

There is only one meaningful way to test a graphics card, and that is to look at maximum playable settings, just like [H] did. The bottom line is that the 2900XT does well in some tests with no or little AA but whatever performance lead it has is totally MEANINGLESS because the 8800 series cards do better with higher levels of aa.

Hypothetically, just hypothetically, if you test a game with no AA and 2900XT wins by 30 frame per second, and then you test the same game and find that the most 2900XT can do is 4xaa before the game is unplayable, but the GTS can do 8xaa with supersampling or 16xaa with super or multisampling playable, which is the better card? Raw frame rates mean nothing, nothing at all--because they don't translate into better image quality.

In world of warcraft I am running 1680 x 1050 16xq supersampling aa on the GTX? What can the 2900XT do in that game?

By the way, I am a lawyer. :)
 
2 - Ati fans cant even "defend" the card properly and keep using stupid arguments, the card is actually good, but youre doing a horrible job (dont be a lawyer please :p )

This is the problem I have, why do I need to explain myself to anyone or even justify my purchase.
 
This is the problem I have, why do I need to explain myself to anyone or even justify my purchase.

You really shouldn't have to, but there are a lot of people on here out for the most performance for their dollar and I'm one of them.

Although I've taken a liking to ATi lately because they've been releasing a new set of drivers every month without fail, now whether those drivers actually did much or not doesn't matter at least they're releasing them.

It looks like at the moment that the 8800GTS is indeed the supirior card (if not by much). However if you did indeed get the 2900XT it's not like the world is over for you.

Then there's the preliminary DX10 benchmarks which shows the 2900XT right up there with the 8800GTS for the most part. Crysis will tell though.

An enthusiast doesn't care about power or noise, so those don't matter. Except when you're talking performance per watt, which in the enthusiast's eyes should not matter either, just whichever has performance supreme.
 
The fact that there is ~33% less to render on the 8800GTS screenshot?

Less to render = higher FPS.

Based on what? Where is the information (data, etc) that shows the 8800GTS losing 9-10 FPS when those ss are reproduced on that video card? Without the actual information there is no proof the 8800GTS lost or gain anything other then what was posted in the ss. Saying that a screen shot is rendered slightly different to another doesn't mean there is a 9-10 FPS lose. Also, don't forget there are other ss taken in that link that show similar screen shots.

:)
 
So basically we sum up this whole thread in one sentance.

When compared to older ATI cards, the new HD 2900XT is not that bad of a card.

Hallelujah, he's seen the light! Yes, Sir, that's what I was saying... as an encouragement for other 2900XT owners to join me and talk about the card. Yes, even the bad things and what (if anything) improves from driver to driver, settings tried, and such.
 
...
So, I challenge any nvidia fanboy to point me to a RECENT review of HD2900XT and tell me it isnt the superior card compared to the GTS (dont bother posting launch day reviews as those are already outdated)

Nice point, Shadow. It has seemed to be a trend that more recent reviews have shown the 2900XT in a stronger (relative to launch day's) position.
 
http://www.ocforums.com/showthread.php?t=513445

The 1GB GDDR4 cards beat everything except heavily overclocked Ultras.

Sorry shills, but I only go by overclocked results with heavy AA and AF with large resolutions. Can't deny concrete evidence.

Of course not. No one ever denied that the HD 2900 XT is amazing @ the 3DMark "game". Unfortunately, in actual games, it doesn't do so well. 1 GB of GDDR4 is not going to fix the problems R600 has now. Also these 1 GB cards, will have their cores cherry picked, which means they'll be quite rare and expensive or AMD/ATI will just have to reduce their profit margin alot, just to stay competitive.
 
What? The 2900Xts are all selling for atleast $400. It better be a good amount faster for being almost $100 more than an 8800GTS640, but it's not.

Paid $281.59 plus tax for mine, haven't found a GTS that cheap yet. Best Buy sale and fathers day 12% off coupons FTW.

Figured I'd give it a shot since it was fairly cheap, my X1950XTXs run great in crossfire in Vista, but this card should outperform them. I'll probably actually stick it in my Precision 690 though since it's running Vista x64 and the two FX 4500s in it don't have SLI drivers yet. Not doing anything at the moment that requires the quadros, so I'll give the 2900 a shot paired with two 5160s.
 
holy crap you only paid 281 dollars where link now plz?

(Wholesaler)

Based on what? Where is the information (data, etc) that shows the 8800GTS losing 9-10 FPS when those ss are reproduced on that video card? Without the actual information there is no proof the 8800GTS lost or gain anything other then what was posted in the ss. Saying that a screen shot is rendered slightly different to another doesn't mean there is a 9-10 FPS lose. Also, don't forget there are other ss taken in that link that show similar screen shots.

Well for one it just is. There are more objects in trees to be rendered, and therefore more surfaces. Not only that, but trees are also sometimes round (I.e branches) which can also be difficult to render.

A giant block with 3 visible sides versus a canopy, with possibly hundreds of shapes to be rendered.

Who do you think is going to have to work harder?

(You don't even have to answer that one. It should be painfully obvious.)
 
Back
Top