ATI Radeon X1900 Series Evaluation Discussion Thread

RoffleCopter

[H]ard|Gawd
Joined
Oct 8, 2005
Messages
1,910
Well, after reading this article I am very impressed with the X1900 Series.

I'm quite surprised it was able to do HDR and AA on SS2 without that bug. I was unaware ATi fixed the problem.

I thought the article was well made, so kudos to Brent for another good evaluation.

FEAR 1600x1200 at max settings, that's VERY impressive.

Thoughts?
 
I was a bit lost when i read "Much like F.E.A.R. and Call of Duty 2, we received a performance increase with CrossFire." before i had seen the COD2 results :D


But very nice evaluation.
 
I find it pretty amazing that we've finally reached the stage where our Hardware is beyond our Software in the gaming industry.

I remember a short while ago, AA and AF were exotic options as standalone. Running in 16x12 was also quite an achievement. Just recently it seemed like HDR was a luxury to any card. Now we're running all these simultaneously. All's left is to tackle soft shadows!

Anyway, great to see this action between nVidia and ATi, with these constant nippings at each other's heels. It really presses the tech fast.
 
Kudos to Brent and Kyle for a great review. I have to admit I only play CS:S and that already runs great. If I was to sell my current x1800xt to upgrade, I'm still wondering if that would be a worthwhile investment. I can't think of another game that I would play using high res with all eye candy. Anyways, I hope this generation is supported with drivers that consistently squeeze out more frames in games in every revision. :cool:
 
no o/c results... is there no s/w updates out yet? This is a [H]ard site after all!
 
mmynett said:
no o/c results... is there no s/w updates out yet? This is a [H]ard site after all!

I believe they O/C retail cards to reflect the kind of experience the consumer would get when ocing.
 
In the monarch site the x1900xt is $550 and the x1900xtx is $700. The review doesn't seem to justify this price since the xtx seemed to only have a slight edge over the xt. A simple OC of your xt would save you $150. I don't see any reason for buying the xtx. I'm also not sure I like where this naming scheme is going. xx2000xtxtxt :(
 
Good review.
And Lord Almighty, it's good to see cards available on launch day.

Maybe I've had my head in a box the last few times I've read a video card article, but this time I really appreciated the "Gameplay advantages" part of your performance tables. It's a good way to highlight where one card is superior to another.

Also, Brent, save yourself the trouble of benching Sims2. I realize this is a very popular game, but I thought the whole point of that game's graphics is so that it can be run on a huge range of hardware. Maybe a more relevant game engine like Splinter Cell: CT could take it's place?


 
It has been my opinion and the opinon of many that [H]ard, while never publicly stating so, is an Nvidia cheerleader.

As far as the X1900 is concerned, its a complete 180 degree turn from last year. The 512mb GT is nowhere to be found and the X1900 is cheaper, faster and available! Great job ATI! I cant wait to see what these two have in store for us in the next generation. :D
 
ThirtySixBelow said:
In the monarch site the x1900xt is $550 and the x1900xtx is $700. The review doesn't seem to justify this price since the xtx seemed to only have a slight edge over the xt. A simple OC of your xt would save you $150. I don't see any reason for buying the xtx. I'm also not sure I like where this naming scheme is going. xx2000xtxtxt :(

Newegg has the XTX for $599

jebo_4jc said:
Also, Brent, save yourself the trouble of benching Sims2. I realize this is a very popular game, but I thought the whole point of that game's graphics is so that it can be run on a huge range of hardware. Maybe a more relevant game engine like Splinter Cell: CT could take it's place?

I agree, since Sims 2 is heavily dependent on CPU power rather than GPU power.
 
I liked the review a lot. I has wished for CS:S (HL2) scores, but I guess I will find them somewhere else.

I do have one little piece of beef with the review's conclusion:

Editorializing a little bit here, I would like to see the Anisotropic slider in the control panel and in games disappear. There really should be only 2 options regarding Anisotropic filtering: There should be “Off” and there should be “On.” When you turn it on, it should operate in every game at 16X with High Quality AF. There is no reason not to have AF enabled in today’s games with today’s video cards. The only quality setting we should really be worrying about these days is anti-aliasing. I hope that one day we see AF as a given where it is always on, at the highest level, at the best quality.

I understand Kyle or Brent's point about HDAF being easily attainable "in today’s games with today’s video cards". Well, the problem that the author seemed to have neglected is that not everybody in the world who wants to play FEAR will have $700 to dish out for a new GFX card. I for one am not willing to pay this much.
Given that, I understand that I would always have the option to turn AF "off", but I think that I should be given the option for a less demanding AF option so that I can maximize my utility from the game.

Apart from that little part, I loved the review ,though I had wished that the X1900XTX would demolish the 7800GTX512, considering that it requires quite a hefty investment.
 
drizzt81 said:
Apart from that little part, I loved the review ,though I had wished that the X1900XTX would demolish the 7800GTX512, considering that it requires quite a hefty investment.
Keep in mind that the 512GTX requires "quite a hefty investment" also, and can't be purchased at this point, from what I hear.
 
drizzt81 said:
Apart from that little part, I loved the review ,though I had wished that the X1900XTX would demolish the 7800GTX512, considering that it requires quite a hefty investment.

The x1900 demolishes the 7800 in shader intensive games such as FEAR, and games like FEAR are what the x1900 series are meant to shine on.
 
man i dont know what they did, but they go the GTX 256 to pull some INSANE scores in EQII! :O

http://hardocp.com/images/articles/1137898263zPDySkv4SS_10_2.gif

(Kyle - Fixed this missing decimal place.)

interesting read... although it confuses me at one point in EQII you said CF increased performance but the settings stay the same and performance drops :S

http://hardocp.com/image.html?image=MTEzNzg5ODI2M3pQRHlTa3Y0U1NfMTBfNF9sLmdpZg==

Also in one of the COD2 screen for the x1900XTX i believe, the fraps frame rate is showing 55, but none of the X1900 graphs hit 55 ever :S

http://hardocp.com/image.html?image=MTEzNzg5ODI2M3pQRHlTa3Y0U1NfOF82X2wuanBn

anyway, keep em coming :p
 
I'm a bit puzzled that apparently nobody has the specs of the new All in Wonder x1900 beyond it having 256 megabytes of RAM.

On ATI's site, their gallery shots of the card show that it's still using the same awful single-slot cooling solution of the x1800XL. Considering that many people seem to say that x1800XL's can get absolutely roasting on the stock cooler, I've got to wonder how hot the more powerful x1900 AIW might get using the same stock cooler...

EDIT: Well, looks like Anandtech's got the specs of the x1900 AIW. Looks pretty solid to me.
 
krizzle said:
I find it pretty amazing that we've finally reached the stage where our Hardware is beyond our Software in the gaming industry.
Everquest II above the Balanced setting, anyone? :D

I still can't get over the fact that even the newest vid cards can't do much in that game above the Balanced graphical quality settings at 1600x1200. :eek:
 
electech98 said:
Everquest II above the Balanced setting, anyone? :D

I still can't get over the fact that even the newest vid cards can't do much in that game above the Balanced graphical quality settings. :eek:

Could be just bad coding.

Now we need to work to 8x AA with a single card!
 
the card is rocking no matter what any one says! :p i want one for my cod2 !
 
drizzt81 said:
I do have one little piece of beef with the review's conclusion:

I understand Kyle or Brent's point about HDAF being easily attainable "in today’s games with today’s video cards". Well, the problem that the author seemed to have neglected is that not everybody in the world who wants to play FEAR will have $700 to dish out for a new GFX card. I for one am not willing to pay this much.
Given that, I understand that I would always have the option to turn AF "off", but I think that I should be given the option for a less demanding AF option so that I can maximize my utility from the game.

Apart from that little part, I loved the review ,though I had wished that the X1900XTX would demolish the 7800GTX512, considering that it requires quite a hefty investment.

I meant it for all cards, value and up, AF should be a given. The only options that should be available are AA levels, in my opinion.
 
Great review! Glad to see both both the 7800gtx256mb and 7800gtx512mb compared, HDR+AA tested in Serious Sam 2 and a few mentions of, "hey these are totally new drivers," when a couple problems came up (instead of just saying this feature doesnt work with ATI cards and they are doomed for life). Lots of info. I love the new layout you started with the last review stating what settings are higher after each test.

Next time you run a crossfire setup it would be nice to see some 8xAA and 14xAA even if you have to reduce the resolution down one. Just to see how it runs. Still do your highest playable settings.

Some overclocking results would have been nice, but these were beta drivers.

No heat or power info? Well, there is a ton of benching in this review. I guess time came into account.

Some CS:S or HL2 numbers would have been nice. Specially to show off crossfires 8xAA and 14xAA modes. I mean this is a ATi review. Probably want to show off one of their strong suits.
 
RoffleCopter said:
Well, after reading this article I am very impressed with the X1900 Series.

I'm quite surprised it was able to do HDR and AA on SS2 without that bug. I was unaware ATi fixed the problem.

I thought the article was well made, so kudos to Brent for another good evaluation.

FEAR 1600x1200 at max settings, that's VERY impressive.

Thoughts?

A new patch was released for SS2 a while ago, fixed the problem.
 
Druneau said:
I was a bit lost when i read "Much like F.E.A.R. and Call of Duty 2, we received a performance increase with CrossFire." before i had seen the COD2 results :D


But very nice evaluation.

lol, yeah, actually when i was writing review that's the order I was writting it in, looks like i reversed the order a bit when i uploaded it though, whoops, a small problem hehe
 
krizzle said:
I find it pretty amazing that we've finally reached the stage where our Hardware is beyond our Software in the gaming industry.

I remember a short while ago, AA and AF were exotic options as standalone. Running in 16x12 was also quite an achievement. Just recently it seemed like HDR was a luxury to any card. Now we're running all these simultaneously. All's left is to tackle soft shadows!

Anyway, great to see this action between nVidia and ATi, with these constant nippings at each other's heels. It really presses the tech fast.

Which is exactly the point, we are so very content limited with these cards.
 
mmynett said:
no o/c results... is there no s/w updates out yet? This is a [H]ard site after all!

we like to wait for retail products to do oc'ing on, no point in overclocking a reference card, they probably won't represent what you get in retail in terms of overclocking, and as you know even then it is luck of the draw
 
While I was very impressed by the results in the F.E.A.R. benchmarks, I begin to wonder. Once you've hit 1600x1200 and 4xAA and 4xAF, what's the point of going further? When I look at the IQ comparison, I can see differences, but they're very minor from level to level of AA or AF. IMO, at this point, higher polygon count and better animation should get more attention than the next AA or AF implementation. Think long grass blowing in the breeze, or maybe (some developer please do this!) a decent animation of a person walking. Make the feet stick to the floor--no sliding. Granted, this puts more weight on the developer than on nVidia or ATi, but still...

One of the things that has turned me off to the X000 and X1000 series is the two-slot cooling. Perhaps someone can enlighten me on this. Do the ATi cards of the current/last generation generate that much more heat that they require a two-slot cooler, or is it just ATi's intention to exhaust the air immediately instead of putting it back into the case (like recent nVidia offerings)?

Alright. Enough griping. It's nice to see cards out now that can handle (just about) anything you can throw at them. Brent, a while ago we had a poll/survey on widescreen gaming. I realize that this review was probably in the pipeline already when you made that poll, but are you planning on adding widescreen gaming performance to the GPU reviews? If yes, when?

 
electech98 said:
Everquest II above the Balanced setting, anyone? :D

I still can't get over the fact that even the newest vid cards can't do much in that game above the Balanced graphical quality settings at 1600x1200. :eek:

amazing huh, I tried running the XTX at 1600x1200 with no AF and 16 X HQ AF at "High Quality" I saw frames in the 30's and lower. I guess it'll be the next gen cards that can do it maybe, this game was built for the future in mind for sure, it should scale well with new cards
 
Mohonri said:
While I was very impressed by the results in the F.E.A.R. benchmarks, I begin to wonder. Once you've hit 1600x1200 and 4xAA and 4xAF, what's the point of going further? When I look at the IQ comparison, I can see differences, but they're very minor from level to level of AA or AF. IMO, at this point, higher polygon count and better animation should get more attention than the next AA or AF implementation. Think long grass blowing in the breeze, or maybe (some developer please do this!) a decent animation of a person walking. Make the feet stick to the floor--no sliding. Granted, this puts more weight on the developer than on nVidia or ATi, but still...

One of the things that has turned me off to the X000 and X1000 series is the two-slot cooling. Perhaps someone can enlighten me on this. Do the ATi cards of the current/last generation generate that much more heat that they require a two-slot cooler, or is it just ATi's intention to exhaust the air immediately instead of putting it back into the case (like recent nVidia offerings)?

Alright. Enough griping. It's nice to see cards out now that can handle (just about) anything you can throw at them. Brent, a while ago we had a poll/survey on widescreen gaming. I realize that this review was probably in the pipeline already when you made that poll, but are you planning on adding widescreen gaming performance to the GPU reviews? If yes, when?


Yes, when I get a 2405FPW, need to find a good deal on one (if anyone finds one PM me). I'm going to be starting work on another project where it won't be used, but after that, look for it to be used.
 
Xeero said:
also, why was there no benchmark using Splinter Cell Chaos Theory?

One of the criteria we use to decide what games to evaluate with is whether or not they are still important to gamers in terms of what is selling. Here is a list of what sold best in 2005.

This list is no end-all be all, as we take into account engines used and technology as well as genre.

Overall, we dropped splinter cell a while ago as it seemed to move more from game to "tech demo." Now of course you can argure the games validity all day and be right on point, we just thought other games are more important to a wider group of HardOCP readers.
 
great review/evaluation!, since i just bought a westinghouse 27"LCD monitor i figure i would get a new graphics card too, and i was curious as to what exactly is different between the XT, and XTX versions that justfiy the 150$ price hike?
 
4% core overclock, 7% RAM overclock. Not worth it IMO. The factory OC made an IQ/playability difference in one scenario in the [H] preview
 
DougLite said:
4% core overclock, 7% RAM overclock. Not worth it IMO. The factory OC made an IQ/playability difference in one scenario in the [H] preview

Yeah, when they showed us these cards, the XTX honestly made no sense to me in terms of pricing. Pay ATI $100 to OC the card for you with warranty is how I see it.
 
Brent_Justice said:
amazing huh, I tried running the XTX at 1600x1200 with no AF and 16 X HQ AF at "High Quality" I saw frames in the 30's and lower. I guess it'll be the next gen cards that can do it maybe, this game was built for the future in mind for sure, it should scale well with new cards
Do you know more detail about why this game just ups and chokes even the newest cards to death above the Balanced setting? Does the game just have billions of polygons or 15 thousand layers of shaders when the higher graphics options are chosen? Or is it, as someone else suggested, maybe bloated or malefficient coding?

Regardless, I can't wait to see what kind of card will one day be able to run EQII at Extreme quality at 1600x1200!
:cool:
 
Nice review, but I must admit that I am finding all these different settings (adaptive, etc) confusing. Now if someone would release a quiet version - I will upgrade.
 
ClearM4 said:
I read the x1900xtx are shipping with 1.1ns ram right now. Should overclock good. I assume the x1900xt are shippig with lower rated ram. $100 price difference does seem odd.
Isn't 1.1ns DDR3 rated for ~1.8GHz, while 1.2ns is rated for around 1.65GHz? Or is my math way off?
 
This a question that only Kyle or Brent could answer. Putting the slight differences in image quality aside at extreme high resolutions. Is there a noticeable difference at all between the two cards? Can you actually see or feel the X1900 performing better?

As with the XTPE and the Ultra of last generation. In a blind taste test no gamer would likely be able to tell the difference. This raises the question who cares if the 7800 512mb or the X1900XTX has even a 20fps advantage. If you can't see or feel the difference, there is no difference!!!!!
 
Back
Top