ATI Radeon X1950 XTX / CrossFire Evaluation

Great read.

I'm impressed, this new card is POWERFUL. But I'd still wait for Quad SLi drivers to mature a little before it's written in stone that 2x X1950s in CF beat 2x GX2s in Quad.
 
Tigerblade said:
This kinda statement always bewilders me. Do you frequently stare at the rear of your case admiring it? I see no other reason as to why you'd have a problem with it......

WHY do ppl have this obsessive problem with the dongle configuration? So much so they will not buy a product because of it? I always thought it was price/performance that was the main issue with video cards.....maybe I was wrong :confused:
Well, maybe I don't like them from an aesthetic point of view (hurray for the SLI bridge) and maybe I don't like using a separate mastercard for my setup. Not only that, but I think it's easier to sell two standard cards than a standard and mastercard once next-gen cards are around.
 
I enjoyed the review very much. However on this page:
http://enthusiast.hardocp.com/article.html?art=MTE0NCwyLCxoZW50aHVzaWFzdA==

You were listing the ways in which the X1950 XTX differed from the X1900 XTX and you left out
one fact. The X1950 XTX is HDCP compliant. A lot of people don't think that HDCP is a big
deal yet, and that may be true but I think it deserves to be mentioned because it is a new
feature to the X19__ line of video cards.

I have read 6 X1950 XTX reviews today and the only site that even used the word HDCP was
DriverHeaven.

HDCP is mentioned in the 8th paragraph of the ATI Press Release.

I just wanted to bring that to your attention, and I fully understand the pressure you guys
are under to get your reviews out on time.
 
If anything, the X1950's will put extra pressue on nV to get their drivers up to snuff with quad-SLI for more games.

Great review.
 
Suddenly Quad-SLI looks like just a gimmick.

you have to pay twice the price (just in the vidcards) to get close to the 1950XTX xfire performance.
 
I hope this means the X1900 Crossfire cards will come down in price.
 
roflcopter said:
Not only that, but I think it's easier to sell two standard cards than a standard and mastercard once next-gen cards are around.

There is a very clear statement in HardOCP review about this one, which effectively turns this statement into nothing.

edit:

Ok I must be becoming senile or something since I cannot find the line from [H] review, so I quote Anandtech:

"Before we close, one reminder to people who really want the X1950 XTX: don't buy it. Pick up the X1950 CrossFire instead. For the same price and performance you get a much more versatile solution. If you really need both DVI outputs, the CrossFire dongle supports that as well, so all you're doing is adding a small amount of cable clutter."
 
Seems like a nice card, but Ill probably wait for the next gen. Kyle or Brent, any chance you boys will do a run with the X1950 Crossfire on a C2D X6800 just for shits and giggles??? Would love to see if the new cards let the C2D stretch its legs a bit more @ higher res.
 
Only reason I support the master card is for its ability to keep Super AA performance high.
 
I enjoyed this article, but I'm interested...you made mention of the upgrades to the 1600 and 1300 series, and the price points are amazing, but will they come in AGP?

Before all of you slam me, I still have clients that haven't made the leap to PCI, so it's a viable question when we start considering that these are 'budget' cards, and if the client's cost has to include a new motherboard, budget starts going out the window.
 
I thought all the X1K series cards were "HDCP compliant" but none of the AIB partners actually built in support for it. The chips themselves have always had the capability they just needed to be liscensed and activated.

Edit: Being a little more specific.
 
solobaricsrock said:
I want what hes smoking...... :D :cool:

Coolman, re-read the review, and you see that the XTX only wins in 2 or 3 of the framerate scores (2 of the 3 are in crossfire at that).......Where are you basing this from???? Im not flaming, as im impressed, but the 7950, while not able to do AA+HDR in oblivion (which I dont play anyways) still has an edge.



Look at the details, sure the ATI card may be behind in MAX FPS, but the ATi card also has AA and HDR on when the NVIDIA card does not, or simply can not. thus the ATI card is more powerful, and that is a single core beating out a dual core card...

IT is very impressive they took an old core and just added new RAM, i do hope NVIDIA will add some GDDR4 to their cards now to see what it will do, especially in the case of the X2 cards.....
 
Great review. Good to see ATI address the noise issue from last gen and give us a better value!

I agree its great to compare the two fastest from each company.

I still would caution as there are some cases where it takes some time or work for a "SLI single card" to run at its highest potential vrs a true single gpu card. What I ment is that a single GPU card does not have the need to have users create profile for a brand new game or really old game if that game is not supported via a current profile which is not true for a card using SLI tech, not saying its a lot of extra work, but IT COULD BE A SLIGHT difference. I know spliting hairs....
 
roflcopter said:
Not only that, but I think it's easier to sell two standard cards than a standard and mastercard once next-gen cards are around.

I agree with that from a price standpoint, but from what I've read the x1950 mastercard is going to retail at the same price as the slave. Time will tell if this is true, but at least it's a big step in the right direction.
 
D4hPr0 said:
Seems like a nice card, but Ill probably wait for the next gen. Kyle or Brent, any chance you boys will do a run with the X1950 Crossfire on a C2D X6800 just for shits and giggles??? Would love to see if the new cards let the C2D stretch its legs a bit more @ higher res.

Yes, we will move to that soon. Dont have enough processors to have three of us doing motherboard testing and video card testing yet. But yes, it will be upgraded. Then again, we will likely be keeping AMD as the ATI CrossFire basis in the future. We will see what happens.
 
Anarchist4000 said:
I thought all the X1K series cards were "compliant" but none of the AIB partners actually built in support for it. The chips themselves have always had the capability they just needed to be liscensed and activated.

I was guessing that the x1900 might be pumping out too much data for the AGP bus to handle, so that would be a reason why not to go AGP with it. Either that, or the board manufacturers are trying to eliminate the AGP board process because the money's just not there anymore.
 
Lothar the Lotharian said:
I was guessing that the x1900 might be pumping out too much data for the AGP bus to handle, so that would be a reason why not to go AGP with it. Either that, or the board manufacturers are trying to eliminate the AGP board process because the money's just not there anymore.
Both. The 7800GT vs. the 7800GS was no competition on the part of AGP. Unfortunately I still know a good amount of people using AGP. Actually out of all my friends I am the only one using PCI-E.
 
sam0t said:
There is a very clear statement in HardOCP review about this one, which effectively turns this statement into nothing.

edit:

Ok I must be becoming senile or something since I cannot find the line from [H] review, so I quote Anandtech:

"Before we close, one reminder to people who really want the X1950 XTX: don't buy it. Pick up the X1950 CrossFire instead. For the same price and performance you get a much more versatile solution. If you really need both DVI outputs, the CrossFire dongle supports that as well, so all you're doing is adding a small amount of cable clutter."
Yeah, that's what I really need: more cable clutter (my pc is in my livingroom). And when I have to sell the card, I have to sell it with the message: Free cable clutter included! I wouldn't buy one 2nd hand.
 
Good for ATI.

I would have liked to see comparison screenshots though. Your reviews are a tad on the subjective side; since you are using different settings it would be nice to have a picture next to the number so we can see the whole equation.
Without those SS I am left saying, oh ok it has a slightly lower FPS but higher settings therefore better image quality, but I have no way of seeing that.

Second is I don't game on a 30" screen...and I doubt many of us do. Despite our enjoyment is spending tons of money on our computers 30" LCD screens are still out of reach for a lot of us.....or simply not worth the money.
Therefore, I think we could have benefited from a second section for the 2core vs. 4core using a smaller screen or a lower resolution so we don't have to guess what the performance differences would be.

Overall though, it was a great read.

Cheers.
 
razor1 said:
The increaesd memory freaquencies really won't make much of a change, I'm suprised in games like BF2 the bandwidth didn't help much at all, I really expected that game to be bandwidth bottlenecked. Seems like most games are still fillrate or shader bottlenecked, very suprising even with the increased texture size.

Actually, I saw the memory increase greatly help in BF2. I was able to do 6X "Quality ADAA" with the X1950 XTX versus 6X Perf ADAA (which is 4X on alpha textures) with the X1900 XTX. That is a bump from 4X to 6X Alpha Texture AA. On the X1900 XTX with 6X Quality ADAA I was seeing framerates in the teens around grass and trees, but with the X1950 XTX with 6X Q ADAA it was now 40 FPS! I was shocked how large the improvement was.
 
Great review, guys. Nice to see ATi putting out some seriously competitive hardware.
 
roflcopter said:
That will probably result in a lot of flaming and banning, so no thanx.

Nice to see they reworked the dreaded HSF. The performance gain isn't that spectacular but okay and it's good to see that Ati is trying new stuff like GDDR4 and not selling it for a premium price. Now lose the dongles and I might even buy an Ati card next upgrade.

As if all your peripherals (including monitor) are wireless.

I had a dongle for my 3dFX cards back in the day and didn't give a damn because the dongle is on the back of the PC with a rat's nest of other wires.

Seriously, though, why the dislike of dongles? Is it just that they're a smidge inelegant compared to an on-mobo or in-PC case communication layer?
Certainly not a make-or-break issue for me.

edit: (I just saw that Tigerblade also commented on this Today, 10:18 AM #40).
 
Puterguru said:
Meh, good review as far as numbers but there were a few things I didn't like.

Quote "Image Quality
We can’t say enough about the great image quality produced by the ATI Radeon X1900/X1950 series"

A few screenshots would have been nice?

Quote PREY
"Wow does this game look incredible at 2560x1600!"

A few screenshots would have been nice?

Also there was no 1920x1200 "Dual GPU Review" - This is going to be the sweet spot with up and coming games being more graphically intense and will require even more memory bandwith yet we can only guess what kind of frame rates we might get with a X1950 XTX CrossFire setup at this resolution.

Time limitations, I had to scale back a whole bunch from what I really wanted to do. We will do more with add-in-board partner cards.
 
roflcopter said:
That's a legit question since HardOCP didn't review the X1800XT before it was available for us consumers. Maybe Kyle and Brent changed their minds on this subject. Good for them and good for us.
Yeah, I seem to remember that they were not going to cater to paper launches. I guess that's changed. They did not even mention availability. :rolleyes:

Still looks like a nice card, it just that you can't even buy one.
 
Skirrow said:
I'm a little confused with the single card Oblivion scores. Coz i play on a 2405 with an 1900xtx at 1920x1200 and get an average of approx 30fps outdoors, 40fps in towns and 60fps in dungeons. Only time it dips to around 20fps is in the meadows near Anvil. I DONT have AF on though. But i'd rather have grass and shadows than AA or AF. Looks far nicer. Plus my system (in sig) aint as fast as the reviews.

Wonder if the beta drivers could be the problem?

Also, i wonder if you can use a CF 1950 with an older 1900XTX?

You just stated the reason, your settings are different than ours, most likely your hardware as well. We push IQ as high as it will go on both cards and still have playable performance. I don't see how you can play at 19x12 without AF, yuck.
 
Tigerblade said:
This kinda statement always bewilders me. Do you frequently stare at the rear of your case admiring it? I see no other reason as to why you'd have a problem with it......

WHY do ppl have this obsessive problem with the dongle configuration? So much so they will not buy a product because of it? I always thought it was price/performance that was the main issue with video cards.....maybe I was wrong :confused:

http://enthusiast.hardocp.com/article.html?art=MTA4OSwzLCxoZW50aHVzaWFzdA==
 
DoomRulz said:
Great read.

I'm impressed, this new card is POWERFUL. But I'd still wait for Quad SLi drivers to mature a little before it's written in stone that 2x X1950s in CF beat 2x GX2s in Quad.

It is how it is, right now. I agree QuadSLI will mature over time, but I cannot predict the future, all I can do is say this is how it is right now at this very moment, side-by-side this is what you will experience on August 23rd 2006.
 
PRIME1 said:
Yeah, I seem to remember that they were not going to cater to paper launches. I guess that's changed. They did not even mention availability. :rolleyes:

Still looks like a nice card, it just that you can't even buy one.

Actually you do not have your story straight at all. Please do not simply make up issues as if we had stated them. I suggest you quote us when telling us what our policy is. ;)


As for the availability, yes, that needs to be stated on the conclusion page.
 
Lothar the Lotharian said:
I enjoyed this article, but I'm interested...you made mention of the upgrades to the 1600 and 1300 series, and the price points are amazing, but will they come in AGP?

Before all of you slam me, I still have clients that haven't made the leap to PCI, so it's a viable question when we start considering that these are 'budget' cards, and if the client's cost has to include a new motherboard, budget starts going out the window.

I haven't heard anything about any new AGP products. But interestingly as a side note, I've got a new PCI (yes PCI, not PCIe) X1300 here :D
 
Tigerblade said:
I agree with that from a price standpoint, but from what I've read the x1950 mastercard is going to retail at the same price as the slave. Time will tell if this is true, but at least it's a big step in the right direction.

Yes, it is a good thing that the CrossFire Edition is now the same price as the X1950 XTX.

At the same time though this is a bad thing if say you purchase an X1900 XT for $280 and want a dual-GPU solution you then have to spend much more money to get the CrossFire Edition at $450. It is too bad you can't just get two X1900 XT's at the same price and "CrossFire" them which is how NVIDIA SLI works.
 
Hey Brent and Kyle in your guys honest opinion is it worth upgrading from a single X1900XTX to this new card. Especially if your playing in 1920x1200 on a Dell 2405 for games mentioned in your review.

For many of us ATI X1900XTX owners that is the crux of the matter. Should we buy a refresh when DX10 part will be out perhaps at years end? I agree with your statement completely about enjoying games in the "now" but if I am not going to notice any real measured increase than it's not worth the $450 bucs.

What I would literally kill for is a benchmark between the X1900XTX Crossfire solution vs. X1950XTX. Because instead of going out and buying this new card, we could plop in a greatly reduced priced X1900XT Crossfire card and still see gains higher than a X1950XTX, which could hold us off until DX10.

So is it $450 bucs for a new X1950XTX with small or decent gains compared to $250+ for another X1900XT Crossfire card and a good leap in performance. Or just bust the bank and get two new X1950XTX :D
 
Brent_Justice said:
Yes, it is a good thing that the CrossFire Edition is now the same price as the X1950 XTX.

At the same time though this is a bad thing if say you purchase an X1900 XT for $280 and want a dual-GPU solution you then have to spend much more money to get the CrossFire Edition at $450. It is too bad you can't just get two X1900 XT's at the same price and "CrossFire" them which is how NVIDIA SLI works.


Could you still us a X1900 Crossfire card? It obviously wouldn't be as fast, but would be cheaper. Plus the heatsinks would match so your friends not in "the know" wouldn't give you crap about having one cool heatsink and one bad heatsink. :p
 
Brent_Justice said:
Actually, I saw the memory increase greatly help in BF2. I was able to do 6X "Quality ADAA" with the X1950 XTX versus 6X Perf ADAA (which is 4X on alpha textures) with the X1900 XTX. That is a bump from 4X to 6X Alpha Texture AA. On the X1900 XTX with 6X Quality ADAA I was seeing framerates in the teens around grass and trees, but with the X1950 XTX with 6X Q ADAA it was now 40 FPS! I was shocked how large the improvement was.


Ah missed a few key points there ;) That makes more sense was expecting a nice increase in that game :)
 
it seams convienient that the ATI guys keep throwing the ASAA/HDR thing in our faces. Well, technicly, ATI cant do it either except in Oblivion (which I dont even play) without a patch.......how is that an advantage??? Im not going to spend 5 hours looking over a picture of gameplay to say "OMFG that one texture is faded!!!!" or some shit like that. Usually I play a game to PLAY THE GAME.......see what Im saying????

Just to inform; Catalyst 6.8 Chuck enables HDR+AA in the Call of Juarez demo too, not only in Oblivion. :)
 
Brent_Justice said:
Time limitations, I had to scale back a whole bunch from what I really wanted to do. We will do more with add-in-board partner cards.
Ok but.....

You guys took time to take screenies of the cards themselves, the ingame menus for each game etc, but not a single screenshot. Yet in at least 3 instances we were told "How beautiful the games were at those resolutions" ???

Also I would think a 1920x1200 Dual GPU Review would have been much more beneficial to most users (instead of doing the 2560x1600) as I would bet more members here have a 24 inch LCD and not a 30.

I'm not complaining, it was a good review but like I said there were certainly some things that I felt were missing.
 
John Bo said:
As if all your peripherals (including monitor) are wireless.

I had a dongle for my 3dFX cards back in the day and didn't give a damn because the dongle is on the back of the PC with a rat's nest of other wires.
Ah good old 3DFX...when dinosaurs roamed the earth :rolleyes: In those days you couldn't expect any better, but now in the year 2006 I would expect a more elegant solution for a multi-gpu solution I guess.
I'm sorry, it's just my opinion that I don't like the dongles and mastercard stuff. And I'm entitled to have one.
 
Any of you guys using 16:10 aspect ratio screens have any display issues while the system is booting? I picked up the new 19" widescreen display from Gateway (1440x900 native resolution) a month ago or so. On my old X800XL (AGP version) I get no video signal on the DVI port until Windows has booted to the Login screen and the Catalyst drivers have been loaded. No signal at all. Can't see the system POST. Can't access my BIOS setup screen. Nothing. I have to revert to a VGA connection to see anything. The ATI tech support guy I talked to said that no ATI cards will display a signal through the DVI port until the Catalyst drivers have been loaded in Windows. If true, that would effectively make it impossible to use DVI in any operating environment outside Windows. Another tech said it may be because widescreen displays don't use standard 4:3 resolutions and/or refresh rates and there may be some kind of video sync problem going on.

Reading on the Rage3d.com forums reveals that a lot of others have had similar issues with 4:3 LCD screens when trying to use DVI. The issue seems to go back as far as the Radeon 9600 series. I've never seen anyone mention issues like this in any hardware reviews anywhere, and I'm wondering why? I'm not the only one having these issues. Not supporting DVI in all operating environments seems like a big problem. Is this something that was limited to AGP cards and has been addressed when PCI-E came out? I haven't heard anything with regard to bus type being a factor.
 
Back
Top