Unimpressed / Disappointed with the X1800XT/XL

EXreaction said:
...the 7800 GT will be able to do the same thing...but for a lot cheaper...you can get a 7800 GT for $350 easily

No, the 6800/7800s cannot apply AA to float render targets.
 
EXreaction said:
...the 7800 GT will be able to do the same thing...but for a lot cheaper...you can get a 7800 GT for $350 easily

unfortunately 7800 doesn't do 2560x1600 by 2 DVI dual link port, or else I would have go w/ the 7800.

7800 only do 2048x1536 and has only ONE Dual Link DVI intput vs. 2 x Dual link DVI input that ATI has.
 
Happy Hopping said:
unfortunately 7800 doesn't do 2560x1600 by 2 DVI dual link port, or else I would have go w/ the 7800.

7800 only do 2048x1536 and has only ONE Dual Link DVI intput vs. 2 x Dual link DVI input that ATI has.
you have two LCD's that run 2560x1600?

DAH-UM
 
{NG}Fidel said:
What Driver Problem? When did you own an ATi Card? I deal with their drivers every month on more than 3 diffrent machines and I have yet to have had a problem that was caused by the drivers. Everytime I read a issue about how Their Drivers "Crash" All the time I think its a User Problem.

I sell a lot of Compaq/HP. They have problems back then when the Rage Fury and other Rage products that when it goes to screen saver mode, the graphics card generates crash output on the screen.

As well, I have seen many other ATI video card and many other software problems.

Back in 2001, I was using Nvidia w/ only 16 MB of ram video card, switch to Raedeon 7500, in numerous games, it crashes. REmind you nothing else was changed in the sys besides the card.

I was so glad I got rid of that card, and switch back to the Nvidia FX5200.

Now, I am using the X300, all kinds of software driver problems. The 2D graphics display on certain applications simply crashes to the pt. that I have to reformat the hard drive and re-install the whole OS.

In general, I have no good experience w/ ATI.

Having said that, FX4500 is over $4000 for me to get 2 x dual link at 2560x1600. But w/ ATI, I can get it for $249 or less. So for $249, I wouldn't hate ATI that much. I can put up w/ it.
 
Happy Hopping said:
I sell a lot of Compaq/HP. They have problems back then when the Rage Fury and other Rage products that when it goes to screen saver mode, the graphics card generates crash output on the screen.

As well, I have seen many other ATI video card and many other software problems.

Back in 2001, I was using Nvidia w/ only 16 MB of ram video card, switch to Raedeon 7500, in numerous games, it crashes. REmind you nothing else was changed in the sys besides the card.

I was so glad I got rid of that card, and switch back to the Nvidia FX5200.

Now, I am using the X300, all kinds of software driver problems. The 2D graphics display on certain applications simply crashes to the pt. that I have to reformat the hard drive and re-install the whole OS.

In general, I have no good experience w/ ATI.

Having said that, FX4500 is over $4000 for me to get 2 x dual link at 2560x1600. But w/ ATI, I can get it for $249 or less. So for $249, I wouldn't hate ATI that much. I can put up w/ it.

Yea the drivers have been a sticking point for me personally. I have had several friends with ATI cards and not to mention each game I play have at least one CTD issue with ATI drivers. Mostly MMO's in my experience but nonetheless they have had several issues with popular games. Nothing wrong with ATi just seems like some games I play they don't play nice luckily mostly friends encounter the issue and it makes me hold back on the company as a whole.

Anyhoo I don't run res over 1024x768 on my crt's never have even with my 20+ inch ones since it doesn't really make the games better decreases performance. Eye candy is a huge seller for most people though look at what movies do well in the box office for instance :p.
 
lithium726 said:
you have two LCD's that run 2560x1600?

DAH-UM

No, I don't want to buy the Apple 30" due to the lack of OEM PC Driver. So I'm expecting some PC vendor to drop the 30", which requires a video card to support 2 DVI Dual link output at 2560x1600
 
Happy Hopping said:
unfortunately 7800 doesn't do 2560x1600 by 2 DVI dual link port, or else I would have go w/ the 7800.

7800 only do 2048x1536 and has only ONE Dual Link DVI intput vs. 2 x Dual link DVI input that ATI has.
You could get a Quadro.

NVIDIA has been making dual link dvi cards for Apple, but the only ones I know of for the PC are the Quadro line.
 
PRIME1 said:
You could get a Quadro.

NVIDIA has been making dual link dvi cards for Apple, but the only ones I know of for the PC are the Quadro line.

Surely your not asking him to opt for a quaddro rather than a x1800 or 7800 card?
 
lithium726 said:
7800 doesnt do dual dual-link

Point I was making was that why should he have to go for a card designed for CAD work to run 2 30" apple cinema displays, when mac cards and the X1x00 series (or maybe it's just the x1800's) show you can have this and a good gaming card...
 
Some display has no scaler. For those who own the X1800XT/XL or the X1600 series, how many interpolation methods does ATI offer?

1:1
fullscreen
fullscreen with correct aspect ratio
 
I have decided to do the same, and I cant wait. I am waiting on getting 2 X1800XTs and maybe a DFI crossfire mobo. I am kinda worried about that radeon southbridge problems I hear with SATA and USB. but if it is the best overclocker mobo for crossfire, guess Ill take it.
 
Ir0n_SamuraI said:
I have decided to do the same, and I cant wait. I am waiting on getting 2 X1800XTs and maybe a DFI crossfire mobo. I am kinda worried about that radeon southbridge problems I hear with SATA and USB. but if it is the best overclocker mobo for crossfire, guess Ill take it.
I am on the same page as you bro. Who the hell uses USB for anything other than plugging in a mouse/keyboard and an IPOD?
 
R1ckCa1n said:
I am on the same page as you bro. Who the hell uses USB for anything other than plugging in a mouse/keyboard and an IPOD?

Seriously, if you have an external HD/burner and you're not using firewire you don't know what you're doing. It's such an overblown non-issue.
 
EXreaction said:
...the 7800 GT will be able to do the same thing...but for a lot cheaper...you can get a 7800 GT for $350 easily

No, the nvidia cards can't run HDR and AA so if you want to do that you have to X1800.
 
Bar81 said:
No, the nvidia cards can't run HDR and AA so if you want to do that you have to X1800.

Thats a shame because I have a 6800GT AGP8x and a 7800GTX PCI-E and they both run Lost Coast with full HDR and AA. Nvidia isn't the one slacking on new technologies it's ATI. They are just now coming to market with a product that supports full Shader Model 3.0 while Nvidia has had it in their last gen GPUs. Please know what you're talking about before posting bogus facts.

EDIT: In any case the main reason I'd never buy an ATI product is their driver support. Nvidia isn't super amazing with it but ATI is just terrible.
 
Bassist-X said:
Thats a shame because I have a 6800GT AGP8x and a 7800GTX PCI-E and they both run Lost Coast with full HDR and AA. Nvidia isn't the one slacking on new technologies it's ATI. They are just now coming to market with a product that supports full Shader Model 3.0 while Nvidia has had it in their last gen GPUs. Please know what you're talking about before posting bogus facts.

:rolleyes:
Maybe you should listen to your own advice and learn about the different forms of HDR
FP16 HDR Nvidia cards can do...but can't do AA on. Which many consider to be a better form of HDR.
the X1800 series cards are the only ones that can.

Please know what you are talking about before arguing posts :D
 
I didn't realize he meant HDR and AA together in that way. Yes I know there are different types but most games right now don't even do much with HDR except for bloom effects, by the time it is widely implemented enough I'll already have upgraded again. Also I'll bet Nvidia could support that if they were as late to market as the new ATI cards. In benchmarks for modern games they still break about even with the Nvidia cards and in many cases are a few un-noticible FPS lower than the Nvidia competition.

I'll make a generalization which some may not like but why is it that some cards that have the same pixel pipelines and amount of memory but the nvidia with lower core and memclock speeds spanks the prior ATI 8 series cards? Driver support? It makes all the difference in the world and as late to market as these new cards are their driver support is months behind Nvidia.
 
Bassist-X said:
Also I'll bet Nvidia could support that if they were as late to market as the new ATI cards.

What's going to be your meme when the 512MB 7800s come out and still don't do float render targets and AA? Or when the refreshes to the GTX/GTs come out next spring and still don't? Those parts are being released months later than the 7800s this summer, so why can't NVIDIA just magically add that functionality to them with that extra time since based on your logic that's how ATI was able to do it.
 
Bassist-X said:
EDIT: In any case the main reason I'd never buy an ATI product is their driver support. Nvidia isn't super amazing with it but ATI is just terrible.

That's just crap. If you're talking about Windows. For Linux, I agree with you. ATI drivers are as good as nvidia's and usually better as ATI doesn't need about 40 beta leaked drivers until they fix something. We could go back and forth, but that statement of yours is just ignorant. Get what you want but don't try that bs argument; it doesn't hold water to the informed.
 
I'm with Bar81 on that one. I have a rig with 6800GT and one with X850XT PE and driver wise, i absolutely favor the ATI unit. nVidia drivers completely suck compared to the ATI part. We have the official nVidia drivers coming out every 3 months or so and they contain tons of problems (shimmering anyone), then you have some weird leaked beta drivers every 3-4 days, which, when lucky, improve 3dmark 100points (whoopdie) but don't fix any serious issues.

On the other side, you have a steady 1 per month driver release from ATI which actually fixes things and don't have the problems nVidia's drivers are riddled with.

Not even talking about image quality being much better on the ATI card compared to the nVidia part.

Even though I'll skip this gen of cards, I'll be more likely to purchase another ATI card than another nVidia card. If the ATI costs $50 more for the same performance, then I'm more than glad to pay the premium for better image quality and better drivers.
 
Bar81 said:
That's just crap. If you're talking about Windows. For Linux, I agree with you. ATI drivers are as good as nvidia's and usually better as ATI doesn't need about 40 beta leaked drivers until they fix something. We could go back and forth, but that statement of yours is just ignorant. Get what you want but don't try that bs argument; it doesn't hold water to the informed.

Your individual results may vary but I based this opinion off of my own real world experience. Switching between same hardware configurations save for the GPU. I wasn't happy in the past with ATI's products over Nvidia and maybe one day they'll change my mind back because I've had some bad Nvidia experiences as well. To be perfectly honest I almost wish there was a third major player again in the GPU market such as Voodoo, remember that card? And as far as drivers go I think the differences in hardware versus performance in prior generations is more than enough proof that drivers make all the difference in the world. And yes for the most part I am also talking cross platform overall driver support because I do switch back and forth between various builds of Debian and Windows and I really need a card that does both pretty well. So for all the people who are ignoring the fact that there are other OS's out there I urge you to widen your scope a bit, there's a lot of great open source stuff out there as I'm sure you're aware.


ScYcS said:
I'm with Bar81 on that one. I have a rig with 6800GT and one with X850XT PE and driver wise, i absolutely favor the ATI unit. nVidia drivers completely suck compared to the ATI part. We have the official nVidia drivers coming out every 3 months or so and they contain tons of problems (shimmering anyone), then you have some weird leaked beta drivers every 3-4 days, which, when lucky, improve 3dmark 100points (whoopdie) but don't fix any serious issues.

We all know (or at least should know) that 3dmark05 scores don't really mean much of anything in terms of real world applications as it is just a simulated next gen game engine not used by anyone. I'd definately never just buy a card based on benchmarks like that alone.

ScYcS said:
Even though I'll skip this gen of cards, I'll be more likely to purchase another ATI card than another nVidia card. If the ATI costs $50 more for the same performance, then I'm more than glad to pay the premium for better image quality and better drivers.

That sounds more like an ATI fan boy talking, not the level headed go to the best available card at the time it is available as it should be. Really I went Nvidia because they delivered back in the summer whereas ATI is delivering comparable marginally better if you want but not really in most applications now. Perhaps if ATI was ready then I would have considered them quite a bit more this next time around but lets remember that the 850xt's get rocked if I'm not mistaken by the 7800GT hands down. I'll be completely re-evaluating both manufacturers as new technologies arise next time around, they both make good products and I'm glad that we have such fierce competition between the two rival companies to drive development of newer stuff.

For the longest time this R520 was supposed to be an Nvidia 7800GTX killer and the early benchmarks from every publication I've seen seem to contest this. Basically it looks to me like the ATI's are still better with AA as always, image quality I'm not so sure about as I've seen this one go both ways enough to where I don't know yet, and the shader round goes to nvidia. Power consumption and heat is still in nvidia's favor but to many that isn't really a concern and really shouldn't be if all you care about is distinguishing 2-3 more FPS over another in your favorite game. I think the least we can do is just buy whatever you want, I think they are all great cards but for crying out loud please come at me with some constructive criticism instead of just flaming me. Thanx guys, happy gaming.
 
Bassist-X said:
For the longest time this R520 was supposed to be an Nvidia 7800GTX killer and the early benchmarks from every publication I've seen seem to contest this. Basically it looks to me like the ATI's are still better with AA as always, image quality I'm not so sure about as I've seen this one go both ways enough to where I don't know yet, and the shader round goes to nvidia. Power consumption and heat is still in nvidia's favor but to many that isn't really a concern and really shouldn't be if all you care about is distinguishing 2-3 more FPS over another in your favorite game. I think the least we can do is just buy whatever you want, I think they are all great cards but for crying out loud please come at me with some constructive criticism instead of just flaming me. Thanx guys, happy gaming.

no the R520 wasn't going to be the GTX killer, other people in other forums predicted that if it pushed out a core with 32 pipes that it would rox0rz our boxors, but no where was there any publication that this was going to "own" at 7800, they even said at a press conference that the R520 would be able to match it in speed if they could get the target clocks they wanted (which they ended up doing)

ATI is better with AA quality (marginally) with their Adaptive form, their AF quality is VERY noticable compared to the 7800's, so its not going "both ways" its going one, twords ATI right now, its undisputed that the R520 simply has the best image quality right now

power consumption/heat disipation is very similar to both cards, my XL doesn't heat up my case anymore then my 6800GT did, and to add driver support i favor ATI's, nvidia's weren't horrible but some of the issues i was having compared to the ease i had so far with ATI's just makes me want to stick with ATI's

if you can't take a jab, especially one that netrat33 threw at you, then i suggest growing some thicker skin, nothing on here so far was really "flaming" you
 
I've heard that some games can run HDR + AA at the same time such as Lost Coast but other games such as Far Cry can't? Is there any truth to this? Is there anything to do with the game engine design or is it just referring to the 7800 can't apply AA to the HDR stuff specifically?
 
Bassist-X said:
I've heard that some games can run HDR + AA at the same time such as Lost Coast but other games such as Far Cry can't? Is there any truth to this? Is there anything to do with the game engine design or is it just referring to the 7800 can't apply AA to the HDR stuff specifically?

There are many ways to do HDR. Open method that Farcry and SC makes use of OpenEXR. However the method that Lost Cost uses does NOT make use of the OpenEXR formats. Actually Valve had some slides on the pros/cons of 4 different methods of HDR they considered in their engine/games. Those slides had some good info. However there is nothing preventing AA when using the OpenEXR format. Its strickly a hardware limiation. When Daid Kurt was asked about this he gave the impression that it was "too expensive" to do with their given design goals and transister budget...
 
Bassist-X said:
I've heard that some games can run HDR + AA at the same time such as Lost Coast but other games such as Far Cry can't? Is there any truth to this? Is there anything to do with the game engine design or is it just referring to the 7800 can't apply AA to the HDR stuff specifically?

HDR is really only being used in a few games right now (Splinter Cell 3, Ages of Empire 3, Farcry, Lost Coast, ....and...I think that is right now I can't remember)

Lost Coast in the only one that is allowing AA right now because of the method implemented.

Which is kind of sad because with how long different methods of HDR have been around only a handful of games really use it.
 
So in other words what you're saying is that HDR can co-exist with AA on an Nvidia 7 series GPU if the game developers choose to implement that into their engine? Or am I off target here?
 
Bassist-X said:
So in other words what you're saying is that HDR can co-exist with AA on an Nvidia 7 series GPU if the game developers choose to implement that into their engine? Or am I off target here?

Sadly that is not the case. He was referring to the point that there are different ways of doing HDR on games and only the HL2 method supports HDR + AA.

Nvidia is currently hardware limited, its current hardware cannot do HDR + AA on hardware level and there is nothing that programmers can do about that. You can have HDR with Nvidia hardware but not HDR + AA.This is how I understand it, wiser people feel free to correct me.
 
Bassist-X said:
We all know (or at least should know) that 3dmark05 scores don't really mean much of anything in terms of real world applications as it is just a simulated next gen game engine not used by anyone. I'd definately never just buy a card based on benchmarks like that alone.

Guess what my "whoopdie" was meant for...


Bassist-X said:
That sounds more like an ATI fan boy talking, not the level headed go to the best available card at the time it is available as it should be. Really I went Nvidia because they delivered back in the summer whereas ATI is delivering comparable marginally better if you want but not really in most applications now. Perhaps if ATI was ready then I would have considered them quite a bit more this next time around but lets remember that the 850xt's get rocked if I'm not mistaken by the 7800GT hands down. I'll be completely re-evaluating both manufacturers as new technologies arise next time around, they both make good products and I'm glad that we have such fierce competition between the two rival companies to drive development of newer stuff.


Why ATI !!!!!! talking? I already mentioned i have cards from both manufacturers. If I consider a ATI card over a nVidia card then it's because of past experience, not because I'm married with ATI. If the nVidia card has better image quality, then nVidia it is (well, and if they finally get their drivers sorted out and fixed). But seeing that nVidias image quality was worse than ATIs in the last couple years, I have my doubts that it'll happen anytime soon.
Oh and of course the X850 gets rocked by a next gen card.... :rolleyes:

I want to be completely honest with you here: You sound more like a !!!!!! than anyone else in this thread. ATI came out with a decent piece of hardware. It was late and it's expensive. No doubts. It DOES however deliver good performance and neat ideas.
Will I buy it? Hell no....way too much money. Will i buy the 7800GTX? Hell no as well. Way too much money.
 
Alright I was really looking more for some answers on HDR + AA varying on game engines but something irrelevant works too. I'm over the ATI / Nvidia thing I just want to know what is hardware limited and what is software limited.
 
Bassist-X said:
Alright I was really looking more for some answers on HDR + AA varying on game engines but something irrelevant works too. I'm over the ATI / Nvidia thing I just want to know what is hardware limited and what is software limited.

That's been explained quite clearly. If you weren't so busy cheerleading you might have actually gleaned some information from this thread.
 
Bassist-X said:
Alright I was really looking more for some answers on HDR + AA varying on game engines but something irrelevant works too. I'm over the ATI / Nvidia thing I just want to know what is hardware limited and what is software limited.

There are many ways to do HDR. FarCry and SC:CT, for example, use OpenEXR HDR. No Nvidia card can do OpenEXR HDR with AA due to hardware limitation. Only the X1K can.

Valve used a different approach to HDR, which allows HDR without OpenEXR (thus, it can also run on Geforce FX, X800, etc). This also allows AA with their HDR.

HDR is not Bloom.
 
Thank you for giving me a much better concise answer instead of a wise remark. I wonder now which method for HDR will be more popular for developers? The one that seems more commonplace with OpenEXR HDR or a different method like what Valve created since that supports NV6-7 + X800+X1K series cards.
 
Can someone provide a link on the power requirement of one to two of X1800 XT ?

Also, a link to the DVI digital max. resolution that each port support for the X1800xl?
 
Netrat33 said:
HDR is really only being used in a few games right now (Splinter Cell 3, Ages of Empire 3, Farcry, Lost Coast, ....and...I think that is right now I can't remember)

Lost Coast in the only one that is allowing AA right now because of the method implemented.

Which is kind of sad because with how long different methods of HDR have been around only a handful of games really use it.
Just wanted to add Serious Sam 2 to the HDR16 list. HL2 doesnt bloom the HDR effect and is the ONLY game that can do HDR+AA on the 7800 because of its hardware limitations. I also agree, I find ATI's monthly driver updates much nicer than nvidias beta drivers every few weeks in WindowsXP, but I still use my older nvidia card for my linux box.
 
Back
Top