$422 6800gt(bestbuy) or $435 x800xt(gateway)

burningrave101 said:
That may be true but if your not one of those persons running less then a name brand 350w then there is no point in making that an issue of whether or not to buy one. Everyone keeps bringing up the power issue as being one of the negatives of the 6800u and i bet there is hardly a single one of you that has less then a 350w PSU installed. So to continuously bring it up is a bit ironic in the fact that very few of you will have an issue thats needs addressing.

Most likely ATI will increase their transistor count in their next release this fall or whenever and when they do its likely it will draw as much power as the 6800u. Your better off just keeping your power supply upgraded above all else because it can wreck your whole system if its not good quality and able to supply sufficient power. The more transistors that they pack into the GPU core the more power its going to draw. nVidia has 220 million in the 6800u compared to the X800XT PE which only has 160 million.

They did a test on maximum pc on the PSU issue of the 6800 ultra. They tested a PC power n cooling 410w and 510w, a fortron 400w, 250w and a 400w generic PSU. Every PSU passed the test except for the generic ones. Understand this, not everyone buys name brand PSU. Most people usually bought their computer pre-built from manufactors, which uses generic psu's to cut corners. How you manage to ignore the facts is beyond me?

Fine, you have a nice system. How about those who don't?
 
trungracingdev said:
They did a test on maximum pc on the PSU issue of the 6800 ultra. They tested a PC power n cooling 410w and 510w, a fortron 400w, 250w and a 400w generic PSU. Every PSU passed the test except for the generic ones. Understand this, not everyone buys name brand PSU. Most people usually bought their computer pre-built from manufactors, which uses generic psu's to cut corners. How you manage to ignore the facts is beyond me?

Fine, you have a nice system. How about those who don't?

I agree with your comments. However, I see a bigger problem with the 6800GT. There are a lot of comments from people saying "just buy the 6800GT and overclock it to Ultra levels". However, minus one molex connection you are going to run into some power problems. Even if you overclock the 6800GT core to 400MHz, you may only see a frame or two of extra performance. Nvidia has even confirmed the fact that the 6800GT will "clock-down" if there is not enough power available, and without the 6800U's second power connection I doubt there will be.

I'd like to see some overclocked X800 Pro vs. 6800GT numbers to verify my thoughts on this....
 
The GeForce 6800 GT at default clock speeds is doing 350 MHz core and 2x500 MHz memory (1000 effective).
overclocked it was capable of running of a 401 MHz core and 530 MHz (1060 effective) memory frequency. Now that is bang for your bucks as that is roughly Ultra speed. Memory was a tad disappointing but hey, can't blame NVIDIA there. otherwise it would be competing its own product.

http://www.guru3d.com/article/Videocards/135/7/

6800GT overclocked to 425Mhz core and 1.2Ghz RAM. 6800 Ultra Extreme overclocked to 475Mhz core and 1.255Ghz RAM.

http://www.firingsquad.com/hardware/geforce_6800_ultra_extreme/page22.asp

Likely it will depend on the card you get ahold of as to how well it will overclock. Usually XFX has one of the top overclockers.

I agree with your comments. However, I see a bigger problem with the 6800GT. There are a lot of comments from people saying "just buy the 6800GT and overclock it to Ultra levels". However, minus one molex connection you are going to run into some power problems. Even if you overclock the 6800GT core to 400MHz, you may only see a frame or two of extra performance. Nvidia has even confirmed the fact that the 6800GT will "clock-down" if there is not enough power available, and without the 6800U's second power connection I doubt there will be.

Once again, the 6800u has been tested and proven to run at 450/1200 with the single primary power connector. Anything higher then 450Mhz will need the secondary power connector. This is all according to those who actually have 6800 ultras and have tested it for themselves.

There are several of you here on HardOCP with a PNY 6800 Ultra so why dont you test it for yourself and post back the results! I wanna know how high it will clock and still run a 3d application with just its primary power connector.
 
jarman said:
Okay...PCFormat is reporting the exact opposite (PCFormat, July 2004, 163, page 55). I'm interested to see what Kyle and the boys have to say about this with retail boards. :confused:

Those clocks go with the FiringSquad review. Guru3d got 401/1060 on their 6800GT. Its going to be a matter of different boards how high you can OC.

And what is PCFormat? I dont guess i've ever looked at that one.
 
burningrave101 said:
And your point is?

nVidia had had the lead for a long time and ATI just finally started to take the lead with the 9500 pro and 9700 pro. They had just began to be in the running and actually competitive against the top tier nVidia.

Do i need to repeat it again?



SM 3.0 will be implemented in quite a few games long before 6 months from now i'm sure and the OpenGL games like Doom 3 will likely perform a whole lot better on an NV40.

This isn't FUTURE tech. This is "the next 2-3 months tech".

NV40 also has UltraShadow II Technology which is in Doom 3.

It is still arguable that the feable early implimentations of 3.0 will have any adavantage in image q over 2.0 and even the performance gains will be arguabley negligible. It is my understanding that it will be at least a year berfore 3.0 is a real issue and by then, I will be purchasing the latest Nvidia or ATI card; whomever gives me the best performance for the buck.

5 will get you ten it will be ATI..................I am not an ATI fAN boy, I was an Nvidia customer for years until the 9700 pro came around. I hope someday Nvidia can entice me back.
 
burningrave101 said:
Those clocks go with the FiringSquad review. Guru3d got 401/1060 on their 6800GT. Its going to be a matter of different boards how high you can OC.

And what is PCFormat? I dont guess i've ever looked at that one.

LOL, I'm in the UK right now on TDY. They are the same company that write Linux Format magazine. They are both UK publications.

I'm dying over here...all of you lucky bastards are getting to play with all of this new hardware in the States, and I'm stuck over here waiting for the same hardware to trickle over the ocean.

Try this on for size, XFX 6800U in the UK is $737.89 after VAT!!!!!!!!!

There's no place like home...
:D
 
agar said:
I just don't know how you can't go wrong with 434.99 for a XT/PE. You can argue 2-3fps here or there, but for that price you can't compete.


bfg 6800 gt oc 299.99, one of the best bang for the buck deals ever.ill never notice the 5 fps difference.
 
randsom said:
bfg 6800 gt oc 299.99, one of the best bang for the buck deals ever.ill never notice the 5 fps difference.

Are u talking about the one time best buy sale thats long over? If not, please link me to the price.
 
Thanks for the link , but it doesn't look like either card was the clear winner. At high rez and max settings it was 6 or Nvidia, 5 for ATI, and 1 tie. It almost seemed like they intentionally tried to skew the results in Nvidia's favor by using so many QIII based games.(QIII, Wolfenstein, + Jedi Acadamy) I'm still gonna buy the x800xt since it won all of the games that really matter to me....the ones I play. (like Halo, UT2004, + Far Cry)
 
Blackwind said:
Catching up? LOL

They lapped them on the playing field the last two releases running....

9700 Round 1 ....Winner!

9800 Round 2 ....Winner!

Anyone has some redemption to do or "catching up"....it's nVidia. I mean really...that little do hicky to force tri that nVidia now has in their control panel.....you think that got there by accident? :rolleyes: It was direct response to them getting banged over the head by users, hardware review sites, vendors and grandmothers who know a thing or two about vid cards.






Ok...maybe not the grandmothers. ;)


nVidia appears to be back on track but I think you are wearing blinders if you can't see the accomplishments of ATI. no shame is simply saying...nVidia got their butts handed to them and now they are putting more GREAT effort to make amends. More power to em and I as a consumer have more options......



LOL. So now everyone should wait till Doom 3 to decide eh? :D the life buoy is right in front of you....it has ATI written on it.

Stop flaying the excuses already.


Well said in all coments. This guy is just a MEGA NVIDIA FANBOY he cant give ATI any props. Hey burningrave101 have you ever owned an ATI card? Mabye the 9700pro a card everyone knows was the best you could get for like a year or did you blindly follow your fanboy instinks?
 
Dyslexic said:
Well said in all coments. This guy is just a MEGA NVIDIA FANBOY he cant give ATI any props. Hey burningrave101 have you ever owned an ATI card? Mabye the 9700pro a card everyone knows was the best you could get for like a year or did you blindly follow your fanboy instinks?

Oh yea i'm a MEGA NVIDIA FANBOY just because i can see the truth when the rest of you seem to ignore it and the rest of you are totally unbias altogethor because you like nVidia and ATI both the same. :rolleyes:

I never denied the poor performance of the NV30 because ATI definitely had the better cards with the 9700pro/9800pro. nVidia turned it around alot with the 59XX cores though and a few solid driver updates to help in DX9 games. ATI deserves recognition for their accomplishments over the last 2 years but that doesn't just dismiss all their years of suckage while nVidia held the top crown.

Some of you guys really crack me up, i tell ya. Just because none of you can come up with anything better you resort to calling people a fanboy lol. I love it :D.

Dyslexic = xSyzygy666x lol

http://www.hardforum.com/showthread.php?t=769943

ToastyBoy said:
Thanks for the link , but it doesn't look like either card was the clear winner. At high rez and max settings it was 6 or Nvidia, 5 for ATI, and 1 tie. It almost seemed like they intentionally tried to skew the results in Nvidia's favor by using so many QIII based games.(QIII, Wolfenstein, + Jedi Acadamy) I'm still gonna buy the x800xt since it won all of the games that really matter to me....the ones I play. (like Halo, UT2004, + Far Cry)

Why would it be skewed in nVidia's favor just because there were OpenGL games? Its not anyones fault but ATI's that their cards dont perform as well in OpenGL. The point though, you say the X800XT PE won in all the games that are important to you at 1600x1200 w/ 4x AA + 4x AF, but when the X800XT did win, how much did it win by? 1-5 FPS? You really think you can notice that little different in-game? The 6800U supports pixel shader 3.0, vertex shader 3.0, 32-bit floating-point, and UltraShadow II Technology (in Doom 3). What does the X800XT PE bring in terms of new technology that will actually be used in upcoming games this year? 3Dc? I doubt it even gets used by hardly anyone just like when ATI tried to impose DXT5 on game developers they wouldn't use it ither. And even if they do use 3Dc there will have to be a fallback off DXT5 for all those that dont have cards that support it and there is very minimal differences between the two. And all it is is texture compression.

I'm sure you will be happy with an X800XT PE but i dont see why everyone keeps saying we need to give so much props do ATI like they did something incredible when all they did was put their 9800 pro on steriods. Absolutely nothing new in the way of technology in upcoming games. Maybe it will bring a small improvement to the 6800's, maybe it will bring a big improvement, maybe it will bring nearly nothing. Ither way, me and whoever else that buys a 6800 will have something to look forward to having implemented while the X800XT PE users will not.
 
burningrave101 said:
Oh yea i'm a MEGA NVIDIA FANBOY :p



And even if they do use 3Dc there will have to be a fallback off DXT5 for all those that dont have cards that support it and there is very minimal differences between the two. And all it is is texture compression.

Humus Demo, using 3Dc.

With shadows:
No compression: 125fps
3Dc: 146fps (+17%)
DXT: 136fps (+9%)
3Dc & DXT: 158fps (+26%)

Without shadows:
No compression: 164fps
3Dc: 210fps (+28%)
DXT: 195fps (+19%)
3Dc & DXT: 239fps (+46%)


Available here: http://esprit.campus.luth.se/~humus/ and this is thread discussing its advantages over DXT: http://www.beyond3d.com/forum/viewtopic.php?t=13579. Looks like there will be performance and visual differences betwee the two. They funny thing is burning, you just spew crap from reviews without really researching what it is you are spewing. It's like you d/l Nvidia coorperate docs and cut them and paste them here.

" The 6800U supports pixel shader 3.0, vertex shader 3.0, 32-bit floating-point, and UltraShadow II Technology " ALL BOW DOWN TO THE HIGH TECH ADVANCES THAT HAVEN'T BEEN USED IN ANY GAMES AND HAVE YET TO BE USED IN ANY GAMES.. It was like people flamebaiting others about TruForm and how it was going to change graphics. Give me a break. This is your classic quote of the week " i can see the truth when the the rest of you seem to ignore it and the rest of you are totally unbias" Wow! :D

I think you should re-think your comments and previous posts before you make statements like that.
 
another vote for ATI.. :)

from ati website, "ATI reports record revenues of US $491.5 million in Q3"
thanks to help of Nvidia GeForce FX series. ;)
 
I am suprised that the difference in performance was so great between 3DC and DXT5, I guess that ATI's shader perf is giving them a boost here because they decompress 3DC in shaders but DXT5 in the texture unit. The performance hit would be lessened in a game situation where normal mapping is only one of the many things going on in a scene. the IQ difference isn't too bad, I expect with tweaking it can be lessened some. perhapse at a slight performance deficit, although if either MS or the ARB would add 3DC to their spec it would be nice because then we would probably get other IHV's to include 3DC support in their designs, same with FP blending and filtering.
 
I would take the X800XT, is there any doubt? Really, it should only be a consideration if the other card is a 6800 Ultra, but its not, it a 6800GT.

http://www.spodesabode.com/content/article/nv40r420p2/5

Thanks for the link.. ATi seems unconcerned with 3DMark scores as usual, and Farcry (which should be optimized for Nvidia) is right on the same level with a 6800 ultra. Otherwise, its quite a bit faster than a 6800ultra - nothing new to see, but its a nice small one chart page which pretty well sums it all up.
 
Hmm well I hate to jump into this debate, but my understanding of the whole PS 3.0 support, is that the x800 will be able to duplicate PS 3.0 visuals. PS 3 is just a faster way of doing PS 2.0, along with unlimited pixel shader length (I know there is more to it but it ain't much more). The 3.0 vertex shader has displacement mapping, which 2.0 can emulate.... Anywho, I don't think the x800 is at a big disadvantage, but thats just my opinion. I personally went ahead and ordered an x800XT. I went to a friends house who recently bought an Ultra. It looks good and seems to be a smokin card, but I think the IQ seems crisper on my old 9700 pro. I'm banking the XT will be just as crisp....
 
most things done in ps 3.0 can be multipassed in ps 2.0 at a performance decrease, V.S. 3.0 on the other hand, cannot be emulated in VS 2.0. (you cannot do texture lookups in VS 2.0, so no displacement mapping.
 
I'd go with the GT. Take SM 3.0, new technology anytime. Its just like the 9700pro-with the first dx9 support. Games with SM 3.0 are coming soon. By the end of this year, I think nearly a dozen games will support it
 
Yup, as mentioned many times before, PS 3.0 is a speed issue and not an image quality or capability issue.

Now, it totally depends on who you want to go by but "rumors" say a game coded specifically for 3.0 may see up to 35 percent performance increase over a game coded for 2.0. It does not add any new features, except the possibility of displacement mapping (which ATi just doesn't seem to support.)

Contrary to FUD spread around the net, ATi's X800 does support Multiple Render Targets (MRT's) http://www.ati.com/developer/demos/rx800.html

The new "Crowd" demo shows this... so Nalu the Mermaids hair could be just as fast on an ATi.
 
And unfortunately, it sounds like Doom3 will not be able to use "ambient occlusion", which is ATi's way of doing NVidia Smartshading for all the shadows...

1400 soliders with moving limbs crossing over one another, and with full shadows running >30fps in a real game type situation is a very impressive feat IMO. I don't think Carmack (or anyone) can use the new shadows as he'd probably have to rewrite the engine. It is something to look forward to a few years in the future though (if it ever catches on, I'm not a huge fan of fog or perfect shadows nevermind 1400 of them)

BTW, the X800 does seem to add a few features more than just the 9800. I'm not sure if the 9800 does sub-surface scattering (or a different way of doing it) or MRTs. So there were some core enchancements other than just a "doubling up" on the old tech.
 
BTW, the X800 does seem to add a few features more than just the 9800. I'm not sure if the 9800 does sub-surface scattering (or a different way of doing it) or MRTs. So there were some core enchancements other than just a "doubling up" on the old tech.

The 9800 does have sub-surface scattering. You can see it in the demo for SSS that many downloaded from ATI's site for the x800. You have to apply the wrapper to get the demo to run.

phas3d
 
burningrave101 said:
Yea right lol. Yea your not an ATI fanboy. You've just been going on about ATI non stop this whole thread, reaching for the skys for reasons to get the X800XT PE and not a 6800u even though the 6800u is the better card in performance and technology. If your only reason is a $70 price difference its not a very good one. .

Are you really that clueless?
 
Merlin45 said:
most things done in ps 3.0 can be multipassed in ps 2.0 at a performance decrease, V.S. 3.0 on the other hand, cannot be emulated in VS 2.0. (you cannot do texture lookups in VS 2.0, so no displacement mapping.


Hmm, I'll have to look that up because I was pretty certain 2.0 does emulate displacement mapping, but it isn't as precise as 3.0. I'll edit when I find a link...

Edit:
Well I stand corrected.
http://www.hardocp.com/article.html?art=NjA5LDE=

Still interesting comparing those screenies in that article.
 
It can't do it in vertex shaders, because VS 2.0 doesn't allow for texture lookups, which is central to displacement mapping.
on another note, no video card in the world can do real time Sub surface scattering. that which is used in ATI's demo is a precomputed SSS model. no way you could do that for an animated model because you couldn't precompute the situation. it does have possible applications with static models in games though, but it is rather limited still.
 
agar said:
Humus Demo, using 3Dc.

With shadows:
No compression: 125fps
3Dc: 146fps (+17%)
DXT: 136fps (+9%)
3Dc & DXT: 158fps (+26%)

Without shadows:
No compression: 164fps
3Dc: 210fps (+28%)
DXT: 195fps (+19%)
3Dc & DXT: 239fps (+46%)


Available here: http://esprit.campus.luth.se/~humus/ and this is thread discussing its advantages over DXT: http://www.beyond3d.com/forum/viewtopic.php?t=13579. Looks like there will be performance and visual differences betwee the two. They funny thing is burning, you just spew crap from reviews without really researching what it is you are spewing. It's like you d/l Nvidia coorperate docs and cut them and paste them here.

Your the one that doesn't seem to understand what your even talking about. You keep on about 3Dc and you dont realize that most likely it will not even be used by game developers and if it is it will be in a small portion of games because of the work it takes to implement it. Do you not realize that ATI has already tried to push DXT5 for a long time now and nobody wanted to use it? Give me a break. Why would they all start using it now?

" The 6800U supports pixel shader 3.0, vertex shader 3.0, 32-bit floating-point, and UltraShadow II Technology " ALL BOW DOWN TO THE HIGH TECH ADVANCES THAT HAVEN'T BEEN USED IN ANY GAMES AND HAVE YET TO BE USED IN ANY GAMES.. It was like people flamebaiting others about TruForm and how it was going to change graphics. Give me a break. This is your classic quote of the week " i can see the truth when the the rest of you seem to ignore it and the rest of you are totally unbias" Wow! :D

I think you should re-think your comments and previous posts before you make statements like that.

They havn't been used in any games yet because the cards just got released lol. If ATI had all this new tech and nVidia didn't you know for a fact you would be trying to hold that over everyones head. There have ALREADY been around a dozen games announced to use SM 3.0 this year with more to follow and Doom 3 WILL use UltraShadow II Technology.

Even if a FEW game developers decide to put in all that extra work to use 3Dc, they will still have to add DXT5 support for the other cards and there is only a minimal difference in performance between the two. I can't believe you guys are even getting hyped at all over 3Dc when ATI wasn't able to get anyone to use DXT5 lol.

Here are some pics of DXT5 vs 3Dc and some information on both.

http://www.nvnews.net/vbulletin/showthread.php?t=30772

Blackwind said:
Are you really that clueless?

Are you? I've provided more then enough facts and links to back up what i'm saying. So far none of you have produced anything but speculation, personal ATI fanboy opinions, and a whole lot of BS. How about you provide a couple of reviews using the NEW 61.34 drivers or higher and then show me where the X800XT PE is dominating the 6800u in any way shape or form. :rolleyes:

I would take the X800XT, is there any doubt? Really, it should only be a consideration if the other card is a 6800 Ultra, but its not, it a 6800GT.

http://www.spodesabode.com/content/article/nv40r420p2/5

Thanks for the link.. ATi seems unconcerned with 3DMark scores as usual, and Farcry (which should be optimized for Nvidia) is right on the same level with a 6800 ultra. Otherwise, its quite a bit faster than a 6800ultra - nothing new to see, but its a nice small one chart page which pretty well sums it all up.

Their still using the old drivers there bud. Take a peak at Call of Duty which is OpenGL. The beta 60.72 drivers hurt the 6800 in performance badly and thats exactly what spodesabode is using because those performance results dont even come close to matching up with the reviews that have been released using the new 61.34 drivers and higher. Thats just another worthless review.

Here are some of the the games that have announced support for SM 3.0.

Lord of the Rings, Battle for Middle-earth
STALKER: Shadows of Chernobyl
Vampire: Bloodlines
Splinter Cell 3
Tiger Woods 2005
Madden 2005
Driver 3
Grafan
Metal of Honor: Pacific Assault
Unreal Engine 3
Painkiller Patch
FarCry Patch
Half Life 2 Patch

And there are links with pictures in this thread:

http://www.nvnews.net/vbulletin/showthread.php?t=30736
 
burningrave101 said:
Snip because its so damn long!
Those Pics are cool, but they don't compare them to shader 2.0. They seem to show HL2 screenies as if it was running SM 3.0 when it doesn't even have SM 3.0 implemented yet. I'm not taking away from SM 3.0 tech at all, I just don't believe that you will see a big visual quality diff. Thats just my opinion. I could be totally wrong, but I haven't seen any evidence either way.
 
gsboriqua said:
Those Pics are cool, but they don't compare them to shader 2.0. They seem to show HL2 screenies as if it was running SM 3.0 when it doesn't even have SM 3.0 implemented yet. I'm not taking away from SM 3.0 tech at all, I just don't believe that you will see a big visual quality diff. Thats just my opinion. I could be totally wrong, but I haven't seen any evidence either way.

I dont think those pics are meant to show SM 3.0. I think they are just pics to give you an idea of what some of those upcomming titles look like in game.
 
Getting performance out of there drivers?Something ATI has never been able to do.

Burninggrave you have got to be the most annoying Nvidia fanboy to pop it's head up for some time :rolleyes:
For starters your above sentence is the biggest load of crap ever sprouted ;) Ever hear or use a card called the Radeon 8500.The performance gained out of these cards with later driver realises was nothing short of exellent and certainly as good if not better than anything nvidia has done to date!.
They got performace from that card near on 2 year's later with driver realises?
As for all this constant badgering you keep sprouting about in every goddamed thread that mentions the X800XT! STFU unless you own said card and know for shore how it overclocks and Plays games.And if you want to buy the 6800GT do it and stop busting everyone elses chops about it :rolleyes:=Freedom of choice some prefer ATI some Prefer NVIDIA!
As for the original Thread topic My opinion is go the X800XT its the top of ATI's line of card For the same Price as Nvidias middle of the road card!
IT'S NOT THE BEST ITS NOT THE WORST ITS JUST AS GOOD AS ANYTHING NVIDIA CURRENTLY HAS FULLSTOP!
 
Every time I hear someone talk about the expected performance increase of upcoming drivers it makes me wonder : Are they really performing some miracle? or did they just rush the product to market and are now scrambling to get it right?
 
it is a combination of the second one and the fact that it takes time for people to figure out new ways to optimise things.
 
burningrave101 said:
Oh yea i'm a MEGA NVIDIA FANBOY just because i can see the truth when the rest of you seem to ignore it and the rest of you are totally unbias altogethor because you like nVidia and ATI both the same. :rolleyes:

I never denied the poor performance of the NV30 because ATI definitely had the better cards with the 9700pro/9800pro. nVidia turned it around alot with the 59XX cores though and a few solid driver updates to help in DX9 games. ATI deserves recognition for their accomplishments over the last 2 years but that doesn't just dismiss all their years of suckage while nVidia held the top crown.

Some of you guys really crack me up, i tell ya. Just because none of you can come up with anything better you resort to calling people a fanboy lol. I love it :D.

Dyslexic = xSyzygy666x lol.

No the reason I call you a fanboy is because you will not admit that both the 6800 and the X800 are great cards. Each has good points and bad points. All of the reviews new and old I have seen say they preform very well and are fast as hell. NVidia has had ups and downs like any GPU company they used to get spanked by 3Dfx back in the day so they were not always on top before the 9700pro came along. They got good when the first Gforce cards came out and held the lead till Gforce 4s got slammed by 9700s it wasnt for all that long. Its a tuff thing to stay on top no one does it for ever. So stop and let your nose pull free from the collective asses of NVidia and just take a break.

http://www.anandtech.com/cpu/showdoc.html?i=2091&p=1

By the way this review that you posted proves that the X800 XT is faster with AA and AF enabled in all but old Q3 engine games. I don't know anyone with a decent rig that doesnt use AA and AF on every game they play.
 
Burninggrave you have got to be the most annoying Nvidia fanboy to pop it's head up for some time
For starters your above sentence is the biggest load of crap ever sprouted Ever hear or use a card called the Radeon 8500.The performance gained out of these cards with later driver realises was nothing short of exellent and certainly as good if not better than anything nvidia has done to date!.
They got performace from that card near on 2 year's later with driver realises?
As for all this constant badgering you keep sprouting about in every goddamed thread that mentions the X800XT! STFU unless you own said card and know for shore how it overclocks and Plays games.And if you want to buy the 6800GT do it and stop busting everyone elses chops about it =Freedom of choice some prefer ATI some Prefer NVIDIA!
As for the original Thread topic My opinion is go the X800XT its the top of ATI's line of card For the same Price as Nvidias middle of the road card!
IT'S NOT THE BEST ITS NOT THE WORST ITS JUST AS GOOD AS ANYTHING NVIDIA CURRENTLY HAS FULLSTOP!

Good job at blowing off some more ATI BS in this thread lol.

nVidia does a far superior job over ATI of squeezing performance out of their cards with driver updates. Not to mention ATI's HORRID driver support for Linux. I bet an FX 5200 outpaces a 9800 pro in Linux lol.

Here is an article that spans the performance of the 3.4 cats to the 3.9 cats:

http://www.madshrimps.be/?action=getarticle&articID=104

And the X800XT PE is not the same price as nVidia's middle of the road card. They are exactly the same price at $499 MSRP. CompUSA is offering their PNY 6800u's and GT's with a 30% off discount. There are extremely few places your going to find an X800XT PE for less then $500, if you can even find one in stock.


Dyslexic said:
No the reason I call you a fanboy is because you will not admit that both the 6800 and the X800 are great cards. Each has good points and bad points. All of the reviews new and old I have seen say they preform very well and are fast as hell. NVidia has had ups and downs like any GPU company they used to get spanked by 3Dfx back in the day so they were not always on top before the 9700pro came along. They got good when the first Gforce cards came out and held the lead till Gforce 4s got slammed by 9700s it wasnt for all that long. Its a tuff thing to stay on top no one does it for ever. So stop and let your nose pull free from the collective asses of NVidia and just take a break.

In half the posts i've made in this thread i've referred to how the 6800 and X800 are both great cards. What i have been trying to get across to some rather thick minded people is that the 6800u has more to offer you then the X800XT PE. THAT is why i'm raving about it being the better card. Not because its an nVidia or ATI card. But because IMO its the better card.

And now nVidia has released SLI technology for their 6800 PCI-E cards. Just one more reason why the 6800 is a better choice for future performance.

I've never once said the X800Pro or X800XT PE performed badly compared to the 6800's. But i dont think the cards are evenly matched because the 6800GT and 6800u have more going for them right now. And the 6800GT is quite a bit faster then the X800Pro according to recent benchmarks. Add in the fact its a full 16 pipeline card and that it will OC to ultra speeds.

X800Pro vs 6800GT

http://www.ixbt-labs.com/articles2/gffx/nv40-2.html

And for those of you disputing which card runs hotter take a look here:

http://www.guru3d.com/article/Videocards/135/4/

X800 - 66 Degrees C
6800 GT - 49 Degrees C

Both cards where tested with a room temperature of 22 Degrees C. The temperature measured of course is not the core temperature. Both cards have been tested at the same time-frame after 3 runs of Aquamark 3 in the most heavy mode. So unless your have some seriously good cooling you need to take note of the fact that the card will create higher ambient temperatures inside your case that can have an adverse effect on other components, especially an overclocked processor (CPU).

I've verified this with several 6800u owners.

The 6800GT's are in stock and shipping now also. Those that pre-ordered them will soon have them in their hot little hands if they dont already.

This ATI fanboyism has got to stop unless you got actually provide some evidence to back up your BS. I can fanboy all day and night if i want to as long as i can support what i'm saying with factual information.

http://www.anandtech.com/cpu/showdoc.html?i=2091&p=1

By the way this review that you posted proves that the X800 XT is faster with AA and AF enabled in all but old Q3 engine games. I don't know anyone with a decent rig that doesnt use AA and AF on every game they play.

Show me where the X800XT PE blows the 6800u away at 1600x1200 when AA + AF is enabled! You dont seem to realize that a pitiful 1-5 fps lead once in a while means nothing when the 6800u supports all this new technology that will be used in upcomming games and the fact the 6800u will blow the X800XT PE away in OpenGL games. There wont just be a 1-5 fps difference ither.
 
Thats funny cause in the article you posted it says the following:

"With this article, we were also trying to put an end to the ATI vs. NVIDIA PCI Express debate. Our conclusion? The debate was much ado about nothing - both solutions basically perform the same. ATI's native PCI Express offering does nothing to benefit performance and NVIDIA's bridged solution does nothing to hamper performance. The poor showing of NVIDIA under Far Cry and Warcraft III is some cause for concern, which we will be looking into going forward. We will keep you all updated on any and all findings with regards to that issue as we come across them."

Doesnt sound like they think the 6800ultra is a better card.
 
Dyslexic said:
Thats funny cause in the article you posted it says the following:

"With this article, we were also trying to put an end to the ATI vs. NVIDIA PCI Express debate. Our conclusion? The debate was much ado about nothing - both solutions basically perform the same. ATI's native PCI Express offering does nothing to benefit performance and NVIDIA's bridged solution does nothing to hamper performance. The poor showing of NVIDIA under Far Cry and Warcraft III is some cause for concern, which we will be looking into going forward. We will keep you all updated on any and all findings with regards to that issue as we come across them."

Doesnt sound like they think the 6800ultra is a better card.

What does that say in the first sentence? Yea thats right. PCI Express. What in the hell does that have to do with AGP 8x? Both PCI-E cards performed horribly compared to the AGP 8x cards. And its most likely because of the poor drivers from both sides currently for PCI-E. The ATI PCI-E card is even native PCI-E and it still peformed badly.
 
Back
Top