X1800XT wipes the floor with 7800GTX!! DAMN!!!!

Status
Not open for further replies.

Saeid

Limp Gawd
Joined
Aug 19, 2005
Messages
435
WOW... a picture is worth a thousand words...

ATi has proved that you dont need 24pipes to get performance... with their x1800xt that has 16pipes... it kills!

Btw this review was done with the Catalyst 5.9 which arent even optimized for the X1K cards! :p

P.s. Damn the X1600 which is 128bit beats the x800XL!

3dm.png


Link: http://www.hothardware.com/viewarticle.cfm?articleid=734&cid=2
 
funny how its only 3dMark 2005.... yay for synthetic crap that means nothing in real life

in real games the margin is way tighter, even at ultra high resolutions with all the pretty stuff turned on.

get a life
 
Saeid said:
WOW... a picture is worth a thousand words...

ATi has proved that you dont need 24pipes to get performance... with their x1800xt that has 16pipes... it kills!

Btw this review was done with the Catalyst 5.9 which arent even optimized for the X1K cards! :p

P.s. Damn the X1600 which is 128bit beats the x800XL!

Link: http://www.hothardware.com/viewarticle.cfm?articleid=734&cid=2
OMG :eek: :eek: It's teh 3d 733t m4rk 05!! Love 3DM05 multiplayer.
 
That is a perfect example of why 3dmark is not a game or even a decent representation of game play.

The 6800 Ultra even the standard 6800 or the 800XL all provides a much better game play experience compared to the x1600XT despite its higher score.
 
That review is crap for one simple reason. Who's 7800GTX scores 7600 in 3DMark05? Definately not my XFX and X2 4400+. You can wipe the floor with that review. ;)
 
ATi has the X1800XT Core Clocked to the max just to beat nVidia's 7800GTX. If nVidia OC their Core to the max nVidia will come out on top. Some experts on the SteamPowered forum and even an ATi worker said ATi OC their Core to the max and it "might not" even OC nemore. :(

"FLAME SHIELD ON" :D
 
Um, this thread is worthless. Thats all I can say. Its 3dMark05 for crying out loud!

Stereophile, we all know that the X1800XT wins in FEAR. How about the other games that were tested?
 
ATi can mop the floor when it actually has the mop, not just the friggin design for the mop...
 
ReDgUaRd008 said:
ATi has the X1800XT Core Clocked to the max just to beat nVidia's 7800GTX. If nVidia OC their Core to the max nVidia will come out on top. Some experts on the SteamPowered forum and even an ATi worker said ATi OC their Core to the max and it "might not" even OC nemore. :(

"FLAME SHIELD ON" :D

If ATi can clock their cores so high, and still be stable, why not do it? Besides, XFX and others have a GTX at 490Mhz core, which is just about as high as you're going to go on air. So your argument is moot. Adding more Mhz to the core wouldnt get 2x the frames in F.E.A.R.

pxc said:
I found another graph with longer bars for the x1800 XT vs 7800GTX:

http://techreport.com/reviews/2005q4/radeon-x1000/power.gif

LOL

44W hotter at idle, 25W hotter under load.

What? 44W hotter? You dont make any sense.

tranCendenZ said:

You cant compare different reviews like that. We have no idea if the ingame settings are the same, or not. Hopefully newer drivers will get better performance (without bugs this time), but who knows.
 
Is nobody concerned with why "thier" 7800GTX used in this benchmark only scored 7600 in 3DMark05? My XFX scores 8700....and that's not with an FX processor...but an X2 4400+. Something smells here.
 
congrats on being a retarded !!!!!!. maybe you should read the entire review. in the END of the review they state:

"Overall though, we'd consider the GeForce 7800 GTX the "faster" all-around card in terms of general gaming performance."
http://www.hothardware.com/viewarticle.cfm?articleid=734

Oh, and I sure am glad you can go buy two of them to wipe the floor with my setup. :)

oops, no you cant! ;)

i find it nice how you left out all of the results where the GTX beats the XT. My biggest problem is that ATI releases a product 5 months after nVidia and it keeps pace. To me, that is a failure on their part. In 5 months they should at least kill it in EVERY game, not just a few or even half. Better luck next time ATI.
 
Joe Fristoe said:
Is nobody concerned with why "thier" 7800GTX used in this benchmark only scored 7600 in 3DMark05? My XFX scores 8700....and that's not with an FX processor...but an X2 4400+. Something smells here.

Your card is at 490Mhz core, and 1300Mhz memory? Thats much faster than their card is clocked at. Its 7700 not 7600 btw. 7800 seems to be about average for a stock clocked GTX from reviews. CPU matters little, so thats moot. There are also different drivers.. different driver options, etc.
 
fallguy said:
Your card is at 490Mhz core, and 1300Mhz memory? Thats much faster than their card is clocked at. Its 7700 not 7600 btw. 7800 seems to be about average for a stock clocked GTX from reviews. CPU matters little, so thats moot. There are also different drivers.. different driver options, etc.

So their benching the smallest % core speed of the 7800GTX cards? Makes sense to me. :rolleyes:
 
Maybe we'll see a difference in FEAR benchmarks when the 512mb version of the GTX comes out?? :rolleyes:
 
it makes a little sense. I mean ATI has been developing the card for some time now so why would ati release something that is weaker then the 7800gtx? But i would also like to see some in game benches from a legit site like hardocp.
 
Mnx4 said:
ATi can mop the floor when it actually has the mop, not just the friggin design for the mop...
hahahahahaha

How true how true. I also hate this thread title.
 
i would like to see how it compares to my XFX 7800GTX @ 490

and 3dmark05 is useless for this comparison....
 
fallguy said:
What? 44W hotter? You dont make any sense.
I hoped that most could make the mental leap that more power consumed = more heat generated. I overestimated, obviously.
 
the basic 3dmark(locked at relatively low resolution)is significantly processor limitied once you hit 8000 or so 3dmarks. After that, you're benching the processor more than the videocard.

I have the longest bar on the graph, my e-peen is still intact. :)
 
Joe Fristoe said:
So their benching the smallest % core speed of the 7800GTX cards? Makes sense to me. :rolleyes:

I didnt look at the speed of their card, I dont know. The stock clocks are 430/1100 for a GTX. Sure, a lot are sold faster than that, but thats the stock clocks. Reviews should have a stock card in a review. Personally I would rather see a stock clock card, and an overclocked card, because that benefits the user the most. Because overclocked cards generally cost more, and the user could decide if they are worth the cash, or not.
 
fallguy said:
I didnt look at the speed of their card, I dont know. The stock clocks are 430/1100 for a GTX. Sure, a lot are sold faster than that, but thats the stock clocks. Reviews should have a stock card in a review. Personally I would rather see a stock clock card, and an overclocked card, because that benefits the user the most. Because overclocked cards generally cost more, and the user could decide if they are worth the cash, or not.

The reference speeds for a GTX are actually 430/1200. In addition I would be surprised to see if third party manufacturers make OC'ed x1800xt cards. These cards really have been OC'ed to the max how much more could they do? If so nVidia's card would be able to provide a faster card mainly because it could be overclocked.
 
Saeid said:
WOW... a picture is worth a thousand words...

ATi has proved that you dont need 24pipes to get performance... with their x1800xt that has 16pipes... it kills!

Btw this review was done with the Catalyst 5.9 which arent even optimized for the X1K cards! :p

P.s. Damn the X1600 which is 128bit beats the x800XL!

3dm.png


Link: http://www.hothardware.com/viewarticle.cfm?articleid=734&cid=2

Big fuckin deal. Do you buy top of the line vid cards to run synthetic benchmarks all day or play games at their max settings?

But then again I forgive you since you appear to be a noob and are new to the pc gaming world.
 
I just really tire of people not understanding the pipe situation. It is no miracle in the slightest 16 can beat 24. AT ALL. Graphics is a parallelized task. They have always been used as theoreticals. So when you are calculating theoreticals for those pipes it goes 16*600MHz=24*400MHz egg-fugging-xactly. When you read articles like the tech report they will show you how the balance of all these separate parts of the chips have been changing. As these functions have been more and more "decoupled" from one another, things get more confusing. So the stats may not be able to spell everything out exactly... true. This may lead to things like people saying pipelines don't matter to them and that's fine. To me the stats as given are still all we have until the cards come out, and when they do get benched knowing them helps quantify things like improvements made to each pipe. When the paper lines up, and one is faster, then it's easier to call one more efficient. But beyond that, we just have a bunch of people piggybacking Brent's notion that pipelines don't matter, but they seem to have "half-cooked" ideas as to why. Things like ATi's smaller fragment size give tons more credit to pipeline counts not being self-defining... the 16=24 notion, given the clockrate increase, makes that a straight up wash. Using that argument if anything proves there is more to be understood before the blanket statements should be made.
 
Rob94hawk said:
Big fuckin deal. Do you buy top of the line vid cards to run synthetic benchmarks all day or play games at their max settings?

But then again I forgive you since you appear to be a noob and are new to the pc gaming world.

And it's not a real card that you're able to purchase? :eek:
 
seems like a competitive card, which is really lacking in OGL.

im happy with these cards. now lets see them in stores :)
 
lithium726 said:
seems like a competitive card, which is really lacking in OGL.

im happy with these cards. now lets see them in stores :)
Yeah, just as soon as Crossfire is out.
 
Rob94hawk said:
Big fuckin deal. Do you buy top of the line vid cards to run synthetic benchmarks all day or play games at their max settings?

that all depends on the individual.

some people (like myself) could give a rat's ass about 3DMark tests while others' only sole purpose on this earth is to have the biggest "E" penis and nothing more.

in other words, to each his or her own...
 
animosity said:
Yeah, just as soon as Crossfire is out.

If the card isn't good at opengl....it will only fall behind that much more when using 2 cards (will never touch the GTX in SLI at opengl). A weakness is a weakness no matter how you put it.
 
Saeid said:
WOW... a picture is worth a thousand words...

ATi has proved that you dont need 24pipes to get performance... with their x1800xt that has 16pipes... it kills!

Btw this review was done with the Catalyst 5.9 which arent even optimized for the X1K cards! :p

P.s. Damn the X1600 which is 128bit beats the x800XL!

3dm.png


Link: http://www.hothardware.com/viewarticle.cfm?articleid=734&cid=2

ahem...cough....

BWAHAHAHAHAHAHA

sorry, slipped.

The funny thing is that it looks like this is pretty much the same situation as with the 6800vsx800....6800 has better OGL, x800 better DX. But its not like a few frames here or there will any game unplayable.

so, meh, nothing really changes.
 
brucedeluxe169 said:
funny how its only 3dMark 2005.... yay for synthetic crap that means nothing in real life

in real games the margin is way tighter, even at ultra high resolutions with all the pretty stuff turned on.

get a life

Especially when a 6800Ultra SLi setup can score just as high as the 7800GTX's in SLi. Which are CLEARLY faster. Yeah 3D Mark is a great benchmark. :rolleyes:
 
Here is what confuses me so much.

ATI did this time what is or was considered business as ussual for the graphics game.

Nvidia broke with tradition and did paper launch and real lauch same time so now the world expects that all the time?

Come on its one time they did it and now everyone hates ATI because instead of changing their business structure they just kept on trucking?

Is it right the way they do things? Sure it was just fine for years why is everyone so wound up now?

The card is GTX equal pretty much and who honestly expected more or less?

You guys are really silly, even the peopel that run the websites are stuck to this nvidia gives it to us at the same time neener neener neener crap, its childish to me.

ATI did what is or was(perspective) the norm in the GPU business and now everyone calls them a failure?

come on
 
We've all heard the reports of teh new drivers also....boosting FEAR performance by 30%.
 
7800gt & gtx now, where has ATI been when nVidia has been destroying any card that ATI produces. How long do we have to wait to see if ATI can even compare to anything nVidia produces. I feel bad for ATI taking all this time to sit and watch nVidia dominate the graphics business. They better have one hell of a card to sway people to go to ATI.
 
Status
Not open for further replies.
Back
Top