prerelease benchmarks r520

Status
Not open for further replies.
^eMpTy^ said:
No offense Netrat...but you'd believe anything positive that anyone said about ATi...

And likewise to you about negative news, neither of you are right until the card gets actually released, neither of you
 
Shifra said:
you'd have to be mildly retarded to take those benchmarks for fact.

or maybe you missed the laughable +/- 5-10% difference from an X850.

empty im restraining myself from putting your aweful comments down, please try to be a little less bias and gullable.

Biased and gullible?

The benchmarks show the 16 pipe R520 performing much like a 16 pipe R420 in PS2.0 games...and getting a much better increase in performance in newer more graphically intense titles...isn't that about what you expected to see? Hell the thing is nearly twice as fast as an x850xtpe in doom 3...
 
SnakEyez187 said:
And likewise to you about negative news, neither of you are right until the card gets actually released, neither of you

And that means oh so much coming from you... :rolleyes:
 
^eMpTy^ said:
I doubt ATi would call it a hoax outright if there wasn't something seriously wrong...
Yeah the r520 did probably perform too good in those benchmarks :p ;) :D
 
bah. i was excited till i actually read the thing. totally unreliable info. i'm calling B/S on this.
 
SatinSpiral said:
bah. i was excited till i actually read the thing. totally unreliable info. i'm calling B/S on this.

yeah...have no choice but to throw these benchies out...no biggie...we'll have more in a couple weeks...:)
 
^eMpTy^ said:
And that means oh so much coming from you... :rolleyes:

I didn't realize my comment was so offensive, seems completely tame to me, but thanks for your heartwearming response.

Anyways, do you actually think you're right, and can take the stance that you are when doubt is being cast on the article from so many different sources? Do you, personally, have any proof of this, or are you just using it to parade your stance all over this thread?

Once again, neither of you are right until the actual review comes out, you're saying you don't agree with this? Why?
 
SnakEyez187 said:
I didn't realize my comment was so offensive, seems completely tame to me, but thanks for your heartwearming response.

Anyways, do you actually think you're right, and can take the stance that you are when doubt is being cast on the article from so many different sources? Do you, personally, have any proof of this, or are you just using it to parade your stance all over this thread?

You have such a wild imagination.

I didn't find your comment offensive in the least...I just said that it didn't mean much coming from one of the most biased people on this board...

Also...I already agreed that the benchmarks are unreliable...I just noted that given the doom3 performance jump, it's still possible that they're real...but with ATi calling them a "hoax" and the fact that they came from a third party...I'd be hard pressed to take them at face value...
 
Riptide_NVN said:
It's not just about features... I need more performance at higher resolutions. The current cards, even 7800GTX, probably aren't going to be able to run modern games in the resolution I play (1920x1200) at full detail and with everything turned on (HDR, AA, AF) and all the way up.

Granted people that run in 1600x1200 or higher are still probably the minority but I'm talking about my own needs here. ;)

I'm right there with ya, I want more performance as well. I was just making sure people understood "Next-gen" doesn't always mean more performance.

But ya, I want high rez, AA/AF, HDR, the works, with 60+ consistent framerates :D
 
Let's all take a deep breath and stop the fighting and try to have a civilized conversation? Personally, I'm really enjoying the drama introduced by ATi's apparent denial of these benchmarks being legit.
 
jkrafcik said:
Especially given that 90nm parts tend to run VERY hot and that this part is running at a high clock, I doubt many enthusiasts who are interested in cool and quiet workstations or HTPCs will jump on this product.

Let me first say: My knowledge about Semiconductor manufacturing is limited. I learned quite a bit in my first semicond. class about 3 years ago and read this and that on the web.

From all I have ever heard, it appears to me that up to a certain point, reduction in feature size, e.g. going from 130nm to 110nm, will lead to lower temperatures, due to the reduced capcitance of the channel. It also allows for faster clockspeeds for the same reason. Having said that, reducing the feature size increases the leakage current as well, hence the need for SOI and other techniques.

So why exactly should 90nm parts run hotter than 110nm parts, at the same clockspeed? Given that the x1800 runs faster, it could require more cooling than the x850XT-PE, but that is not an inherent characteristic of the 90nm manufacturing process, but rather due to the clockspeed increase.
 
^eMpTy^ said:
Very true statement...though I suspect that all the delays gave ATi ample time to do the typical first round of driver enhancments...hopefully more info about that will be available at launch...
one could think that the lack of final silicon and card prevented the driver team from being able to optimize drivers. Your point is that the long delay allowed for more driver development time, but I somehow doubt that this is the case, as you will need to know exactly what you are working with, i.e. the final product, in order to optimize the drivers for it.
 
reducing the die size can also reduce the physical size of the chip right, if you do that then there is less surface area and less area for heat to dissipate, hence a potentially hotter running core because of less surface space
 
so of the things people say just piss me off.....

BTW I am niether !!!!!!....just sold my 9800pro for a 6800nu


TAKE THIS SHIT WITH A GRAIN OF SALT...this isnt even an official review (as in a reputible site aka Hard, Anandtech, ext)


Just wait till the fucker comes out and then make stupid comments like....


"Wow ATI is FUCKED"
"Wow I could see this coming"
Wow what the fuck was ATI thinkin"

give it some fuckin time (yes i know its been delayed 23432725 times LOL)
 
well, I wouldn't put it past ATI to shoot down those benchmarks, especially if they don't showcase the card in the way they like. Of course you're going to say it's an OC'd x850, though they could also be complete fiction, why you'd take an OC'd x850, rahter than just making them up seems stupid, nothing is holding someone from just inputing a random value into excel.

We could easily get to the bottom of this by looking at similar benches on a high x850 OC machine. Which I'm sure someone has kicking around.

Personally, I'm not too sure the OC'd x850 ever competed with the GTX on that level, but whatever.
 
Netrat33 said:
"This is a complete hoax - done with an OC'd X850, we think. Call our partners yourself and ask if they have R520XTs in house. The numbers aren't even close."


With those numbers..I believe it is an oc x850

What do you expect an ATI PR guy is going to tell you? Yeah its true our card isn't as fast? And with NDAs and junk, board partners aren't going to say a word even if you did ask them.
 
PRIME1 said:
They are launching in 2 weeks and none of their board partners have one yet???? :eek:

Agreed, I've had a 9800XT and an X850XT so I'm kind of hoping that ATi pulls out a GTX dominator but those benchmarks do look reasonable. Personally, I think they'll ramp up the clock speeds right before launch just to win a couple benchies.

The guy explained how he got the board and how ATi wouldn't give him one, it doesn't necessarily imply that he hates ATi and is incapable of making an unbiased review. This is just some random dude on the internet who probably has a friend of a friend of a cousin who runs a manufacturing plant in Taiwan that PROBABLY has an R520 by now considering how soon the launch date is. Quite frankly, people like Shifra just won't except any benchmark or comparison that gives the short end of the stick to ATi. 2 slot cooling solution, equal performance at best, new 90nm process, delays up the wazoo, it just doesn't look good for ATi to take the performance crown this round (although I'm sure they're making big cash on OEMs/most consumers/XBOX 360).

To all the flamers: step outside, embrace the sunlight, game on your already capable video card than can max anything but F.E.A.R. and just hold your breath for more sources. You WILL NOT benefit either way if ATi or Nvidia has a better card unless you own stock. :)
 
revenant said:
I saw this link over on the guru3d forums...

http://www.clubic.com/actualite-22488-les-premiers-benchs-du-radeon-x1800.html

it's french, but the numbers look more realistic... granted, Half-Life2 loves the ATi architecture... but anyways...
You do realize thats a chart from the HA "review," right?

Personally I think the numbers are suspect but after everything ATI has said about the performance I kind of expect numbers somewhere in the ballpark or the ones in the "review."

I'm still going to wait for the more reputable sites to put up their reviews soon.
 
I'm dissapointed. I know these are not the final numbers, but really they probably won't change much. From the looks of it, nVidia has won this around. Too bad.
 
[BB] Rick James said:
I'm dissapointed. I know these are not the final numbers, but really they probably won't change much. From the looks of it, nVidia has won this around. Too bad.


hello im a moron, ATI why even bother releasing, obviously your suckage is huger then their suckage, this person on the internet said so
 
drizzt81 said:
one could think that the lack of final silicon and card prevented the driver team from being able to optimize drivers. Your point is that the long delay allowed for more driver development time, but I somehow doubt that this is the case, as you will need to know exactly what you are working with, i.e. the final product, in order to optimize the drivers for it.

My understanding was that the respins were related to the manufacturing process...not to the design of the chip...so early silicon and final silicon would be the same from a driver perspective...though I could totally be wrong...
 
Brent_Justice said:
I'm right there with ya, I want more performance as well. I was just making sure people understood "Next-gen" doesn't always mean more performance.

But ya, I want high rez, AA/AF, HDR, the works, with 60+ consistent framerates :D
If that is the case then you wll probably want a next gen console, cause that is probably the only way you can hope to get 60 fps consistently.
 
Shifra said:
hello im a moron, ATI why even bother releasing, obviously your suckage is huger then their suckage, this person on the internet said so

How much is ATi paying you per post? I mean it must take a lot of work to research, manipulate facts, deny other facts, flame bait every logical and rational person on the forums, I mean that must require a lot of effort, I hope they pay well.

In other non-slandering news, David Orton's comments go hand in hand with the benchmarks released (the R520 may or may not beat the GTX depending on clock speed), its competitive, that's all you can really say, a driver boost here and some optimization there can easily account for a 10-20% +/- before the day we see the cards in store.

EDIT: one more thing, does anyone know if this card will by any chance be able to do AA + HDR, because that would definately be worth upgrading?
 
PRIME1 said:
They are launching in 2 weeks and none of their board partners have one yet???? :eek:

Since HKEPC already got a card from an AIB partner and they had pictures of it...I'm betting AIB partners do in fact have the cards...dunno why ATi would say that...
 
ATI's (Chris Hook) first reaction to this article was: " Fiction. I don't believe these numbers were ever run on a 520 ".

A second response just came in:

"This is a complete hoax - done with an OC'd X850, we think. Call our partners yourself and ask if they have R520XTs in house. The numbers aren't even close."

Okay wait. Aren't they launching in a couple of weeks? If so then why don't their AIB partners have any R520 XTs in house? Shouldn't they have them in product now if they want to do a hard launch?

Color me confused... wait on that. Color ATI confused cause something doesn't look right here no matter how you look at it.
 
trudude said:
If that is the case then you wll probably want a next gen console, cause that is probably the only way you can hope to get 60 fps consistently.

LOL no... lets not start that shit
 
CrimandEvil said:
Okay wait. Aren't they launching in a couple of weeks? If so then why don't their partners have any R520 XTs in house? Should they have them in product now if they want to do a hard launch?

Color me confused... wait on that. Color ATI confused cause something doesn't look right here no matter how you look at it.
R520XT's are not launching in a couple weeks, iirc. The mid range parts are coming out first, and the high end cards are coming out after that.
 
Well lets just say this...

If those benchmarks are correct then it is obvious that someone that would actually spend money on a X1800XT s either a DIEHARD fanATic or just ignorant. Lets hope that those benchmarks are not correct and X1800XT is more competitive with the 7800GTX so that prices on the cards drop.
 
jebo_4jc said:
R520XT's are not launching in a couple weeks, iirc. The mid range parts are coming out first, and the high end cards are coming out after that.
Well that explains things. ;)
 
jebo_4jc said:
R520XT's are not launching in a couple weeks, iirc. The mid range parts are coming out first, and the high end cards are coming out after that.
That is a really bad idea on ATi's behalf cause even the lower end X1x00 cards will be competing directly with the X850 cards. In the general public's eye the higher number and higher clock speeds coupled with the lower price on the new line will hurt X850 sales.
 
Shifra said:
hello im a moron, ATI why even bother releasing, obviously your suckage is huger then their suckage, this person on the internet said so

Hey I'm in the same boat as you, last I heard we were both consumers. You must be sleeping with mother ATi to take such offense to something I said. I bet you buy the card just to get a little of your dribble on it. Maybe procreate and make an even better card. :rolleyes:
 
CrimandEvil said:
Well that explains things. ;)
Yeah I think it might, actually. I hunted around for the info, and this was all I could find:
http://www.anandtech.com/video/showdoc.aspx?i=2501
It says X1800XL will be launched soon, with the XT version to follow. The fact that XL will be released first makes this "preview" seem a little less credible, imo, since it only mentions Pro and XT. Also, that would explain why ATi Guy says that no board partners have XT's in their hands yet.

But hey, what do we know, right?



 
neubspeed said:
How much is ATi paying you per post? I mean it must take a lot of work to research, manipulate facts, deny other facts, flame bait every logical and rational person on the forums, I mean that must require a lot of effort, I hope they pay well.

In other non-slandering news, David Orton's comments go hand in hand with the benchmarks released (the R520 may or may not beat the GTX depending on clock speed), its competitive, that's all you can really say, a driver boost here and some optimization there can easily account for a 10-20% +/- before the day we see the cards in store.

EDIT: one more thing, does anyone know if this card will by any chance be able to do AA + HDR, because that would definately be worth upgrading?

You're doing the same thing empty has been doing in alot of threads, mis-quoting Orton. Once again, since i'v had to correct people 4 times now in this forum on what was said exactly was that,

"The R520 should exceed the 7800GTX in all benchmarks but recognizes that the G70 has room for higher clocks."

that is not what is reprisented in that mockery of a pre-view benchmarks, that shows the R520 totally crapping out in everything vs the GTX. And once higher resolution comes in, you may as well use it as a paper weight. Its almost the exact OPPOSITE of what was said, yet you seem to think you can link the two, okay?

Some people seem to be missing a huge freakin part of their brain since the XT isnt even going to be opposition for the GTX, its suppose to put pressure on the GTX so Nvidia has to respond. The targeted card for direct GTX competition is the 1800XL. But you apparently think the XL and XT are both going to be more expensive and perform worse then the GTX, cause that makes a load of sense right? ATI wouldnt release them if they didnt because no one would buy them.
 
[BB] Rick James said:
Hey I'm in the same boat as you, last I heard we were both consumers. You must be sleeping with mother ATi to take such offense to something I said. I bet you buy the card just to get a little of your dribble on it. Maybe procreate and make an even better card. :rolleyes:

Rick James made a funny :p :D
 
Shifra said:
You're doing the same thing empty has been doing in alot of threads, mis-quoting Orton. Once again, since i'v had to correct people 4 times now in this forum on what was said exactly was that,

"The R520 should exceed the 7800GTX in all benchmarks but recognizes that the G70 has room for higher clocks."

that is not what is reprisented in that mockery of a pre-view benchmarks, that shows the R520 totally crapping out in everything vs the GTX. And once higher resolution comes in, you may as well use it as a paper weight. Its almost the exact OPPOSITE of what was said, yet you seem to think you can link the two, okay?


Did you know. From what a person says and the the thing does is 2 differnt things?

Did you also know that ON PAPER matrox's parahiela whould have owned nVidia and ati with it's 512bit bus...and look where it is now...

The r520 = not impressive. and also you STILL can't buy it...

So get over your ego and stfu plz kthnx
 
Status
Not open for further replies.
Back
Top