R600 Pics

After closer examination, the GPU doesn't look over sized at all.

It looks about the same size as a x1900xt

If you were to remove that fan, and have just the graphics card itself, it doesn't look to bad ;)
 
man I am woundering if a 750watt psu will be enough to power this card and all the other stuff I have in my sig.

This thing is a MONSTER!!!!
 
Ugh, why does this thing have to be so long! I'm not giving up on SFF but this card doesn't really help much.
 
I'm still wondering if it will even use as much power as an 8800GTX does. The 6+8 setup could simply be to make it more compatible. You could use either 6+6 or 6+8 to power it based on what your PSU has. Allowing for some simple engineering you want to avoid using over ~80% of the max power and with 6+6 maxing at 240W(makes math easy) you're looking at 20A(which was mentioned earlier) being the maximum amount of power that can go through the connectors and slot. Assuming worse case scenario you wouldn't want to use more than ~80% of that you have 200W max. More than likely it won't even approach that kind of usage.

Other than the amount of memory present on the card an 8800 is simply built to use more power. R600 should use a smaller process which means less power usage. If that old die shot was accurate the die is roughly the same size for both G80 and R600. And that doesn't include NVIO which will suck down even more power.

Lower those power figures even further when you consider that blower alone should be able to suck down 2A as well as small children.
 
I'm still wondering if it will even use as much power as an 8800GTX does. The 6+8 setup could simply be to make it more compatible. You could use either 6+6 or 6+8 to power it based on what your PSU has. Allowing for some simple engineering you want to avoid using over ~80% of the max power and with 6+6 maxing at 240W(makes math easy) you're looking at 20A(which was mentioned earlier) being the maximum amount of power that can go through the connectors and slot. Assuming worse case scenario you wouldn't want to use more than ~80% of that you have 200W max. More than likely it won't even approach that kind of usage.

Other than the amount of memory present on the card an 8800 is simply built to use more power. R600 should use a smaller process which means less power usage. If that old die shot was accurate the die is roughly the same size for both G80 and R600. And that doesn't include NVIO which will suck down even more power.

Lower those power figures even further when you consider that blower alone should be able to suck down 2A as well as small children.

M8 you would be right if ATI where smart enough to make it run efficient, odds are it will have none 3d mode and 3dmode, where it will go all out and use every bit of its power lol
 
Ugh, why does this thing have to be so long! I'm not giving up on SFF but this card doesn't really help much.

You DID actually read this thread didn't you? That is the OEM VERSION. The retail version is 9", not the 12 you see there.
 
M8 you would be right if ATI where smart enough to make it run efficient, odds are it will have none 3d mode and 3dmode, where it will go all out and use every bit of its power lol

Except form an engineering standpoint you can only use what you're given and to actually meed spec and follow safety regulations they are required to do things a certain way. There's only so much power they can use so exceeding that figure isn't very likely. And with the power connectors they're standardized so you make do with what's available.

The only reason Nvidia used less power last gen was because the chips were significantly smaller and less capable. A difference that likely won't exist since Nvidia had a lot of features they had to improve for G80.
 
Here is a better quality picture: http://vr-zone.com/?i=4622

Also: "VR-Zone has learned about some new details on 80nm R600 today and there will be 2 SKUs at launch; XTX and XT. There will be 2 versions of R600XTX; one is for OEM/SI and the other for retail. Both feature 1GB DDR4 memories on board but the OEM version is 12.4" long to be exact and the retail is 9.5" long.The above picture shows a 12.4" OEM version. The power consumption of the card is huge at 270W for 12" version and 240W for 9.5" version. As for R600XT, it will have 512MB of GDDR3 memories onboard, 9.5" long and consumes 240W of power. Lastly, there is a cheaper R600XL SKU to be launched at a later date."

The OEM is crazy long, but the retail is alot more back to earth. It will be interesting to say the least at what this thing can do, but I wonder how pricing will be for the XTX :eek:
 
I'm a tad confused about VR-Zone's comments. Why would the retail version draw 30W less than the OEM? Will OEM clock speeds be radically higher?

30W is a very large chunk of power, and it seems strange the they would differ that substantially.
 
Phide, pix of the OEM fan show it pulls up to 2 amps, so that is 24W (!!!) right there. If there is a different fan on the retail cards, that could account for part of it. Early rumours also had the OEM cards using rev A12 silicon, with retail using A15, so who knows for sure.
 
I've been going off the assumption the current OEM card is the A13 revision and the retail will be the A15 revision. Same die but the retail and eventually the OEMs will have newer improved silicon that would be identical.

VR-Zone's numbers seem slightly off the mark as well. Those numbers would likely be the maximum power draw for the connectors, not the actual usage. So 150W through the PCI slot then 15W per pair of pins on the additional connector. Oddly enough the fan that was pictured was rated at 24W as well. Assuming that's only on the OEM cards that could be a significant part of the power difference between OEM and retail.
 
Are you sure that A15 will be available at launch? I heard it was a ways away... like a future revision with a 1.0 GHZ clock.
 
I agree, I think VRZone's numbers were for the maximum watts the card will use...
Correct me if I'm wrong, but...
PCIe slot = 75watts, 6pin = 75watts. 75/6 = 12.5watts per pin, 12.5(8) = 100watts. 75+75+100 = 250...
Now I don't know if that's exactly how power works, but it seems logical to me :D :p
 
Phide, pix of the OEM fan show it pulls up to 2 amps, so that is 24W (!!!) right there.
I've never seen a PC fan draw 24W continuously, save for the ultra-high performance 120x120x72 twin-rotary Deltas, which can draw around 30 watts, and require two to three amps to start. Centrifugal fans are less efficient, with the necessity for greater weight, but 24 watts?

Even if the maximum wattage was 24 watts, surely the retail version uses a similar fan. There's very little accounting for such a high difference between the OEM and retail power draw.
 
Most of the length is the cooler. I have a watercooling setup so I can just remove that and it will work fine for me. And I have a mountain mods case so I have plenty of room.

Now if we just knew the cost....
 
just to let u guys know I saw that pic a long time ago.

No clue how it went back into the world again

I wasnt sure if that was real before and now I am thinking it is fake because It is a old pic before any rev was in the talks
 
Phide, thats what I said ("could account for part of it"). Other than using an earlier silicon revision as some are claiming, I think this particular rumor might be a bit of FUD.
 
Except form an engineering standpoint you can only use what you're given and to actually meed spec and follow safety regulations they are required to do things a certain way. There's only so much power they can use so exceeding that figure isn't very likely. And with the power connectors they're standardized so you make do with what's available.

The only reason Nvidia used less power last gen was because the chips were significantly smaller and less capable. A difference that likely won't exist since Nvidia had a lot of features they had to improve for G80.


Unless, of course, from an engineering stand point there is no way to satisfy the marketing check boxes. Then you go back, engineer something to meet tolerances and the marketing requirements. There you go, a new product is born.
 
Unless, of course, from an engineering stand point there is no way to satisfy the marketing check boxes. Then you go back, engineer something to meet tolerances and the marketing requirements. There you go, a new product is born.

Albeit at a much higher cost.
 
http://uk.theinquirer.net/?article=37581

R600XTX.jpg
 
Would you people shut the fuck up about the length, it's been said over and over again that the retail version is NINE INCHES while the oem is TWELVE. At 9 inches the R600 is shorter than the G80.
 
Would you people shut the fuck up about the length, it's been said over and over again that the retail version is NINE INCHES while the oem is TWELVE. At 9 inches the R600 is shorter than the G80.

I thought the R600 was 9.5 inches and the G80 was 9?
I could be wrong though, I just thought nVidia's was 'short and stubby and not satisfying to women' and the R600 was long and strong.

Oh no he di'nt!
 
It's really not the size that counts, it's how you use it and what you use it for... But an extra inch never hurt. :D
 
I have never seen so much sissy pansy bullshit in my entire life.

Last time I checked this is HARD OCP where sheer raw power matters most.

If you want to talk about sissy pansy low wattage and a "Mini Me" form factor size then go to sites like Anand.

Who cares if the thing is as long as my penis in a cold shower.
Who cares if the thing draws more power than some small towns.

All I care about is how fast it can go and how good it makes a game look.

That's all that freaking matters.

Jesus Christ, what the hell happened to this place :eek: . It used to be where you would come here and see how some guy rigged his Celeron to an outside generator all the while keeping it stuffed into his wife's refrigerator for an extra 300MHz. Now I come here and half of you Fellas are bitching about an extra 100 watts or so?:rolleyes: Oh cry me a river, then jump in it with your plugged in toaster.

You know what I am going to do? I am going to be putting two of these bad boys in my system, and instead of getting a faster psu, I will getting some homeless guy on a bike pedaling his ass off to a small generator in order to power those two cards. All the while I will be playing Obvilion in crazy high resolution while sucking down a triple cheeseburger (from a styrofoam container) and pumping out more methane to destroy the ozone from my arse.

This is HARD OCP not Hello Kitty Eco-PC Island Adventure.
 
I have never seen so much sissy pansy bullshit in my entire life.

Last time I checked this is HARD OCP where sheer raw power matters most.

If you want to talk about sissy pansy low wattage and a "Mini Me" form factor size then go to sites like Anand.

Who cares if the thing is as long as my penis in a cold shower.
Who cares if the thing draws more power than some small towns.

All I care about is how fast it can go and how good it makes a game look.

That's all that freaking matters.

Jesus Christ, what the hell happened to this place :eek: . It used to be where you would come here and see how some guy rigged his Celeron to an outside generator all the while keeping it stuffed into his wife's refrigerator for an extra 300MHz. Now I come here and half of you Fellas are bitching about an extra 100 watts or so?:rolleyes: Oh cry me a river, then jump in it with your plugged in toaster.

You know what I am going to do? I am going to be putting two of these bad boys in my system, and instead of getting a faster psu, I will getting some homeless guy on a bike pedaling his ass off to a small generator in order to power those two cards. All the while I will be playing Obvilion in crazy high resolution while sucking down a triple cheeseburger (from a styrofoam container) and pumping out more methane to destroy the ozone from my arse.

This is HARD OCP not Hello Kitty Eco-PC Island Adventure.

^^^^^thread p0wned! good stuff!

(I still don't want to buy a new PSU though :D )
 
Yeah, because caring about energy efficiency is unmanly and uncool. What's wrong with you people? [/sarcasm]
 
I have never seen so much sissy pansy bullshit in my entire life.

Last time I checked this is HARD OCP where sheer raw power matters most.

If you want to talk about sissy pansy low wattage and a "Mini Me" form factor size then go to sites like Anand.

Who cares if the thing is as long as my penis in a cold shower.
Who cares if the thing draws more power than some small towns.

All I care about is how fast it can go and how good it makes a game look.

That's all that freaking matters.

Jesus Christ, what the hell happened to this place :eek: . It used to be where you would come here and see how some guy rigged his Celeron to an outside generator all the while keeping it stuffed into his wife's refrigerator for an extra 300MHz. Now I come here and half of you Fellas are bitching about an extra 100 watts or so?:rolleyes: Oh cry me a river, then jump in it with your plugged in toaster.

You know what I am going to do? I am going to be putting two of these bad boys in my system, and instead of getting a faster psu, I will getting some homeless guy on a bike pedaling his ass off to a small generator in order to power those two cards. All the while I will be playing Obvilion in crazy high resolution while sucking down a triple cheeseburger (from a styrofoam container) and pumping out more methane to destroy the ozone from my arse.

This is HARD OCP not Hello Kitty Eco-PC Island Adventure.

AMEN!!!!!
 
Yeah, because caring about energy efficiency is unmanly and uncool. What's wrong with you people? [/sarcasm]

Keep up the sarcasm and I will steal your precious Honda Hybrid and rig IT to power my two ATi R600's instead of some homeless guy on a generator bike.

Expecting energy efficiency from a new high end graphics card is like expecting a great blow job from Bea Arthur.

Bea_Arthur.jpg



aka...it is'nt possible.
 
There is an awful amount of over-reacting going on here.
The bottom line is: we will probably never be able to buy the card in the picture. Rumor has it, the enormous foot-long version of R600 is for OEM and system integrators only. So, it's an enormous thing that is probably nothing more than a superclocked version of the R600XTX that companies like Dell and Alienware can offer in pre-configured systems. We will be buying the smaller card that is similar in size to the 8800GTX.

nVidia and Dell did the same thing with the 7800GTX. There are pics of it in this thread around post #35.
 
Expecting energy efficiency from a new high end graphics card is like expecting a great blow job from Bea Arthur.

Bea_Arthur.jpg



aka...it is'nt possible.


i don't know what you're talking about, but Bea gives the mad head... you should feel it when she takes her teeth out.
 
I have never seen so much sissy pansy bullshit in my entire life.

Last time I checked this is HARD OCP where sheer raw power matters most.

If you want to talk about sissy pansy low wattage and a "Mini Me" form factor size then go to sites like Anand.

Who cares if the thing is as long as my penis in a cold shower.
Who cares if the thing draws more power than some small towns.

All I care about is how fast it can go and how good it makes a game look.

That's all that freaking matters.

Jesus Christ, what the hell happened to this place :eek: . It used to be where you would come here and see how some guy rigged his Celeron to an outside generator all the while keeping it stuffed into his wife's refrigerator for an extra 300MHz. Now I come here and half of you Fellas are bitching about an extra 100 watts or so?:rolleyes: Oh cry me a river, then jump in it with your plugged in toaster.

You know what I am going to do? I am going to be putting two of these bad boys in my system, and instead of getting a faster psu, I will getting some homeless guy on a bike pedaling his ass off to a small generator in order to power those two cards. All the while I will be playing Obvilion in crazy high resolution while sucking down a triple cheeseburger (from a styrofoam container) and pumping out more methane to destroy the ozone from my arse.

This is HARD OCP not Hello Kitty Eco-PC Island Adventure.

Good post, but give it a couple more pages and they will be back to bitching. Im in the same boat. I dont give a shit how long it is nor do I care how it looks like.
For all I care the card could look like tubgirl and as long as it runs my games maxxed Im a happy camper.
The people bitching are the same ones that would buy a car with a V8 and then whine about gas mileage.

The one thing I dont understand is. If you can afford a $600+ video card you should be able to afford a longer case(not that you would need to as the card pictured is an OEM card) and a new PSU.
 
I have never seen so much sissy pansy bullshit in my entire life.

Last time I checked this is HARD OCP where sheer raw power matters most.

blah blah blah
This is HARD OCP not Hello Kitty Eco-PC Island Adventure.

it has nothing to do with energy efficiency, it has everything to do with people wanting to hate ATI or promote Nvidia, last time I checked, the diff between 200w and 240w isnt so drastic that everyone drops their panties =p unless of course you are trying to make your choice of an 8800gtx look better?
 
What's with the "throw power consumption out the window" mentality? I care about power consumption, and I somewhat-care about card length (owning a UFO has its advantages). Does this make me anti-ATI or pro-NVIDIA?

Last I checked, 200+ watts isn't "sissy pansy low wattage". 200 watts is a typical Dell rig running full tilt.

It seems like all nuts have finally arrived in this basket of a thread. Rest in peace, once valuable thread.
 
it has nothing to do with energy efficiency, it has everything to do with people wanting to hate ATI or promote Nvidia, last time I checked, the diff between 200w and 240w isnt so drastic that everyone drops their panties =p unless of course you are trying to make your choice of an 8800gtx look better?

No, it's not about promoting NVIDIA...
If these rumors are true, the power consumption of these cards are edging a limit, that we though would not be reached in a while.
NVIDIA had a problem with their Anisotropic Filtering and AA+HDR in most games. They fixed it. ATI had a problem with heat and power consumption and according to these rumors, they didn't fix it (at least in what power is concerned). We're simply discussing what these rumors mean for us, consumers. And power consumption IS an issue.
Stop trying to turn this into a "flag waving" contest, because it's not.
 
Back
Top