R600 Pics

hell it may be time to put that AS400 case to good use... if i could get a couple of guys to help me bring it inside.

LMAO!

The DEC Alpha case I have in storage would fit a few of these too, and it would go *so* well with the decor.

Seriously, you can fit a card this size in an ATX case that follows the specification. I had to cram a high end array controller in one, once. Desperation is the true mother of invention. I won't say it's an experience I'd like to repeat with any regularity.

Does anyone know how to calculate out what the amperage requirement is based upon wattage requirements? I'm still interested to know how much this will take off the 12V rail. As a comparison, it would be nice to know what the GTX takes.

EDIT: I answered part of my own question.

"Thanks to all those transistors, a single e-GeForce 8800 GTX draws a whopping 165 watts of power. NVIDIA recommends at least a 450 watt power supply with a minimum of 30 amps on the +12v rail—and I emphasize again, that's just for a single card"

Link for reference
 
Oh, and how much does a GTX take?

GTX uses almost 200 watts peak, so your requirements for no higher than 200 watts for R600 seems a little unrealistic.
Especially if it ends up 10-20% faster.

I'll be surprised if R600 consumes more than 230-240 watts. Not sure on the current requirement, I'd guess around 40 amps ?
 
By extrapolating off of the 8800 GTX numbers, I arrived at the need for at least 44 amps off the 12V rail.

Math:

165/30 = 5.5 240/5.5 = 43.65

A quick check of power supplies at Newegg seems to indicate that even some of the high end Silverstones won't work, provided that my calculations are correct. A Seasonic 600W though, looks like it will be fine. That's about the lowest I'd consider, really, and Seasonics are wicked efficient. Something crappy would probably necessitate 850-900W. Though I have no idea how accurate my theory is about the amperage needed.
 
GTX uses almost 200 watts peak, so your requirements for no higher than 200 watts for R600 seems a little unrealistic.
Especially if it ends up 10-20% faster.

I'll be surprised if R600 consumes more than 230-240 watts. Not sure on the current requirement, I'd guess around 40 amps ?

They are unrealistic. I missed the part about the retail's power consumption being ~240W. I read it again a few minutes ago, and caught it that time. So yeah, 240W for the retail.
 
You can see there's some nvidia !!!!!!s in this thread. The given card is an OEM, so there's no reason to say "omgsh it like wont fit in my case wtfz0r, its pos!" And it's expected to be considerably more powerful than a 8800gtx... so hence the higher power draw. The efficiency will eventually increase on current products, but as far as the leading edge technology that draws the most power.. it will never be efficient. Only after the technology has matured, can the product gain efficiency. (obviously not the actual product, but future productions) Think of engines.

I think we've all seen the secondary power supplies that were under development for stand alone gpu applications, and they might be forthcoming.

As far as the 8-pin power connector.. it's happened, and it's going to keep happening. First video cards never required a power connector, then they needed the floppy connector, then the standard molex, then the six pin, then the dual six pin, now a six pin and an eight, probably next is dual eights, maybe further. Nothing to be unexpected as judged from previous trends...

This card should kick ass, but with raw power comes raw power consumption. Although I'll never own one, I'm definately interested in what it can do. ATI always comes out with products later and hammers nvidia, and I'm assuming that's the same thing that's going to happen this round.
 
Even for a pre release that's icky :eek: I want my crimsion red hawtness!! :D
 
You can see there's some nvidia !!!!!!s in this thread. The given card is an OEM, so there's no reason to say "omgsh it like wont fit in my case wtfz0r, its pos!" And it's expected to be considerably more powerful than a 8800gtx... so hence the higher power draw. The efficiency will eventually increase on current products, but as far as the leading edge technology that draws the most power.. it will never be efficient. Only after the technology has matured, can the product gain efficiency. (obviously not the actual product, but future productions) Think of engines.

I think we've all seen the secondary power supplies that were under development for stand alone gpu applications, and they might be forthcoming.

As far as the 8-pin power connector.. it's happened, and it's going to keep happening. First video cards never required a power connector, then they needed the floppy connector, then the standard molex, then the six pin, then the dual six pin, now a six pin and an eight, probably next is dual eights, maybe further. Nothing to be unexpected as judged from previous trends...

This card should kick ass, but with raw power comes raw power consumption. Although I'll never own one, I'm definately interested in what it can do. ATI always comes out with products later and hammers nvidia, and I'm assuming that's the same thing that's going to happen this round.

You sir, are the man.
I'd like to buy you a brand new car.
 
By extrapolating off of the 8800 GTX numbers, I arrived at the need for at least 44 amps off the 12V rail.

Math:

165/30 = 5.5 240/5.5 = 43.65
.

No, it's much simpler than that : Amps = Watts/Voltage. So we're talking 20 amps if the estimated 240w is correct.
 
No, it's much simpler than that : Amps = Watts/Voltage. So we're talking 20 amps if the estimated 240w is correct.

So 20 amps if 240 is correct, but due to lack of 100% efficiency it would draw say, 25A?
And how much would say, a Windsor OC needing 1.375 V add into the pull there?

EDIT: In any case, thank you. I looked this up, and it is true, Amps = Watts/Voltage, but I am curious about how a power supply's efficiency would play into this, as well as what the rest of the system draws.
 
I think 240 is realistic. ATi has historically done far worse at pulling good performance per watt than has NVIDIA (although ATi GPUs typically pull less power at idle). My main concern is the configuration of the headers -- I don't like the eight pin.

I still want an external power supply, but I expect that AMD is hesitant to make the jump until NVIDIA does so. If R600 is, on the whole, around 15% (a reasonable estimation) faster than the GTX, and comes with an external PSU for ~$700, I'd probably bite.

As it is...I'm thinking the 8900 GTX (or whatever) may end up being the better buy, but the VRAM factor is a huge consideration for me at this point.
 
Me thinks that in order to jump start that fan, you feed the gerbil some No-doze laced cheese, with some Brawls to wash it down. The unsuspecting rodent goes into a running spasm, that turns the wheel at about 1100rpm, pushing about 25cfm, creating a negative space so that it is easier to spin up that OEM fan....

:::Thinking about opening up the power of MS Paint and create a Rube Goldberg illustration:::


<-- I admit that my above post added nothing to this thread, but I'm tired, had this image in my head, and couldn't help myself
 
You sir, are the man.
I'd like to buy you a brand new car.



Did you cash your viral marketing paycheck yesterday ? or was it put into your checking account,as per the usual agreement with AMDati.Do they pay per post,or per hour logged into to each forum you are a member on ? :p
 
So 20 amps if 240 is correct, but due to lack of 100% efficiency it would draw say, 25A?
If it consumes 240W it draws 20A at the "inlet", i.e. the power connectors on the card side. It may draw a bit more from the PSU, but that depends on many factors, not least the connection between the two ends of the power connector, the length and thickness of the cable etc.
 
As far as the 8-pin power connector.. it's happened, and it's going to keep happening. First video cards never required a power connector, then they needed the floppy connector, then the standard molex, then the six pin, then the dual six pin, now a six pin and an eight, probably next is dual eights, maybe further...

I predict that by 2011, videocards will not be powered by your PSU. You'll have to plug them into a wall socket.
 
Anyone really looking at the pix? The pcb is ~9" long, with the cooler overhanging. All reliable sources suggest that only OEMs will be getting this cooler type. Retail will get a card length cooler.

I thought this was fairly well known.
 
Did you cash your viral marketing paycheck yesterday ? or was it put into your checking account,as per the usual agreement with AMDati.Do they pay per post,or per hour logged into to each forum you are a member on ? :p

I wish!
I don't think I meet the criteria though. I've owned nVidia cards too.
I know, I know...
 
By extrapolating off of the 8800 GTX numbers, I arrived at the need for at least 44 amps off the 12V rail.

Math:

165/30 = 5.5 240/5.5 = 43.65

A quick check of power supplies at Newegg seems to indicate that even some of the high end Silverstones won't work, provided that my calculations are correct. A Seasonic 600W though, looks like it will be fine. That's about the lowest I'd consider, really, and Seasonics are wicked efficient. Something crappy would probably necessitate 850-900W. Though I have no idea how accurate my theory is about the amperage needed.

What kind of math are you using? :p To calculate wattage you multiply voltage by amperage.. So it would be 240/12 = 20 Amps.

So 20 amps if 240 is correct, but due to lack of 100&#37; efficiency it would draw say, 25A?
And how much would say, a Windsor OC needing 1.375 V add into the pull there?

EDIT: In any case, thank you. I looked this up, and it is true, Amps = Watts/Voltage, but I am curious about how a power supply's efficiency would play into this, as well as what the rest of the system draws.

There's inefficiency when the psu converts ac->dc. And, from the wall it's 115V so it's a whole different ball game. If you want to know how much it draws AC with maybe an 80% eff psu here:

240/.8 = 300 watts ac, 300watts/115v=2.6amps
 
If Terminator ran on one of these, he would have been much faster, but them little battery things inside him would have piddled out after 3 weeks, not 100 years (or whatever it's length is, mentioned in T3).
 
If it consumes 240W it draws 20A at the "inlet", i.e. the power connectors on the card side. It may draw a bit more from the PSU, but that depends on many factors, not least the connection between the two ends of the power connector, the length and thickness of the cable etc.

Resistance would be another factor..one I hadn't considered until you brought it up, Drizzt. So, we have resistance, due to the length said current is traveling, and we have power supply efficiency, too.

I'll guess, based on these number and number drawn from a view reviews that a 5000+ system draws somewhere in the neighborhood of 210W under load, it's going to take a highly efficient 600W PSU, maybe even 650W if you are running type of processor. I'm not sure what Core 2 number would look like, since I don't have one and I didn't look those up. Anyone else have thoughts on this?
 
What kind of math are you using? :p To calculate wattage you multiply voltage by amperage.. So it would be 240/12 = 20 Amps.



There's inefficiency when the psu converts ac->dc. And, from the wall it's 115V so it's a whole different ball game. If you want to know how much it draws AC with maybe an 80% eff psu here:

240/.8 = 300 watts ac, 300watts/115v=2.6amps

As I said, math from number extrapolated from an 8800GTX power draw, but that's why I asked how it is calculated. Now that I know how it's calculated, I see that was erroneous. I always like to learn new things, and if it pertains to a subject that is already fascinating such as graphics cards, then even more so. ;)
 
As far as the 8-pin power connector.. it's happened, and it's going to keep happening. First video cards never required a power connector, then they needed the floppy connector, then the standard molex, then the six pin, then the dual six pin, now a six pin and an eight, probably next is dual eights, maybe further. Nothing to be unexpected as judged from previous trends.../QUOTE]

At this rate, I think they should just bite the bullet and slap a bloody 240V IEC connector on it.

Power consumption is beginning to get absolutely ridiculous -- a year ago, I thought overclockers.com were being alarmists, these days I tend to agree. CPUs seem to tend to stay at sane levels, power wise, but GPUs keep skyrocketing up the heat graph.
 
oh come on guys the card length is fine, it just has a very dell esque cooler on it. If anyone has ever worked with dell hardware they'll know that thing will be right at home in one of their rigs.
 
oh come on guys the card length is fine, it just has a very dell esque cooler on it. If anyone has ever worked with dell hardware they'll know that thing will be right at home in one of their rigs.

Don't worry, it's just the people who aren't going to buy one who are complaining.
 
Pfft - who cares how big it is, if you can afford one of these you can afford a new case for it and a PSU to power it. I'll be building a whole new PC once they finally launch. Performance is key, especially since this is aimed at the high-end market.
 
WIth my spec's in signature will my newer PC Power and Cooling 750watt be enough to handle it and my system with no problem's ??
 
WTF? 12 inches?! I can only just fit my X1950Pro into my case.
What a huge piece of ugly crap, I really hope they can size it down.
 
WTF? 12 inches?! I can only just fit my X1950Pro into my case.
What a huge piece of ugly crap, I really hope they can size it down.

dude. freaking read. That's an OEM version. The retail version is shorter.
 
Wow just in time for all this damn snow in Michigan, my own personal snow blower!
:D

One of these bad boys is so tempting. If they eat a 8800GTX for breakfast I may have to consider getting one.
 
That looks just like the horrible X1800 cooler, except the cooler is on the opposite side now.:D I hope it doesn't sound as bad....
 
Back
Top