Next Gen (DX10) GPUs to require seperate PSU!?!

quadnad said:
lol...your last comment reminds me of the Voodoo 5 6000.

Well that seems far more practical than having to cram an additional 100+ dollar component into my bulging case. In fact thats what some companies have done in respect of dual nvidia gpu's on a single pcb.

see here:

http://www.hardocp.com/article.html?art=OTQyLCwsaGVudGh1c2lhc3Q=
1136675521tz2iBfQvxU_1_3_l.jpg
 
The jump in power is undoubtedly due to the huge number of transistors we'll be seeing on D3D10 GPUs to meet the high requirements of it. As the article suggests, the gen afterwards, after red and green have had more time to work on it and optimize some of the features, should have less power consumption.
 
External power kits will definitely be better. Making the case heavier and bigger is not favorable for everyone for sure.
 
I wouldn't mind external power supplies, even for 70-100W cards.
 
pxc said:
I wouldn't mind external power supplies, even for 70-100W cards.

Ditto. I would like it, or even having the choice between the two would be even better.

My PSU could handle it still, but having the choice would be nice. Hopefully both NV and ATi offer this, so its one less thing for certain people to try and pick apart the other company with.
 
External power would be much better for me. I cant imagine the heat with 2 psu's in one case.
 
I think they should all the time try to make products that are more powerful but consume less energy. It can be done.
 
fallguy said:
Ditto. I would like it, or even having the choice between the two would be even better.

My PSU could handle it still, but having the choice would be nice. Hopefully both NV and ATi offer this, so its one less thing for certain people to try and pick apart the other company with.

QFT
 
i welcome external power.. id feel more comfortable knowing that my PSU has spare watts available. whats with everyone saying it would be son inconvienant? you have to plug in your monitor and PSU... whats another thing.
 
Faction said:
i welcome external power.. id feel more comfortable knowing that my PSU has spare watts available. whats with everyone saying it would be son inconvienant? you have to plug in your monitor and PSU... whats another thing.

Because your monitor and PC use the same type of cable, it's likely these will use proprietary Power supplies and will we'll have all sorts of new issues (like breaking, or setting in fire, etc...)

This PC gaming industry is getting ridiculous, and I'm not liking where it's going.
 
I would hope that they give people the option of external or internal... like the Asus dual 7800gt card that came out a while ago.


But I think I'm fairly safe with my 1.1kw psu :D
 
I dunno, to me the 300W figure seems very far fetched. In the past few gens both ATi and Nvidia have been able to increase performance by 50-100% without increasing power usage at all (or much anyway). While DX10 cards are going to be alot more complex and a higher power usage is to be expected, i cant see why power usage would almost TRIPLE for a DX10 card, unless they are going to be 200-300%+ faster than the fastest cards we have available today, i think that 300W figure will die very very quickly.

After all, lets not forget there were things around that were saying the good old Geforce 7800 GTX was going to use up to 225 watts before its launch and that turned out to be a 110W max draw card using ~90 watts under most gaming.

Ill start worrying about what kinda of extra PSU my video card might need when ATi and Nvidia tell me im going to need one.
 
At least there is a good part:

As depressing as this news is, there is a small light at the end of the tunnel. Our sources tell us that after this next generation of GPUs we won’t see an increase in power consumption, rather a decrease for the following generation. It seems as if in their intense competition with one another, ATI and NVIDIA have let power consumption get out of hand and will begin reeling it back in starting in the second half of next year.

That sounds like good news, but it means we'll have to wait till the end of 2007 for lower powered GPUs.

For now I like the idea of a seperate power block you plug into the back of your card rather then having to upgrade to a 1kW PSU or having two PSUs inside your box, ugh.
 
Brent_Justice said:
For now I like the idea of a seperate power block you plug into the back of your card rather then having to upgrade to a 1kW PSU or having two PSUs inside your box, ugh.

Considering the price of the 1kW PSUs:

PC Power & Cooling Turbo-Cool 1KW EPS12V 1000W Power Supply - Retail
http://www.newegg.com/Product/Product.asp?Item=N82E16817703003 - $469.99

PC Power & Cooling Turbo-Cool 1KW-Quad SLI T1KW-4E EPS12V 1000Watts Power Supply - Retail
http://www.newegg.com/Product/Product.asp?Item=N82E16817703004 - $499.99

I would be happy to have an external power supply.
 
Brent_Justice said:
That sounds like good news, but it means we'll have to wait till the end of 2007 for lower powered GPUs.
Intel and AMD are going down to 65W (or less) for mainstream dual core CPUs and nvidia and ATI are going up to 150W (and higher) for video cards to match those performance video cards. I was hoping for 80-90W performance DX10 cards. :(
 
I'm wondering if that 300W top end was for a graphics setup, not a graphics card.
XFire X19100XT[X] is ~230-250W, a pair of 7900GTXs come close to 200W.
A 300W SLI setup at the end of the year is by no means unreasonable. Hell, ATI's 3 GPU XFire + physics setup could easily break 300W today.

I bet we'll still be seing 40-60W mid range cards, some of crippled versions of the top end cards (GS / XL) in the sub 100W range and the absurdly powerful breaking the PSU at over 100 up to maybe 150W for the full card. Especially if we start seeing 1gb of RAM pushing 2ghz DDR.

EDIT
Yes I know anandtech specifically said 'per card' but doubling power dissipation in a single generation just doesn't make sense. Geometry shaders and other extras will add some work to the GPU, but that's a massive jump from 50-60W to 130W for a decent performing card.
 
FreiDOg said:
I'm wondering if that 300W top end was for a graphics setup, not a graphics card.
XFire X19100XT[X] is ~230-250W, a pair of 7900GTXs come close to 200W.
A 300W SLI setup at the end of the year is by no means unreasonable. Hell, ATI's 3 GPU XFire + physics setup could easily break 300W today.

I bet we'll still be seing 40-60W mid range cards, some of crippled versions of the top end cards (GS / XL) in the sub 100W range and the absurdly powerful breaking the PSU at over 100 up to maybe 150W for the full card. Especially if we start seeing 1gb of RAM pushing 2ghz DDR.

EDIT
Yes I know anandtech specifically said 'per card' but doubling power dissipation in a single generation just doesn't make sense. Geometry shaders and other extras will add some work to the GPU, but that's a massive jump from 50-60W to 130W for a decent performing card.

check out this power graph from a techpowerup article.... ridiculous stuff.
 
FreiDOg said:
EDIT
Yes I know anandtech specifically said 'per card' but doubling power dissipation in a single generation just doesn't make sense.
I don't really believe the 300W per card figure either. But look at the 7950 GX2. nvidia calls it a single card even though it has 2 PCBs.

If nvidia made a 150W 8800GTX Ultra and somehow managed to solve the cooling problem of having 2 full speed 8800GTX Ultra PCBs so close together, a 300W "card" would be possible from that perspective. I still think that's very unlikely. Even nvidia toned down the 7950GX2's cores to make it a 140W card.
 
pxc said:
I don't really believe the 300W per card figure either. But look at the 7950 GX2. nvidia calls it a single card even though it has 2 PCBs.

If nvidia made a 150W 8800GTX Ultra and somehow managed to solve the cooling problem of having 2 full speed 8800GTX Ultra PCBs so close together, a 300W "card" would be possible from that perspective. I still think that's very unlikely. Even nvidia toned down the 7950GX2's cores to make it a 140W card.

That makes sense. ~130+W for a single pretty high end card with 1 GPU, closing in on 300W for a 'GX2' type card. I can see that happening I suppose.

Though I'm left to wonder: how would you cool something like that? You'd need a two slot high cooler on both sides of the card - or water cooling - I'd think.
 
I think the boy blunder is wrong. I know where I read that the DX10 ATi card was stated to produce less heat than current cards and outoperform them. So...I guess his statement that DX10 cards could use from 130W to 300W is technically true, but todays cards consume up to 130 W anyway when overclocked, not to mention SLi or Crossfire.

Then he goes into showing a bunch of low power cards with HDCP connectors on them, like they are not new DX10 models?

Don't get mislead...wait until you have it in your hands to see the power consumption and power supplies.

Remember, all those external and slot power supplies were designed for older cards in less capable systems like SFF chasis and or old 3dfx cards.
 
I'm disappointed with the news on the power consumption too. While I doubt that 300W per card can even be achieved, what bothers me is that power consumption is increasing at all.

CPUs have abandoned thier performance-at-any-cost rise in clock speeds and are now placing emphasis on efficiency. Why can't videocards do that now and help prevent the further proliferation of insane cooling and power requirements?
 
I think this power stuff is almost ready to cross the line into ridiculous. Why is it that my dual-core 2GHz CPU only uses 20-30% more power than my previous single-core 1.8GHz CPU, but video cards are using 300W+? Pretty soon you're going to have to install new industrial 30A circuits into your house just to power these things! Figure that a 15A line can handle approx. 1700W. 12W for CPU, 100W for other devices, 1200W for quad-SLI is getting very close to that limit. Add monitors, printers, networking gear, etc.

Why can't they do what Intel and AMD have done and make the chips both faster AND more efficient?
 
I'm not worried, when Quad SLi was first announced they were saying you would need upwards of 1 Kilowatt to run it on a high end rig. Now they are finding the opposite. These new 7950 Gx2s for example Anandtech had 2 running in quad sli with beta drivers stable on a 480watt power supply. I think this is all speculation now and when the products are finalized they will consume more than what's out there but not much more. If you have at least a 550 watt you should be fine for G80 SLI or R600 Crossfire. Maybe not quad Crossfire, or Quad SLi but if your gonna go that route you should be able to afford a bigger power supply
 
Lord_Exodia said:
If you have at least a 550 watt you should be fine for G80 SLI or R600 Crossfire. Maybe not quad Crossfire, or Quad SLi but if your gonna go that route you should be able to afford a bigger power supply

You'd better have a freakin small system if you're going to be running Crossfire with a 550...
 
Asus already have ext. power supply for a long time. I like it, never need to worry if my internal power supply can handle the load.
 
I think a power brick ala the Xbox 360 might not be a bad idea. Then you'd literally plug into the card in the back of the case. This would be a pain free way to add a PSU. The good thing also is, make it 300 watts say, and you probably wouldn't have to replace it for several generations of cards.

I'm sure some will require a in-case solution. For that the unit that fits in your optical drive bay is slick. In fact all considered I'd probably prefer that to the power brick idea.
 
External PSU eh? Hm...

That sounds particularly tasty, especially for us V2000 owners. I say that coz those cases can get REALLY heavy. I mean, 100% aluminum plus all the stuff inside...an external PSU would be nice.
 
defiant007 said:
http://anandtech.com/tradeshows/showdoc.aspx?i=2770&p=1

If true thats just fucking insane, starting to really get out of hand. Imagine what a person will need to run sli....or *gasp* quad sli. :eek:

Why cant they just make a power block that you plug into the back of the video card :confused:
not just a separate PSU, but a circuit all to itself:

1000 W / 110 V ~= 9.1 A

Since a lot of indoor circuits in the apartments that I have lived in were/ are 10A circuits, as soon as PSUs hit 1100W on the input side, it's time to upgrade interior wiring and the circuit breaker.
 
quadnad said:
You'd better have a freakin small system if you're going to be running Crossfire with a 550...

Could you define "freakin small" please?

I'm running X1900 Crossfire with a 535w just fine, along with an OC'd X2, 5 120mm case fans, PCI audio card, three 7800 RPM drives, 2 DVD drives, 2gb of RAM, and multiple USB devices drawing power from the bus.
 
Croak said:
Could you define "freakin small" please?
I didn't read what he quoted for that reply either at first. The reply was to a "G80 SLI or R600 Crossfire" and the topic of this thread. 550W may no longer be enough if power consumption increases as much as the AT article states.

But I agree that 550W is fine for current XF and SLI systems.
 
Croak said:
I'm running X1900 Crossfire with a 535w just fine, along with an OC'd X2, 5 120mm case fans, PCI audio card, three 7800 RPM drives, 2 DVD drives, 2gb of RAM, and multiple USB devices drawing power from the bus.

Don't mess with this guy, he overclocked his hard drives to 7800 RPM. Now that's [H] :p
 
Croak said:
I'm running X1900 Crossfire with a 535w just fine, along with an OC'd X2, 5 120mm case fans, PCI audio card, three 7800 RPM drives, 2 DVD drives, 2gb of RAM, and multiple USB devices drawing power from the bus.

I just mean after having seen this chart, I'd be mighty wary of running my system on a 535W psu. Thats not to say that it can't be done (the system used in that test were FX-60 based), but if you're thinking of adding much else to the system, or running a current Intel rig with that, it'll most likely be a problem. Tell me how those 7.8k rpm drives do btw, I've been looking for a new set.
 
Back
Top