225W & 300W Graphics Spec

Why do dual core cpu's only have about 100W or less when video cards will be putting out almost 300W?
 
Turkish621 said:
Why do dual core cpu's only have about 100W or less when video cards will be putting out almost 300W?
CPUs do not use 700M transistors.
 
Why do they continue producing video cards with 700m transistors. Why do they not change over to using smaller transistors like cpus use?
 
Turkish621 said:
Why do they continue producing video cards with 700m transistors. Why do they not change over to using smaller transistors like cpus use?

Well, they are using SMALLER transistors, but gpu's have MORE transistors than cpu's because they need in order to process more things and faster... get the idea?
 
So this basically means all sub 1000W power supplies will not be able to do SLI or Crossfire for DX10 cards. :eek:
 
baronzemo78 said:
So this basically means all sub 1000W power supplies will not be able to do SLI or Crossfire for DX10 cards. :eek:

well sort-off imo current sub 1000W PSU's cant deliver the required amps on the 12V rail/s, 50A will be required just for graphics alone! Iam only assuming SLI here, single GPU's should be fine.

I thought the 300W figure was a joke/rumor but with the info comming from anandtech and now Intel, I dont know what to think.

There is no way to air-cool a 300W device at a reliable temperature in a DUAL slot space, anyone care to disagree?
 
kleox64 said:
There is no way to air-cool a 300W device at a reliable temperature in a DUAL slot space, anyone care to disagree?
No, I wouldn't disagree, but heat dissipation is a tricky concept. It depends not only on heat output, but other factors. Transistor density (fabrication process) and die size are both important aspects to consider. Also keep in mind that the GPUs themselves won't consume 300W, but rather the entire video card. If heat is spread across a large area, a large HSF can more easily dissipate that heat. Expect power consumption of the GPU itself to be in the upper 100's range (though feasibly somewhat more as GDDR4 is no power hog).

It's possible for an effective water cooling system to dissipate this much heat effectively without taking up too much space. An air cooler? I don't think the math favors it, but it's far from the realm of impossible. I definitely see air coolers improving, but not mega-substantially. Copper, heat pipes, large, quickly spinning fan? Reasonably good dissipation. The question is: good enough?

Keep in mind that only the specification here is in discussion. I don't truly expect 300W video cards until late 2007/early 2008. At that time, we'll be well prepared.
 
phide said:
No, I wouldn't disagree, but heat dissipation is a tricky concept. It depends not only on heat output, but other factors. Transistor density (fabrication process) and die size are both important aspects to consider. Also keep in mind that the GPUs themselves won't consume 300W, but rather the entire video card. If heat is spread across a large area, a large HSF can more easily dissipate that heat. Expect power consumption of the GPU itself to be in the upper 100's range (though feasibly somewhat more as GDDR4 is no power hog).

It's possible for an effective water cooling system to dissipate this much heat effectively without taking up too much space. An air cooler? I don't think the math favors it, but it's far from the realm of impossible. I definitely see air coolers improving, but not mega-substantially. Copper, heat pipes, large, quickly spinning fan? Reasonably good dissipation. The question is: good enough?

Keep in mind that only the specification here is in discussion. I don't truly expect 300W video cards until late 2007/early 2008. At that time, we'll be well prepared.

agreed.
 
phide said:
No, I wouldn't disagree, but heat dissipation is a tricky concept. It depends not only on heat output, but other factors. Transistor density (fabrication process) and die size are both important aspects to consider. Also keep in mind that the GPUs themselves won't consume 300W, but rather the entire video card. If heat is spread across a large area, a large HSF can more easily dissipate that heat. Expect power consumption of the GPU itself to be in the upper 100's range (though feasibly somewhat more as GDDR4 is no power hog).

It's possible for an effective water cooling system to dissipate this much heat effectively without taking up too much space. An air cooler? I don't think the math favors it, but it's far from the realm of impossible. I definitely see air coolers improving, but not mega-substantially. Copper, heat pipes, large, quickly spinning fan? Reasonably good dissipation. The question is: good enough?

Keep in mind that only the specification here is in discussion. I don't truly expect 300W video cards until late 2007/early 2008. At that time, we'll be well prepared.

Im afraid I disagree. I think its simply both ATI and nVidia having to build a faster video card thats faster in both DX9 and DX10 compatable. Thus the need for a huge amount of transistors. Also having to do a serious revamp of all their previous technology could easily result in far less efficient chips.

I think by the end of 2007 power consumption will be down from inital release with a 65nm refresh.
 
Is 225W for the nvidia card, and the 300W for the ATI card?

If one card, maybe the R600?, requires 300W, then could the PCIe plug supply 75W, and the remaining 225W could be supplied by one of the 12V molex plugs at 18.75A? It looks like it is cutting it real close, maybe the R600 will have 2 molex plug adapters?

I'm just wondering if a 550W high efficiency power supply could do the job, or will we need at least 600W+ for a SINGLE R600?
 
I would suggest waiting until the cards are released or nvidia updates thier PSU list, for single and DUAL G80 configurations.
 
mentok1982 said:
I was planing on getting the 850 Watt Galaxy from Enermax but no maybe I will have to
consider the full 1 Kilowatt Galaxy.

Yikes I hope those prices go down. Froogle's best price for the 1000 Watt is $307.10 and
their best price for the 850 Watt is $254.04.
Ouch. that is a pretty pricy PSU. I wonder how efficient it is and whether it is really needed.

Let's also not forget that these finalization of the spec if planned for early 2007, which does not mean that the graphics card right then will be using that amount of power.
 
If these power consumptions numbers are true then I see the card makers coming out with external power adapters for the video card itself. Sure its one more power cord and brick but at least you dont need to spend an absurd amount on a 1kw+ PSU.
 
Blauman said:
If these power consumptions numbers are true then I see the card makers coming out with external power adapters for the video card itself. Sure its one more power cord and brick but at least you dont need to spend an absurd amount on a 1kw+ PSU.

True, and this isn't even considering the strain on current PSU's by feeding 300W down one line. This number is more worrisome to me in regards to its implications about heat production. 300W of heat is a freaking lot of heat to dissipate.
 
drizzt81 said:
Ouch. that is a pretty pricy PSU. I wonder how efficient it is and whether it is really needed.

Let's also not forget that these finalization of the spec if planned for early 2007, which does not mean that the graphics card right then will be using that amount of power.

The Galaxy PSUs are over 80% effiicient (85 I think) and all the reviews I have read can be classified as "rave reviews".
Bjorn Galaxy 850 Watt review.
Hardware Secrets Galaxy 1000 Watt review.
Big Bruin Galaxy 1000 Watt review.
Club Overclocker Galaxy 1000 Watt review.
 
lol that is one reason why I am getting a Mountain Mods UFO. It has dual PSU slots. Looks like I'll need a dedicated PSU for the gpu's. Jeez
 
Blauman said:
If these power consumptions numbers are true then I see the card makers coming out with external power adapters for the video card itself. Sure its one more power cord and brick but at least you dont need to spend an absurd amount on a 1kw+ PSU.

That wouldn't surprise me.
 
A quality external PSU brick isnt exactly cheap, you still pay for it through the cost of your video card. I dont trust extenal power bricks period expecially with after the laptop battery recal fiasco, if the batteries are that bad imagine the bricks themselves.
 
Blauman said:
If these power consumptions numbers are true then I see the card makers coming out with external power adapters for the video card itself. Sure its one more power cord and brick but at least you dont need to spend an absurd amount on a 1kw+ PSU.

QFT
 
My opinion? 100w is too much for a graphics card, let alone 300w. Just when aircooling was starting to catch up, the graphics card makers had to go and throw this one into the fray.

The good news is, aircooling CAN'T handle that, even in dual-slot configurations, so I don't expect to see such products execpt in the extreme high-end. Hopefully, the video card manufacturers will come to their senses, or else consumers will.

Myself, I won't be helping fuel the insanity. I was incredibly happy when Nvidia released the 7900 GT - same performance as the previous top-end card with only 50w power usage. I bought the 7900 GT, and I will continue to buy cards that use reasonable amounts of power for the performance delivered.

If the next-gen cards are going to use 200w, I'll wait for the 65 nm refresh, thank you.
 
What do you expect with all the people out there bitching and whining they can't get 90fps on Oblivion. It's only going to get worse, and it's our fault. If people would game at reasonable resolutions and detail levels this wouldn't be a problem.

I've tested this with many games and video cards. I've even run a few on a Radeon 9800 Pro at stock speeds and they were almost unplayable at default settings. 5 minutes later after a few minor tweaks, it was totally playable on that particular card.

People must be getting lazy these days and just popping in games and leaving all the settings at defaults. :confused:
 
Back
Top