Does underclocking save power?

SpeedyVV

Supreme [H]ardness
Joined
Sep 14, 2007
Messages
4,210
I have a 8800GTX and 7600GT driving 3 monitors. 80% of the time I am running and IDE, email, word, IMs. So i tend to underclock my GTX.

I was wondering 2 things: Is it worth it to save power, and how low can i underclock?
 
Underclocking the GTX isn't going to save as much as turning off monitors when not in use.
 
I have a 8800GTX and 7600GT driving 3 monitors. 80% of the time I am running and IDE, email, word, IMs. So i tend to underclock my GTX.

I was wondering 2 things: Is it worth it to save power, and how low can i underclock?

Are you underclocking just to save power? or are you trying to reduce noise?

I don't think that it is going to make a difference on your electric bill or anything
 
Underclocking the GTX isn't going to save as much as turning off monitors when not in use.

For sure!! I have my monitors turn off if idle for 10min.

Are you underclocking just to save power? or are you trying to reduce noise?

I don't think that it is going to make a difference on your electric bill or anything

No not noise. I cannot hear my GTX at 60% default fan speed. Unfortunately I cannot slow the fan on my 7600GT and that damn thing is noisy.

Well I was hoping to save some electricity, but if it does not make a difference I guess I wont bother Underclocking.
 
Im not convinced that underclockign saves any power. Unless the voltage feeding the card is adjusted on the fly which I dont think is.
 
Im not convinced that underclockign saves any power. Unless the voltage feeding the card is adjusted on the fly which I dont think is.
All energy going in must come out in some form. In this case, that form is heat. Less heat out => less energy in, thanks to the first law of thermodynamics. I think you're right about voltage being fixed, which means the increased power is due to increased current.

That said, there's not a huge difference in idle temps, so we're probably talking about a pretty negligible difference in power draw. In the time you spend finding a stable underclock, you'll probably burn through your first six months' power savings :D .

Is it worth it to save power, and how low can i underclock?
There's no reason not to, even if the benefits are pretty minor. My x1950pro craps out below about 70% of stock GPU clocks and 85% of memory clocks, though that probably has little relevance for an 8800GTX user...
 
so why are there 2D and 3D profiles with lower clocks for 2D? surely if the card produces less heat it's using less power?
i think current drawn is different with different clocks and when idle or under load.

for cards that support it i think the 2D profile has lower voltage too and i bet that it's set to as slow a clock as possible without causing slow downs with apps and where it doesn't have any further change on power drawn, heat dissipation or whatever.

but that's just my guess.
 
Ok, so all those thoughts make sense to me.

Specially the thing about less heat. The card does go down in temperature, so that has got to mean less power is being consumed.

I guess the only way to tell for sure is to have some sort of power meter to test it in practice.

Anyone got one and is willing to give it a try?
 
I can't tell you for sure with video cards, but underclocking CPUs definately reduces power consumption especially with loads on the CPU.

I bought a Kill-A-Watt and couldn't believe how much power my computer was drawing - about 170 watts... i calculated it with my most recent power bill and my pc which runs 24/7 costs me about $20 bucks a month to run. I now try to use sleep mode (which downs power usage to 30 watts), but so far I can only do sleep mode when I force it manually - my pc won't do it automatically (see my thread if you can help me with my XP insomia problem here

A lot of people have the Kill-A-Watt and its easy to use - ask around and see if any of your friends have one.. if not, spend the $25 bucks to get one.... I found it quite useful + have lent it out to a lot of other people to play with too.
 
it should already be in 2D mode if your just surfing and stuff.....thats pretty well underclocked as it is..
 
for the GT, i underclock the vid to GPU/MEM 300/600 with no ill effects. Even Supreme Commander plays well at this setting. I used Rivatuner and c reated a 2d profile that loads up this setting upon starting Windows


Ok I have a question about CPU overclocking:
I have my 6750 running at 1.35 volts @ 3.2ghz. I can also run my [email protected]@2.6ghz

If both settings are using the same voltage, then why would running at 3.2ghz give higher temperatures.

Which assumption am I using is wrong?
 
I can't tell you for sure with video cards, but underclocking CPUs definately reduces power consumption especially with loads on the CPU.

I bought a Kill-A-Watt and couldn't believe how much power my computer was drawing - about 170 watts... i calculated it with my most recent power bill and my pc which runs 24/7 costs me about $20 bucks a month to run. I now try to use sleep mode (which downs power usage to 30 watts), but so far I can only do sleep mode when I force it manually - my pc won't do it automatically (see my thread if you can help me with my XP insomia problem here

A lot of people have the Kill-A-Watt and its easy to use - ask around and see if any of your friends have one.. if not, spend the $25 bucks to get one.... I found it quite useful + have lent it out to a lot of other people to play with too.

Hey teststrips, can you underclock your GPU and tell me what Kill-A-Watt tells you? Please.
 
For sure!! I have my monitors turn off if idle for 10min.



No not noise. I cannot hear my GTX at 60% default fan speed. Unfortunately I cannot slow the fan on my 7600GT and that damn thing is noisy.

Well I was hoping to save some electricity, but if it does not make a difference I guess I wont bother Underclocking.

If you really want to save in electricity use the power saving features built into windows to shut down your monitor and hard drives or even sleep mode.

Just kind of off topic but my old big screen TV consumes much more watts then my computer. I think you would save more in the long run watching less TV :)
 
Intel defines the thermal output of their chips as:

Wth = Cg x V^2 x F [Where Cg: Gate capacitance and F = frequency.]

You can see that voltage has the largest effect on thermal output, but frequency is also a factor.

So, yes, lowering clockspeed will save you a bit of power. Not all that much though.

Ok I have a question about CPU overclocking:
I have my 6750 running at 1.35 volts @ 3.2ghz. I can also run my [email protected]@2.6ghz

If both settings are using the same voltage, then why would running at 3.2ghz give higher temperatures.

Which assumption am I using is wrong?

3.2ghz = 0.02 * (1.35^2) * 3200 = 116.64w
2.6ghz = 0.02 * (1.35^2) * 2600 = 94.5w

So, yes... there's about a 21w difference between the two speeds. See the equation I listed above. Obviously, when transistors charge and drain, they lose power to heat. The faster they switch, the more times this loss is enacted.
 
Intel defines the thermal output of their chips as:

Wth = Cg x V^2 x F [Where Cg: Gate capacitance and F = frequency.]

You can see that voltage has the largest effect on thermal output, but frequency is also a factor.

So, yes, lowering clockspeed will save you a bit of power. Not all that much though.



3.2ghz = 0.02 * (1.35^2) * 3200 = 116.64w
2.6ghz = 0.02 * (1.35^2) * 2600 = 94.5w

So, yes... there's about a 21w difference between the two speeds. See the equation I listed above. Obviously, when transistors charge and drain, they lose power to heat. The faster they switch, the more times this loss is enacted.

Arcygenical, thanks a lot for that. So according to the formula, the power is directly proportinal to the frequency. 1/2 the freq, 1/2 the power. very cool.

And BTW, to me 21W is definately worth it. Not just in $s and cents but it is just the right thing to do.

Since I run SpeedStep which lowers the multi to 6, then 6/9 of 116.4W is 77.7W or a 38.88W. Less power, less heat, less $$ less waste. Most excellent.

Also, while I work, 80% of my computer use, the multi stays pegged at 6x. :)

My GTX is OCed to 625, and right now I UC it in 2D to 400, so if it uses 50W, then
400/625 would give me a saving of 18W.

So total of 56.88W combined CPU\GPU power savings. Woah I hope that is right. My mama would be proud of me. :D
 
So according to laws of thermodynamics my PSU must be feeding my CPU more voltage than the stated 1.35v in bios in order to overclock from 2.6ghz -> 3.2ghz?
 
So according to laws of thermodynamics my PSU must be feeding my CPU more voltage than the stated 1.35v in bios in order to overclock from 2.6ghz -> 3.2ghz?

Why? would not more amps going through at 1.35 give out more heat?
 
I also have a Kill-A-Watt. With my HD 3850 running at 300Mhz , my system consumes 76W at idle. When I forced it to run at 669MHz, power consumption increased to 102W at idle.
 
Clock Frequency is such a minute factor in temps that it's not even worth bothering with it. Voltage makes up the majority of the heat difference.

If you're not dropping the voltage w/ the clock frequency then don't bother.
 
Why? would not more amps going through at 1.35 give out more heat?

Ah I see what you're saying, I think.

So even though voltage is same, you can have more amperage getting there. So voltage doesnt control amperage?
 
So total of 56.88W combined CPU\GPU power savings. Woah I hope that is right. My mama would be proud of me. :D
I'm pretty sure those figures would be for a 100% load. The whole idea of underclocking, as far as I'm aware, is to do it when your system is idle. The difference in temperature (and thus thermal output in the equation above) is a lot less at this point - a whopping 0.75C between my underclock and overclock.
 
Ah I see what you're saying, I think.

So even though voltage is same, you can have more amperage getting there. So voltage doesnt control amperage?

No, the load sort of defines what current the voltage (power) source can supply. In the case of processors, every time a MOSFET switches, you need to charge and discharge the gate capacitance....and charge over time = current. So higher frequencies will require larger currents to sustain the new switching speed, which results in more power usage.

Basically.
 
I have a good idea to save power. Don't use high end graphic's cards!
 
Ah I see what you're saying, I think.

So even though voltage is same, you can have more amperage getting there. So voltage doesnt control amperage?

Voltage and amperage are the two independant parts of power. Watts, a measure of power, is volts x amps, so raising or lowering either one will raise or lower the total power. People usually talk about equipment drawing xx number of amps - the greater the need for power by the device, the more amperage it takes to run it, while the voltage is normally constant.
 
I also have a Kill-A-Watt. With my HD 3850 running at 300Mhz , my system consumes 76W at idle. When I forced it to run at 669MHz, power consumption increased to 102W at idle.

Woah Jeep333. That is amazing reduction in power. Thank you very much. I like measurements much more than our collective theoretical knowledge. :p

Clock Frequency is such a minute factor in temps that it's not even worth bothering with it. Voltage makes up the majority of the heat difference.

If you're not dropping the voltage w/ the clock frequency then don't bother.

While true that voltage is more important (i.e. exponentially proportional to the square of voltage), according to the CPU example frequency is still a factor (i.e. directly proportional). So if I 1/2 my frequency, I 1/2 my power consumption. I will still bother.

Ah I see what you're saying, I think.

So even though voltage is same, you can have more amperage getting there. So voltage doesnt control amperage?

Well, if voltage is fixed, other things can control amperage, for example resistance. From my physics course memory (long time ago) V = R * I, or more interestingly I = V / R, where I is current measured in Amps.

So if you have a fixed voltage of say a 9V battery and a very conductive path, you can get some pretty awsome Amps.

[DONT TRY THIS] Take a piece of thin electrical wire, and touch each end of it to the contact points of a 9V battery. You will get a nice demonstration of heat created due to high amperage through a low resistance. [\DONT TRY THIS] Again folks, even if you are curious don't try this. You WILL burn yourself, among other worse potential accidents.

I have a good idea to save power. Don't use high end graphic's cards!

QFT.

But ZodaEX, my goal is not to save power. My goal is to have a powerfull gaming system that does not waste power. Specially when not gaming. No one has ever called me an environmentalist, cause I am not. I hate waste of any kind. I like cars with lots of power but they should be as efficient as possible, specially when not at the track.
 
Back
Top