Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
I have a 8800GTX and 7600GT driving 3 monitors. 80% of the time I am running and IDE, email, word, IMs. So i tend to underclock my GTX.
I was wondering 2 things: Is it worth it to save power, and how low can i underclock?
Underclocking the GTX isn't going to save as much as turning off monitors when not in use.
Are you underclocking just to save power? or are you trying to reduce noise?
I don't think that it is going to make a difference on your electric bill or anything
All energy going in must come out in some form. In this case, that form is heat. Less heat out => less energy in, thanks to the first law of thermodynamics. I think you're right about voltage being fixed, which means the increased power is due to increased current.Im not convinced that underclockign saves any power. Unless the voltage feeding the card is adjusted on the fly which I dont think is.
There's no reason not to, even if the benefits are pretty minor. My x1950pro craps out below about 70% of stock GPU clocks and 85% of memory clocks, though that probably has little relevance for an 8800GTX user...Is it worth it to save power, and how low can i underclock?
I can't tell you for sure with video cards, but underclocking CPUs definately reduces power consumption especially with loads on the CPU.
I bought a Kill-A-Watt and couldn't believe how much power my computer was drawing - about 170 watts... i calculated it with my most recent power bill and my pc which runs 24/7 costs me about $20 bucks a month to run. I now try to use sleep mode (which downs power usage to 30 watts), but so far I can only do sleep mode when I force it manually - my pc won't do it automatically (see my thread if you can help me with my XP insomia problem here
A lot of people have the Kill-A-Watt and its easy to use - ask around and see if any of your friends have one.. if not, spend the $25 bucks to get one.... I found it quite useful + have lent it out to a lot of other people to play with too.
For sure!! I have my monitors turn off if idle for 10min.
No not noise. I cannot hear my GTX at 60% default fan speed. Unfortunately I cannot slow the fan on my 7600GT and that damn thing is noisy.
Well I was hoping to save some electricity, but if it does not make a difference I guess I wont bother Underclocking.
Ok I have a question about CPU overclocking:
I have my 6750 running at 1.35 volts @ 3.2ghz. I can also run my [email protected]@2.6ghz
If both settings are using the same voltage, then why would running at 3.2ghz give higher temperatures.
Which assumption am I using is wrong?
Intel defines the thermal output of their chips as:
Wth = Cg x V^2 x F [Where Cg: Gate capacitance and F = frequency.]
You can see that voltage has the largest effect on thermal output, but frequency is also a factor.
So, yes, lowering clockspeed will save you a bit of power. Not all that much though.
3.2ghz = 0.02 * (1.35^2) * 3200 = 116.64w
2.6ghz = 0.02 * (1.35^2) * 2600 = 94.5w
So, yes... there's about a 21w difference between the two speeds. See the equation I listed above. Obviously, when transistors charge and drain, they lose power to heat. The faster they switch, the more times this loss is enacted.
So according to laws of thermodynamics my PSU must be feeding my CPU more voltage than the stated 1.35v in bios in order to overclock from 2.6ghz -> 3.2ghz?
Why? would not more amps going through at 1.35 give out more heat?
I'm pretty sure those figures would be for a 100% load. The whole idea of underclocking, as far as I'm aware, is to do it when your system is idle. The difference in temperature (and thus thermal output in the equation above) is a lot less at this point - a whopping 0.75C between my underclock and overclock.So total of 56.88W combined CPU\GPU power savings. Woah I hope that is right. My mama would be proud of me.
Ah I see what you're saying, I think.
So even though voltage is same, you can have more amperage getting there. So voltage doesnt control amperage?
Ah I see what you're saying, I think.
So even though voltage is same, you can have more amperage getting there. So voltage doesnt control amperage?
I also have a Kill-A-Watt. With my HD 3850 running at 300Mhz , my system consumes 76W at idle. When I forced it to run at 669MHz, power consumption increased to 102W at idle.
Clock Frequency is such a minute factor in temps that it's not even worth bothering with it. Voltage makes up the majority of the heat difference.
If you're not dropping the voltage w/ the clock frequency then don't bother.
Ah I see what you're saying, I think.
So even though voltage is same, you can have more amperage getting there. So voltage doesnt control amperage?
I have a good idea to save power. Don't use high end graphic's cards!