# Help me explain to a friend about cooling vs total heat production.

#### evanesce

##### Limp Gawd
My friend is telling me, that... if you have more efficient cooling on a system, ie: higher fan speeds, etc... that the total amount of heat "produced" by the cpu/gpu, etc is less.

Every bone in my body is telling me that the same heat is still there, it's still all produced, it's just being dissipated better out into the room. But overall, the same heat is being produced considering all else remains equal.

Who is right? Can you prove it? I'm so curious!

#### JS_

##### Weaksauce
I believe you both are somewhat right. In an ideal world, the heat is all still there just being dissipated more efficiently. However, electronics seem to use less power when at a lower temperature. See this thread for the details of power draw vs temp on a 2600k. The changes aren't huge when the temp change isn't huge, but it's there.

#### evanesce

##### Limp Gawd
That may be true that it uses less power even if it's a microscopic difference, but still produces the same heat at whatever power it's rated for and currently using, no?

#### Tsumi

##### [H]F Junkie
It's basic thermodynamics. In this case, you are right. Although it is true that most electronics operate more efficiently (produces less heat) at lower temperatures, the difference is marginal.

One way you can test this is get a kill-a-watt meter, and put your computer under a GPU load like furmark. Then, manually adjust the fan speed so that it operates at different temperatures, and see what the power draw is.

#### JLangevin

##### [H]ard|Gawd
The same heat is being generated, its just making it to the atmosphere faster through more adequate cooling. I remember building a completely custom water loop for a friend. He went from low end air cooling to high end 1/2" loop. He took the PC home, and later that night I get a call from him "Hey, I dont get it... I have all this cooling now but the room still gets just as hot when I play games, wtf??"

High end cooling gets the heat OUT of the component, it doesnt magically remove it all together.

#### Forceman

##### [H]F Junkie
It's the same thing that confused so many people about Ivy Bridge. Higher temp doesn't necessarily mean higher power (which is what the heat really is). A hot plate uses a heck of a lot more power than a match, but a candle flame is a lot hotter.

The temperature is dependent on how efficiently you can remove the heat from the system, even if the amount of heat is the same.

#### Johnx64

##### Supreme [H]ardness
It's basic thermodynamics. In this case, you are right. Although it is true that most electronics operate more efficiently (produces less heat) at lower temperatures, the difference is marginal.

One way you can test this is get a kill-a-watt meter, and put your computer under a GPU load like furmark. Then, manually adjust the fan speed so that it operates at different temperatures, and see what the power draw is.

Different fan speed = different power usage...

#### RamonGTP

##### Supreme [H]ardness
Different fan speed = different power usage...

While technically true, the difference is inconsequential

#### alex_wu

##### n00b
TDP is the amount of max power draw of that thing.
Nobody mentioned about efficiency of CPU. Example, how much of that power turns into heat?
This is similar concept, but totally different from performance per Watt.

Let's talk about converting electrical energy into heat.
Perhaps, IB is hotter because more of the power it sucks up is converted into heat, as compared to SB. At the same time, because of better design, it has higher perf per Watt.
So, a 95W IB compared to a 95W SB, the IB will have higher perf per clock, and runs hotter.

Back to the main question, my take would be that it is negligible.

Air cooling is about reducing the temperature difference between the heatsink and ambient air. It is more efficient, if the heatsink is much hotter than ambience because thermal gradient is higher. You can move air at a lesser CFM, and the temperature will maintain steadily.

If you try harder to reduce the heatsink temperature by increasing the CFM, the amount of electricity spent increases exponentially relative to the amount of temperature drop gained, until a terminal point is reached, where the cooling cannot reduce the temperature to lower than ambient.

#### RamonGTP

##### Supreme [H]ardness
TDP = Thermal Design Power not power draw, it's the amount of heat energy that needs to be dissipated.

#### jojo69

##### [H]F Junkie
some people you just can't teach

I had a guy...wanted to put windmills on his truck and get free power

#### Tsumi

##### [H]F Junkie
TDP is the amount of max power draw of that thing.
Nobody mentioned about efficiency of CPU. Example, how much of that power turns into heat?
This is similar concept, but totally different from performance per Watt.

Let's talk about converting electrical energy into heat.
Perhaps, IB is hotter because more of the power it sucks up is converted into heat, as compared to SB. At the same time, because of better design, it has higher perf per Watt.
So, a 95W IB compared to a 95W SB, the IB will have higher perf per clock, and runs hotter.

Back to the main question, my take would be that it is negligible.

Air cooling is about reducing the temperature difference between the heatsink and ambient air. It is more efficient, if the heatsink is much hotter than ambience because thermal gradient is higher. You can move air at a lesser CFM, and the temperature will maintain steadily.

If you try harder to reduce the heatsink temperature by increasing the CFM, the amount of electricity spent increases exponentially relative to the amount of temperature drop gained, until a terminal point is reached, where the cooling cannot reduce the temperature to lower than ambient.

No... just... no.

First section: already covered in the post above. It's the minimum amount of heat that the heatsink design must be able to draw away, and the minimum amount of power the mosfets/motherboard must be able to supply. The i7 3770k can actually be rated as a 77w TDP chip, but that might lead some motherboard manufacturers to create 77w TDP motherboards that are compatible with IB only, not IB and SB with its 95w TDP.

Second section: this topic has been beaten to death already. Poorer TIM used (leading to greater temperature variances needed to transfer the same amount of heat) and a smaller die (leading to more concentrated heat) led to the higher temperatures. Performance per clock (or IPC) is completely dependent on architecture design, not temperature or anything else. IB power consumption and actual heat produced is lower compared to SB.

3rd section: little convoluted, but you've got the right idea in terms of physics. Has nothing to do with efficiency. Higher temperature gradients means more heat transferred per unit time and area. Also means the CPU will run hotter if the heatsink is hotter.

4th section: Once again, nothing to do with efficiency. Increased CFM lowers temperatures because it lowers the air temperature within the heatsink, since air heats up as heat is transferred to it from the heatsink. Moving that air away and replacing it with fresh air increases the temperature gradient, which increases the rate of heat transferred and lowers CPU temps. There does come a point, however, where the the rate of air being replaced is so high, that for all purposes, the air around the heatsink is constantly at ambient temperature, thus reaching the maximum cooling capacity. This has nothing to do with more electricity being used by the fan, or efficiency of whatever.

D

#### Deleted whining member 223597

##### Guest
Well I have noticed that I seem to get a few more stutters when my CPU starts to get hot compared to when it is cold, but it is most likely just something I think I see or would happen either way and is just occurring since the PC was on for a longer time

#### jamsomito

##### 2[H]4U
some people you just can't teach

I had a guy...wanted to put windmills on his truck and get free power

#### XacTactX

##### Supreme [H]ardness
I believe you both are somewhat right. In an ideal world, the heat is all still there just being dissipated more efficiently. However, electronics seem to use less power when at a lower temperature. See this thread for the details of power draw vs temp on a 2600k. The changes aren't huge when the temp change isn't huge, but it's there.

This was HIGHLY informative. Thank you for posting.

#### Cheetoz

##### [H]ard|Gawd
Somewhat true, depends how much you want to nitpick.

Increase in temperature increases resistance of silicon material, which is a factor in heat generated.

P=I^2*R
R = R(T0) * (1 + TCR1 * (T- T0)), where R(T0) is room temp resistance and TCR1 is the first-order temperature coefficient of a resistor.

#### evilsofa

##### [H]F Junkie
if you have more efficient cooling on a system, ie: higher fan speeds, etc... that the total amount of heat "produced" by the cpu/gpu, etc is less.

Yes, but is it significant? The study that JS_ linked to gives you an idea.

If you have a server farm with hundreds or thousands of CPUs or GPUs, then absolutely yes, it is your #1 problem to solve.

If you are doing bitcoining or folding, then maybe; how big is your operation?

If you have one or two PCs that you game on for a few hours a day in a home with an electric bill of around \$100 a month, then not even a little bit.