Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
The thermal design power (TDP), sometimes called thermal design point, refers to the maximum amount of heat generated by the CPU (or GPU), which the cooling system in a computer is required to dissipate in typical operation. Rather than specifying CPU's (or GPU) real power dissipation, TDP serves as the nominal value for designing CPU (or GPU) cooling systems.[1]
Which is exactly what I illustrated. The TDP can be a fixed number regardless of the actual power-draw of a component, and TDP will always be lower than the actual power draw of the component by some amount.The question is as applicable regardless of the extremity of the example.
I never said the TDP wasn't a constant (of COURSE it's a constant, it's a single number). You're arguing against a point I never made.No. The TDP is a constant: it is defined by NVIDIA. Its TDP is 145W regardless of whether the card draws 6W (idling) or 260W (transient load).
If a card draws 177w of power, it will expend SOMEWHAT LESS THAN 177w as heat.A graphics card drawing 177W expends approximately 177W of energy as heat. It's not some magical ratio of its power consumption to its rated TDP.
Yes, see above. TDP (assuming it was rated correctly) will always be lower than actual power consumption, because a graphics card does not convert 100% of the power it consumes into heat.No. See above.
Incorrect. Just to start with, some of it is also converted into mechanical energy, and some of it is converted into electromagnetic radiation.your dead wrong. ALL of the electrical energy consumed by the graphics card is directly converted to heat.
I DID NOT, anywhere, say that any power gets "lost"Unless the graphics card is breaking the laws of physics this holds true. Basically what your saying is 100% power goes in and gets lost somewhere.
At what? Yourself?I'm still laughing.
Or it has a leaf-blower attached to it. Either way, it draws LOTS of power but doesn't put out much heat.If a card truly had a TDP limit of 177W and drew 800W it would go up like a Roman candle.
I never said that. I simply said that overclocked cards will always put out more heat and draw more power than reference models.Nice try trying to say that these O/C cards aren't converting the extra power they are using into heat.
Still waiting for answer.What work would this theoretical graphics card be doing with the equivalent of one horsepower of energy?
While definitions vary (and the methods used to compute it similarly vary), they do all fundamentally describe the same thing.While you argue over the definition of TDP, please be aware that the marketing divisions of Intel, AMD and Nvidia have each come up with their own definitions.
While definitions vary (and the methods used to compute it similarly vary), they do all fundamentally describe the same thing.
Already said, it as a hypothetical configuration for illustration purposes only.Still waiting for answer.
Already said, it as a hypothetical configuration for illustration purposes only.
And it illustrated the point perfectly. Just because a part draws 800w, doesn't mean it can't also have a 100w TDP. Similarly, the less-extreme example of a part that draws 177w and has a 145w TDP can also pan-out. This is pretty simple stuff man...
And I already gave a hypothetical answer as to what could cause such a rating to actually pan out: a leafblower attached to the card. Suddenly it can draw 800w AND hve a 100w TDP, and those numbers are both correct. So... I'm not sure what you're waiting on, the ridiculous hypothetical situation already has a ridiculous hypothetical answer. Guess you just missed it, just like you missed the ENTIRE point that example was illustrating: That TDP and actual power consumption are NOT 1:1
As I said previously, plug in real-world numbers (like the GTX 970 in the previously-posted table that was pulling 177w, and has a 145w TDP rating). Not all of the power it's drawing is being converted into heat, THEREFOR, the TDP is lower than the power consumption.
Fine, it has an A/C unit attached to it You realize you're totally missing the point, right? WHY the hypothetical, for-illustration-purposes-only, example part in-question draws 800w of power and only puts out 100w of heat is irrelevant, the point being made is that such figures are POSSIBLE.Per your sample, saying it draws 800W and it has a 100W TDP, the card either has A/C unit attached to it or someone fudged on the calculations...
No. Your claim was that a card drawing 800W with a 100W TDP was efficient:And I already gave a hypothetical answer as to what could cause such a rating to actually pan out: a leafblower attached to the card.
You're asserting that this theoretical graphics card is efficient not at rendering images for its consumed power, but at blowing leaves?As I said, a card with a 100w TDP that consumes 800w of power would imply a VERY efficient card that uses most of the power for actual work
For thermal design power, the power consumed by the cooling system should not be factored in, no. TDP is intended to define the requirements of the cooling system itself.The other argument seems to be around TDP which is a design specification usually for a chip or system which you match your cooling solution to. That seems pretty straight forward... I agree with most posts I don't think the fan wattage should be factored in for design purposes...
Assuming that power is being put towards doing something useful and not producing waste heat, then yes, that would be quite efficient.No. Your claim was that a card drawing 800W with a 100W TDP was efficient
Fucked up what, exactly?You fucked up monstrously. You know it; everyone else knows it. It's why they're laughing at you.
Admit what, exactly? That my example worked perfectly to illustrate my point that TDP and actual power consumption do not have a 1:1 relationship?Why not just save face and admit to it?
I have a basic understanding of physics, so I'm not sure what you're taking issue with.Protip: if you have zero scientific background and don't understand basic thermo, don't try to fake it.
I have a basic understanding of physics, so I'm not sure what you're taking issue with.
My point stands: a graphics card does not convert 100% of the power it consumes into heat, therefor TDP will always be lower than actual power consumption.
This isn't a complicated concept. Various components (like coils) vibrate, emit electromagnetic waves, or (in the case of the fan) convert electricity into kinetic energy. All of these require power, but convert it into something OTHER than heat. Thus, actual power consumption ends up higher than TDP. Simple.
A little bit, but the vast majority of the power consumed by the fan is converted into kinetic energy, not heat. Ergo, the vast majority of the power the fan consumes does not contribute to TDP in any way.The fan motor also gets warm!
Luckily, to everyone's amusement, that wasn't the only thing it illustrated.That my example worked perfectly to illustrate my point
To what are you referring, exactly? Point was made, point is valid, point has not been disprove in any way.Luckily, to everyone's amusement, that wasn't the only thing it illustrated.
No, it means that the designers have rated the part to be handled by a cooling solution that can dissipate a maximum of 165w of heat.The bottom line is, if the TDP is 165 watts, and the thing is drawing 270 watts, TDP means diddly squat.
Like I said, if the power is being used to do something useful rather than generate a waste product, that's pretty much the definition of efficient usage of resources.And in the case of your 800watt air-conditioner, that takes power too. Now you have the card draw, PLUS the refrigeration unit power draw. Not exactly efficient.
No, it means that the designers have rated the part to be handled by a cooling solution that can dissipate a maximum of 165w of heat.
Like I said, if the power is being used to do something useful rather than generate a waste product, that's pretty much the definition of efficient usage of resources.
right?
Doesn't mean shit? TDP is a very practical figure to keep in mind. It means only 165w of heat being dumped into my case.again, it doesnt mean shit, its still sucking 270 watts at my wall.
The decrease in TDP seems like a pretty big win to me. Much easier to run a cool and quiet GTX 980 (165w TDP) than to try and tame a GTX 780 Ti (250w TDP). That's a nice 85w reduction in heat output.So, the basically the same or ostensibly slightly bettef\r performance for a 30 watt decrease in POWER USAGE, is by no means the panacea the nvidia camp has been crowing about.
My next thread.
AMD 390x may lose the performance crown NVIDIA's next generation card in 12 months. I wonder if it will troll along to 6 pages of dogshit.
So if I buy a 95w CPU and overclock it from 3.4GHz to 5.0GHz then it is only dumping 95w of heat into my case. Awesome! Thx!
again, it doesnt mean shit, its still sucking 270 watts at my wall.
Vs 300 watts for 290.
So, the basically the same or ostensibly slightly bettef\r performance for a 30 watt decrease in POWER USAGE, is by no means the panacea the nvidia camp has been crowing about.
No, he is saying it is easier to cool because it is more efficient! LoL
<br />I don't feel like reading this whole thread but are we talking about the gtx 980 consuming only 30 watts less than a 290 or 290x?<br />
<br />
<a href="http://www.hardocp.com/article/2014/09/18/nvidia_maxwell_gpu_geforce_gtx_980_video_card_review/13#.VDSRkhZRzms" target="_blank">http://www.hardocp.com/article/2014/09/18/nvidia_maxwell_gpu_geforce_gtx_980_video_card_review/13#.VDSRkhZRzms</a><br />
<br />
Its more like a 100+ watts.<br />
<br />
What is embarrassing for AMD is the r9 285 draws more power than a gtx 980. <br />
<br />
<a href="http://www.hardocp.com/article/2014/09/02/msi_radeon_r9_285_gaming_oc_video_card_review/10#.VDSR2xZRzms" target="_blank">http://www.hardocp.com/article/2014/09/02/msi_radeon_r9_285_gaming_oc_video_card_review/10#.VDSR2xZRzms</a><br />
<br />
With powerconsumption like this, its no wonder AMD needs water to beat Nvidia.
Telling you would rather defeat the purpose of being intentionally cryptic.To what are you referring, exactly?
You have your graphics cards plugged directly into the wall, do you? Remarkable, truly.again, it doesnt mean shit, its still sucking 270 watts at my wall. Vs 300 watts for 290.
Only if you buy the model with the attached leaf blower!So if I buy a 95w CPU and overclock it from 3.4GHz to 5.0GHz then it is only dumping 95w of heat into my case. Awesome! Thx!