GTX 980 May Lose The Performance Crown To AMD’s R9 380X in February 2015

TDP=Thermal Design Point

Its the theoretical maximum power that the GPU Package/Board design generates at spec.
 
The thermal design power (TDP), sometimes called thermal design point, refers to the maximum amount of heat generated by the CPU (or GPU), which the cooling system in a computer is required to dissipate in typical operation. Rather than specifying CPU's (or GPU) real power dissipation, TDP serves as the nominal value for designing CPU (or GPU) cooling systems.[1]

From Wikipedia.
 
The question is as applicable regardless of the extremity of the example.
Which is exactly what I illustrated. The TDP can be a fixed number regardless of the actual power-draw of a component, and TDP will always be lower than the actual power draw of the component by some amount.

No. The TDP is a constant: it is defined by NVIDIA. Its TDP is 145W regardless of whether the card draws 6W (idling) or 260W (transient load).
I never said the TDP wasn't a constant (of COURSE it's a constant, it's a single number). You're arguing against a point I never made.

A graphics card drawing 177W expends approximately 177W of energy as heat. It's not some magical ratio of its power consumption to its rated TDP.
If a card draws 177w of power, it will expend SOMEWHAT LESS THAN 177w as heat.

I never said anything about magic ratios, not sure what you're on about with that.

No. See above.
Yes, see above. TDP (assuming it was rated correctly) will always be lower than actual power consumption, because a graphics card does not convert 100% of the power it consumes into heat.

Simple as that, argue all you want, nothing you can say will change this fact.
 
Last edited:
your dead wrong. ALL of the electrical energy consumed by the graphics card is directly converted to heat.
Incorrect. Just to start with, some of it is also converted into mechanical energy, and some of it is converted into electromagnetic radiation.

More than 10w goes towards driving the fan(s) on the GTX 970, which already creates a minimum deficit between the actual power consumption of the card and its TDP (when the fan is running full speed). It's very, very, VERY obvious that not-all of the power these cards draw becomes heat.

Unless the graphics card is breaking the laws of physics this holds true. Basically what your saying is 100% power goes in and gets lost somewhere.
I DID NOT, anywhere, say that any power gets "lost"

It's used for other things, but not lost. No laws of physics are being broken, conservation of energy is still perfectly in-play, and all WITHOUT the card converting 100% of the power it receives into heat.

I'm still laughing.
At what? Yourself?

What I said remains true: TDP doesn't directly equate to power consumption. TDP should always be lower than power consumption.
 
Last edited:
If a card truly had a TDP limit of 177W and drew 800W it would go up like a Roman candle.
Or it has a leaf-blower attached to it. Either way, it draws LOTS of power but doesn't put out much heat.

And that would make even the ridiculous notion of a card drawing 800w of power and only having a TDP rating of 177w pan-out.

Nice try trying to say that these O/C cards aren't converting the extra power they are using into heat.
I never said that. I simply said that overclocked cards will always put out more heat and draw more power than reference models.
 
While you argue over the definition of TDP, please be aware that the marketing divisions of Intel, AMD and Nvidia have each come up with their own definitions. There's an interesting post about it here. It had literal, Intel's, AMD's and Nvidia's definitions for TDP, but some of the links are gone and the post writer even seems to have gotten AMD's and Intel's definitions switched.

The literal definition: "the maximum amount of heat that that a processor will generate" (I don't know the post's source but it seems similar to Wikipedia's definition posted above by PcZac).

Intel's definition in 2011 (link goes to .pdf): "The upper point of the thermal profile consists of the Thermal Design Power (TDP) and the associated Tcase value. Thermal Design Power (TDP) should be used for processor thermal solution design targets. TDP is not the maximum power that the processor can dissipate. TDP is measured at maximum TCASE."

Intel's description of AMD's definition (from the same .pdf): "the maximum power a processor can draw for a thermally significant period while running commercially useful software"

Nvidia's new, changed definition as described in a 2010 blog: "a measure of maximum power draw over time in real world applications". That article defined the typical definition as "the amount of power (heat) that a cooler must dissipate in order to keep a silicon chip within its operating temperatures" (oh great, power = heat) and then goes on to say that both Intel and AMD agree that it's "a measurement of waste heat output from a chip".

I will further note that the marketing divisions made their changes a few years ago, so if you look at older articles and forum posts, you'll find different descriptions of how TDP works, like this one from 2009.

I briefly tried to find current pages on AMD and Nvidia for their definitions of TDP but I gave up quickly because, honestly, I'm feeling lazy tonight and they don't seem to be easy to find.

We're all totally clear about the definition of TDP now, right?
 
While you argue over the definition of TDP, please be aware that the marketing divisions of Intel, AMD and Nvidia have each come up with their own definitions.
While definitions vary (and the methods used to compute it similarly vary), they do all fundamentally describe the same thing.
 
While definitions vary (and the methods used to compute it similarly vary), they do all fundamentally describe the same thing.

1. "The maximum amount of heat that a processor will generate"
2. "TDP is not the maximum power that the processor can dissipate"
3. "The maximum power a processor can draw..."
4. "The amount of power (heat) that a cooler must dissipate..."
5. "a measurement of waste heat output from a chip"

Okay, #3 is not like all the others, so AMD was wrong?
 
AMD's definition for thermal design power isn't directly defined in terms of thermals, which does make it a pretty goofy definition relative to the others. Those are defined in the aim of providing a target value to those who design cooling systems ("my system needs to be able to dissipate this much heat energy over sustained periods"). It does, however, clarify that the figure is a measure of a "thermally significant period", rather than any instantaneous power draw, so it should adequately reflect the needs of a cooling system.
 
Still waiting for answer.
Already said, it as a hypothetical configuration for illustration purposes only.

And it illustrated the point perfectly. Just because a part draws 800w, doesn't mean it can't also have a 100w TDP. Similarly, the less-extreme example of a part that draws 177w and has a 145w TDP can also pan-out. This is pretty simple stuff man...

And I already gave a hypothetical answer as to what could cause such a rating to actually pan out: a leafblower attached to the card. Suddenly it can draw 800w AND hve a 100w TDP, and those numbers are both correct. So... I'm not sure what you're waiting on, the ridiculous hypothetical situation already has a ridiculous hypothetical answer. Guess you just missed it, just like you missed the ENTIRE point that example was illustrating: That TDP and actual power consumption are NOT 1:1

As I said previously, plug in real-world numbers (like the GTX 970 in the previously-posted table that was pulling 177w, and has a 145w TDP rating). Not all of the power it's drawing is being converted into heat, THEREFOR, the TDP is lower than the power consumption.
 
Last edited:
Already said, it as a hypothetical configuration for illustration purposes only.

And it illustrated the point perfectly. Just because a part draws 800w, doesn't mean it can't also have a 100w TDP. Similarly, the less-extreme example of a part that draws 177w and has a 145w TDP can also pan-out. This is pretty simple stuff man...

And I already gave a hypothetical answer as to what could cause such a rating to actually pan out: a leafblower attached to the card. Suddenly it can draw 800w AND hve a 100w TDP, and those numbers are both correct. So... I'm not sure what you're waiting on, the ridiculous hypothetical situation already has a ridiculous hypothetical answer. Guess you just missed it, just like you missed the ENTIRE point that example was illustrating: That TDP and actual power consumption are NOT 1:1

As I said previously, plug in real-world numbers (like the GTX 970 in the previously-posted table that was pulling 177w, and has a 145w TDP rating). Not all of the power it's drawing is being converted into heat, THEREFOR, the TDP is lower than the power consumption.

Per your sample, saying it draws 800W and it has a 100W TDP, the card either has A/C unit attached to it or someone fudged on the calculations...
 
Per your sample, saying it draws 800W and it has a 100W TDP, the card either has A/C unit attached to it or someone fudged on the calculations...
Fine, it has an A/C unit attached to it :rolleyes: You realize you're totally missing the point, right? WHY the hypothetical, for-illustration-purposes-only, example part in-question draws 800w of power and only puts out 100w of heat is irrelevant, the point being made is that such figures are POSSIBLE.

All that was meant to illustrate is the fact that TDP doesn't directly equate to power consumption. There are all kinds of things that can draw power without converting it into heat, and every one cumulatively adds to the deficit between TDP rating and actual power consumption.
 
And I already gave a hypothetical answer as to what could cause such a rating to actually pan out: a leafblower attached to the card.
No. Your claim was that a card drawing 800W with a 100W TDP was efficient:
As I said, a card with a 100w TDP that consumes 800w of power would imply a VERY efficient card that uses most of the power for actual work
You're asserting that this theoretical graphics card is efficient not at rendering images for its consumed power, but at blowing leaves?

You fucked up monstrously. You know it; everyone else knows it. It's why they're laughing at you. Why not just save face and admit to it?
 
The problem is the useful work done by a card is a minimal output <5W(?) to a monitor. Everything else is waste one way or another. Whether it's blowing hot air or vibrations from the gates flipping... it's all waste.

That's why we generally break it down to GFLOPs per watt or some kind of measurement. For me FPS/watt might make sense in a given game :).


The other argument seems to be around TDP which is a design specification usually for a chip or system which you match your cooling solution to. That seems pretty straight forward... I agree with most posts I don't think the fan wattage should be factored in for design purposes...

What really matters is work performed (GLFOPs or what have you) / overall system draw including cooling systems.

By the way, this whole conversation boggles my mind.

And who cares if 380X is > than the 980 months and months after release. What matters is what is nVidia releasing around February. I like to look forward.
 
Last edited:
Thank you wonderfield for being the voice of reason here.

Protip: if you have zero scientific background and don't understand basic thermo, don't try to fake it.
 
The other argument seems to be around TDP which is a design specification usually for a chip or system which you match your cooling solution to. That seems pretty straight forward... I agree with most posts I don't think the fan wattage should be factored in for design purposes...
For thermal design power, the power consumed by the cooling system should not be factored in, no. TDP is intended to define the requirements of the cooling system itself.
 
No. Your claim was that a card drawing 800W with a 100W TDP was efficient
Assuming that power is being put towards doing something useful and not producing waste heat, then yes, that would be quite efficient.

Not seeing what you're taking issue with. Oh, and you're STILL missing the entire point of that example, in case you hadn't realized :rolleyes:

You fucked up monstrously. You know it; everyone else knows it. It's why they're laughing at you.
Fucked up what, exactly?

I used hypothetical set of numbers as an example to make a point, and that example makes my point perfectly. TDP does not directly equate to power consumption.

Do you not understand that the chosen numbers are of NO consequence for the sake of the point being made, except for one being a higher number than the other? I could have said "a card that draws 110w and has a 100w TDP," and the example works EXACTLY the same way. Still illustrates a perfectly valid situation where TDP doesn't directly equate to actual power consumption.

Point made, point stands, no fuck-up. The only fuck-up is that you're hung-up on an inconsequential aspect of the example used, and does nothing to negate the point made by the example.

Why not just save face and admit to it?
Admit what, exactly? That my example worked perfectly to illustrate my point that TDP and actual power consumption do not have a 1:1 relationship?
 
Last edited:
Protip: if you have zero scientific background and don't understand basic thermo, don't try to fake it.
I have a basic understanding of physics, so I'm not sure what you're taking issue with.

My point stands: a graphics card does not convert 100% of the power it consumes into heat, therefor TDP will always be lower than actual power consumption.

This isn't a complicated concept. Various components (like coils) vibrate, emit electromagnetic waves, or (in the case of the fan) convert electricity into kinetic energy. All of these require power, but convert it into something OTHER than heat. Thus, actual power consumption ends up higher than TDP. Simple.
 
I have a basic understanding of physics, so I'm not sure what you're taking issue with.

My point stands: a graphics card does not convert 100% of the power it consumes into heat, therefor TDP will always be lower than actual power consumption.

This isn't a complicated concept. Various components (like coils) vibrate, emit electromagnetic waves, or (in the case of the fan) convert electricity into kinetic energy. All of these require power, but convert it into something OTHER than heat. Thus, actual power consumption ends up higher than TDP. Simple.

The fan motor also gets warm! :)
 
The fan motor also gets warm! :)
A little bit, but the vast majority of the power consumed by the fan is converted into kinetic energy, not heat. Ergo, the vast majority of the power the fan consumes does not contribute to TDP in any way.

It does contribute to overall power consumption, though. Again, this makes up some of the difference we see between TDP and actual power consumption.
 
Luckily, to everyone's amusement, that wasn't the only thing it illustrated.
To what are you referring, exactly? Point was made, point is valid, point has not been disprove in any way.

The only thing that's amusing is how hung-up you are on an inconsequential aspect of the example that was used. As I said, the numbers in the example could literally be ANY two numbers. Just so long as the one representing TDP is the lower of the two numbers, the point being made remains unchanged. You're literally trying to make an argument about something that doesn't matter. It doesn't effect the validity of the example and it doesn't effect the validity of the point made using the example. No matter how you slice it, TDP should always be lower than actual power consumption by some amount. There's nothing wrong with TDP being lower than actual power consumption, and it should be expected.

If you're about done with your tangent, the point that example was used to make (you know, the thing that was actually being discussed) still stands, and will continue to stand.
 
Last edited:
The bottom line is, if the TDP is 165 watts, and the thing is drawing 270 watts, TDP means diddly squat. Its a marketing term meant to fool the unwary into thinking they are buying something low-power

Power draw is power draw, it still has to dissipate (close to) 270 watts regardless of what the TDP is.

And in the case of your 800watt air-conditioner, that takes power too. Now you have the card draw, PLUS the refrigeration unit power draw. Not exactly efficient.
 
Last edited:
The bottom line is, if the TDP is 165 watts, and the thing is drawing 270 watts, TDP means diddly squat.
No, it means that the designers have rated the part to be handled by a cooling solution that can dissipate a maximum of 165w of heat.

And in the case of your 800watt air-conditioner, that takes power too. Now you have the card draw, PLUS the refrigeration unit power draw. Not exactly efficient.
Like I said, if the power is being used to do something useful rather than generate a waste product, that's pretty much the definition of efficient usage of resources.

If WHATEVER it is pulls 800w of power to do useful work, and only produces 100w of waste, that sounds pretty efficient to me :p

Also... you realize you're still nitpicking an inconsequential aspect of the given example, right?
 
Last edited:
My next thread.

AMD 390x may lose the performance crown NVIDIA's next generation card in 12 months. I wonder if it will troll along to 6 pages of dogshit.
 
No, it means that the designers have rated the part to be handled by a cooling solution that can dissipate a maximum of 165w of heat.


Like I said, if the power is being used to do something useful rather than generate a waste product, that's pretty much the definition of efficient usage of resources.

right?

again, it doesnt mean shit, its still sucking 270 watts at my wall.

Vs 300 watts for 290.

So, the basically the same or ostensibly slightly bettef\r performance for a 30 watt decrease in POWER USAGE, is by no means the panacea the nvidia camp has been crowing about.
 
Last edited:
again, it doesnt mean shit, its still sucking 270 watts at my wall.
Doesn't mean shit? TDP is a very practical figure to keep in mind. It means only 165w of heat being dumped into my case.

I don't care (much) about how much power a card draws from the wall, but I DO care how easy it is to keep cool and quiet. A card with a 165w TDP should be easier to keep cool and quiet than a card with a 250w or a 270w TDP, regardless of how much power it actually consumes.

So, the basically the same or ostensibly slightly bettef\r performance for a 30 watt decrease in POWER USAGE, is by no means the panacea the nvidia camp has been crowing about.
The decrease in TDP seems like a pretty big win to me. Much easier to run a cool and quiet GTX 980 (165w TDP) than to try and tame a GTX 780 Ti (250w TDP). That's a nice 85w reduction in heat output.
 
My next thread.

AMD 390x may lose the performance crown NVIDIA's next generation card in 12 months. I wonder if it will troll along to 6 pages of dogshit.

Hilarious!!
 
So if I buy a 95w CPU and overclock it from 3.4GHz to 5.0GHz then it is only dumping 95w of heat into my case. Awesome! Thx!
 
again, it doesnt mean shit, its still sucking 270 watts at my wall.

Vs 300 watts for 290.

So, the basically the same or ostensibly slightly bettef\r performance for a 30 watt decrease in POWER USAGE, is by no means the panacea the nvidia camp has been crowing about.

I don't feel like reading this whole thread but are we talking about the gtx 980 consuming only 30 watts less than a 290 or 290x?

http://www.hardocp.com/article/2014...rce_gtx_980_video_card_review/13#.VDSRkhZRzms

Its more like a 100+ watts.

What is embarrassing for AMD is the r9 285 draws more power than a gtx 980.

http://www.hardocp.com/article/2014...5_gaming_oc_video_card_review/10#.VDSR2xZRzms

With powerconsumption like this, its no wonder AMD needs water to beat Nvidia.
 
I don't feel like reading this whole thread but are we talking about the gtx 980 consuming only 30 watts less than a 290 or 290x?<br />
<br />
<a href="http://www.hardocp.com/article/2014/09/18/nvidia_maxwell_gpu_geforce_gtx_980_video_card_review/13#.VDSRkhZRzms" target="_blank">http://www.hardocp.com/article/2014/09/18/nvidia_maxwell_gpu_geforce_gtx_980_video_card_review/13#.VDSRkhZRzms</a><br />
<br />
Its more like a 100+ watts.<br />
<br />
What is embarrassing for AMD is the r9 285 draws more power than a gtx 980. <br />
<br />
<a href="http://www.hardocp.com/article/2014/09/02/msi_radeon_r9_285_gaming_oc_video_card_review/10#.VDSR2xZRzms" target="_blank">http://www.hardocp.com/article/2014/09/02/msi_radeon_r9_285_gaming_oc_video_card_review/10#.VDSR2xZRzms</a><br />
<br />
With powerconsumption like this, its no wonder AMD needs water to beat Nvidia.
<br />
<br />
Hey now, we don't need your facts here. This is about arguing semantics of someone's post. You take your test data and fuck straight off. /sarcasm
 
To what are you referring, exactly?
Telling you would rather defeat the purpose of being intentionally cryptic.

again, it doesnt mean shit, its still sucking 270 watts at my wall. Vs 300 watts for 290.
You have your graphics cards plugged directly into the wall, do you? Remarkable, truly.

So if I buy a 95w CPU and overclock it from 3.4GHz to 5.0GHz then it is only dumping 95w of heat into my case. Awesome! Thx!
Only if you buy the model with the attached leaf blower!
 
Nevermind. Selling my GTX780 and patiently waiting for a 390x. Going to buy a new motherboard to support onboard graphics until it arrives. I feel this will give me the most bang for the buck in 6 months.
 
Wow this thread is so full of crap. A card coming 6 months late should supersede performance of its counterpart.

Why do people stoop to fanaticism?
 
Back
Top