Separate names with a comma.
Discussion in 'AMD Flavor' started by fanboy, Sep 29, 2014.
air, vehicle, vibrations = Work
Heat/Light =/= work.
TDP=Thermal Design Point
Its the theoretical maximum power that the GPU Package/Board design generates at spec.
Which is exactly what I illustrated. The TDP can be a fixed number regardless of the actual power-draw of a component, and TDP will always be lower than the actual power draw of the component by some amount.
I never said the TDP wasn't a constant (of COURSE it's a constant, it's a single number). You're arguing against a point I never made.
If a card draws 177w of power, it will expend SOMEWHAT LESS THAN 177w as heat.
I never said anything about magic ratios, not sure what you're on about with that.
Yes, see above. TDP (assuming it was rated correctly) will always be lower than actual power consumption, because a graphics card does not convert 100% of the power it consumes into heat.
Simple as that, argue all you want, nothing you can say will change this fact.
Incorrect. Just to start with, some of it is also converted into mechanical energy, and some of it is converted into electromagnetic radiation.
More than 10w goes towards driving the fan(s) on the GTX 970, which already creates a minimum deficit between the actual power consumption of the card and its TDP (when the fan is running full speed). It's very, very, VERY obvious that not-all of the power these cards draw becomes heat.
I DID NOT, anywhere, say that any power gets "lost"
It's used for other things, but not lost. No laws of physics are being broken, conservation of energy is still perfectly in-play, and all WITHOUT the card converting 100% of the power it receives into heat.
At what? Yourself?
What I said remains true: TDP doesn't directly equate to power consumption. TDP should always be lower than power consumption.
Or it has a leaf-blower attached to it. Either way, it draws LOTS of power but doesn't put out much heat.
And that would make even the ridiculous notion of a card drawing 800w of power and only having a TDP rating of 177w pan-out.
I never said that. I simply said that overclocked cards will always put out more heat and draw more power than reference models.
While you argue over the definition of TDP, please be aware that the marketing divisions of Intel, AMD and Nvidia have each come up with their own definitions. There's an interesting post about it here. It had literal, Intel's, AMD's and Nvidia's definitions for TDP, but some of the links are gone and the post writer even seems to have gotten AMD's and Intel's definitions switched.
The literal definition: "the maximum amount of heat that that a processor will generate" (I don't know the post's source but it seems similar to Wikipedia's definition posted above by PcZac).
Intel's definition in 2011 (link goes to .pdf): "The upper point of the thermal profile consists of the Thermal Design Power (TDP) and the associated Tcase value. Thermal Design Power (TDP) should be used for processor thermal solution design targets. TDP is not the maximum power that the processor can dissipate. TDP is measured at maximum TCASE."
Intel's description of AMD's definition (from the same .pdf): "the maximum power a processor can draw for a thermally significant period while running commercially useful software"
Nvidia's new, changed definition as described in a 2010 blog: "a measure of maximum power draw over time in real world applications". That article defined the typical definition as "the amount of power (heat) that a cooler must dissipate in order to keep a silicon chip within its operating temperatures" (oh great, power = heat) and then goes on to say that both Intel and AMD agree that it's "a measurement of waste heat output from a chip".
I will further note that the marketing divisions made their changes a few years ago, so if you look at older articles and forum posts, you'll find different descriptions of how TDP works, like this one from 2009.
I briefly tried to find current pages on AMD and Nvidia for their definitions of TDP but I gave up quickly because, honestly, I'm feeling lazy tonight and they don't seem to be easy to find.
We're all totally clear about the definition of TDP now, right?
Still waiting for answer.
While definitions vary (and the methods used to compute it similarly vary), they do all fundamentally describe the same thing.
1. "The maximum amount of heat that a processor will generate"
2. "TDP is not the maximum power that the processor can dissipate"
3. "The maximum power a processor can draw..."
4. "The amount of power (heat) that a cooler must dissipate..."
5. "a measurement of waste heat output from a chip"
Okay, #3 is not like all the others, so AMD was wrong?
AMD's definition for thermal design power isn't directly defined in terms of thermals, which does make it a pretty goofy definition relative to the others. Those are defined in the aim of providing a target value to those who design cooling systems ("my system needs to be able to dissipate this much heat energy over sustained periods"). It does, however, clarify that the figure is a measure of a "thermally significant period", rather than any instantaneous power draw, so it should adequately reflect the needs of a cooling system.
Already said, it as a hypothetical configuration for illustration purposes only.
And it illustrated the point perfectly. Just because a part draws 800w, doesn't mean it can't also have a 100w TDP. Similarly, the less-extreme example of a part that draws 177w and has a 145w TDP can also pan-out. This is pretty simple stuff man...
And I already gave a hypothetical answer as to what could cause such a rating to actually pan out: a leafblower attached to the card. Suddenly it can draw 800w AND hve a 100w TDP, and those numbers are both correct. So... I'm not sure what you're waiting on, the ridiculous hypothetical situation already has a ridiculous hypothetical answer. Guess you just missed it, just like you missed the ENTIRE point that example was illustrating: That TDP and actual power consumption are NOT 1:1
As I said previously, plug in real-world numbers (like the GTX 970 in the previously-posted table that was pulling 177w, and has a 145w TDP rating). Not all of the power it's drawing is being converted into heat, THEREFOR, the TDP is lower than the power consumption.
Per your sample, saying it draws 800W and it has a 100W TDP, the card either has A/C unit attached to it or someone fudged on the calculations...
Fine, it has an A/C unit attached to it You realize you're totally missing the point, right? WHY the hypothetical, for-illustration-purposes-only, example part in-question draws 800w of power and only puts out 100w of heat is irrelevant, the point being made is that such figures are POSSIBLE.
All that was meant to illustrate is the fact that TDP doesn't directly equate to power consumption. There are all kinds of things that can draw power without converting it into heat, and every one cumulatively adds to the deficit between TDP rating and actual power consumption.
No. Your claim was that a card drawing 800W with a 100W TDP was efficient:
You're asserting that this theoretical graphics card is efficient not at rendering images for its consumed power, but at blowing leaves?
You fucked up monstrously. You know it; everyone else knows it. It's why they're laughing at you. Why not just save face and admit to it?
The problem is the useful work done by a card is a minimal output <5W(?) to a monitor. Everything else is waste one way or another. Whether it's blowing hot air or vibrations from the gates flipping... it's all waste.
That's why we generally break it down to GFLOPs per watt or some kind of measurement. For me FPS/watt might make sense in a given game .
The other argument seems to be around TDP which is a design specification usually for a chip or system which you match your cooling solution to. That seems pretty straight forward... I agree with most posts I don't think the fan wattage should be factored in for design purposes...
What really matters is work performed (GLFOPs or what have you) / overall system draw including cooling systems.
By the way, this whole conversation boggles my mind.
And who cares if 380X is > than the 980 months and months after release. What matters is what is nVidia releasing around February. I like to look forward.
Thank you wonderfield for being the voice of reason here.
Protip: if you have zero scientific background and don't understand basic thermo, don't try to fake it.
For thermal design power, the power consumed by the cooling system should not be factored in, no. TDP is intended to define the requirements of the cooling system itself.
Assuming that power is being put towards doing something useful and not producing waste heat, then yes, that would be quite efficient.
Not seeing what you're taking issue with. Oh, and you're STILL missing the entire point of that example, in case you hadn't realized
Fucked up what, exactly?
I used hypothetical set of numbers as an example to make a point, and that example makes my point perfectly. TDP does not directly equate to power consumption.
Do you not understand that the chosen numbers are of NO consequence for the sake of the point being made, except for one being a higher number than the other? I could have said "a card that draws 110w and has a 100w TDP," and the example works EXACTLY the same way. Still illustrates a perfectly valid situation where TDP doesn't directly equate to actual power consumption.
Point made, point stands, no fuck-up. The only fuck-up is that you're hung-up on an inconsequential aspect of the example used, and does nothing to negate the point made by the example.
Admit what, exactly? That my example worked perfectly to illustrate my point that TDP and actual power consumption do not have a 1:1 relationship?
I have a basic understanding of physics, so I'm not sure what you're taking issue with.
My point stands: a graphics card does not convert 100% of the power it consumes into heat, therefor TDP will always be lower than actual power consumption.
This isn't a complicated concept. Various components (like coils) vibrate, emit electromagnetic waves, or (in the case of the fan) convert electricity into kinetic energy. All of these require power, but convert it into something OTHER than heat. Thus, actual power consumption ends up higher than TDP. Simple.
The fan motor also gets warm!
A little bit, but the vast majority of the power consumed by the fan is converted into kinetic energy, not heat. Ergo, the vast majority of the power the fan consumes does not contribute to TDP in any way.
It does contribute to overall power consumption, though. Again, this makes up some of the difference we see between TDP and actual power consumption.
Luckily, to everyone's amusement, that wasn't the only thing it illustrated.
To what are you referring, exactly? Point was made, point is valid, point has not been disprove in any way.
The only thing that's amusing is how hung-up you are on an inconsequential aspect of the example that was used. As I said, the numbers in the example could literally be ANY two numbers. Just so long as the one representing TDP is the lower of the two numbers, the point being made remains unchanged. You're literally trying to make an argument about something that doesn't matter. It doesn't effect the validity of the example and it doesn't effect the validity of the point made using the example. No matter how you slice it, TDP should always be lower than actual power consumption by some amount. There's nothing wrong with TDP being lower than actual power consumption, and it should be expected.
If you're about done with your tangent, the point that example was used to make (you know, the thing that was actually being discussed) still stands, and will continue to stand.
The bottom line is, if the TDP is 165 watts, and the thing is drawing 270 watts, TDP means diddly squat. Its a marketing term meant to fool the unwary into thinking they are buying something low-power
Power draw is power draw, it still has to dissipate (close to) 270 watts regardless of what the TDP is.
And in the case of your 800watt air-conditioner, that takes power too. Now you have the card draw, PLUS the refrigeration unit power draw. Not exactly efficient.
Thermal design power is not a "marketing term". It's an engineering term.
No, it means that the designers have rated the part to be handled by a cooling solution that can dissipate a maximum of 165w of heat.
Like I said, if the power is being used to do something useful rather than generate a waste product, that's pretty much the definition of efficient usage of resources.
If WHATEVER it is pulls 800w of power to do useful work, and only produces 100w of waste, that sounds pretty efficient to me
Also... you realize you're still nitpicking an inconsequential aspect of the given example, right?
My next thread.
AMD 390x may lose the performance crown NVIDIA's next generation card in 12 months. I wonder if it will troll along to 6 pages of dogshit.
Only one way to find out
again, it doesnt mean shit, its still sucking 270 watts at my wall.
Vs 300 watts for 290.
So, the basically the same or ostensibly slightly bettef\r performance for a 30 watt decrease in POWER USAGE, is by no means the panacea the nvidia camp has been crowing about.
Doesn't mean shit? TDP is a very practical figure to keep in mind. It means only 165w of heat being dumped into my case.
I don't care (much) about how much power a card draws from the wall, but I DO care how easy it is to keep cool and quiet. A card with a 165w TDP should be easier to keep cool and quiet than a card with a 250w or a 270w TDP, regardless of how much power it actually consumes.
The decrease in TDP seems like a pretty big win to me. Much easier to run a cool and quiet GTX 980 (165w TDP) than to try and tame a GTX 780 Ti (250w TDP). That's a nice 85w reduction in heat output.
So if I buy a 95w CPU and overclock it from 3.4GHz to 5.0GHz then it is only dumping 95w of heat into my case. Awesome! Thx!
No, he is saying it is easier to cool because it is more efficient! LoL
I don't feel like reading this whole thread but are we talking about the gtx 980 consuming only 30 watts less than a 290 or 290x?
Its more like a 100+ watts.
What is embarrassing for AMD is the r9 285 draws more power than a gtx 980.
With powerconsumption like this, its no wonder AMD needs water to beat Nvidia.
Ha ha ha!
Hey now, we don't need your facts here. This is about arguing semantics of someone's post. You take your test data and fuck straight off. /sarcasm
Telling you would rather defeat the purpose of being intentionally cryptic.
You have your graphics cards plugged directly into the wall, do you? Remarkable, truly.
Only if you buy the model with the attached leaf blower!
Nevermind. Selling my GTX780 and patiently waiting for a 390x. Going to buy a new motherboard to support onboard graphics until it arrives. I feel this will give me the most bang for the buck in 6 months.
Wow this thread is so full of crap. A card coming 6 months late should supersede performance of its counterpart.
Why do people stoop to fanaticism?