AMD Announces The Radeon RX 480 Starting At Just $199

How do you k or they aren't dual 390s, possibly OC?

It's an old game version as well. Ideally we want to see a recent 1080/1070 review and hope to find a 390x 1080p Crazy preset result. Either that or we can ask someone on the forum to test for us

I could try downclocking my card as close to 1050mhz as possible and test that way

I highly doubt someone sporting an i7-5820K is also only using a 390, and the benefit from the game version change the 390X has got from the game version change is very little.

As far as we know 1070 on default power limit does not exceed 150W in power consumption beyond some margin of error.

The 1070 was measured at using maximum of 224 watts. I found a more detailed measurement for the 1080 power usage.

So they mentioned tdp of highest clocked card but demoed performance of a lower/cut gpu?

AMD didn't mention TDP at all. They simply listed the maximum power it's rated for. 75W PCIe + 75W 6-pin connector.
 
Last edited:
TDP is an average power usage, the 1070 was measured at using maximum of 224 watts. I found a more detailed measurement for the 1080 power usage.

Your first chart is using overall system power consumption, so it's misleading at least that way. I mean, you are not going to claim a video card is using 70+ watts in idle state after all.

The latter one is interesting but is a freaking mess.

But yes, you are correct, the "Average" term should be used.
 
Your first chart is using overall system power consumption, so it's misleading at least that way. I mean, you are not going to claim a video card is using 70+ watts in idle state after all.

The latter one is interesting but is a freaking mess.

But yes, you are correct, the "Average" term should be used.

TDP is not power consumption. Tdp is necessarily lower than power consumption because of the laws of thermodynamics.

Tdp is a measure of the thermal output for the die. The energy wasted as heat is necessarily smaller than the energy input to the system
 
just a small correction

the definition of TDP

TDP is typically defined as Thermal Design Power, the amount of power (heat) that a cooler must dissipate in order to keep a silicon chip within its operating temperatures. While Intel and AMD disagree as to what test to run to measure this, both agree that it's a measurement of waste heat output from a chip.

This has nothing to do with the power used to power a chip......

Although there is a 1 to 1 relationship with the heat produced from a chip to the power used by the chip.

So lets say the chip uses 150 watts, it will output 150 watts of heat, that means you need a cooler that has a TDP of moving at least 150 watts of heat away from the chip, otherwise the chip overheats.
 
Don't take our word for it guys. We are hired by nvidia to misrepresent the laws of physics :p
 
TDP is not power consumption. Tdp is necessarily lower than power consumption because of the laws of thermodynamics.

Tdp is a measure of the thermal output for the die. The energy wasted as heat is necessarily smaller than the energy input to the system
I am fairly certain they are equal, actually. I mean, what other job is actually done by that energy for it to transform into something other than heat.
just a small correction

the definition of TDP

TDP is typically defined as Thermal Design Power, the amount of power (heat) that a cooler must dissipate in order to keep a silicon chip within its operating temperatures. While Intel and AMD disagree as to what test to run to measure this, both agree that it's a measurement of waste heat output from a chip.

This has nothing to do with the power used to power a chip......

Although there is a 1 to 1 relationship with the heat produced from a chip to the power used by the chip.

Well, yes, that has to be remembered. It is however, reasonable to talk about average power consumption in context of rated TDP, is not it?
 
yes it definitely is, average power consumption should be taken into context of TDP (there is a relationship between those two), Max power delivery should not be though.
 
Your first chart is using overall system power consumption, so it's misleading at least that way. I mean, you are not going to claim a video card is using 70+ watts in idle state after all.

The latter one is interesting but is a freaking mess.

But yes, you are correct, the "Average" term should be used.

The first chart isn't overall system power consumption. It's measuring max consumption of the graphics card. It lists the 1080 at 274W max which lines up with Tom's Hardware findings.
 
The first chart isn't overall system power consumption. It's measuring max consumption of the graphics card. It lists the 1080 at 274W max which lines up with Tom's Hardware findings.
No, it's average system power consumption. A single look at idle power consumption should tell you that.
 
There is absolutely no way for the 1080 to use 274 watts..... the max power deliver is 225 watts
 
The first chart isn't overall system power consumption. It's measuring max consumption of the graphics card. It lists the 1080 at 274W max which lines up with Tom's Hardware findings.

No way in hell that's correct rofl
 
Also tdp isn't the same as power consumption because it's not just the gpu die consuming power. The memory ICs are factored into power consumption, waste energy from vrms affects power consumption, but neither affect gpu tdp
 
No, it's average system power consumption. A single look at idle power consumption should tell you that.

You're right. Well, no one has done a test to the 1070 similar to what Tom's Hardware did for the 1080 then.

There is absolutely no way for the 1080 to use 274 watts..... the max power deliver is 225 watts

You can see the spikes yourself on the Tom's Hardware review.

Power Consumption Results - Nvidia GeForce GTX 1080 Pascal Review
 
Also tdp isn't the same as power consumption because it's not just the gpu die consuming power. The memory ICs are factored into power consumption, waste energy from vrms affects power consumption, but neither affect gpu tdp
Strictly speaking TDP is given for the whole card, including VRM, memory and whatever the hell else is crammed onto PCB.

And when put like that it does correlate with power consumption almost 1 for 1 (obviously some power goes into fans, but whatever).
you will get spikes but the card can't sustain that.
Well, who told you PCI-E spec is anything but a worst case requirement :)
 
Yeah was just gonna say, the spikes do not matter, it could be a question of measurement and the granularity at which the value is calculated.

By that token a Fury X consumes 400w
 
If it spikes that high then you have to buy a large enough power supply to cover the spike or risk stressing your system and starving it. Interesting find. Thank you.
Not at all, your psu allows for spikes lol. There are in rush currents to account for etc etc

That 280w spike could last less than 1ms lol
 
If we are talking just TDP, the 1070 was measured by guru3d at 161W and 1080 at 184W.
Dude. Guru3d don't use a calorimeter. That's power consumption

Mind you, the system wattage is measured at the wall socket side and there are other variables like PSU power efficiency. So this is an estimated value, albeit a very good one. Below, a chart of relative power consumption. Again, the Wattage shown is the card with the GPU(s) stressed 100%, showing only the peak GPU power draw, not the power consumption of the entire PC and not the average gaming power consumption.
 
Last edited:
Not at all, your psu allows for spikes lol. There are in rush currents to account for etc etc

That 280w spike could last less than 1ms lol
Yep.
If people want to see that in action, look to hifi audio gear as the bursts go well over the spec of the hardware, let alone cold start inrush current for hifi gear.
Cheers
 
You know, Nvidia didn't put that 8-pin connector to 1070 out of their goodwill... they put it there because some 1070's can and will consume more than PCI-E + 6-pin can deliver and OEM's will say "me no like" if it doesn't adhere to the specs.

But at the same time, some peeps in here are saying that 480 will have 150w TDP which is the max on what PCI-E + 6-pin can deliver. So 480 needs to whole power budget and 1070 just 70%. I think some re-education is needed if people wan't to troll properly.

Now to the fun fact of the day. AMD didn't say what the cards TDP is going to be but they hinted at it by saying that 2x480 is more power efficient than 1080. We know that 1080 consumes around 160-180w avg so my humble estimation is that 480 is going to consume around 90-120w.

My troll estimation is ~75w and AMD also put that 6-pin connector there out of their goodwill like how Nvidia did it. :troll:

Here's the hint before someone asks:
kEhZQ9W.jpg
 
Assuming the 1070 is 35-40% faster than the 480, that would put the 480 at 105-110W 1:1.
Now remembering smaller/weaker chips are usually more efficient, the 480 should probably be sub-100W.
 
Latest rumor on B3D is ~96 watts or so. If that is the case, then we could see some interesting partner boards maybe 1300 - 1400 range? Would be interesting.
 
...The energy wasted as heat is necessarily smaller than the energy input to the system
Uh, what other energy output is there from a video card? Are you suggesting there is appreciable power leaving through the cable going to the display? It's not creating potential energy. It's not creating appreciable kinetic or acoustic energy. The electrical energy input is basically all being turned into heat.
 
Uh, what other energy output is there from a video card? Are you suggesting there is appreciable power leaving through the cable going to the display? It's not creating potential energy. It's not creating appreciable kinetic or acoustic energy. The electrical energy input is basically all being turned into heat.
Some of it gets turned into work.
 
What sort of work? Graphical output is not work in the sense of the physics sense of work.

Uh, what other energy output is there from a video card? Are you suggesting there is appreciable power leaving through the cable going to the display? It's not creating potential energy. It's not creating appreciable kinetic or acoustic energy. The electrical energy input is basically all being turned into heat.

Driving the gates is. Most of the energy is wasted as heat, yes, but there is useful work being done; routing current flow through various gates consumes energy as well, most of that energy is simply radiated out as heat.

I don't get the quip about cables at all. We're talking about transistors
 
Driving the gates is. Most of the energy is wasted as heat, yes, but there is useful work being done; routing current flow through various gates consumes energy as well, most of that energy is simply radiated out as heat.

I don't get the quip about cables at all. We're talking about transistors


It doesn't "consume" energy I don't think so at least, cause there is no mechanical wastage. What you are loosing there to heat is leakage. Should add in there resistance but that correlates to leakage too.
 
Last edited:
You know, Nvidia didn't put that 8-pin connector to 1070 out of their goodwill... they put it there because some 1070's can and will consume more than PCI-E + 6-pin can deliver and OEM's will say "me no like" if it doesn't adhere to the specs.

But at the same time, some peeps in here are saying that 480 will have 150w TDP which is the max on what PCI-E + 6-pin can deliver. So 480 needs to whole power budget and 1070 just 70%. I think some re-education is needed if people wan't to troll properly.

Now to the fun fact of the day. AMD didn't say what the cards TDP is going to be but they hinted at it by saying that 2x480 is more power efficient than 1080. We know that 1080 consumes around 160-180w avg so my humble estimation is that 480 is going to consume around 90-120w.

My troll estimation is ~75w and AMD also put that 6-pin connector there out of their goodwill like how Nvidia did it. :troll:

Here's the hint before someone asks:

Latest rumor on B3D is ~96 watts or so. If that is the case, then we could see some interesting partner boards maybe 1300 - 1400 range? Would be interesting.

hmmm IDK... their own slide state power as a 150W part

image.png
 
It doesn't "consume" energy I don't think so at least, cause there is no mechanical wastage. What you are loosing there to heat is leakage.

Basically the amount of energy we are talking about is inconsequential compared to the thermal energy, but there are minute amounts of energy effectively going towards driving the gates,
 
Driving the gates is. Most of the energy is wasted as heat, yes, but there is useful work being done; routing current flow through various gates consumes energy as well, most of that energy is simply radiated out as heat.
FETs don't work like that, but even if it did that would turn into heat via friction. The chip is turning virtually all the electrical power it consumes into heat. There is no appreciable output of energy from the chip other than heat.
 
FETs don't work like that, but even if it did that would turn into heat via friction. The chip is turning virtually all the electrical power it consumes into heat. There is no appreciable output of energy from the chip other than heat.

Yes I said it was miniscule and insignificant compared to the heat produced. It is not appreciable, but it exists
 
hmmm IDK... their own slide state power as a 150W part

image.png

you see one number and but you fail to understand the logic. That power is not for a set number of tflop, I am sure I can explain away but you will only see what you wanna see. They are stating max power and they are stating tflop >5 which means they are tellling you the worst case scenario depending on where you clock the card. If they said Tflops = 5 and same power then yes your argument would be correct. But you are looking at one number and not making sense of it. Thats what I understand from the slide. They gave it max tdp depending on where you clock the card probably.

What I am seeing is they probably know what the max overclocked cards will be at and they will probably stay under 150w. They did say it was up to 2.8x more efficient than last gen with process and amd enhancements.
 
2.8 is in specific circumstances, check the video Raja specially states the perf/watt numbers then he says he hasn't been talking about games yet. What that perf/watt number is in VR situations.

But yeah the rest of it, don't expect the rx480 to use 150 watts, expect it to use less.
 
2.8 is in specific circumstances, check the video Raja specially states the perf/watt numbers then he says he hasn't been talking about games yet. What that perf/watt number is in VR situations.

But yeah the rest of it, don't expect the rx480 to use 150 watts, expect it to use less.

Probably. yea just some people are looking at the slide but don't see the top portion where they are basically saying we are not going to tell you clocks yet but don't expect it to use more than 150w
 
Back
Top