AMD's Statement On Reports Of Radeon RX 480 Excessive Power Draw

Fixed it for you. I know your an engineer but GFCI are mostly just in kitchens and bathrooms:)

You would be surprised what use EE's tie together when doing typical unit plans..Let's say you are using 20A breakers, and we naturally want to use the least number of breakers possible without overloading them, as that reduces the cost to the homeowner..When a house usually has a powder room (aka bathroom with no shower) we are required by code to use a GFCI outlet there, but we often will tie the hall way lighting into that circuit since it has a huge overhead..

It is very well possible that the circuit he is using could be tied into a GFCI..Not likely, but certainly possible...Also, it would be a good idea for him to have a friend go into the breaker box and observe/listen while he starts the stress test and see if one of the breakers is "chattering" which it will do when it gets hot..They will literally dance and its easy to see/hear..If it is tripping immediately, he needs to have a licensed electrian come out and inspect that circuit, as something is vastly overloaded and could easily be a fire hazard..

It looks like the new 16.7.1 driver is doing exactly what AMD said it would, making this "PowerGate" issue a nonissue.. AMD's New Radeon RX 480 Driver Fixes Power Issues





*This information is provided solely for discussion, and does not represent any sort of professional advice from myself as a licensed engineer in the great state of Maryland!
 
You would be surprised what use EE's tie together when doing typical unit plans..Let's say you are using 20A breakers, and we naturally want to use the least number of breakers possible without overloading them, as that reduces the cost to the homeowner..When a house usually has a powder room (aka bathroom with no shower) we are required by code to use a GFCI outlet there, but we often will tie the hall way lighting into that circuit since it has a huge overhead..

It is very well possible that the circuit he is using could be tied into a GFCI..Not likely, but certainly possible...Also, it would be a good idea for him to have a friend go into the breaker box and observe/listen while he starts the stress test and see if one of the breakers is "chattering" which it will do when it gets hot..They will literally dance and its easy to see/hear..If it is tripping immediately, he needs to have a licensed electrian come out and inspect that circuit, as something is vastly overloaded and could easily be a fire hazard..

It looks like the new 16.7.1 driver is doing exactly what AMD said it would, making this "PowerGate" issue a nonissue.. AMD's New Radeon RX 480 Driver Fixes Power Issues





*This information is provided solely for discussion, and does not represent any sort of professional advice from myself as a licensed engineer in the great state of Maryland!
actually my house is very well built but the electrical was done by a drunk, maybe. been that way in most places I lived. My computer room is connected to the bathroom gfci circuit. found that out with a electric radiator and hairdryer running at same time.
 
I am currently been reading review on the 480 http://www.gamespot.com/articles/amd-radeon-rx-480-review/1100-6441354/

what I want to know is how can a 4 year GTX660ti beat a GTX980 as per benchmarks show in link 980 64.9 fps. 660 68.1 fps?
The 480 did 46.8 which seems right considering its an Older Q6600 What I don't know is the settings they used low med or high
These are at 1080P
New lenses will probably help, there isnt a GTX660 result there and no mention of one either.
:p
 
No the 660 result is my own testing 1080p low settings A10-7870k 16gb 2133, intel240 ssd Valley benchmark. 480 was also mine q6600 8gb ddr2 800 intel 80gb ssd. I'm not sure something is right here. Just doesn't seem right for 660 to be that high. I checked resolution etc. All settings look okay.
 
No the 660 result is my own testing 1080p low settings A10-7870k 16gb 2133, intel240 ssd Valley benchmark. 480 was also mine q6600 8gb ddr2 800 intel 80gb ssd. I'm not sure something is right here. Just doesn't seem right for 660 to be that high. I checked resolution etc. All settings look okay.
Different systems, different settings, maybe different benchmark version.
I doubt valley benchmarks will use low settings.
 
AMD Polaris 10/11 : RX 480/470/460 (14nm FinFET) [Topic Unique] - Page : 256 - Carte graphique - Hardware - FORUM HardWare.fr

XFX preview/testing card. Currently drawing 250W, half through the 8-pin connector and half through the motherboard's PCI-E slot. Basically, an OC'd reference card with an 8-pin added.

Obviously, this isn't final, but they had better fix it before they do finalize the card. I will not be buying any AIB card until I see how they handle the power delivery. This one wreaks of a lazy attempt. The MSI variant seems to have severely downgraded the power delivery, but at least they made alterations.
 
IN the heaven benchmark on ultra 1080 p tessellation normal the 660 only got 45.4 480 got 71.8 it just seems weird....
 
IN the heaven benchmark on ultra 1080 p tessellation normal the 660 only got 45.4 480 got 71.8 it just seems weird....

Not really surprised. Nvidia's statement was that the 1060 is as fast as a 980 in VR. They used the same "in VR" that they used at the 1080 announcement. I expect these two cards to be VERY close.
 
IN the heaven benchmark on ultra 1080 p tessellation normal the 660 only got 45.4 480 got 71.8 it just seems weird....
You arent making any sense.
Forgive us, we havent mastered ESP on this forum.
 
AMD Polaris 10/11 : RX 480/470/460 (14nm FinFET) [Topic Unique] - Page : 256 - Carte graphique - Hardware - FORUM HardWare.fr

XFX preview/testing card. Currently drawing 250W, half through the 8-pin connector and half through the motherboard's PCI-E slot. Basically, an OC'd reference card with an 8-pin added.

Obviously, this isn't final, but they had better fix it before they do finalize the card. I will not be buying any AIB card until I see how they handle the power delivery. This one wreaks of a lazy attempt. The MSI variant seems to have severely downgraded the power delivery, but at least they made alterations.
7 amps over a bus rated for 5.5 amps was bad enough, but more than 10 amps? I want whatever the engineers over at XFX are smoking.
 
7 amps over a bus rated for 5.5 amps was bad enough, but more than 10 amps? I want whatever the engineers over at XFX are smoking.
Its not an issue if they fix it before launch so nothing to worry about yet lol. Not a bad looking card at least...Seems to be a longer pcb?
IMG0051218_1.jpg
 
I really have to give it to AMD for being honest and upfront about drivers. So PCperspective couldn't see any distinguishable difference between 16.2.1 and 16.7.1 with compatibility mode on. They could have easily not bothered and said here is the new driver that puts it within 150ish watts with little to no performance hit.

I would have rather had them do compatibility mode by default since there is barely any performance difference and let people toggle to a performance mode?

O well. I guess at least they fixed the main issue.
 
First round is supposed to be 8-pin. Supposedly, a later round of AIB cards will have 8+6 pin and cost about $300. Not sure that I'm willing to wait that long. So long as power delivery is acceptable, I'm diving in on one of these cards above.
 
First round is supposed to be 8-pin. Supposedly, a later round of AIB cards will have 8+6 pin and cost about $300. Not sure that I'm willing to wait that long. So long as power delivery is acceptable, I'm diving in on one of these cards above.

It'll be interesting to see what the 480 would bring in a $300 price point. I think such a piece of kit would need to perform a good deal better than the reference cards to be worth $300. I have to admit, I'm amazed by all of the interest in this part. I've bought lower end cards before to put in older boxes but never really thought of them for full time gaming. Since I built this new rig I do have an i7-980x sitting in the next bed room. Thing still runs fine and I was actually getting some great gaming out of it from a single 1080 while I assembled the parts for this new rig. I'm going to rebuild it into a smaller case and I was considering a 480. But I think I'll go with a 1060 if it that much cooler and close to performance to the 480.

The other thing for me too is that I actually do use NVidia's 3D tech, not a lot for gaming but quite a bit for 3D BD. I'm going to swap out the monitors I've been using for the last six years for newer 3d monitors with DisplayPort connectors. These DP to Dual DVI converters suck dealing with 3D and with only two DVI ports now on the two 1080s I need to switch over to DisplayPort monitors. So if I go with a 1060 I now have another 3D BD capable system that should also be VR capable.

I guess this is how people become fanboys.
 
It'll be interesting to see what the 480 would bring in a $300 price point. I think such a piece of kit would need to perform a good deal better than the reference cards to be worth $300. I have to admit, I'm amazed by all of the interest in this part. I've bought lower end cards before to put in older boxes but never really thought of them for full time gaming. Since I built this new rig I do have an i7-980x sitting in the next bed room. Thing still runs fine and I was actually getting some great gaming out of it from a single 1080 while I assembled the parts for this new rig. I'm going to rebuild it into a smaller case and I was considering a 480. But I think I'll go with a 1060 if it that much cooler and close to performance to the 480.

The other thing for me too is that I actually do use NVidia's 3D tech, not a lot for gaming but quite a bit for 3D BD. I'm going to swap out the monitors I've been using for the last six years for newer 3d monitors with DisplayPort connectors. These DP to Dual DVI converters suck dealing with 3D and with only two DVI ports now on the two 1080s I need to switch over to DisplayPort monitors. So if I go with a 1060 I now have another 3D BD capable system that should also be VR capable.

I guess this is how people become fanboys.
Sounds like your speaking from experience? lol:whistle: You got to remember many of us interested in this card currently use much less powerful cards....So this cards a decent upgrade and relatively cheap. Your hole line of bs about the 3d can be done on the AMD's as well. (i guess it might require a paid app?) its not like amds cant do 3d lol
 
Looks like MSI is the only one so far confirmed to use CUSTOM PCB, with 4+2 phase design. Probably the one to pick if all the other manufacturers decided to go with AMD design.

ASUS, MSI, Gigabyte, PowerColor, XFX and Sapphire custom Radeon RX 480 pictured | VideoCardz.com

The Sapphire one looks the best to me, anyways it says a custom PCB. The RedDevil from PowerColor looks to be cooled the best. Asus is also a custom longer PCB. So this is probably a wait and see on the power usage. I expect them to be more compliant.
 
Sounds like your speaking from experience? lol:whistle: You got to remember many of us interested in this card currently use much less powerful cards....So this cards a decent upgrade and relatively cheap. Your hole line of bs about the 3d can be done on the AMD's as well. (i guess it might require a paid app?) its not like amds cant do 3d lol

Indeed I was one of the people around here using a much less powerful card than an RX 480, I was running 4 year old GTX 680s until a month ago. A 480 would be a significant upgrade in the old rig so I'm not discounting that at all. As far as 3D is concerned. I never much about it with AMD hardware. I know they had something when it became a thing about 6 years ago but it was kind of fad. nVidia doesn't talk about it much anymore. However if you go to their site they still offer info and support for it, like how to setup 3D Surround even on the 1080. In looking for new monitors with 3D support and DisplayPort connectors, of what's there little is mentioned of Radeon support. I think I found what I was looking for with in the Asus VG248QE which does explicitly mention support for nVidia 3D Vision. Zip about AMD support.
 
I seriously doubt support is needed one way or the other for normal blue ray 3d movies lol.....either your display is capable or not. I bet my current ancient AMD card would play 3d movies exactly the same as brand new AMD or Nvidia cards. For all i know AMD's built in apu's actually might manage the same function as well. I think its 99% about the display. Theres got to be a gazillion media players that offer 3d bd support.
 
AMD Polaris 10/11 : RX 480/470/460 (14nm FinFET) [Topic Unique] - Page : 256 - Carte graphique - Hardware - FORUM HardWare.fr

XFX preview/testing card. Currently drawing 250W, half through the 8-pin connector and half through the motherboard's PCI-E slot. Basically, an OC'd reference card with an 8-pin added.

Obviously, this isn't final, but they had better fix it before they do finalize the card. I will not be buying any AIB card until I see how they handle the power delivery. This one wreaks of a lazy attempt. The MSI variant seems to have severely downgraded the power delivery, but at least they made alterations.

Hmm will xfx be dumb enough to release a card on reference board design that sucks 125w of PCI-E? I mean they would be total ignorant and stupid to do that, and who in their right mind would do that? May be they are desperate since they don't have the resources to actually make a custom PCB may be? But if they are using the reference design though the driver should automatically limit the PCI-e slot less thatn 75 so the other portion will probably forced over to 8 pin no? Talking all possible scenarios here, but I think the latter will probably be true, the drivers limiting PCI-e draw to 75 or less.
 
shit yea.....that some nice bling! It makes the card faster as well! I guess this thing has heat pipes but we just cant see from this angle?
I can imagine it'd have at least one or two but at 150 watts it wouldn't need much.
 
I can imagine it'd have at least one or two but at 150 watts it wouldn't need much.
PowerColor-Radeon-RX-480-Red-Devil-4.jpg

I just dont see how it could spread the heat out without pipes or chamber. I like this and the asus the best. I know it kinda silly from worst to best 480's is probably 6 fps at most.:woot: The guys with the cheapest 4gb reference cards will get the exact same game play experience even compared to this thing of beauty.
 
PowerColor-Radeon-RX-480-Red-Devil-4.jpg

I just dont see how it could spread the heat out without pipes or chamber. I like this and the asus the best. I know it kinda silly from worst to best 480's is probably 6 fps at most.:woot: The guys with the cheapest 4gb reference cards will get the exact same game play experience even compared to this thing of beauty.
Except when they run out of ram ;)
 
I just dont see how it could spread the heat out without pipes or chamber. I like this and the asus the best. I know it kinda silly from worst to best 480's is probably 6 fps at most.:woot: The guys with the cheapest 4gb reference cards will get the exact same game play experience even compared to this thing of beauty.

I like those triple fans too, sometimes they get a bit tall with the excessive heat pipes that come out the top though. My favorite is probably the Asus DCU's, those use anywhere between 2-4 heatpipes and have have badass cooling.
 
Hmm will xfx be dumb enough to release a card on reference board design that sucks 125w of PCI-E? I mean they would be total ignorant and stupid to do that, and who in their right mind would do that? May be they are desperate since they don't have the resources to actually make a custom PCB may be? But if they are using the reference design though the driver should automatically limit the PCI-e slot less thatn 75 so the other portion will probably forced over to 8 pin no? Talking all possible scenarios here, but I think the latter will probably be true, the drivers limiting PCI-e draw to 75 or less.

XFX is owned by Pine Technology Holdings, one of the largest companies in the PC components world..They have the resources they need..I refuse to a random French forums word that the card is drawing 10A on the PCI-E slot..That is nearly double the rated spec, and very few, if any, MB's are going to be able to do that..It's either something being lost in translation, or just inaccurate testing..
 
  • Like
Reactions: NKD
like this
Well a stock card was drawing something like 7+A on pcie, I don't see why an overclocked one wouldn't hit 10A.

Dont forget that as the power draw increases the voltage decreases due to droop effects, so amperage must increase further to meet power demand
 
Well a stock card was drawing something like 7+A on pcie, I don't see why an overclocked one wouldn't hit 10A.

Dont forget that as the power draw increases the voltage decreases due to droop effects, so amperage must increase further to meet power demand

Are you talking about AIB cards here? If you are I think you are basing that off every AIB partner is using reference design. Which I doubt is the case here.
 
Are you talking about AIB cards here? If you are I think you are basing that off every AIB partner is using reference design. Which I doubt is the case here.
No no, I'm talking specifically about this xfx card which is a reference board. The new driver with the fix will change distribution from 1:1 to some other ratio in favor of the 6pin, but it's still at the limit based on the testing done, so even a small OC should make it go back over spec on the PCIe.

It's weird though, that xfx card has an 8pin, it makes no sense. Meh. Did I misunderstand something here? Are we really gonna see Aib cards with 8pin connectors on reference boards with the exact same issue?
 
  • Like
Reactions: Nenu
like this
Are we really gonna see Aib cards with 8pin connectors on reference boards with the exact same issue?

of course, unless they start to ship with newer and corrected BIOS then no matter how much power connectors they add to the board, it will keep with a 1:1 balance, i've said this lot of times before, if the card try to pull 150W from the PCI-E power connector will try to do the same from the PCI-E Slot.. so yes as someone said above the card pulling out 250W will indeed mean 125W for both power sources which it's awfully bad.
 
AIBs should use a better power delivery system to the GPU otherwise there is little point moving to 8 pin other than to remain compliant.
Currently part of the GPU gets power from the PCIe bus, the rest from the the 6 pin.
This cannot make effective use of more power from an 8 pin connector because part of the GPU wont receive it.

ie, use the PCIe bus power for memory and other menial circuits only.
The 8 pin connector powers the GPU only.
If more than 150W is needed for the GPU, add another 6 pin.
But however much power is needed, stay bloody compliant!
 
No no, I'm talking specifically about this xfx card which is a reference board. The new driver with the fix will change distribution from 1:1 to some other ratio in favor of the 6pin, but it's still at the limit based on the testing done, so even a small OC should make it go back over spec on the PCIe.

It's weird though, that xfx card has an 8pin, it makes no sense. Meh. Did I misunderstand something here? Are we really gonna see Aib cards with 8pin connectors on reference boards with the exact same issue?
Sounds like XFX will just redo the bios since the VRMs are programmable proven through the driver. The question then becomes will the driver override the bios? Are the drivers looking for specific bios versions to apply the corrections? ( I expect that to be the case)
 
Sounds like XFX will just redo the bios since the VRMs are programmable proven through the driver. The question then becomes will the driver override the bios? Are the drivers looking for specific bios versions to apply the corrections? ( I expect that to be the case)
There will more than likely be a flag that will tell it when to switch the VRM's.
 
Back
Top