AMD's Statement On Reports Of Radeon RX 480 Excessive Power Draw

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
We promised an update today (July 5, 2016) following concerns around the Radeon™ RX 480 drawing excess current from the PCIe bus. Although we are confident that the levels of reported power draws by the Radeon RX 480 do not pose a risk of damage to motherboards or other PC components based on expected usage, we are serious about addressing this topic and allaying outstanding concerns. Towards that end, we assembled a worldwide team this past weekend to investigate and develop a driver update to improve the power draw. We’re pleased to report that this driver—Radeon Software 16.7.1—is now undergoing final testing and will be released to the public in the next 48 hours.

In this driver we’ve implemented a change to address power distribution on the Radeon RX 480 – this change will lower current drawn from the PCIe bus.

Separately, we’ve also included an option to reduce total power with minimal performance impact. Users will find this as the “compatibility” UI toggle in the Global Settings menu of Radeon Settings. This toggle is “off” by default.

Finally, we’ve implemented a collection of performance improvements for the Polaris architecture that yield performance uplifts in popular game titles of up to 3%. These optimizations are designed to improve the performance of the Radeon RX 480, and should substantially offset the performance impact for users who choose to activate the “compatibility” toggle.
AMD is committed to delivering high quality and high performance products, and we’ll continue to provide users with more control over their product’s performance and efficiency. We appreciate all the feedback so far, and we’ll continue to bring further performance and performance/W optimizations to the Radeon RX 480.
 
The 8gb card looks like a decent card for 1080.
200_s.gif


Glad I am a late, late, late adopter.. don't have to worry about companies messing up their product launches.

1403296863923
 
I vaguely remember overclocking AGP ports back in the day. So dumb question of the day...Has anyone (forum sites etc, not nimrods from some troll post) had any issues with the standard RX 480? I assumed there was always some wiggle room in specs just for these kinds of issues. Seems like the card couldn't draw more juice than the slot allows. Like hooking up 300 watt speakers to a 100 watt amp. The speakers just wont play as loud. I'm sure it is something stupid I am not understanding, but I haven't really seen anything explaining in terms that I understand.
 
I vaguely remember overclocking AGP ports back in the day. So dumb question of the day...Has anyone (forum sites etc, not nimrods from some troll post) had any issues with the standard RX 480? I assumed there was always some wiggle room in specs just for these kinds of issues. Seems like the card couldn't draw more juice than the slot allows. Like hooking up 300 watt speakers to a 100 watt amp. The speakers just wont play as loud. I'm sure it is something stupid I am not understanding, but I haven't really seen anything explaining in terms that I understand.


Let us know when you've found a voltage/power regulator for the PCI-E slot power
 
Finally, we’ve implemented a collection of performance improvements for the Polaris architecture that yield performance uplifts in popular game titles of up to 3%. These optimizations are designed to improve the performance of the Radeon RX 480, and should substantially offset the performance impact for users who choose to activate the “compatibility” toggle.

So, the "minimal performance impact" caused by the compatibility toggle can be "substantially offset" by improving game performance by up to 3%?

Is this "substantially" the kind of substantial you talk about when discussing how much faster your new generation of video cards is than the previous generation?
 
How can a driver control the amount of power the card takes from the slot vs the power socket?
 
How can a driver control the amount of power the card takes from the slot vs the power socket?
Same way people in the community have already figured out doing the same thing via bios mods is my guess (there already some info posted in other threads)....Guess well see how effective it is. Overloading the 6 pin is not so much a big deal for most of us. (not to mention i believe amd is using one of the sensing black wires as a load conductor as well)
 
How can a driver control the amount of power the card takes from the slot vs the power socket?

The VRM's are programmable. The bottom three come from the PEG, and the top 3 for the GPU are from the PCIe. Reprogram the VRMs with new values, you have new power distribution...PROVIDED THE OUTPUT IS ON THE SAME POWER PLANE SUPPLYING THE COMPONENTS.

If they aren't all running the same power duty cycle (which this fix appears to do) it will make the top 3 run hotter as they will have a higher cycle duty. But this shouldn't be an issue as they VRM's are rated at 100 Watts, so they have some headroom.

It appears the board had put an extra VRM on the PEG which exacerbated the problem.
 
I'll reply when I see an [H] review of the fix. ;)
Looks like the "fix" lowers performance less than 5% and AMD seems to say that the new drivers bump perf less than 5%.

Quite frankly, I see a lot of this as a non-issue. It is likely to only be an issue on ancient systems, and if AMD is willing to take on the liability for that, it will probably not be an enthusiast issue. We have our CrossFire article coming up next week, using the current driver. I do not see doing all that work over again for performance increase that falls inside the margin of error for most tests.
 
GN has posted a video re the incoming patch. should be here by Thursday...

 
The VRM's are programmable. The bottom three come from the PEG, and the top 3 for the GPU are from the PCIe. Reprogram the VRMs with new values, you have new power distribution...PROVIDED THE OUTPUT IS ON THE SAME POWER PLANE SUPPLYING THE COMPONENTS.

If they aren't all running the same power duty cycle (which this fix appears to do) it will make the top 3 run hotter as they will have a higher cycle duty. But this shouldn't be an issue as they VRM's are rated at 100 Watts, so they have some headroom.

It appears the board had put an extra VRM on the PEG which exacerbated the problem.

I don't mean to question you like some of the trolls do since I believe you also hold a double E, but shouldn't the VRMs be rated at 125C? I haven't bother looking at any bare PCB shots to see what components the referance boards sourced, but I am fairly certain they should be rated for 40A @ 125C each...

My last 3 AMD cards all were rated @ 125C, which is why I find it so funny with forum newbs go "Oh Nooooez, my VRMs are 85~90C, they are going to fry themselves!! Damn you AMD for allowing us to read the VRM temperature in GPU-Z...Nvidia said it best, (like Cypher in Matrix) Ignorance IS BLISSSSSSS!!" :p:p:p
 
Provided the driver keeps the PCIE bus under its proper power limit, just how far is it overdrawing the 6-pin?

I'm finding it super hard to believe that anyone would bother worrying about a few extra watts on a 6-pin. It's the MB we're trying to keep from killing here. There's plenty of current headroom in the 6-pin wiring.

(I suppose to answer my own question, the one exception would be business use where you MUST meet all relevant industry specs or you can't use the part. But if whatever driver gets loaded automatically by Windows doesn't select the by-the-book power options no one is going to spec any 480 cards for any mission critical applications.)
 
lots of miners on the various coin forums are pretty upset about this.

However, in the grand scheme of things, the RX 480 hashes very well for its power draw.
 
Provided the driver keeps the PCIE bus under its proper power limit, just how far is it overdrawing the 6-pin?

I'm finding it super hard to believe that anyone would bother worrying about a few extra watts on a 6-pin. It's the MB we're trying to keep from killing here. There's plenty of current headroom in the 6-pin wiring.

(I suppose to answer my own question, the one exception would be business use where you MUST meet all relevant industry specs or you can't use the part. But if whatever driver gets loaded automatically by Windows doesn't select the by-the-book power options no one is going to spec any 480 cards for any mission critical applications.)

It's simple, if the card is currently drawing ~85W from the PCI-E slot and AMD limits it to ~74W, then the card will draw another 11W from the 6pin connector...As Kyle said, it really isn't a big deal (despite what some forum trolls will have you believe) unless you are running an old PCI-E 1.1X system...

I don't own any 480s at the moment but will be getting 3 once gpu blocks are out for them and I have zero worries about the mobo and that will be be a 24/7 mining rig. The fact that 6 pin connectors are limited to 75W by "spec" is insane in this day and age when they can easily pull triple that and still be within rated current draw spec.
 
Everyone is getting this wrong. They fixed the PCIe power issue as a default BUT if you toggle the compatibility it will reduce the PCIe plug overdraw to put the card into spec. So far Gamersnexus and TPU got it wrong.

6 pin connectors have two 12v cables, each cable should be rated for 5.5amps with +8% wiggle for any regular PSU, that tells you that the 75w from the plug that is said on the spec is yet another one of those massive overengineering standards that we love.

12*5.5*2 = 132w *1.08= 142.56 should be the maximum ammount safe to be pulled from a 6pin plug. The 5.5 amp and 8% numbers were given in a different thread directly from a motherboard//psu manufacturer.


The wiggle room is big enough that when overclocking, if people increase the possible allocated power by 20-50% it should still fall safely within the range.
 
Under full performance the card is still out of spec only now on the pcie cable. Amd is doing this to maintain speeds and weasel out of a real fix . The loss of 5 percent or more ? when in compatiblity mode to me means I'm not getting what I paid for. I will be getting a brand new sapphire 8gb tomorrow ordered from Newegg. It will not be opened and I will get a full refund even if I have to dispute charge. You guys might accept this. I didn't pay for a cob job. No thanks AMD.....
 
Under full performance the card is still out of spec only now on the pcie cable. Amd is doing this to maintain speeds and weasel out of a real fix . The loss of 5 percent or more ? when in compatiblity mode to me means I'm not getting what I paid for. I will be getting a brand new sapphire 8gb tomorrow ordered from Newegg. It will not be opened and I will get a full refund even if I have to dispute charge. You guys might accept this. I didn't pay for a cob job. No thanks AMD.....
So you base this final decision on absolutely no proof? Wouldn't it be better to wait and see how the driver affects performance before lighting the torches?
 
Under full performance the card is still out of spec only now on the pcie cable. Amd is doing this to maintain speeds and weasel out of a real fix . The loss of 5 percent or more ? when in compatiblity mode to me means I'm not getting what I paid for. I will be getting a brand new sapphire 8gb tomorrow ordered from Newegg. It will not be opened and I will get a full refund even if I have to dispute charge. You guys might accept this. I didn't pay for a cob job. No thanks AMD.....
Why would you ever run it in compatibility mode though? As long as they fix it so the power draw is over the PCI-E cable then it really doesn't matter. That's what this fix appears to do.
 
Because I was going to use it in a older core 2 quad mother board. Still not sure i would trust it. Honestly this whole deal ticks me off. I won't pay or accept this..my right as a consumer....
 
Investment wise these puppies are amazing....Think about it? If you sold one used right now how much deprecation would it have? Absolutely none! Go ahead send the card back LOL Even with this hole "Power Gate Drama" these cards have lost zero value. They might have actually gained popularity lol
 
Investment wise these puppies are amazing....Think about it? If you sold one used right now how much deprecation would it have? Absolutely none! Go ahead send the card back LOL Even with this hole "Power Gate Drama" these cards have lost zero value. They might have actually gained popularity lol
The Tim Allen more power edition.
 
Investment wise these puppies are amazing....Think about it? If you sold one used right now how much deprecation would it have? Absolutely none! Go ahead send the card back LOL Even with this hole "Power Gate Drama" these cards have lost zero value. They might have actually gained popularity lol

Seems like new GPUs are all the rage this summer. Both nVidia's and AMD's new offerings are in short supply. How much of it is true demand versus short supply I have no idea but I'm sure it's some of both.
 
Because I was going to use it in a older core 2 quad mother board. Still not sure i would trust it. Honestly this whole deal ticks me off. I won't pay or accept this..my right as a consumer....
lol even in compatibility mode there is zero chance a core 2 quad will be even remotely gpu limited....not possible..not even close
 
6 pin connectors have two 12v cables, each cable should be rated for 5.5amps with +8% wiggle for any regular PSU, that tells you that the 75w from the plug that is said on the spec is yet another one of those massive overengineering standards that we love.

12*5.5*2 = 132w *1.08= 142.56 should be the maximum ammount safe to be pulled from a 6pin plug. The 5.5 amp and 8% numbers were given in a different thread directly from a motherboard//psu manufacturer.


The wiggle room is big enough that when overclocking, if people increase the possible allocated power by 20-50% it should still fall safely within the range.

Yeah and 142W is probably conservative IMO. I think around 400W it'd start to get warm.

I'd personally have zero concerns as long as the card is pulling power over the 6 pin.
 
Because I was going to use it in a older core 2 quad mother board. Still not sure i would trust it. Honestly this whole deal ticks me off. I won't pay or accept this..my right as a consumer....


Go buy a Strix GTX 960, oh wait thats out of spec. Go buy a GTX 750ti, wait, that's out of spec too. I am sure we can pick 10 GPUs on Newegg right now and at least 2 of them would be borderline/out of spec in some case. If you keep your case clean and free of dust, have a PCIe 2.0 or higher slot, and a decent PSU its.. not.. an.. issue.
 
It's sounding like this fatal motherboard destroying flaw wasn't nearly as serious as it was portrayed to be.
 
Go buy a Strix GTX 960, oh wait thats out of spec. Go buy a GTX 750ti, wait, that's out of spec too. I am sure we can pick 10 GPUs on Newegg right now and at least 2 of them would be borderline/out of spec in some case. If you keep your case clean and free of dust, have a PCIe 2.0 or higher slot, and a decent PSU its.. not.. an.. issue.

the GTX 960 is still in spec (as pcper shown) tomHW is logging 1ms peaks (which all GPUS do) that made it look like it was way over spec then what it was

Power Consumption Concerns on the Radeon RX 480 | Evaluating the ASUS GTX 960 Strix

out of spec card AMD 480 (the GPU is doing 50/50 split)
Power Consumption Concerns on the Radeon RX 480 | Overclocking, Current Testing

at the bottom of that article how it should work (you notices its keeping the 16x slot power draw around 50W)
amd R9 380
Inline | PC Perspective
 
So all in all a non-issue and again was only something that affected computers with bargain basement crap parts and was easily fixed to calm fears

Just like I said two days ago

Me thinks the nvidia shills be working heavy overtime (And I own a GTX1080 to boot)
 
So all in all a non-issue and again was only something that affected computers with bargain basement crap parts and was easily fixed to calm fears

Just like I said two days ago

Me thinks the nvidia shills be working heavy overtime (And I own a GTX1080 to boot)

This wasn't a huge issue but this card was WAY overhyped for what it is. Sure the performance is good for a new part at this price point. We'll see how the GTX 1060 stacks up soon enough. The thing is, compared to what we've seen in Pascal in the 1080 and 1070, Polaris is just getting its ass kicked in power efficiency and this goes back to that op ed Kyle wrote last month. If the 1060 comes in with the right price and performance and is drawing substantially less power, that's going to be a problem for AMD.
 
So much of a non-issue that Raja had a driver team working over time during 4th of July weekend.

It was certainly more than a non-issue. Just not a big one. AMD has a lot of reputational problems and I think they very well know it. This is the kind of action, even though the issue isn't big that looks very good in the PR column. There was a problem and soon after discovery AMD addressed it. They did exactly what they had to in this case.
 
So much of a non-issue that Raja had a driver team working over time during 4th of July weekend.

When you suddenly out of nowhere have ppl/websites claiming your video card will destroy everyone's computers which said remarks could and possibly will tank your company you're damn right they'll work over the 4th of July to clear their name
 
Whether the power issue is valid or not AMD doesn't need this kind of press. Any Intel fan should understand that without competition, Intel would be a monopoly. That means much longer times between product releases, and much higher prices.

I loved my AMD Barton Athlon 2500+, but since then I've pretty much stuck to Intel.

I know that I'm talking about CPUs, and the kerfuffle is about GPUs, but, for now, it's all AMD.
 
It was certainly more than a non-issue. Just not a big one. AMD has a lot of reputational problems and I think they very well know it. This is the kind of action, even though the issue isn't big that looks very good in the PR column. There was a problem and soon after discovery AMD addressed it. They did exactly what they had to in this case.
I agree, I also don't think anyone was making it out to be a huge deal. These days if someone brings up an issue with a product you are likely to "OMG this is such a non-issue! Stop trying to make it out to be such a big issue! Blah blah."

Either way AMD did the right thing and kept to their own timeline so no one can fault them.
 
Hmmm, I wonder if the new driver will mess up other cards . . .?
 
Back
Top