Nvidia/RTG Powergate

No I didn't He said at stock clocks it wasn't a cause for concern, only when overclocking. What did I mis read about that?


Well for one thing xfx is already selling a 1328mhz version, and pcper and Tom's reported 100w with just 1310

The cause for concern starts at 95w according to pcper, and it's doing 86 at stock, 14% above spec in terms of power on 12v.
 
At stock speeds the card shouldn't break the ATX spec. For overclocking one can expect this to happen and the onus is on the end user to know the risks of overclocking a reference card. AMD could just have easily added an 8 pin connector to mitigate the issue.

Why does everything come as an afterthought to hardware designers at AMD.
 
Well for one thing xfx is already selling a 1328mhz version, and pcper and Tom's reported 100w with just 1310

The cause for concern starts at 95w according to pcper, and it's doing 86 at stock, 14% above spec in terms of power on 12v.

I see 80.5 at stock not 86? Am I wrong?

upload_2016-6-30_16-6-12.png
 
At stock speeds the card shouldn't break the ATX spec. For overclocking one can expect this to happen and the onus is on the end user to know the risks of overclocking a reference card. AMD could just have easily added an 8 pin connector to mitigate the issue.

Why does everything come as an afterthought to hardware designers at AMD.

But like someone said some nvidia cards do, but it only matters when amd does it? or we should just brush that aside and because they are older cards and this is new?
 
But like someone said some nvidia cards do, but it only matters when amd does it? or we should just brush that aside and because they are older cards and this is new?

Two wrongs don't make a right. Both should be equally investigated.

Too bad for AMD but this time they got caught with their pants down. Shit just happens and they seem to be in the middle of the storm. I pity for them but feel they should pay for it so they don't repeat their mistakes again.
 
But like someone said some nvidia cards do, but it only matters when amd does it? or we should just brush that aside and because they are older cards and this is new?
Again, it's about sustained vs spikes
 
OMG... Way much don't ya think??? Not a poor mans crossfire card... Move the fuck on?.?.?
 
Watch this is just going to go away in a week when the AIB cards come out and people are all over them getting tested.

What AMD is going to do is throttle the boost clock where it will boost less to stay below 150w. Problem solved.

They will be like boost clock is boost clock, it ain't guaranteed, because then the main focus will be after market cards anyways.
 
Watch this is just going to go away in a week when the AIB cards come out and people are all over them getting tested.

What AMD is going to do is throttle the boost clock where it will boost less to stay below 150w. Problem solved.

They will be like boost clock is boost clock, it ain't guaranteed, because then the main focus will be after market cards anyways.

#ThrottleGate
 
#ThrottleGate

how so? Isn't the point of boost clocks so they can adjust according to usage? So I can call nvidia throttle gate too because they certainly do that. I have seen nvidia implementing it pretty good to stay under tdp. Seems like in certain situations where this card is getting close to 150 they can boost less to around 1220-1230. Should take care of everyone going crazy.

I don't think this card is adjusting that well. It seems to try to stay at boost clock speeds all the time. They can easily relax those settings.
 
Last edited:
Apparently undervolting improves performance as well as reducing consumption
 
Watch this is just going to go away in a week when the AIB cards come out and people are all over them getting tested.

What AMD is going to do is throttle the boost clock where it will boost less to stay below 150w. Problem solved.

They will be like boost clock is boost clock, it ain't guaranteed, because then the main focus will be after market cards anyways.

What if pricing reach near $280-300 for AIB cards? No one will care.
Any AIB 480 within $100 of a GTX 1070 is a bad buy. Hell, a 480 maybe a bad buy altogether in two weeks after the 1060 launch.
 
What if pricing reach near $280-300 for AIB cards? No one will care.
Any AIB 480 within $100 of a GTX 1070 is a bad buy. Hell, a 480 maybe a bad buy altogether in two weeks after the 1060 launch.

meh you cant say that without knowing how far they are clocked. I would have to await judgment on that.
 
Apparently undervolting improves performance as well as reducing consumption

hmm say what? Any link? So it seems like it is not adjust voltage properly with their new adaptive voltage thingi? could that be? May be not working as efficiently as they would like?
 
What if pricing reach near $280-300 for AIB cards? No one will care.
Any AIB 480 within $100 of a GTX 1070 is a bad buy. Hell, a 480 maybe a bad buy altogether in two weeks after the 1060 launch.

Let's say ASUS or XFX drops a 8pin card with better components and cooler. The reference 8GB cards are going for $240-260.
So we know $260-280 is a real possibility. Even at 1400Mhz it still hasn't reached 980 levels, you will need 1450MHz+ for that. The power draw would be insane.

Also, a beefed up $280-300 card is a far cry from the initial $200 MSRP card AMD tried to sell us, and I can't really call it a budget card anymore. The power draw to performance will not yield any benefits to the users with budget PCs.
 
meh you cant say that without knowing how far they are clocked. I would have to await judgment on that.

32 ROPs vs. 64 ?

1070 can be clocked even higher if that counts for something. So it's a valid argument to make that AIBs shouldn't charge within a 100$ of a 1070.
 
AMD's gonna have a CLA up its ass if it decides to release a BIOS to throttle power usage by reducing performance. This will be identical to the 3.5GB fiasco, but worse, since flashing a VBIOS is risky and technically voids warranty, and AFAIK this capability is not in drivers.

So basically AMD's gonna have to do a recall, or risk losing certification, which means not getting into big OEMs.
 
AMD's gonna have a CLA up its ass if it decides to release a BIOS to throttle power usage by reducing performance. This will be identical to the 3.5GB fiasco, but worse, since flashing a VBIOS is risky and technically voids warranty, and AFAIK this capability is not in drivers.

So basically AMD's gonna have to do a recall, or risk losing certification, which means not getting into big OEMs.

LOL recall? They can easily adjust the boost clocks. Boost clocks are not promised remember? I think you are just thinking way ahead of yourself. They can easily say the boost clocks were adjusting according to power draw, bamn end of story.
 
LOL recall? They can easily adjust the boost clocks. Boost clocks are not promised remember? I think you are just thinking way ahead of yourself. They can easily say the boost clocks were adjusting according to power draw, bamn end of story.

Well... Depends what you mean, the boost clocks have been announced. They are promised, if they suddenly make boost clock 1200 that won't be cool.

You're also forgetting the overclocked versions.XFX sells a 1328mhz one out the box. According to PCPer and Tom's, that's drawing 100W or more from the 12V alone, enough to damage something
 
So can it be concluded that AMD was caught cheating.

I think there's definitely a deliberate aspect to this. AMD is not that naive or stupid to get this issue unresolved. The more I read, the more it becomes clear that they thought no one will catch this.
 
Makes me wonder about those rumors of AIB being unhappy. I can see why they might be unhappy
 
Since this now seems to be an issue with both companies. What are peoples take on Both Nvidia and RTG Failing the PCI-e specs. Regardless of spikes or usage. Both companies make cards that go over the PCI-e spec.

AMD Radeon RX 480 8GB Power Consumption Results (RTG 480)

Power Consumption: Gaming - GeForce GTX 750 Ti Review: Maxwell Adds Performance Using Less Power (750ti Maxwell)

Tomshardware complains about RTG having the issue, but totally ignored the 750ti from doing it as well.

So thats shows biased from Tomshardware sure we can ignore that part.....

What I want to know since we know both companies do it. How do people really think?
I think trying to dig up dirt on a card released a really long time ago is frankly a pathetic defense.
 
So can it be concluded that AMD was caught cheating.

I think there's definitely a deliberate aspect to this. AMD is not that naive or stupid to get this issue unresolved. The more I read, the more it becomes clear that they thought no one will catch this.

Nvidia lied to you for months about 4gb of ram and sold you 3.5 for 4gb. I didn't see it being an issue or people cooking that shit up and suing nvidia, nvidia told everyone to shut the fuck and up and move on and everyone did lol. They both do it. Caught cheating for what though? looks like the card uses too much power at the max voltage adjusting that or adjust the boost clocks will be fine. There is no cheating if the boost clocks are not adjusting according to power.
 
I think trying to dig up dirt on a card released a really long time ago is frankly a pathetic defense.

what? not if it supports someones argument that it violates the spec just like everyone's claiming rx 480 does. Fair is Fair, no? So you throw away evidence just because its old?
 
Makes me wonder about those rumors of AIB being unhappy. I can see why they might be unhappy

Then why did xfx decides to OC a card with reference cooler? That looks me like all on XFX if they knew about it and still decided to OC it.
 
Nvidia lied to you for months about 4gb of ram and sold you 3.5 for 4gb. I didn't see it being an issue or people cooking that shit up and suing nvidia, nvidia told everyone to shut the fuck and up and move on and everyone did lol. They both do it. Caught cheating for what though? looks like the card uses too much power at the max voltage adjusting that or adjust the boost clocks will be fine. There is no cheating if the boost clocks are not adjusting according to power.

Again, two wrongs don't make a right. And this is a really pathetic defense. People still quote the vram fiasco and so should they do so with 480s power gate issue.

It's either cheating or utter stupidity. Either ways doesn't bode well for AMD. They are going to pay one way or another.
 
what? not if it supports someones argument that it violates the spec just like everyone's claiming rx 480 does. Fair is Fair, no? So you throw away evidence just because its old?
No the fact it exists is not a defense of AMD; it is like one kid saying the other kid stole a cookie too. It is misdirection also it is really old, so irrelevant to what AMD is doing now. Know if the 1060 releases doing the exact same thing, yeah we should be looking into that.
 
Again, two wrongs don't make a right. And this is a really pathetic defense. People still quote the vram fiasco and so should they do so with 480s power gate issue.

It's either cheating or utter stupidity. Either ways doesn't bode well for AMD. They are going to pay one way or another.

I am just saying. Nvidia always gets a pass from fanboys and everyone wants to burn amd! I understand everyone is mad because they don't have an highend. But just cuz someone points out unfair criticism and people taking it up the ass when it comes to nvidia fucking you over but when this card goes over spec, we all forget that this isn't the first or the last card that will do that.

Just cuz two wrongs don't make a right true, but no one should get a pass over it. Not nvidia not amd. Nvidia literally lied about the ram on gtx 970, straight up and fanboys tried to argue it doesn't matter, not a big deal!
 
No the fact it exists is not a defense of AMD; it is like one kid saying the other kid stole a cookie too. It is misdirection also it is really old, so irrelevant to what AMD is doing now. Know if the 1060 releases doing the exact same thing, yeah we should be looking into that.

Ofcourse I am just pointing out the passes nvidia gets to be honest. Everyone is on a witch hunt. Wait a week and see what they do to fix it but what I said is whats going to happen. Relaxing the boost clocks a little when it gets close to 150w or relaxing the voltage a notch should easily fix it.
 
Nvidia lied to you for months about 4gb of ram and sold you 3.5 for 4gb. I didn't see it being an issue or people cooking that shit up and suing nvidia, nvidia told everyone to shut the fuck and up and move on and everyone did lol. They both do it. Caught cheating for what though? looks like the card uses too much power at the max voltage adjusting that or adjust the boost clocks will be fine. There is no cheating if the boost clocks are not adjusting according to power.

I thought they were sued over the 970?

Nvidia lawsuit over GTX 970

And it was a pretty big deal around here...

The 480 power issue is a possible issue because overcurrenting your Mobo isn't the best thing to do. Hell my Asus Rampage has an extra power connection for multiGPU to help power the PCIe slots. What happens with the budget mobos that these cards will go into?

I'd also have to wonder if the cards are getting clean power off the PCIe slots. I guess it shouldn't matter since the VRMs should clean it up.

I still think it was just a fuck up, which they can probably fix, since I doubt anyone in their right mind would overcurrent the mobo over the 6 pin.
 
I thought they were sued over the 970?

Nvidia lawsuit over GTX 970

And it was a pretty big deal around here...

The 480 power issue is a possible issue because overcurrenting your Mobo isn't the best thing to do. Hell my Asus Rampage has an extra power connection for multiGPU to help power the PCIe slots. What happens with the budget mobos that these cards will go into?

I'd also have to wonder if the cards are getting clean power off the PCIe slots. I guess it shouldn't matter since the VRMs should clean it up.

Meh its happened before. They will plug it with a driver quick. Just watch. you might see slight difference in performance on games that tax the gpu more because it will probably boost less. But I doubt it makes a difference at 1080p
 
I don't know what XFX are thinking. The fact is both PCPer and Tom's tested with OC @ 1310~ and they reported 100W on the 12V, so ... If you know anyone who bought an XFX 480 BE tell them to video their first gaming session. Should be electrifying
 
So can it be concluded that AMD was caught cheating.

I think there's definitely a deliberate aspect to this. AMD is not that naive or stupid to get this issue unresolved. The more I read, the more it becomes clear that they thought no one will catch this.


One person was bring attention to AMD pushing the clocks hard before launch:

Kyle Bennet:
Because AMD is trying to buy enough time to try and get the clocks up on production GPUs.
This could have been reason why it took a whole month for them to bring the card to market and the whole reason why the cards run out of spec. A possible desperate attempt to raise and stabilize the boost clocks to at least come close a GTX 970 in overall performance.
 
Since this now seems to be an issue with both companies. What are peoples take on Both Nvidia and RTG Failing the PCI-e specs. Regardless of spikes or usage. Both companies make cards that go over the PCI-e spec.

AMD Radeon RX 480 8GB Power Consumption Results (RTG 480)

Power Consumption: Gaming - GeForce GTX 750 Ti Review: Maxwell Adds Performance Using Less Power (750ti Maxwell)

Tomshardware complains about RTG having the issue, but totally ignored the 750ti from doing it as well.

So thats shows biased from Tomshardware sure we can ignore that part.....

What I want to know since we know both companies do it. How do people really think?

I say give AMD a pass since nVidia did it first.

Did I win?
 
what? not if it supports someones argument that it violates the spec just like everyone's claiming rx 480 does. Fair is Fair, no? So you throw away evidence just because its old?
i really don't want to defend Nvidia in ramgate, but here goes. The card had 4gb of ram period no lie there, end of story; what they did was use slower ram for the last 512mb which certainly is misleading but is not the same as what AMD may be doing here.
 
Back
Top