AMD Radeon RX 580 PowerColor Red Devil Golden Sample

That they got the clocks significantly higher on Polaris on the new node. I would hope that would transfer over to Vega? I'm no razor1 though, I just casually follow video cards.

Unless they are basically just OC'ing it stock... sounds like a little bit of both (process improvement/OCing)

The voltage has gone up considerably. The process hasn't improved all that much. I guess the chips can take the extra voltage, but it's not like this was a "free" upclock. I wouldn't call it stock OCing (except in the AIBs that advertise that deliberately) so much as AMD taking advantage of every ounce of minor process improvement they can.

It says nothing about Vega one way or another. Totally different uarch.
 
I tell yea really not all too shocked at this. So they did re-spin the GPU eh?.....Interesting wonder how much farther they can overclock.

Good article thanks Kyle.
 
These seem to be rather affordable ways to get GPUs with beefy amounts of VRAM. I like that part! DOOM alone is... demanding of such things.
 
Looks over at my r9 390 setting on the table -

"Looks like your still in the same category as the top dog of your family buddy."

Still no regrets on getting that card 2 years ago. Amd wants to keep renaming and tweaking their cards then good for them. They are great cards. But really, we want something new here guys.
 
That they got the clocks significantly higher on Polaris on the new node. I would hope that would transfer over to Vega? I'm no razor1 though, I just casually follow video cards.

Unless they are basically just OC'ing it stock... sounds like a little bit of both (process improvement/OCing)
Yeah....that has nothing to do with Vega architecture.
 
AMD better get Vega here quick, even if it is slower than Pascal (who knows?). They are really hurting here.

Sometimes I wonder if all this trouble over at AMD started when they tried to be both a CPU and a GPU company. Maybe they'd have been better off to stick with making CPUs and let the ATI division operate more or less independently. Seems like they have good ideas in both markets, but just fail to execute on time.

I don't what independently means, but the graphics division was reorganized in 2015 into Radeon Technologies Group under Raja Koduri.

The newly formed Radeon Technologies Group will be responsible for the graphics technology used in discrete GPUs, APUs, and semi-custom products like the chips used in the Xbox and Playstation. In his new role, Koduri will oversee everything from hardware and software development to product management, marketing, and developer relations.

http://techreport.com/news/29003/amd-shakes-up-radeon-biz-gives-koduri-full-responsibility

http://www.anandtech.com/show/9612/...ion-radeon-whole-once-more-led-by-raja-koduri

The anandtech article talks about how AMD integrated ATI by separating its divisions throughout the company. This particular passage was really interesting to me.

Though I imagine most long-time AMD followers understand just what "graphics being an integral part of AMD" means at a high level, it wasn’t until this announcement that even I truly understood just how spread out AMD’s various graphics-related sub-groups have been. Between the various groups, AMD has had departments reporting to CTO Mark Papermaster, CVP of Global Marketing John Taylor, CVP and GM of graphics Matt Skynner, VP of Visual Computing Raja Koduri, and other executives within the AMD structure. The end result is that graphics is truly everywhere within AMD, but at times it is also nowhere.

Edit: Now, RTG under Koduri reports to CEO Lisa Su.
 
I don't what independently means, but the graphics division was reorganized in 2015 into Radeon Technologies Group under Raja Koduri.

Didn't know that. But that sounds potentially promising.

http://techreport.com/news/29003/amd-shakes-up-radeon-biz-gives-koduri-full-responsibility

http://www.anandtech.com/show/9612/...ion-radeon-whole-once-more-led-by-raja-koduri

The anandtech article talks about how AMD integrated ATI by separating its divisions throughout the company. This particular passage was really interesting to me.

And this might explain some of the problems AMD has had from a GPU side.
 
Yea I really don't get this card at all. Like others stated, calling it a 485 would have made sense, but giving it a new number does not. The slightly lower MSRP means nothing to me, because you can probably find RX480s for lower than you can 580s. So then it comes down to whether you already have an RX480 that can perform the same as this new card. I'm guessing some will.

If I were looking for a budget card, I just don't see the appeal of this card. You can get a GTX 1060 which is faster, costs about the same, and only requires 1 6 Pin PCI-E connector. The power consumption difference between the two is not even close, and you're not gaining anything from the 580 other than being a space heater.
 
If I were looking for a budget card, I just don't see the appeal of this card. You can get a GTX 1060 which is faster, costs about the same, and only requires 1 6 Pin PCI-E connector. The power consumption difference between the two is not even close, and you're not gaining anything from the 580 other than being a space heater.

To be fair, to some extent with the 480, and definitely with the 580, the Radeon is faster in DX12 than the 1060. The 1060 wins in DX11. So if you're playing a lot of newer games that take advantage of DX12, the Radeon is the better choice on pure performance, and probably better from a future-proofing perspective too. If you mostly play older stuff and intend to swap the card in a year or so, the 1060 is a better buy. This card definitely has a comfortable place in the market, and a compelling argument for buying it over a 1060. It's just that AMD's labeling of it as a "580" was kind of silly. 485 or 480XT (as Kyle suggested), or something similar would have been better.
 
Didn't know that. But that sounds potentially promising.



And this might explain some of the problems AMD has had from a GPU side.

If I remember correctly, RTG revamped gpu drivers in November 2015, and hired Andrej Zdravkovic, a ATI veteran as "Corporate VP of Software and Platform Engineering" to make better drivers. Its been working pretty well ever since, AMD's driver game has been reaching parity with Nvidia since then.

Edit: A interview with the guy.
 
The GPU-Z screenshot shows power peaking at about 270W on the video card. TPU shows the 1080 Ti peaking at 267W.

But if you want a direct comparison in testing methodologies, here is TPU's chart for the Sapphire RX 580 Nitro+:
View attachment 22286

I think the average is more interesting and indicative of real world performance.
View attachment 22287
https://www.techpowerup.com/reviews/Sapphire/RX_580_Nitro_Plus/28.html

They measure power draw for the video card directly from the slot and power connectors.

Don't need to go anywhere else. *Everything* else about the test system is exactly the same except for the video card for both of the [H] reviews. Power measurements between the two are absolutely comparable. The FE Ti consumes substantially more power (and it should. 25% more power for what...125-150% more performance? Easily worth it). No need to pick points from various other reviews. It's available directly, right here.
 
I kinda feels for my 750 watt power supply with this card oc'd along with a 1700 or 1800 oc'd...

At least amd stuff idles low, which the system will be at 99% of the time.
 
If I remember correctly, RTG revamped gpu drivers in November 2015, and hired Andrej Zdravkovic, a ATI veteran as "Corporate VP of Software and Platform Engineering" to make better drivers. Its been working pretty well ever since, AMD's driver game has been reaching parity with Nvidia since then.

Edit: A interview with the guy.

Well, let's hope this Vega project isn't a failure. I'm skeptical on that, naturally. But maybe if they've a better team of folks working on it this time around...

I guess we'll see. This is a critical time for AMD, though. If they fail here... they could be done. And while I've generally preferred Nvidia cards (though I've had my share of Radeons too), I'd hate to see the industry become a monopoly.
 
Assuming that's true, why is Vega so late? AMD is going to be in the same position again as soon as NVIDIA launches Volta, which is having nothing in the high end space.
 
Honestly, sir, I suspect they need Vega to be more impressive than that. I mean, I guess, if they price it right, that could sell for a while. But they really need a solid performance win somewhere. If they could at least match the 1080 Ti, I think Team Red would load up on Vegas, and AMD could make a tidy profit.
He said where he THINKS Vega will land. Obviously, AMD needs it to be as fast as possible, but that may not be how the chips actually land.
 
He said where he THINKS Vega will land. Obviously, AMD needs it to be as fast as possible, but that may not be how the chips actually land.

Of course. I didn't dispute what he said, I only pointed out that AMD really needs more than that. In other words, he's probably right (after all, Kyle is a lot smarter than I am), but I hope for AMD's sake that he's wrong.
 
Hmm... Not a huge boost, but better factory OC than I've seen on the 480. My question is do these new polaris revisions OC further than the original stock? I've been able to get an easy 1400 MHz on pretty much any non-reference 480 and 470 I've laid hands on, so I'd like to see if that will change with the 500 series...

You must be lucky. I have never gotten above 1350-1375. Heck I like to torture my shit for stability and I never felt comfortable. my rx 470 in my other rig is rock stable at 1340. Anything higher it may run but its not really stable.
 
Compared to two 290x these are power mizzers :pompous:. Depending upon the system what I would choose here, SFF the 1060, mid to full tower the 580. To me it looked like the 580 did outperform but not in a very noticeable way unless you constantly play Serious Sam VR. Got the second Encounter and it is better than the first in VR. Supposedly Serious Sam 3 VR is right around the corner - Awesome. Now All the Serious Sam VR's do CFX/SLI VR - would be interested to see how the Radeon does in CFX VR - for future high definition type VR will probably need two to work well. Nvidia SLIVR is kinda weak from my experience.

As for Vega, who knows, if it OC well (damn the power) over default it may be a great performer in the end.
 
GPUz power draw is not reliable thing to go by. I would rather look at the average then max. It will record any spikes in power. Your average may be less than 200 but due to spikes you are stuck with one maximum. My rx 470 it will record spikes up to 160-170w or even 200w but on average it will stay around 120-130w or so.
 
Some of these cards are not priced bad a t all though. Gigabyte version is $260. It runs at 1440 boost stock as well.
 
the Reprojection % numbers impressed me in the article. AMD managed to reach sub 1% reprojection in most games with just a clock speed bump.

Am i missing something?
 
I'm not sure I understand people "not getting" this card....?

It's a slight refresh, with a new number, a little faster, at the same price as the old model....it was never intended to be a new architecture or process. It gives them a few months of having the fastest mid-range card before NV releases their refresh. Makes perfect sense to me.
 
at $229 it seems pretty damn solid imo. May get one for the wife's rig. That massive voltage jump tho makes me curious to if it can be undervolted at all and remain stable to drop that power consumption

Might as well buy a 480 then..
 
I'm not sure I understand people "not getting" this card....?

It's a slight refresh, with a new number, a little faster, at the same price as the old model....it was never intended to be a new architecture or process. It gives them a few months of having the fastest mid-range card before NV releases their refresh. Makes perfect sense to me.

The card itself is a fine product. The name/number AMD's marketing moonies came up with was silly, however.
 
the Reprojection % numbers impressed me in the article. AMD managed to reach sub 1% reprojection in most games with just a clock speed bump.

Am i missing something?

You really want 1070 performance or better so you can crank AA or resolution otherwise it's hard to even read text.
 
I think this is where most people expect it to be. I'm hoping for Vega to be really close to 1080Ti, but at 1080 price.

If it's performs closer to a gtx 1080 ti, it will be priced like it. Not the gtx 1080. This is absolutely certain.

Gtx 1080 ti performs more than 100% faster than a rx580.

Thus, if vega performed that good, the rx 580 would have to be a lower price.

Flagship products always have the worst price to performance. Vega being priced at 499 for double the performance is just too low of a price and makes the value proposition of the rx 580 look bad.

An example of this is the 280x vs 290x. 25% percent better performance for 83% more money. Same story with Fury x to 390x. A product with 100+% more performance is going to be alot more than just double the price.

So if it has gtx 1080 ti performance expect the pricing to be $650+. Plus look at the die size, HBM, + water cooling. At $499, your just not making money.
 
I'm in for two at between 1080 and 1080ti performance.

Same here. I just want something within the ballpark of 1080 Ti performance that I can take advantage of FreeSync with. As long as it's like 80-90% of what the 1080 Ti offers in performance I would be very happy with that. And if they go beyond that, even better.
 
You really want 1070 performance or better so you can crank AA or resolution otherwise it's hard to even read text.

what i meant is that in every single VR game tested the 580 had 50% to 2700% less reprojection % than 480. i did not understood these results with just a 10% clock bump.
 
The only way this "rebrand" would've been acceptable is if AMD released VEGA with it. Tired of Rebrandeon not even trying to compete on the GPU side.
 
returning my rx 470 to best buy as soon as I get the rx 580. Pulled the trigger on Gigabyte AORUS for 260. Will let you all know how it goes.
 
what i meant is that in every single VR game tested the 580 had 50% to 2700% less reprojection % than 480. i did not understood these results with just a 10% clock bump.

Reprojection happens if the frame rate drops under 90 FPS. If you're borderline it doesn't take much extra to stay above it. 10% is around 10FPS. So if you're at 110 avg vs 100 (or what have you) your dips won't go under the 90 mark nearly as much.
 
Last edited:
So I should pretty much still buy an r9 fury for $240 over this. Damn amd.

No way, if [H] actually included Fury results in those benches, you will find that RX 480 + 10% is about where Fury is at. There's also major outliers for Fury performance, typically NV sponsored games run badly on them, like Mafia 3, Dishonored 2 (RX 480 is faster than Fury X) and Mass Effect Andromeda. Basically Fury X performance can be gimped badly with lots of Tessellation, but Polaris based GPUs don't care about Tessellation.
 
Soooo, a fair question for Kyle:

The PowerColor Red Devil RX 580 Golden runs at a consistent boost of 1425MHz and has a list price of $269, yet didn't get an award.

What exactly were your price/performance criteria/expectations for this card in order for it to have received either a Bronze, Silver, or Gold award?

From the outside looking in some readers could fairly conclude you already had a preconceived opinion before the first test was even run.
 
Last edited:
Back
Top