[TPU] Radeon RX 480 Cards Can Successfully be Flashed to RX 580

Araxie

Supreme [H]ardness
Joined
Feb 11, 2013
Messages
6,463
Essentially, this may allow you to bypass some artificial overclocking limitation with your graphics card, probably by increased voltages on different power states of the card. You should do this at your own risk, and remember, the only guaranteed way of getting an RX 580 is... you guessed it, buying an RX 580. However, this might also give you an extra performance boost, and free performance is always good, right?

RX 480 can successfully be flashed to RX 580

So, Any owner of RX 480 here at [H] wanting to do some fun?...
 
I'm more concerned about the PCI-E power draw than anything which has probably been completely fixed on the RX 580.

Yep and they have when you look at Tom's power distribution between gaming and then also torture test, the PCIe motherboard current is now pretty stable and within margins.
And importantly this is for a 1450MHz 580 model.

aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS8wL0UvNjY4NzUwL29yaWdpbmFsLzA3LUJvb3N0LU1vZGUtR2FtaW5nLVBvd2VyLUNvbnN1bXB0aW9uLnBuZw==



aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS8wL0MvNjY4NzQ4L29yaWdpbmFsLzA5LUJvb3N0LU1vZGUtVG9ydHVyZS1Qb3dlci1Db25zdW1wdGlvbi5wbmc=



aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS8wL0gvNjY4NzUzL29yaWdpbmFsLzEwLUJvb3N0LU1vZGUtUEVHLVV0aWx6YXRpb24ucG5n


So yeah looks much better with the 5xx.

Cheers
 
That in itself probably helped get the clocks higher. GCN has just run its course, its really time for a new architecture.
 
  • Like
Reactions: NKD
like this
Would this even be reliable without the extra power cable for the card? I'm not sure I'm [H] enough to risk it.
 
I would not bother wasting time like this, for a couple of hundred megahertz ?
Then again I am just a "Limp Gawd" so i may not have valuable input.
 
A lot of people wonder why we didn't see more than 2300 shaders on polaris. I think AMD Knows last of these Shaders and current gen GCN and they probably realized a while back they had to make some changes to CUs. There is a reason Vega presentations talk about designed for higher IPC and Higher clocks, giving it a new name NCU. AMD could have easily tried more Shaders and have a rx 490 or 590 but I think they didn't want to waste anymore time on Polaris trying to turn it in to high end where it would have deminishing returns.
 
Last edited:
A lot of people wonder why we didn't see more than 2300 shaders on polaris. I think AMD Knows last of these Shaders and current gen GCN and they probably realized a while back they had to make some changes to CUs. There is a reason Vega presentations talk about designed for higher IPC and Higher clocks, giving it a new name NCU. AMD could have easily tried more Shaders and have a rx 490 or 590 but I think they didn't want to waste anymore time on Polaris trying to turn it in to high end where it would have deminishing returns.

The name NCU is pretty much slideware. AMD refers to it as GCN5.

Polaris in the Scorpio console got 44CUs (2816SP) with 40 enabled (2560SP).

The reason is more likely that AMD cant afford to make a bigger Polaris chip on their own. Also it would likely kill Vega 11 directly and put Vega 10 in a very bad spot if we imagined a 44 or more CU Polaris with 384bit bus. Internal conflict of interest as well.
 
The name NCU is pretty much slideware. AMD refers to it as GCN5.

Polaris in the Scorpio console got 44CUs (2816SP) with 40 enabled (2560SP).

The reason is more likely that AMD cant afford to make a bigger Polaris chip on their own. Also it would likely kill Vega 11 directly and put Vega 10 in a very bad spot if we imagined a 44 or more CU Polaris with 384bit bus. Internal conflict of interest as well.

Spot on!!
 
The name NCU is pretty much slideware. AMD refers to it as GCN5.

Polaris in the Scorpio console got 44CUs (2816SP) with 40 enabled (2560SP).

The reason is more likely that AMD cant afford to make a bigger Polaris chip on their own. Also it would likely kill Vega 11 directly and put Vega 10 in a very bad spot if we imagined a 44 or more CU Polaris with 384bit bus. Internal conflict of interest as well.
That is too bad vice using Vega. Another skipped console for me. Let me guess Bulldozer special cores too.
 
That is too bad vice using Vega. Another skipped console for me. Let me guess Bulldozer special cores too.

It uses evolved Jaguar cores at 2.3Ghz. Why is it too bad it doesn't use Vega. What if Vega barely bring anything to the table besides compute features? Just a waste of die space then.

The Scorpio also got draw calls offloading to the GPU/DSP with a feature MS got added. Something you wont see in PC products unfortunately for the time being. But a clever way to solve a software problem with hardware.
 
Last edited:
It uses evolved Jaguar cores at 2.3Ghz. Why is it too bad it doesn't use Vega. What if Vega barely bring anything to the table besides compute features? Just a waste of die space then.

The Scorpio also got draw calls offloading to the GPU/DSP with a feature MS got added. Something you wont see in PC products unfortunately for the time being. But a clever way to solve a software problem with hardware.
Well it kinda hints at Vega is not the cats meow. I guess we will find out shortly.
 
Well it kinda hints at Vega is not the cats meow. I guess we will find out shortly.

Not really, the scorpio chip was in development for a while, a few years most likely, chances are Vega was just a pipe dream at the time the contracts were signed and the hands were shaken.
 
If you want to flash one, it better not be reference. There is nothing H in doing that, just plain stupidity due to the power limitations.
 
If you want to flash one, it better not be reference. There is nothing H in doing that, just plain stupidity due to the power limitations.
yea im thinking the triple fan Asus and Devil models would be ideal for these experiments.
 
lol yea either be comfortable with blind flashing to fix it or make sure it has dual bios before considering doing this.

Blind flashing? If you experiment with video cards a lot you generally have a couple lying around for just such situations.... a PCiE x1 ---->x16 connector with power can be your very best friend when you have limited slots. I Imagine guys with single slot mini ITX builds and no built in GPU on their procs or multi GPU might be flashing blind, but that's all that comes to mind at the moment.

I haven't been without a spare video card, well, since 1985 or so.

I do agree with others though, there's a always a risk of cooking whatever you experiment on, if you're not willing to take the risk, don't play the game.
 
Back
Top