RX 480 post mortem - the good, the bad, and the ugly.

Here is how how it goes. People see $200 and $229 and they see anothe card at 299. Even if the review says oh its good performance for the money but it uses alot of power. At that point people forget about power trust me. Money is always first!

It depends on that nVidia has with the 1060. I'm certain that it will be priced competitively with the 480 depending on where it lands on performance. If it's a little faster it be a little more expensive. If it's a little slower it will be a little cheaper. And power bills cost money as well. Power efficient products have a big edge. If the power efficiency comparisons between the 480 and the 1070/1080 scales down to the 1060 the 480 is headed for a flop.
 
We will see how much of a "flop" the 480 will be. My local Microcenter sold all 35 cards by 1pm today and I was told there was a line at the door for the cards before they opened. I was like 5min late getting one. It will be interesting to see how many of those are returned, if any. AMD seems to have a good performing product that people can afford. I can only assume Nvidia will follow suite, maybe with the 1060. It will be fun to wait and see. I have been away from serious PC gaming for a while but the new GPU's from both AMD and Nvidia are getting me excited again.
 
We will see how much of a "flop" the 480 will be. My local Microcenter sold all 35 cards by 1pm today and I was told there was a line at the door for the cards before they opened. I was like 5min late getting one. It will be interesting to see how many of those are returned, if any. AMD seems to have a good performing product that people can afford. I can only assume Nvidia will follow suite, maybe with the 1060. It will be fun to wait and see. I have been away from serious PC gaming for a while but the new GPU's from both AMD and Nvidia are getting me excited again.

exactly people care about price. AMD can just block the overclocking on reference cards and put a strict limit of 150w on the board.

I mean people here said the whole time even the ones that are complaining now that you can pull more power from PCI-e slot and its not going to blow up using 10 watts. I don't have the time to search but I read plenty. Its whatever, its cheap let people spend their money, all amd has to do is plug the drivers and setup a warning disclaimer before they overclock. Thats all. I think they already do it when you unlock overclocking. .
 
Hmm, I think getting a custom AIB 970 right now is a better deal if the prices are close.

Most reviews I'm seeing where the 480 wins is against a "stock" 970. Well, most 970s have another 25% or so headroom vs stock.

A $239 8GB stock 480 or a $289/259 after MIR custom AIB GTX970.

We'll have to see how well AIB 480 does, and at what price.
 
Last edited:
No one in the $200 range of video cards gives two shits about power draw. Fastest card for the money they have is what they care about and thats about it. been to Best Buy and Frys and I have seen that mentality first hand. Price always wins when it comes to consumers. Also people will buy it over a 970 since its a newer card and the 970 is a older card and people prefer their electronics to be on the cutting edge. I expect these cards to sell like mad until Nvidia can launch the 1060 and that will all depend on price and how well it performs.
 
Hmm, I think getting a custom AIB 970 right now is a better deal if the prices are close.

Most reviews I'm seeing where the 480 wins is against a "stock" 970. Well, most 970s have another 25% or so headroom vs stock.

A $239 8GB stock 970 or a $289/259 after MIR custom AIB GTX970.

We'll have to see how well AIB 480 does, and at what price.

With how Nvidia obsoletes cards? Buying a 970 now wouldn't be a wise investment. It'll probably be a legacy card within a year of the 1060 being released.
 
This has been disproven how many times already...
Actually all was proven was there wasn't a negative or regression with the passing of drivers. Granted this could just mean tapped out but in no way proves they didn't just stop optimizing drivers for the 7XX series. I read each of those reviews to prove there was no negative which they did but at the same time they proved there was no positive growth at all, however they did not mention that.
 
Actually all was proven was there wasn't a negative or regression with the passing of drivers. Granted this could just mean tapped out but in no way proves they didn't just stop optimizing drivers for the 7XX series. I read each of those reviews to prove there was no negative which they did but at the same time they proved there was no positive growth at all, however they did not mention that.

Or it could just mean that AMD drivers were behind compared to Nvidia's. AMD's cards since GCN have always had more raw horsepower compared to their Nvidia counterparts, oftentimes by a big margin.

The performance gains you're seeing might very well just be untapped potential on existing AMD cards, wasted by poor release drivers.
 
Just saying, if you're going to do it, don't complain when I call you out by doing the EXACT SAME THING to point out the flaw in your logic. You pointed out the flaw in my logic. What you failed, and still fail to realize is that I wasn't using my logic. I was using yours.

I have a different version of this that I use on people in person. It goes like this. "I'm going to repeat to you what you just said. Now, YOU tell ME how stupid I sound."
Are you daft? You weren't using any logic. The only thing comparable between the 970 and the 780 is the brand, and that both product names feature the number "7". They are two cards launched with different price points, performance levels, and market segments. The 480 is comparable to the gtx 970 in performance and nothing else. The fact that this card is comparable to a mid level card launched two years ago by their competition is the point. How you draw any conclusion from that other than "this launch is a massive failure" is beyond me.
 
And yet, it hasn't stopped the faster card (or even the slower card sometimes) from outselling the more efficient competition. See - Fermi.

Power only becomes relevant when your favored brand is suddenly the most efficient. Because I don't have a preferred brand, I don't care about power draw. I don't care if the RX 480 draws 50W, 100W, 200W, or even 250W. My PSU can handle it. I care about the performance and the price.

Not quite true.

That sentiment holds water (no pun intended if you use water cooling) when you KNOW if your PSU is up to the snuff, IE if you know your PSU can handle 1000W of GPU power, then the GPU drawing 50W or 250W may not be at the top of your "to concern" list, short of ensuring that your cooling is up to the task.

For the ordinary layman, they probably don't have a clue about what is their PSU is rated, let alone what quality their PSU is (many people equate the 80 rating with quality), all they care about is replacing a GPU that won't make their computer go nuclear. So having a lower power draw for THAT market segment is an advantage, as it means your GPU is an option for more people the lower your GPU draws power (EG there will be a massively greater number of computers who can handle a 50W GPU compared to a 250W GPU).
 
quite frankly I don't care about power consumption. I care about experience. If it takes 500watts of power to push 4k 60hz or 120hz, then so be it.

However, this is a two edged sword because if company A can provide the same experience while using less power than company B. This gives me a reason to consider it over the other option.

The Rx-480 pushes 1080p and 1440p just fine, while using less power than the 970. It costs the same as the the previous R9-380x or Nvidia 950/60 yet it provides a better experience, on par with a card on a higher tier (970). In terms of performance per dollar and performance per watt the RX-480 is superior in its price point.

Sure the 1070 used the same amount of power, but it also costs more. With the AIB cards of the RX-480 holding more promise than the reference card, they will be cheaper and offer a similar experience to that of the 1070. Granted these cards will consume more power, but who really cares if it consumes a bit more power, again it boils down to experience per dollar.
 
This is pretty much the 5770, which led to me coughing up more money for a 470. Soo looks like a 1070 is going to be in my future. heh.
 
I've seen speculation that Zen would surpass HW-E and go head to head with BW-E.

Check that - now up to even with Skylake
LOL random forum comments are pretty far from even WCCF style speculation, you might as well go full bore strawman mode and just make stuff up if you going to do that.

The actual realistic speculation hasn't changed since the 40% improved IPC over Excavator commentary came out which is where the comparisons to Haswell and Broadwell (Broadwell is generally 5% or less faster than Haswell per clock) come from.

It won't meet or beat Skylake (which is 5-10% faster than Haswell) and that goes double for Kabylake which is supposed to be 10% faster per clock than Skylake. If they can actually achieve Haswell-esque performance per clock and get 3-4Ghz clocks (as rumored) without insane power usage then AMD has a clear cut winner on their hands. Yes they'll lose the synthetic benchmarks against Skylake and Kabylake but in games and general use you won't notice a difference. They'll have to price them lower than the respective Intel chips of course but they would still be able to get their ASP's much higher than they are now so it would be a big win financially for AMD and end users too.

I'm hoping, HOPING not expecting, we'll see ~3Ghz 8C/16T chips for $200-300 and default TDP's of 90-120W with Haswell-ish single thread performance. Honestly though the information we're seeing so far about the RX480's power usage isn't a good sign about GF's new process. Lets all just hope its new process teething issues + GCN's tendency to run hot while eating lots of power while getting ho hum clockspeeds at work here. Otherwise Zen will end up using as much or more power as the top clocked Vishera chips (220W).
 
A cheap card that draws a lot of power that's significantly slower than top end models just looks bad and hurts the value proposition of a low end card.
The RX480 is their new mid to low range product though so of course it'll lose to the top end models no matter what.

Why the hell do people keep trying to compare the RX480 to the 1070 or 1080? Its the 1060 that it will have to go up against. If Nvidia follows through with their current trend of pricing their cards higher (very vague rumors of 1060 being priced around $300, we'll have to wait and see) this gen I don't think AMD has much to worry about really since the mid to lower end market is really price sensitive.

Vega is what you'll have to wait for if you want something on the high end to compare to though.
 
For the ordinary layman,
Ordinary laymen aren't in the dGPU market though. They go with whatever their computer came with when they bought it new and when something starts to get out of kilter or its not just up to snuff anymore they shrug and buy a new PC.

The sort of person who is interested in and able to swap out dGPU's for a new model is also the sort of person who doesn't care too much about upgrading their PSU if necessary. So long as the cost is in line with their budget of course.

But even though the RX480 is drawing more power than expected the amount it pulls isn't all that high really. HardOCP's review has total gaming system load at 249W for the RX480. TechReport has it at 262W. So a OK 300W PSU should be up to the task. Most current enthusiasts PSU's will be more than up to the task so this isn't a issue at all.
 
Ordinary laymen aren't in the dGPU market though. They go with whatever their computer came with when they bought it new and when something starts to get out of kilter or its not just up to snuff anymore they shrug and buy a new PC.

The sort of person who is interested in and able to swap out dGPU's for a new model is also the sort of person who doesn't care too much about upgrading their PSU if necessary. So long as the cost is in line with their budget of course.

But even though the RX480 is drawing more power than expected the amount it pulls isn't all that high really. HardOCP's review has total gaming system load at 249W for the RX480. TechReport has it at 262W. So a OK 300W PSU should be up to the task. Most current enthusiasts PSU's will be more than up to the task so this isn't a issue at all.
Tyhe problem is not its power draw but where it is drawing from which is it is drawing too much power from the pci-e slot.
 
that r290 must be like at 330+ watt at those speeds no? Damn that is actually a good overclock on that.
1250Mhz is a exceptional overclock on a 290. Most can't get much over 1100Mhz without LN2. And yes it'll probably chew up 300W+, unless you got a golden die, so good water cooling will be a necessity.
 
Tyhe problem is not its power draw but where it is drawing from which is it is drawing too much power from the pci-e slot.
The person I was replying to was talking about upgrading the PSU though. Thread and reply context is something you HAVE to consider elsewise the thread turns to crap!

And yes its drawing a bit too much power through the PCIe slot but I wouldn't worry about that in a single card system, which is what nearly everyone will use with these cards. CF is very uncommon and THAT is where the PCIe slot power draw could indeed be a problem. If you're really worried about I'd say wait for AIB cards that have a 8 pin power connector which should be coming soon.
 
The person I was replying to was talking about upgrading the PSU though. Thread and reply context is something you HAVE to consider elsewise the thread turns to crap!

And yes its drawing a bit too much power through the PCIe slot but I wouldn't worry about that in a single card system, which is what nearly everyone will use with these cards. CF is very uncommon and THAT is where the PCIe slot power draw could indeed be a problem. If you're really worried about I'd say wait for AIB cards that have a 8 pin power connector which should be coming soon.
that and amd can easily put out an updated bios for those doing CF and wanting to be on the safe side...and like already mentioned for single cards its a non issue
 
First full 480 Crossfire review. The short of it is, I'm glad I went with a single 1080 over the two 480s I was considering. Even a single 1070 looks to be a more stable pick.

RX 480 Crossfire Performance: GTX 1070 Killer?
I used to run SLI systems, frankly all multi-gpu setups are a pain that requires tweaking almost every game to get optimum performance, so for the first time in a long time I went one card one gtx 1080, and while I wont get perfect 4k I get perfect 1440. DX12 may make multi-gpu great, but that is definitely a wait and see.
 
First full 480 Crossfire review.

Single vs dual card is always like that. How could you be surprised? AMD didn't say thing about "fixing" Crossfire with Polaris10 so all the same old limitations apply here. The situation will improve some as the drivers improve but 2 RX480's will still frequently lose in lots of games to the 1070 or 1080 because CF isn't working due to lack of support.

Maybe DX12 enabled games will shake things up with CF and SLI but I wouldn't count on it and if they do such a shake up is years away at this point it seems.
 
Single vs dual card is always like that. How could you be surprised? AMD didn't say thing about "fixing" Crossfire with Polaris10 so all the same old limitations apply here. The situation will improve some as the drivers improve but 2 RX480's will still frequently lose in lots of games to the 1070 or 1080 because CF isn't working due to lack of support.

Maybe DX12 enabled games will shake things up with CF and SLI but I wouldn't count on it and if they do such a shake up is years away at this point it seems.


After experience using Crossfire and SLI over the past 12 years (my last card, 295x2 i very reluctantly bought used at a steal), I'm not surprised at all. What I do find surprising is every new gen there are people who buy into multi-GPU as a cheaper solution for top-end performance. It's like people see the high FPS and forget about everything else. I've seen tons of "I might get dual 480s because it'll be cheaper" post over the past months here.

Hopefully, there will be a bunch of Crossfire/SLI reviews this round (and I'm sure there will be ppl looking at 1060 SLI as a "1080 beater")

As for dx12, we've heard these sorts of promises over and over - they never pan out.
 
I'm not surprised at all.
OK sorry, misread the tone of your post then.

What I do find surprising is every new gen there are people who buy into multi-GPU as a cheaper solution for top-end performance. It's like people see the high FPS and forget about everything else.
If it happens to work with the games you care about and NEED more performance in then SLI or CF does make some sense so long as you know what you're getting in to. I've ran CF before too myself but it was a long time ago and only because I got the cards cheap as hell.

As for dx12, we've heard these sorts of promises over and over - they never pan out.
DX12 has a half way decent shot at fixing multi card performance issues because much of the work needed to make it happen is going to be done by the game engine designers. If the game engine supports it the work that will be needed to make the game multi adapter compatible should be dramatically reduced. The rub there is that game developers still need to do some extra work to make it happen and its entirely possible they still might not bother since so few people use CF or SLI. So cynicism is certainly warranted here IMO but I don't think its total BS.
 
DX12 has a half way decent shot at fixing multi card performance issues because much of the work needed to make it happen is going to be done by the game engine designers. If the game engine supports it the work that will be needed to make the game multi adapter compatible should be dramatically reduced. The rub there is that game developers still need to do some extra work to make it happen and its entirely possible they still might not bother since so few people use CF or SLI. So cynicism is certainly warranted here IMO but I don't think its total BS.
This is the main reason why i don't believe it will get much better. As it is we are lucky if a dev bothers optimizing their game well let alone building multi-gpu support into the engine. Right now the few DX12 we have gotten perform worse than their DX11 counterpart.
 
This is the main reason why i don't believe it will get much better. As it is we are lucky if a dev bothers optimizing their game well let alone building multi-gpu support into the engine. Right now the few DX12 we have gotten perform worse than their DX11 counterpart.

This is the reason for my extreme scepticism as well. Who is more motivated to ensure crossfire/sli support, the game designer (for whom multi-GPU users are a minute fraction of their market), or the company trying to sell the hardware? I'm pretty sure it's the latter and if multi-GPU is left in the hands of devs, support will be an even more decentralized mess. People will scream, I'm sure, on that game's boards, but ultimately many devs/publishers will look at the profit/loss angle and say "we looked at doing multi-GPU support, and we decided not to support in order to provide the best experience for the gamer blah, blah, blah... Look at MSFT store, for a forward-looking example. They've entirely disabled multi-GPU for games on their platform.
 
Yeah, pretty much.

but I think a lot of people would've ponied up $330 for a GTX 970 in september 2014 instead of $230 for a RX 480 in 2016, two years later if they knew that the medium end wouldnt move as much.
I wish I could have known what I know today!
 
The only time this is happened is with Fermi, and that is because it still was the fastest card by a large margin. Actually Fermi 480 launch nV lost marketshare for one quarter..... So yeah power matters.
It is not power alone. Fermi 480 was also loud as vacuum cleaner going full throttle. Same thing plagued 290X/290.

RX480 power draw admittedly is higher than I would expect for 150W TDP card(*), but still its power consumption is squarely well in the area where most systems are ready for it. And at stock the fan is reasonably silent. And this complete package is what counts.

At OC, the fan&cooler duo does not fare well, but neither really do 1080/1070.

(*) I would not expect it hitting 150W all the time. Looks to me like 1266 MHz is already in the zone of climbing the power wall, and they had to employ some Nano-style power-management techniques to not break PCe spec.
 
No one in the $200 range of video cards gives two shits about power draw. Fastest card for the money they have is what they care about and thats about it. been to Best Buy and Frys and I have seen that mentality first hand. Price always wins when it comes to consumers. Also people will buy it over a 970 since its a newer card and the 970 is a older card and people prefer their electronics to be on the cutting edge. I expect these cards to sell like mad until Nvidia can launch the 1060 and that will all depend on price and how well it performs.

They will when they get home slap the card in their eMachine and realize it's not going to work because *ding ding ding* power draw.

Not knoking the card as it does represent great value for a 1080p gamer. What's interesting is the whole out of spec power draw debacle that might turn into a big thing. Those 970s might still have some life left in them because most people know you can crank that bastard for a 20% overlclock rather easily.
 
Definitely not an HTPC card. Will have to wait to see what 1060 GTX brings to the table.
 
And this totally kills the point of a performant budget card. That kind of product simply shouldn't be in the same power envelope as the fastest cards out right now.
This sounds more like the fashion police then anything else , are you wearing sneakers with that outfit while you know the only suitable things are flip flops ..
So performance budget from now on has to have the best power draw ever or should not be released. That is somewhat a screwed up view ..

The 1060 will be price and performance competitive with the 480 while drawing substantially less power. I think that's all but certain.
Not only that it will do the dishes the drivers never crash it will be at $150 and have the power draw of half the RX 480 because you know Nvidia ...
 
Last edited:
Definitely not an HTPC card. Will have to wait to see what 1060 GTX brings to the table.
It should be fine on a decent SFF PSU. I have a 450W SFF and I have been running an R9 285 in it, which is rated at 190W board power, and have zero issues. I've even overclocked it from mid 900s to 1060MHz and no problems. This is in a mITX NCASE M1 case with a single fan too.
 
This sounds more like the fashion police then anything else , are you wearing sneakers with that outfit while you know the only suitable things are flip flops ..
So performance budget from now on has to have the best power draw ever or should not be released. That is somewhat a screwed up view ..

Performance per watt and power consumption is pretty important, otherwise why do reviewers always look at it? I'm not saying that lower end cards need to be the most efficient, but it looks like this iteration of Polaris is being badly beaten in performance per watt by this iteration of Pascal. In a case where other factors like initial cost and overall performance are similar or close, be it budget or high end, why go with the thing that's delivering significantly less performance per watt?
 
Performance per watt and power consumption is pretty important, otherwise why do reviewers always look at it? I'm not saying that lower end cards need to be the most efficient, but it looks like this iteration of Polaris is being badly beaten in performance per watt by this iteration of Pascal. In a case where other factors like initial cost and overall performance are similar or close, be it budget or high end, why go with the thing that's delivering significantly less performance per watt?

Because, to put it bluntly, high cost graphics cards are typically not worth what they cost or what a person pays for them. (That is not subjective unless you are a billionaire or something.) It is a shame that cards used to cost less at the higher end so that more folks could enjoy them but now, it is just lets make all we can off the PC gamer well not bothering to make the games fully optimized for the PC platform.
 
Because, to put it bluntly, high cost graphics cards are typically not worth what they cost or what a person pays for them. (That is not subjective unless you are a billionaire or something.) It is a shame that cards used to cost less at the higher end so that more folks could enjoy them but now, it is just lets make all we can off the PC gamer well not bothering to make the games fully optimized for the PC platform.

If the 1080 was the best or fastest selling card for Nvidia, how are prices too high? There was and will always be premium for the best performing card, that was true even back in the day. The 8800 ULTRA was $830+, the 8800 GTX was $600+.
 
Back
Top