PowerColor Devil 13 Dual Core R9 390 Video Card Review @ [H]

how do you figure?

my microwave sucks 1500watts (12.5A) on a 15amp circuit...

30A at 120v l is 3600watts...
 
This further highlights an issue that I've seen creeping up for a few months now on nVidia's part. The 390 and 390X are doing quite well as a performance/price value proposition, and nVidia has no good answer in that price range.

The 970 has performed well as a midrange soldier for a long time, but its weaknesses are starting to show. Sure, it's cheaper than the 390, but only by about $20 or so. Not a bad buy at $280 to $300, but now it's suddenly getting caught with its 3.5 GB pants down.

If an nVidia customer decides, "Okay, I'll just step up--the 970 was never envisioned for play at 4K anyway," what's his option? There'es nothing but barren sand and blowing tumbleweeds in the nVidia product stack all the way up to the 980 at $480. For that kind of gap, unless HDMI 2.0 is a concern, the 390X makes better sense.

I think nVidia needs a 970Ti at around $400 to fill this hole in their lineup. Make it a slightly cut-down 980 to soak up some defective 980 chips, but have the PCB with a "true" 4GB design. Of course, if they wanted to just price-cut the 980 to $400, I don't think anyone would complain.
 
.....Of course, if they wanted to just price-cut the 980 to $400, I don't think anyone would complain.

We will probably see some price wars in the not too distant future. I was really hoping it would have started by now, since I would like a vanilla Fury at a cheaper price point.
 
This further highlights an issue that I've seen creeping up for a few months now on nVidia's part. The 390 and 390X are doing quite well as a performance/price value proposition, and nVidia has no good answer in that price range.

The 970 has performed well as a midrange soldier for a long time, but its weaknesses are starting to show. Sure, it's cheaper than the 390, but only by about $20 or so. Not a bad buy at $280 to $300, but now it's suddenly getting caught with its 3.5 GB pants down.

If an nVidia customer decides, "Okay, I'll just step up--the 970 was never envisioned for play at 4K anyway," what's his option? There'es nothing but barren sand and blowing tumbleweeds in the nVidia product stack all the way up to the 980 at $480. For that kind of gap, unless HDMI 2.0 is a concern, the 390X makes better sense.

I think nVidia needs a 970Ti at around $400 to fill this hole in their lineup. Make it a slightly cut-down 980 to soak up some defective 980 chips, but have the PCB with a "true" 4GB design. Of course, if they wanted to just price-cut the 980 to $400, I don't think anyone would complain.

I don't think that hole is as prominent as you think, but I see your point. As much as I've dabbled here or there in SLI in the past, or bought top-end cards until recent years, at this point I just want the fastest cost-effective card I can get with a single GPU. Basically puts me in the mid to upper-mid range. I'm happy playing at 1920x1080, but I want to be able to push settings up to where I want them. I'm not a big AA/AO type of person either. I'm more interested in pushing shadows, lighting, shaders, geometry etc. I'm probably somewhat in the minority, but I know there are plenty of people like me still. The 970 is perfect for this. Probably even ok for 1440 in "most" cases. I haven't played a game yet that I've had a problem with given these criteria.

If I really wanted to upgrade, (having already spent around $300 originally on the 970) $400-500 seems reasonable for the next step or two up in the current generation. (like if I decided to get a higher resolution monitor or something). I don't think a middle-step 970Ti would cut it if I really wanted to go to a higher resolution and still push the options I like. The 980 or the mid range part from the next generation would make a lot more sense I think.

I'm all for more options though, so it would be totally fine if they did release a 970Ti, (better yet if it's Maxwell 2...) I just don't think it's very important the way things currently sit. My opinion of course. I think my 970 will do just fine until the Pascal equivalent is released, at which point I'll pick one of those up. Games are definitely demanding more memory now, but there are cards for that for resolutions that require it. I'm still not seeing it at 1080. Maybe my view is skewed a bit, but I see the 970 as squarely a 1080 card. A very capable one, but still 1080.
 
Another awesome review guys. I expected the price to be higher. These custom dual cards can be insanely priced but this is priced right. When xfire works, this card is a beast. 8 gb vram is awesome for 4k too.
 
I don't think that hole is as prominent as you think, but I see your point. As much as I've dabbled here or there in SLI in the past, or bought top-end cards until recent years, at this point I just want the fastest cost-effective card I can get with a single GPU. Basically puts me in the mid to upper-mid range. I'm happy playing at 1920x1080, but I want to be able to push settings up to where I want them. I'm not a big AA/AO type of person either. I'm more interested in pushing shadows, lighting, shaders, geometry etc. I'm probably somewhat in the minority, but I know there are plenty of people like me still. The 970 is perfect for this. Probably even ok for 1440 in "most" cases. I haven't played a game yet that I've had a problem with given these criteria.

If I really wanted to upgrade, (having already spent around $300 originally on the 970) $400-500 seems reasonable for the next step or two up in the current generation. (like if I decided to get a higher resolution monitor or something). I don't think a middle-step 970Ti would cut it if I really wanted to go to a higher resolution and still push the options I like. The 980 or the mid range part from the next generation would make a lot more sense I think.

I'm all for more options though, so it would be totally fine if they did release a 970Ti, (better yet if it's Maxwell 2...) I just don't think it's very important the way things currently sit. My opinion of course. I think my 970 will do just fine until the Pascal equivalent is released, at which point I'll pick one of those up. Games are definitely demanding more memory now, but there are cards for that for resolutions that require it. I'm still not seeing it at 1080. Maybe my view is skewed a bit, but I see the 970 as squarely a 1080 card. A very capable one, but still 1080.

Your observations make sense from a usability standpoint. I just see that gap where you sort GPUs by price on NewEgg and from mid- $300s to mid-$400s it's page after page of AMD 390 products with nothing nVidia, and from a market standpoint it seems like a mistake to me. Especially since the gap seems to bear out in performance, not just price.

Of course, I say this as a 670 user (playing behind the curve on older games) who will be looking forward to what Pascal offers in the mid-range next year. So what do I know? :)
 
We will probably see some price wars in the not too distant future. I was really hoping it would have started by now, since I would like a vanilla Fury at a cheaper price point.

Yeah, I saw the note a couple of days ago about the supply problems for Fury chips loosening up, and it makes me wonder if they might start lowering prices. That could change things up all over the place.
 
It would have been appropriate to compare other top end SINGLE CARDS, such as 980 TI/titan X and Fury x. That's the point of making it a single card.
 
Thanks for the review here. Always liked how Powercolor went their own way with their Devil series. Moreover, they have exceptional customer service in my experience.

Results tell me the $450 I spent on a 295x2 last month was worth it, especially since I'm "only" playing at 2560x1400 and not limited by only 4GB. Disappointing to see Crossfire wackiness still going on in some games -- and it's inexcusable AMD doesn't even deign to answer your questions here.

Just curious, where did you get a 295x2 for $450? Thats a pretty hot deal.
 
It would have been appropriate to compare other top end SINGLE CARDS, such as 980 TI/titan X and Fury x. That's the point of making it a single card.

There are no single cards that are $800. TITAN X is $200 more expensive, and 980 Ti and Fury X are $150 less expensive.

You can always compare performance by looking at our latest reviews using these video cards and extrapolate performance comparisons between them and PowerColor Devil 13 Dual Core R9 390.
 
There are no single cards that are $800. TITAN X is $200 more expensive, and 980 Ti and Fury X are $150 less expensive.

You can always compare performance by looking at our latest reviews using these video cards and extrapolate performance comparisons between them and PowerColor Devil 13 Dual Core R9 390.

So what ? It would be even better showing how pointless this card is against something like 980ti Gaming/G1/Matrix
 
Its weird that you guys just noticed a performance impact by the 3.5GB of ram.
 
Its weird that you guys just noticed a performance impact by the 3.5GB of ram.

I still haven't. It's not really impacting me at all. Maybe some people are just trying to do too much with it or stretch it too far. At 1080 I've yet to play something that I couldn't push the settings to where I want them, and have it run smooth as silk. I can easily see it hitting a wall at higher resolutions with lots of AA etc. I guess I have different expectations of what a mid-range card should be able to do though.
 
It was necessary that they used dual 390 cards if someone ever wanted to build a quadfire with a single PSU and circuit. This beast consumed 716 watts (802w - 86w for the system). Quadfire would draw 716 watts + 716 watts + 86 watts for the system = 1518 total. We are talking a 1600 watt TITANIUM!

The even more power hungry 390x pair consume 756 watts (845w - 86w system) Quadfire would
draw 756 watts + 756 watts + 86 watts for the system = 1598 watts. Sorry, no PSU is that efficient.

Of coarse, quadfire sucks and always will anyhow but the ePeen would be massive.

FYI: The 295x2 used "only" 640 watts ( 730w - 90w system) Quadfire was around 1400 watts
 
Last edited:
So what ? It would be even better showing how pointless this card is against something like 980ti Gaming/G1/Matrix
Because time is money and running all of those benchmarks on another card takes more time.
 
As noted, US electrical code calls for circuits to be designed for 80% continual load. So, 15A rated circuit is meant to carry 12A continuous. (Meaning, if you have 13 amps worth of continuous load, say, lights, then you need a higher rated circuit.) A 15A circuit can carry a 15A load. It may not be the smartest thing to do, but you could do it.

For grins, a 240V, 20A, circuit would be designed for a continuous load of .8 * 240 * 20 = 3840 watts.
 
80% of a standard 20A/120V circuit is still almost 2000watts.

and no one ever pays attention to code.

if the breaker blows continually,most people would just replace is with a slow-blow breaker.
 
80% of a standard 20A/120V circuit is still almost 2000watts.

and no one ever pays attention to code.

if the breaker blows continually,most people would just replace is with a slow-blow breaker.

Or tape it so it won't trip. ;)
 
I face palmed when I saw all the non-sense extras included.... another bad decision added to the long list of bad decisions surrounded AMD products.
 
Kinda cool seeing this done with 16gb of ram. PowerColor has been doing some unique stuff and that is good they get some attention. As for the card I think some will find it crazy enough to own. At least AMD is making some headway with their drivers on GameWorks titles.

Love how the review puts it how it is with and without GameWorks, just the facts/data. If you own a Nvidia card GameWorks may work for you. If you don't but own a AMD card GameWorks tends to give power performance and issues. Which to me makes sense due to the very close nature in how it is handled. How HardOCP did this review and few others with GameWorks title is perfect and I see no issue with using GameWorks titles that are very good games presenting the issues with and without GameWorks. This allows reviewer to make better choice in the end on their own.

Now I wonder percentage wise, how many titles that need something like CFX/SLI to really max out the title can each solution play, percentage wise? There are thousands of games old and new, many could use the extra boost, especially at higher resolutions to play well. I have no feel for this even though I have a CFX configuration. How much better is SLI or CFX over the other across a broad base of games? Data here is too small to give any kind of conclusion. Which makes deciding if one wants to purchase a dual GPU card harder to make.
 
Kinda cool seeing this done with 16gb of ram. PowerColor has been doing some unique stuff and that is good they get some attention. As for the card I think some will find it crazy enough to own. At least AMD is making some headway with their drivers on GameWorks titles.

Love how the review puts it how it is with and without GameWorks, just the facts/data. If you own a Nvidia card GameWorks may work for you. If you don't but own a AMD card GameWorks tends to give power performance and issues. Which to me makes sense due to the very close nature in how it is handled. How HardOCP did this review and few others with GameWorks title is perfect and I see no issue with using GameWorks titles that are very good games presenting the issues with and without GameWorks. This allows reviewer to make better choice in the end on their own.

Now I wonder percentage wise, how many titles that need something like CFX/SLI to really max out the title can each solution play, percentage wise? There are thousands of games old and new, many could use the extra boost, especially at higher resolutions to play well. I have no feel for this even though I have a CFX configuration. How much better is SLI or CFX over the other across a broad base of games? Data here is too small to give any kind of conclusion. Which makes deciding if one wants to purchase a dual GPU card harder to make.
It's primarily what resolution you play at and what your target frame rate is.

If you are playing at 4K resolution, or have a 144hz 1440p monitor, you will need CFX or SLI to play at max settings.
 
80% of a standard 20A/120V circuit is still almost 2000watts.

I don't think I've ever seen a house wired for that.

Everywhere is pretty much standard 15A circuits, usually a singe circuit shared between several outlets.

Maybe this is just because most of the homes around here are older construction. There is very little new construction here. My house was built in 1909 which is a little older than average. Not sure when the electrical was installed/updated but it was more recently than that, as I don't have knob+tube wiring (My previous house was built in 1830, it had newer wiring, but whomever did it was lazy and ran very few circuits, meaning I couldn't run the AC and the PC at the same time)

Most homes around here are 60s-70s vintage. Often times you have homes with just NEMA 1-15 ungrounded outlets :p

There is very little that is newer than that.
 
Last edited:
Great review can you crossfire 2 of these cards :D

Well, we've been talking about whether this would be possible on standard 120V 15A home wiring, without using dual power supplies, one each plugged into a different outlet on a different circuit :p

Consider this.

With a single card in the system at load it drew 802W.

As has been mentioned before, you are only supposed to draw 80% of a circuits rated power as a constant condition, only spiking into full load.

This means the total available power at the wall you can draw is 1440VA. Assuming we have an 80 plus PSU (that is actually performing within spec) the requirement is that you ahve at least a 0.9 power factor, meaning the most you can draw froma single 15a wall outlet to a PC power supply is about 1296W.

Not sure you'd be able to add a second one, without a second PSU, on a separate outlet residing on a separate circuit.
 
Great review can you crossfire 2 of these cards :D

If it is possible, unfortunately we cannot right now. The card has to be returned, not something we like to do, but in these circumstances it was necessary. The chance of obtaining 2 of these cards is next to zero, plus after all the electrical talk I'm not even sure it would work without blowing the breaker :eek:
 
Zarathustra[H];1041924114 said:
Not sure you'd be able to add a second one, without a second PSU, on a separate outlet residing on a separate circuit.
I can see testing done by begging your next door neighbor for his socket so you can test the videocards without causing any shutdowns ;)
 
I can see testing done by begging your next door neighbor for his socket so you can test the videocards without causing any shutdowns ;)

I can't read your user name without thinking of the mentat Pieter DeVries, and hearing these words:

"It is by will alone I set my mind in motion. It is by the juice of Sapho that thoughts acquire speed, the lips acquire stains, the stains become a warning. It is by will alone I set my mind in motion."

:)
 
Zarathustra[H];1041924085 said:
I don't think I've ever seen a house wired for that.

Everywhere is pretty much standard 15A circuits, usually a singe circuit shared between several outlets.

Maybe this is just because most of the homes around here are older construction. There is very little new construction here. My house was built in 1909 which is a little older than average. Not sure when the electrical was installed/updated but it was more recently than that, as I don't have knob+tube wiring (My previous house was built in 1830, it had newer wiring, but whomever did it was lazy and ran very few circuits, meaning I couldn't run the AC and the PC at the same time)

Most homes around here are 60s-70s vintage. Often times you have homes with just NEMA 1-15 ungrounded outlets :p

There is very little that is newer than that.

most likely due to old wiring.

Most new/updated construction is 15A for lighting, 20A for receptacles (or mixes), 30A for appliances.
 
most likely due to old wiring.

Most new/updated construction is 15A for lighting, 20A for receptacles (or mixes), 30A for appliances.

So you actually have nema 5-20 receptacles in your home?

NEMA_5-20R.jpg


I've only ever seen these in labs and manufacturing areas, never in a home.

And I have never actually seen a nema 5-20 plug in the flesh anywhere ever.

5-20P%20C-19%20V%20(2).jpg
 
What would happen in games that do not support crossfire/sli? Does it just utilize one core?
 
yes, it will revert to running as a single 390. Although you have to turn it off otherwise it runs terrible, hence why you see CF running worse than the single card.
 
Back
Top