ASUS STRIX R9 380X DirectCU II OC Video Card Review @ [H]

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,532
ASUS STRIX R9 380X DirectCU II OC Video Card Review - AMD has let loose the new AMD Radeon R9 380X GPU, today we evaluate the ASUS STRIX R9 380X OC video card and find out how it compares to a 4GB GeForce GTX 960 and GeForce GTX 970 for a wide picture of where performance lies at 1440p.
 
Last edited:
This card at this price, even at $230, just doesn't make sense to me. At $260 it really doesn't make sense. Not when people are finding deals for GTX 970's at $250 on Ebay and EVGA b-stock.

If they released it at $200-210 and marketed towards 1080 gaming they could pretty much guarantee no one would buy a 960.
 
Mid range pricing has been disappointing for a good while now so no surprise there.
 
this card doesn't even deserve silver award =) it's pointless, would be good to see a follow up versus the predecessor 280X and OC vs OC.
 
this card doesn't even deserve silver award =) it's pointless.

Quite frankly, I do not think it deserves an award either, but Brent thought it did. One of those instances where I am going to bow to his thoughts since he has all the hands-on experience with the card and I was kind of on the fence about it.

As always.....the award really means nothing, at least it should not to our readers. We give you guys all the data we have and hopefully enough information to form your own opinion.

That all said, at its MSRP, it is a much more solid product placement.
 
My thoughts on the award: The GTX 960 always seemed under powered for the price point. AMD is introducing the 380X, with 4GB, at a lower price than NVIDIA launched the 4GB 960 at, and the 380X is faster than the 4GB 960 to boot. Add up all these facts and AMD positioned the card right for a change. Perhaps a bit too late, if it had come out right at the same time as 960 it would have been perfect. Still, it is a competitive product and gives gamers in the lower $200's a great 1080p experience. It redeems, somewhat, the space the 280X owned. Let's face it, AMD might have released too good of a product in the 280X :p They've never been able to come close to offering an appealing product like that again. AMDs only problem are themselves, with dropping prices on 390's and such. Sometimes AMD releases stuff, a bit later than they should to take full potential of the positioning.

I am looking forward to overclocking potential.
 
This card at this price, even at $230, just doesn't make sense to me. At $260 it really doesn't make sense. Not when people are finding deals for GTX 970's at $250 on Ebay and EVGA b-stock.

If they released it at $200-210 and marketed towards 1080 gaming they could pretty much guarantee no one would buy a 960.

Comparing MSRP to Ebay and refurb deals isn't a realistic way to evaluate a product like this. I think at $230 (likely down to ~$200 with rebates and deals) this is a good entry into the current heirarchy. It does somewhat replace the 280x/7970, which while it was an older card, was a great deal at the prices it has been hovering at. The 960 always seemed like a pointless card to me at the prices they wanted, especially with only a 128 bit bus. I think this is about the right step down in price/performance from the 390/970 level.
 
If your wish had been to receive a "reference" card at MSRP, why not simply underclock the Strix card to match 'reference" specs.

I liked the review, but I also don't think there is anything award winning about the GPU.

It's at a price premium and the GTX 970 blows it out of the water.
 
If your wish had been to receive a "reference" card at MSRP, why not simply underclock the Strix card to match 'reference" specs.

Well, first of all we needed to review the card we did receive. It would not be fair to review a $270 card at "reference" speeds. Second, AMD gave us about 48 hours to get this done, which is bullshit, but it is what it is.
 
Kind of dissapointed of the review. I expected more from [H]

It was pretty much obvious the 380x is not a 1440p as the fury is not a 4k card.

So how does it perform at the resolution it makes sense. Well I'll never know because [H] didn't test at 1080p so I don't know if its better than the GTX960 at that res.
 
Well, nVidia, your move--if you've been sitting on a 960Ti, now's the time to show it, at this price point, with appropriate cuts all the way down the stack.

May be too late in the 9xx product cycle for them to bother, but the 660Ti was a nice surprise in its day.

I don't see going beyond 1080p any time soon, so such a card as the 380X maxing out all the candy would be a compelling choice if my gaming ever catches up to newer titles.
 
Kind of dissapointed of the review. I expected more from [H]

It was pretty much obvious the 380x is not a 1440p as the fury is not a 4k card.

So how does it perform at the resolution it makes sense. Well I'll never know because [H] didn't test at 1080p so I don't know if its better than the GTX960 at that res.

As mentioned above, we literally had 48 hours with the card.

Sorry it was obvious to you that AMD marketing was lying. Sadly, we have to have facts behind our statements unlike forum replies.

Reading is fundamental.
 
Kind of dissapointed of the review. I expected more from [H]

It was pretty much obvious the 380x is not a 1440p as the fury is not a 4k card.

So how does it perform at the resolution it makes sense. Well I'll never know because [H] didn't test at 1080p so I don't know if its better than the GTX960 at that res.

Fully agree. After the first results, it would have been more useful to switch to 1080p for further testing.
 
Fully agree. After the first results, it would have been more useful to switch to 1080p for further testing.


Yes, I agree. Had AMD given us more than 48 hours with the card, you might have seen something a little more complete on our first look at the 380X. Given 380X is being billed by AMD as a card for 1440p gaming, we thought we should start there to qualify its marketing statements.
 
Let's face it, AMD might have released too good of a product in the 280X :p They've never been able to come close to offering an appealing product like that again. AMDs only problem are themselves, with dropping prices on 390's and such. Sometimes AMD releases stuff, a bit later than they should to take full potential of the positioning.

Right, the only problem here is:

1. We're still stuck on 28nm, and
2. AMD has lost that loving feeling.

ATI has a long history of releasing cost-reduced parts:

The 9500 series was followed up by the 9600 series. It was slower, but simpler core design and smaller process node. Reviewers were critical, but gradually the market accepted the fact that ATI couldn't sell R300 at budget prices forever.

The HD 3870 was a cost-reduced 2900 XT with half the bus width, and smaller process tech.

The HD 5770 was a cost-reduced 4870 with half the bus width, smaller process node. The DX11 support wasn't really all that big a change from the features the 4870 already supported, so I consider Juniper XT a cost-reduced RV770.

AMD planned to replace the expensive to make Pitcairn XT 384-bit with a die shrunk Tonga XT 256-bit, but that never happened. So they were delayed, and couldn't provide the XT on release. I think what really sunk AMD's release for Tonga was the fact that Maxwell delivered 28nm to new heights, while the 285 was not even treading water (was slower than the 280X). Basically, AMD lost their mojo designing new parts. And as much as idiots here like to pan the GTX 960 and it's "paltry" 128-bit bus, it seems to not be holding it back from competing with the 380, while costing Nvidia a whole lot less to make.

The other problem was AMD continued to sell the old 280X for WAY too long! You can still buy FOUR competitively priced models on Newegg TODAY, which is simply insane for a "discontinued" GPU. This is why I have completely stopped trying to figure out what AMD management is doing, because it makes my head asplode.
 
Last edited:
In the opening of the article, you state that you roll with what you feel is the highest settings for an enjoyable play experience, and as such stray away from the notion of "max this game out, and see what we get". What I can't figure out is how that 4.8GHz OCd i7 fits in with the idea of getting a somewhat repeatable experience at home. Generally speaking, 4.8GHz is nowhere near common, let alone an i7. Correct me if I'm wrong, but if someone has a golden egg of a chip like, and the money to spend on an i7 platform, isn't a $260 midrange graphics the wrong target audience?
In my thoughts, if one is wanting to give a presentation while presenting data in somewhat of a repeatable experience, why not attempt to do so on a more appropriate platform? Or at least bringing that nice OC closer to stock clocks. By all means, show the OC as well.
In my own case, that i7, especially at those clocks, is nowhere near a repeatable experience for me. I'm not sure how big a difference it makes in other games, but I know BF4 is cpu hungry, and it'd be nice to have a more realistic comparison. Surely, I'm not the only one with this thought.

Thanks for the review though!
 
In the opening of the article, you state that you roll with what you feel is the highest settings for an enjoyable play experience, and as such stray away from the notion of "max this game out, and see what we get". What I can't figure out is how that 4.8GHz OCd i7 fits in with the idea of getting a somewhat repeatable experience at home. Generally speaking, 4.8GHz is nowhere near common, let alone an i7. Correct me if I'm wrong, but if someone has a golden egg of a chip like, and the money to spend on an i7 platform, isn't a $260 midrange graphics the wrong target audience?
In my thoughts, if one is wanting to give a presentation while presenting data in somewhat of a repeatable experience, why not attempt to do so on a more appropriate platform? Or at least bringing that nice OC closer to stock clocks. By all means, show the OC as well.
In my own case, that i7, especially at those clocks, is nowhere near a repeatable experience for me. I'm not sure how big a difference it makes in other games, but I know BF4 is cpu hungry, and it'd be nice to have a more realistic comparison. Surely, I'm not the only one with this thought.

Thanks for the review though!

From a video card testing perspective you absolutely want to be running a system that will not limit the video card in any way.

It would just be silly to test cards with a lower end system only to get results that show a CPU or other limitation that doesn't have anything to do with the video card.
 
At $229 it's faster than the $199 GTX 960. Yes, Zotac is getting rid of their stock of 970's via NewEgg on EBAY. But still at $229 I think it's a good deal compared to the 960. I think that AMD shouldn't have sent out a card so far over the MSRP though. I suppose that they wanted the cream of the crop to be tested.

Also the time frame to test was just wrong. Can't put that type of pressure on the reviewers. It's just not right.
 
In the opening of the article, you state that you roll with what you feel is the highest settings for an enjoyable play experience, and as such stray away from the notion of "max this game out, and see what we get". What I can't figure out is how that 4.8GHz OCd i7 fits in with the idea of getting a somewhat repeatable experience at home. Generally speaking, 4.8GHz is nowhere near common, let alone an i7. Correct me if I'm wrong, but if someone has a golden egg of a chip like, and the money to spend on an i7 platform, isn't a $260 midrange graphics the wrong target audience?
In my thoughts, if one is wanting to give a presentation while presenting data in somewhat of a repeatable experience, why not attempt to do so on a more appropriate platform? Or at least bringing that nice OC closer to stock clocks. By all means, show the OC as well.
In my own case, that i7, especially at those clocks, is nowhere near a repeatable experience for me. I'm not sure how big a difference it makes in other games, but I know BF4 is cpu hungry, and it'd be nice to have a more realistic comparison. Surely, I'm not the only one with this thought.

Thanks for the review though!

I think you're missing the point. That i7 at 4.8 removes the CPU from the equation of being a performance hindrance. They're reviewing the card, not what the card will do on an i3, i5 or whatever else.
 
I think you're missing the point. That i7 at 4.8 removes the CPU from the equation of being a performance hindrance. They're reviewing the card, not what the card will do on an i3, i5 or whatever else.

I'm not missing the point, as I understand this completely.
However, the card is only as capable as the rest of the computer is. Also, as I mentioned, [H] reviews are supposed to be about reproducible results. So data from both ends seems to be more appropriate, yea?
 
I'm not missing the point, as I understand this completely.
However, the card is only as capable as the rest of the computer is. Also, as I mentioned, [H] reviews are supposed to be about reproducible results. So data from both ends seems to be more appropriate, yea?

Yeah, I'd be all for a Core i3/i5/i7 review every once in awhile for these midrange cards (to see just how much CPU you really need), but [H] just seems to pretend CPU limits don't exist.

They haven't done a Core i3 review EVER. Either they don't have the time, or they just don't think we would care.

Well [H], if you think we care enough about sub-$200 graphics card s to review them fairly often lately, then you'd better start understanding that we care about sub-$200 CPUs as well.

Just pick a single midrange card, and run game benchmarks with 2-3 processors.
 
Well in gaming in 99.9% of the titles there is little to no difference in performance from a 2500K to a 5960X. In most games a dual core will suffice. Of course this is now changing with the new consoles finally stretching their legs a little bit. Really we need some finished DX12 games before it matters.

I'd rather see an article about memory speed and latency; DDR3 vs DDR4 in regards to frame rate or PCIE bus speeds than a chart with:

CPU1 75fps
CPU2 75fps
CPU3 75fps
CPU4 75fps
CPU5 75fps

Just way too many articles showing those results on the web.
 
Fully agree. After the first results, it would have been more useful to switch to 1080p for further testing.

I think it depends what your frame rate goals are. If you were expecting 120+ FPS I would have to agree.
 
Frankly, the whole $150-$250 price point is a mess.

The 950, the 960, the 380, and the 380X all need to retail at least $20 less than their MSRPs currently stand at. Both sides are screwing people over right now.

This particular card for $260 where there's a 970 next door for nearly the same price? Do they think we're dumb?
 
Last edited:
I'm not missing the point, as I understand this completely.
However, the card is only as capable as the rest of the computer is

We know what you mean. Such a comprehensive review would take a lot of time, but to what benefit? Exactly which midrange cpu / ram speed do you settle on? Yeah, it would be interesting if at least one article took the time to investigate from time to time, which happens with ram speed at least, occasionally.

Changing topics, $200-$250 just doesn't buy much more than it did 2 years ago, very incremental. NV killed off the mainstream x60 Ti branding to push consumers to the $330 level. AMD just plays along.
 
It works, though. Slight price difference, brand name.

Corsair CX power supplies? You know they're bad when people go out of their way to warn people.

Razer?

Though the 970 has sold and sold. They'll throw in a game.
 
My thoughts on the award: The GTX 960 always seemed under powered for the price point. AMD is introducing the 380X, with 4GB, at a lower price than NVIDIA launched the 4GB 960 at, and the 380X is faster than the 4GB 960 to boot. Add up all these facts and AMD positioned the card right for a change. Perhaps a bit too late, if it had come out right at the same time as 960 it would have been perfect. Still, it is a competitive product and gives gamers in the lower $200's a great 1080p experience. It redeems, somewhat, the space the 280X owned. Let's face it, AMD might have released too good of a product in the 280X :p They've never been able to come close to offering an appealing product like that again. AMDs only problem are themselves, with dropping prices on 390's and such. Sometimes AMD releases stuff, a bit later than they should to take full potential of the positioning.

I am looking forward to overclocking potential.

Still, it is a competitive product and gives gamers in the lower $200's a great 1080p experience.

That's what I was missing, the 1080p results. I don't want to make any assumptions about 1080p performance, we all know some cards are better at certain resolutions. So that would have been nice to see.

Both sides do the same thing when it comes to overstating performance, if you jump on Nvidia's site take a look at the GPU's by Games comparison. You will see 960's listed for 1440p on there as well. AMD claims High settings on 1440p, in general that's the case, just like a 960 can run Medium/High. Neither are meant to max out the game. I could tell from the start of the article with how dry the content was, this would be focused on making a point. Noted.

offnote: If anyone is looking at gaming laptops, the G751 GTX970m w/ gsync is flat out amazing. I'm playing on Ultra, 100hz, OC'd to 1173/5400 and it's smooth as butter. Battlefront is really fun on Endor. Super impressed with the 970m, moving from an 860m was a massive jump and well worth it.
 
Well in gaming in 99.9% of the titles there is little to no difference in performance from a 2500K to a 5960X. In most games a dual core will suffice. Of course this is now changing with the new consoles finally stretching their legs a little bit. Really we need some finished DX12 games before it matters.

I'd rather see an article about memory speed and latency; DDR3 vs DDR4 in regards to frame rate or PCIE bus speeds than a chart with:

CPU1 75fps
CPU2 75fps
CPU3 75fps
CPU4 75fps
CPU5 75fps

Just way too many articles showing those results on the web.

you will be surprised how much have changed the games from 2 years to today.. most of the time you are right but with every game release I just see how 4c/4t are starting to be a bottleneck in a lot of games.. however I also find that pointless and unnecessary, a 4.8ghz 3770K its more than enough to remove CPU bottleneck scenarios, in my opinion testing with different CPUs for different reviews just make any comparison overtime impossible. we could easily say things are now better because CPUs are better, instead of Things are now better because of drivers improvements. :)

I'm not missing the point, as I understand this completely.
However, the card is only as capable as the rest of the computer is. Also, as I mentioned, [H] reviews are supposed to be about reproducible results. So data from both ends seems to be more appropriate, yea?

4.8ghz in mainstream i7 (4c/4t) were always an reproducible task. with Sandy Bridge i7s 4.8ghz its in fact common and 5ghz doable for everyday usage, 5.2ghz were also achievable... that number dropped a bit with Ivy bridge to a more lower 4.6ghz common and 4.8ghz doable however this always required delidding and liquid metal TIMS, with haswell things were more or less the same and were improved with Devil's canyon (haswell refresh) the increased base clocks and lower stock voltage made 4.8ghz again kinda common across all 4790K.. and it was maintained with Skylake which its also common to see above 4.6ghz with good chips going above 4.8ghz. the thing its, those results are always reproducible. they need to have the less bottlenecks possible with each GPU test to be able to juice the max of it.. this is specially helpful with AMD cards as they tend to suffer a bit more from CPU overheads so the stronger GPU the better results they can obtain, but as always between real-world environment..
 
At $229 it's faster than the $199 GTX 960. Yes, Zotac is getting rid of their stock of 970's via NewEgg on EBAY. But still at $229 I think it's a good deal compared to the 960. I think that AMD shouldn't have sent out a card so far over the MSRP though. I suppose that they wanted the cream of the crop to be tested.

Also the time frame to test was just wrong. Can't put that type of pressure on the reviewers. It's just not right.

not necessary, 2GB GTX 960s are really common in the ~150$-160$ range.. and 4GB are common also in sub 200$ AR. GTX 970 its getting cheaper everyday.. its now common to see at sub 280$-290$.. with some offers even at 250$-260$ AR.. that kind of offer make this 380X even more irrelevant and pointless. man, the card isn't even able to outperform entirely the 280X which its the card that are replacing... im probably agree with Brent, the 280X was just too good...things could have been just worse if they test a reference 380X model.
 
Wow so much hate for this card.
WHAAAAA "it is 28nm, I hate it." To me architecture is more
important and this mature Tonga did well.
$230 is the start point and there will be rebates.
It is priced right and is a much better deal than the 960.
Overclocking should be nice especially with voltage.

I would love to see these in Xfire. Great 1440p gaming
and some 4k with games like Battlefront.
 
Wow so much hate for this card.
WHAAAAA "it is 28nm, I hate it." To me architecture is more
important and this mature Tonga did well.
$230 is the start point and there will be rebates.
It is priced right and is a much better deal than the 960.
Overclocking should be nice especially with voltage.

I would love to see these in Xfire. Great 1440p gaming
and some 4k with games like Battlefront.

What are you talking about? I don't recall seeing any of what you're saying, it must be a fig-newton of your imagination...
 
Wow so much hate for this card.
WHAAAAA "it is 28nm, I hate it." To me architecture is more
important and this mature Tonga did well.
$230 is the start point and there will be rebates.
It is priced right and is a much better deal than the 960.
Overclocking should be nice especially with voltage.

I would love to see these in Xfire. Great 1440p gaming
and some 4k with games like Battlefront.

man your.. your.. fanboyism is [H] in your veins.. :eek:
 
I wonder if Nvidia will finally get around to releasing one of the 960 Ti cards they were prototyping a long time ago. Stop making 4GB 960s, make the 960 Ti only with 4GB, keep the 2GB 960 price where it's at, and price the 4GB 960 Ti to compete with the 380X directly.

But I don't know if it's worth doing that, or ever was, due to the price of the GTX 970. If AMD's goal was to fight with the 4GB 960, they picked the wrong target.
 
This card at this price, even at $230, just doesn't make sense to me. At $260 it really doesn't make sense. Not when people are finding deals for GTX 970's at $250 on Ebay and EVGA b-stock.
Oh COME ON! Eric on Craig's list is selling a GTX970 for $190.
 
Kind of dissapointed of the review. I expected more from [H]

It was pretty much obvious the 380x is not a 1440p as the fury is not a 4k card.

So how does it perform at the resolution it makes sense. Well I'll never know because [H] didn't test at 1080p so I don't know if its better than the GTX960 at that res.

If your wish had been to receive a "reference" card at MSRP, why not simply underclock the Strix card to match 'reference" specs.

I liked the review, but I also don't think there is anything award winning about the GPU.

It's at a price premium and the GTX 970 blows it out of the water.


I know these questions have been answered, but I wanted to chime in, this hits me personally.

We have to test manufacturers claims first and the card we are evaluating as is, out of the box. AMD claimed this card is positioned for 1440p, that is its intended place in the market. As such, we have to test this claim, we would be remiss if we did not, and missing the point. We have to have the data to backup our opinions, so testing at 1440p is necessary to show you the hard proof. We also have to start the review and review the card as is.

We had a very limited time to evaluate the product this round for the initial review. We had to focus on one resolution and a specific testing scenario. We chose to test the card as it was intended, which makes perfect logical sense to start out your first initial review as such.

Naturally, we test and do follow-up articles that cover what we were not able to in the initial article. It isn't like this will be the only 380X review we ever do again. There will be more reviews, as always. So I'm not quite sure where the thinking comes from. If this article didn't answer your question, and you can't extrapolate performance, perhaps future ones will. As mentioned, we had 48 hours to do this review, something that is not easily done and not everyone can do.
 
I wonder if Nvidia will finally get around to releasing one of the 960 Ti cards they were prototyping a long time ago. Stop making 4GB 960s, make the 960 Ti only with 4GB, keep the 2GB 960 price where it's at, and price the 4GB 960 Ti to compete with the 380X directly.

But I don't know if it's worth doing that, or ever was, due to the price of the GTX 970. If AMD's goal was to fight with the 4GB 960, they picked the wrong target.

4GB 960TI will mean they will use again the crap 128bit bus, if they use 192bit bus 3GB vRAM that would be an amazing 960TI at 220$-250$
 
Back
Top