MSI R9 390X GAMING 8G Video Card Review @ [H]

Zarathustra[H];1041674678 said:
Playable settings are very important, I agree.

At the same time in 100% of the titles, from an Apples-to-Apples perspective, the GTX980 beat the 390x in both average and minimum fps.

Based on that, I would consider it more accurate to say that the 390x performs very close to, but slightly behind a 980.

Hence my choice of the term "competitive" as opposed to "matched" or "beat" in my OP that led to this thread of conversation. Which is silly.

A better issue is this: When will AMD realize that power-hungry and hot cards are NOT what people want? Let alone new versions of old cards that require more power and only deliver linear performance increases from simple clock boosts?

Related: is there a chart showing which AMD cards belong to which architecture family? With all the re-branding over the years I'm beginning to suspect that the R5 series is a rebranded 9800 Pro...
 
On the idle 50C idle temp, I heard a reviewer say that on idle, the card uses passive cooling only.
This was in the review video posted on Overclock3d.net.
 
Zarathustra[H];1041674678 said:
Playable settings are very important, I agree.

At the same time in 100% of the titles, from an Apples-to-Apples perspective, the GTX980 beat the 390x in both average and minimum fps.

Based on that, I would consider it more accurate to say that the 390x performs very close to, but slightly behind a 980.

whilst consuming far more power - the same old problem 290X faced.

Fury series might be the only ones to look out for. Hopefully 4GB of VRAM doesn't cripple it's performance too much at 4K.
 
I'm surprised that AMD was able to get this close to the 980 with just a refresh. After it was confirmed not to have Tonga features, I thought it was just another footnote in history. Pretty impressive to be able to boost the speed like that as the R9 200 series wasn't the greatest of overclockers. I can't wait to see the overclocking potential.

Awesome article as always. Thx for the review!

It's not good. I just read a review on the sapphire tri-x and they got it to bump from 1055 to 1137.

That's it.
 
On the idle 50C idle temp, I heard a reviewer say that on idle, the card uses passive cooling only.
This was in the review video posted on Overclock3d.net.

Yes, I've been curious about that Green ZeroCore LED. First thought it was for Crossfire use, and maybe when the monitor goes to sleep, but seems like maybe AMD has improved ZeroCore to desktop and maybe other uses?
 
Great review, but for the price difference in regards to the 290x and the real world usage of vram in single card application this rebrand is not a good one in my opinion. (8gb is sort of getting shoved down the customer's throat)

If the card goes down in price it will be a better deal and I am sure it will go down in price very fast.

Fiji will be the king!!!! to bad we have to wait a while for the air cooled version.
 
It's not good. I just read a review on the sapphire tri-x and they got it to bump from 1055 to 1137.

That's it.
A single sample isn't enough to definitively claim they are good or bad overclocking boards. We need more data.

I'll be surprised if they manage to squeeze that much more out of Hawaii though. NVIDIA's most recent parts overclock very well because they were designed for mobile first and scaled up, which I don't believe is AMD's design philosophy.
 
Fiji will be the king!!!! to bad we have to wait a while for the air cooled version.

It looks like it will be one kickass video card.

It's really too bad, it doesn't have HDMI 2.0, and thus won't get good quality on TV based 4k screens, which seem to be the most popular choices in 4k right now.

:(
 
On the idle 50C idle temp, I heard a reviewer say that on idle, the card uses passive cooling only.
This was in the review video posted on Overclock3d.net.

yes, that's a typical trend started with Maxwell cards, Asus was the first to introduce the Zero Fan feature with Strix, then MSI followed with Twin Frozer V, then EVGA with a Bios update..
 
Zarathustra[H];1041674720 said:
It looks like it will be one kickass video card.

It's really too bad, it doesn't have HDMI 2.0, and thus won't get good quality on TV based 4k screens, which seem to be the most popular choices in 4k right now.

:(

Yeah I am surprised that AMD did not do HDMI 2.0 on the new Fiji cards that is a real bummer for all the 4k TV owners out there.
Maybe the air cooled versions will have HDMI 2.0?
 
Yeah I am surprised that AMD did not do HDMI 2.0 on the new Fiji cards that is a real bummer for all the 4k TV owners out there.
Maybe the air cooled versions will have HDMI 2.0?

It is possible they will add in DP to HDMI 2.0 adapters? Of course I am speculating. Who knows.

I would expect it too to be honest. Otherwise it is turn off to TV owners.

Although I would think a majority of people dont use 4k TV's. I think its a very very very small market, specially for PC gamers since the input lag is HORRIBLE on tv's.

I think 4k monitors are more common for PC owners.
 
Great review overall. I would have liked to see some synthetic test results to help compare the 390X to my existing hardware as I do not have the same games available to benchmark. I know that my 7970's in crossfire are 25% faster than my sons R290 even when overclocked as I can run standard benchmarks on all the machines I have. (Cinibench 11.5, AVP 1.3, Stalker, 3dMark, 3dMark vantage, 3dMark 11, Unigine Heaven 2.5, Unigine Valley 1.0, Resident Evil 5)
Will you be doing other testing? BTW, I am waiting to see what the Fury-X can do. That would look nice in my case if it can beat my crossfire set up.
 
Since nV frequently has Gameworks working for it, any chance of seeing a BF4 run with Mantle on?

I'm hoping someone will run it with their Fury X review, too, as I'm hoping that's an indication of DX12 performance.
 
Great review overall. I would have liked to see some synthetic test results to help compare the 390X to my existing hardware as I do not have the same games available to benchmark.

Thanks for the kind words.

We will not be using any synthetics for GPU testing.
 
Since nV frequently has Gameworks working for it, any chance of seeing a BF4 run with Mantle on?

I'm hoping someone will run it with their Fury X review, too, as I'm hoping that's an indication of DX12 performance.

Our recent experiences on using Mantle in BF4 is that it is slower than using DX. We will look at this of course with Fury for sure and see which is the best option.
 
Please start your own thread on this topic as it does not pertain to our review. - Kyle
 
A single sample isn't enough to definitively claim they are good or bad overclocking boards. We need more data.

I'll be surprised if they manage to squeeze that much more out of Hawaii though. NVIDIA's most recent parts overclock very well because they were designed for mobile first and scaled up, which I don't believe is AMD's design philosophy.

Considering that the worst maxwell cards only overclock 20% instead of 25-30%, i'd say it doesn't bode well.
 
Kyle or Brent: Did AMD learn to not include a "quiet mode" which was used by many reviews sites to produce reduced frame rates?
 
That seals the deal for me. Short of some odd happening, my 7970 will be replaced with a 290x once they start to bottom out on price.

Unless the 390x happens to get into the mid 3's, then I might go there. Otherwise, I just don't see it.
 
Zarathustra[H];1041674790 said:
You wouldn't want to use adapters. They would have to be active, and thus would introduce lag.

Which would suck since 4k TV's have a huge amount of input lag.

So gaming on 4k tv's would be useless anyway if you are a serious gamer.

Edit: I would rather use DSR for 4k gaming over a 4k TV right now. /hugs my large 46inch 1080p NEC monitor.
 
That seals the deal for me. Short of some odd happening, my 7970 will be replaced with a 290x once they start to bottom out on price.

Unless the 390x happens to get into the mid 3's, then I might go there. Otherwise, I just don't see it.

You might have missed that.

Most 290x GPU's were deeply discounted in anticipation of the 390x release, but as soon as the NDA lifted they shot back up.

A 290x OC model with 8GB (equivalent to a 390x) will now run you ~$420 :(
 
That seals the deal for me. Short of some odd happening, my 7970 will be replaced with a 290x once they start to bottom out on price.

Unless the 390x happens to get into the mid 3's, then I might go there. Otherwise, I just don't see it.

I would also look at other reviews around the net. Most of them are showing different results then [H].

Now I do not think the 390x is a good buy right now. You should wait for the Fury Nano. It is suppose to be faster then a 290x, and use WAY less power. It only has 1 8pin for power if that tells you anything.

To me stay away from the 290x or even a 390x, And wait for the Fury Nano
 
I would also look at other reviews around the net. Most of them are showing different results then [H].

First, it is not unusual to see gameplay show differences than benchmarks. That aside, the cards were delivered with one driver, but just before launch AMD put out an updated driver. I would look to differences in the driver used first and foremost. We used the latest driver available when we did all the work. It is noted on the Test Setup page.
 
Hence my choice of the term "competitive" as opposed to "matched" or "beat" in my OP that led to this thread of conversation. Which is silly.

A better issue is this: When will AMD realize that power-hungry and hot cards are NOT what people want? Let alone new versions of old cards that require more power and only deliver linear performance increases from simple clock boosts?

Related: is there a chart showing which AMD cards belong to which architecture family? With all the re-branding over the years I'm beginning to suspect that the R5 series is a rebranded 9800 Pro...
I think that an argument can be made that performance over efficiency is what people want. Not many care about power use or temps so long as performance is there. Maybe not what you want but that's a different argument.
 
Last edited:
The clock for clock numbers, increased power, increased price (it's cheaper to OC a 290X yourself) and no HDMI 2.0.

No Award or Bronze at best.
 
Fair, but heat is not included, and it's making a zero sum assumption. THat said, I guess if all AMD has is power-hungry, hot cards (which given the water cooled nature of the Fury, seems to be the case), then that's what we'll get.
At least they got the price point right. It needed to be less than the 980 by a good bit, and I think they got that.
 
As for the review, all I saw on the 390x was mediocrity; as indicated a rebadge. Did the card deserve an award...yeah, but sliver was too high IMO.

Well it is comparable to 980 in speed and not insignificantly cheaper + good build quality + 8 GB RAM = I would say silver is deserved.

@Zarathustra[H] Jesus, man, must every sentence be its own paragraph?
 
@Kyle I find one thing puzzling and wonder why you missed the chance to examine it further. In Witcher3 test you state:

We ran two separate tests here because we couldn't believe how much faster the MSI R9 390X was over the R9 290X, but it was that much faster. It comes much closer to GTX 980 performance in this game, and that is with HairWorks enabled in both tests above. There just might be something to that improvement in tessellation performance noted in the introduction.

but in clock/apples comparison you change the settings and miss the opportunity to see if there really is something to the possible tessellation improvements. How come?
 
Well since I don't have 4K or a 30 inch LCD.....

I think what most people have with CrossFire is an EyeFinity setup.

I'd like to see what two 390X could do running three 24 or 27 inch monitors.

I'd wager no better than my 290Xs.......

so I'm waiting to see what all the "Fury" is about. But I think sitting this one out will be OK. Nothing new here.
 
Wouldn't the most logical explanation for 390X's Hairworks performance be that AMD has put an effective tessellation factor limit in the 15.15 driver, perhaps in the Witcher 3 profile? U know, as in the well-known end-user trick to get Hairworks to run better on AMD than with nVidia.
 
Kyle or Brent: Did AMD learn to not include a "quiet mode" which was used by many reviews sites to produce reduced frame rates?

There is no Quiet or Uber mode like there was with 290X series, no BIOS switches.

However, MSI has its own unique app called MSI Gaming App - http://gaming.msi.com/article/msi-gaming-app-article

Through this software program you can click a button for a pre-programmed clock speed, in Quiet Mode it reduces the clock speed, making the fan not have to run as fast etc.... Which of course is the sort of thing you could do manually anyway with Afterburner, of course decreasing the clock speed is going to mean less performance. There really is no reason at all to do this IMO.

However, I found that in this cards default factory OC mode I find it quite quiet anyway at 1100/6.1
 
@Kyle I find one thing puzzling and wonder why you missed the chance to examine it further. In Witcher3 test you state:



but in clock/apples comparison you change the settings and miss the opportunity to see if there really is something to the possible tessellation improvements. How come?

probably drivers optimization for the witcher 3? that's may be why AMD have exclusive drivers for 390X that refuses to install in a 290X machine..
 
probably drivers optimization for the witcher 3? that's may be why AMD have exclusive drivers for 390X that refuses to install in a 290X machine..

It is hard to test "driver to driver" since the versions are different, that is also something to keep in mind. 15.5 Beta on 290X, 15.15 Beta on 390X.

Even with that difference though it was interesting it didn't cause any differences in the clock versus clock testing.

Also, to answer the OPs question, time was a factor, meaning I did not have a lot of it to do extra testing. It was close enough getting the clock v. clock testing done, and I thought that would be more important.
 
Did you read the same review I did? The 390X had to turn settings down in most of the games compared to the 980 to get framerates within 5-10% at 1440p.

I can already feel the usual shitstorm coming down on [H] when their reviews differ from everyone else. Most other sites I have read shows the 390X sometimes as much as 15% faster than the GTX 980. Makes me excited for the Fury X review saga coming next week :D.

i think the big one he's referencing is the GTA V benchmarking.. the 980 had to run very high grass quality but could run high-def shadows at nearly the same frame rate and some how the 980 "is in a league of it's own".. personally to me neither option really seems all that important in GTA V so for me i consider it a wash. but i'm not the one reviewing it. thats the only notable difference i saw in the review.
 
Disappointed & Frustrated, it runs Hotter, pulls a little more power and costs more than a 290X 8GB version :(

Bring on the real cards AMD wtf!!

Fury w HBM Please Hurry Up.

I'm looking to get something better from either manufacturer but unless AMD pushes Nvidia with something worthwhile we're gonna sit in current performance/pricing Purgatory.

Worse than all of this, it's obvious since there is little to no chance that this card is already overclocked to the tits & that will help little to nothing at all. But at the same time either of my 980s overclocked to 1500+ mhz would crush this card in most scenarios below 4k, even if the 390X was overclocked further.

So when are the real cards coming from AMD? They have to be able to engineer something better than this.
 
Review on release day, nice.

I'm happy AMD caught up and has again clearly established performance / $$$ clearly over NVidia.

On the other hand I can't help but wonder if the 200 series just had better cooling to accommodate an extra ~100W it could be ran at 1100 core / 1500 memory and equal this.
 
Back
Top