XFX R9 390X Double Dissipation Review

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
The XFX R9 390X Double Dissipation video card is on the test bench at Vortez today.

Today we’ll be looking at XFX’s take on the 390X to see what benefits they can bring to the table. The 390X Double Dissipation graphics card utilises the Ghost 3.0+ thermal design with an all new-GPU heatsink, VRM heatsink and dual-90mm cooling fans.
 
I wish there were more reviews on the 390 (non-X) cards. They seem reasonably priced and are only slightly slower than their 390x brother. In most scenarios, OC versions appear to beat the 290x..
 
It's a 290 with an overclock, who cares?
Even the "Nitro" is just a Tri-X with a new shroud.

If I yawn any harder my head will fall off. It's not better than a 290X nor 970 unless the person who buys it doesn't know how to drag a slider from "1000" to "1100".
 
LOL, well put. Why are we still waiting time hemming and hawing over these rebrands?
 
Look at those VRM heatsinks! Good to see XFX learned from their mistake on the 290/290x DDs.
 
It's a 290 with an overclock, who cares?
Even the "Nitro" is just a Tri-X with a new shroud.

If I yawn any harder my head will fall off. It's not better than a 290X nor 970 unless the person who buys it doesn't know how to drag a slider from "1000" to "1100".

Mine doesn't OC but from 974 to 1045. So the "rebrands" are much nicer than what I have as they start off higher than what my card is capable of. I was thinking of buying a water cooler and a shroud, but that's another $100+ dollars.
 
That's a dinky heatsink. What's up with all the missing fins?

jQY6AHQ.jpg


290X:

TUfYpGz.jpg
 
Their Fin Density was too High and wasn't able to provide enough Airflow to cool the VRMs and the little air they could manage to push to the VRMs was already warm due to the high Density Fins, they also added dedicated HeatSink to VRM areas.. It was truly necessary to cut that area and allow higher amount of cool air to be pushed there. Or in other case just Decrease the Overall Fin Density but that will decrease the cooling performance, So I think they made the right decision.
 
VRMs will be why. Had ridiculous VRMs temps on an old 6970 Gigabyte 'wangforce'.. total crappy design on those and they kept doing it by looks of things. The blow through was already red hot, so much that even some components on other side of the VRMS, on other side of the card, could burn you instantly on contact. Any external fan cooling did nothing to help.
Accelero and putting heatsinks on everything hot helped.
 
Mine doesn't OC but from 974 to 1045. So the "rebrands" are much nicer than what I have as they start off higher than what my card is capable of. I was thinking of buying a water cooler and a shroud, but that's another $100+ dollars.

You do know why these rebrands are higher clocked, right? It's because AMD upped the 390X voltage to some ungodly number just to increase the max clock speeds across the boards. Last time I checked, a 10% core clock bump doesn't increase power consumption by 90w, but in this case it does!

http://www.techpowerup.com/reviews/MSI/R9_390X_Gaming/28.html

These aren't "improved yields," they're just increasing the voltage OEMs can apply so they can overclock the chip even further. They pulled the same stunt with the 7970 GHz Edition, bumping the performance by a little over 10%, and power rose 40w (30%). This required a 0.043v boost voltage increase.

http://www.anandtech.com/show/6025/radeon-hd-7970-ghz-edition-review-catching-up-to-gtx-680/16

And no, the 390 doesn't overclock any more than the old 290 did (when you apply enough voltage):

http://www.techpowerup.com/reviews/Powercolor/R9_390_PCS_Plus/33.html

They are getting a much lower power consumption on the 390 at 1GHz (10% faster for the same power), although I'm sure the majority of that power savings is because of the custom cooler versus the crappy stock cooler (25C difference = massive reduction in leakage):

http://www.techpowerup.com/reviews/Powercolor/R9_390_PCS_Plus/34.html
 
Last edited:
I applied tons of voltage. All I got was a ton of artifacts and heat. I need to try a H80 + Corsair shroud I suppose. At 1050 I get artifacts every so often. I end up using 1025 as that is safe.
 
No digging that "two paragraphs a page" approach to getting more page views.
 
Well that's the difference in the Fury X and 980ti. YMMV. :)

At 4k, it's pretty badass, and CFX should be competitive with SLI. However, not very attractive for the 1440p and 1080p 144 Hz users. Difference in some games can top 20-30%!

So, thus the split decision from the commmunity. The ultra-high-end gamers are pretty split on display preference. Also, the lack of a DL-DVI port kills it for the 1080p 144Hz people, and the HDMI 1.4a port limits you to 4k monitors.
 
Last edited:
You do know why these rebrands are higher clocked, right? It's because AMD upped the 390X voltage to some ungodly number just to increase the max clock speeds across the boards. Last time I checked, a 10% core clock bump doesn't increase power consumption by 90w, but in this case it does!

http://www.techpowerup.com/reviews/MSI/R9_390X_Gaming/28.html

Strange....... Guru 3d, Kit Guru and and Hardwarecanucks shows the 390x with a less/only marginally more power draw than a 290x with much better performance.

http://www.hardwarecanucks.com/foru...646-amd-r9-390x-8gb-performance-review-3.html

http://www.guru3d.com/articles_pages/msi_radeon_r9_390x_gaming_8g_oc_review,8.html

http://www.kitguru.net/components/graphic-cards/zardon/sapphire-r9-390x-tri-x-8gb-review/22/

The XFX in the vortez review drains less power (about 40 watts) than a non-reference 290x, whist distancing itself in overall performance:

http://www.vortez.net/articles_pages/xfx_r9_390x_double_dissipation_review,7.html

http://www.vortez.net/articles_pages/xfx_r9_390x_double_dissipation_review,18.html

I guess it depends on the model.
 
Strange....... Guru 3d, Kit Guru and and Hardwarecanucks shows the 390x with a less/only marginally more power draw than a 290x with much better performance.

http://www.hardwarecanucks.com/foru...646-amd-r9-390x-8gb-performance-review-3.html

http://www.guru3d.com/articles_pages/msi_radeon_r9_390x_gaming_8g_oc_review,8.html

http://www.kitguru.net/components/graphic-cards/zardon/sapphire-r9-390x-tri-x-8gb-review/22/

The XFX in the vortez review drains less power (about 40 watts) than a non-reference 290x, whist distancing itself in overall performance:

http://www.vortez.net/articles_pages/xfx_r9_390x_double_dissipation_review,7.html

http://www.vortez.net/articles_pages/xfx_r9_390x_double_dissipation_review,18.html

I guess it depends on the model.

I guess so. I won't credit Vortez for anything, but those 3 other reviews are quite plain: similar power consumption :D

I suppose this is because we already have OC card (like the MSI). Sorry for not doing more research :)
 
The thing I don't understand is the wild difference, in regards to power consumption, of the MSI 390x card when one compares the Guru3d and Techpowerup review. Tweaktown has slightly similar results to Techpowerup (though not as extreme). I have to chalk it to testing methodology.
 
At 4k, it's pretty badass, and CFX should be competitive with SLI. However, not very attractive for the 1440p and 1080p 144 Hz users. Difference in some games can top 20-30%!

So, thus the split decision from the commmunity. The ultra-high-end gamers are pretty split on display preference. Also, the lack of a DL-DVI port kills it for the 1080p 144Hz people, and the HDMI 1.4a port limits you to 4k monitors.

Well said.

I only somewhat disagree with the bold part - you can do 120Hz/144Hz with DisplayPort. I hear what you're saying though - "legacy" 120Hz/144Hz monitors may not have DP.
 
At 4k, it's pretty badass, and CFX should be competitive with SLI. However, not very attractive for the 1440p and 1080p 144 Hz users. Difference in some games can top 20-30%!

So, thus the split decision from the commmunity. The ultra-high-end gamers are pretty split on display preference. Also, the lack of a DL-DVI port kills it for the 1080p 144Hz people, and the HDMI 1.4a port limits you to 4k monitors.

How many 4K monitors don't have display port? Any?
 
Mine doesn't OC but from 974 to 1045. So the "rebrands" are much nicer than what I have as they start off higher than what my card is capable of. I was thinking of buying a water cooler and a shroud, but that's another $100+ dollars.

My XFX 290 DD set to 1100 right out of the box. Not sure if it goes higher, just set it there and did nothing else. Came from CF XFX 7770 so really didn't require much to be impressed. Didn't OC the memory or adjust voltage.
 
My XFX 290 DD set to 1100 right out of the box. Not sure if it goes higher, just set it there and did nothing else. Came from CF XFX 7770 so really didn't require much to be impressed. Didn't OC the memory or adjust voltage.

Probably grab a XFX card whenever I upgrade again. Used to have a XFX 5850 back in the day. Not in a rush to upgrade now.
 
Back
Top