wait for the benchmarks :)

You have a strange take on things.
Anything to try and make the 390 look better it seems.

Direct quotes from that review




I got a higher overclock with my old 290x + AC Extreme III and it was silent.
390x isnt such good value unless you use 2 of them for 4K on a monitor, not TV.
But even then, 2 well clocked 290x cards might be faster!

Sounds like those 290x's are some good deals.

Awesome stuff! Better swipe up that deal before they sell out, similar to the 780/tis when the 980 came out, right?

After that, the 390x still offers a better value then the 980, being priced ~$100 less itself.

And if you don't like 'old technology', AMD is also leading the 'new technology' train as well...
 
Really.
They havent got HDMI 2.0 and thats been out for about a year.
They still havent got a game library to allow developers to use GPU accelerated Physics.
And this is despite being late to the party.

The [H] review is out.
The GPU is the same or worse than the 290x, they need higher vcore to get the small overclock and thats with a better cooler that should allow for less voltage.
My 290x didnt need any voltage increase to get to 1070MHz.
Thats zero increase in efficiency for the 390x. The power use chart shows its actually a lot worse!
They show that the power used by the whole system is 50% higher for the 390x vs the 980.
With just the video cards it looks like it uses about 2x the power as the 980.
So the GPUs used in the 390x look like rejects from the 290x that were waiting for a better cooler.

AMD are sadly shafting 290x owners, the 390x has higher tesselation performance on the same hardware.

I dont think the 390x is better value.
It is missing capabilites, features, isnt faster than the 980 and uses a ton more power!
The 980 has had a price drop too.
And thats without considering the 980s higher overclock capability.
 
First review in!

http://www.guru3d.com/articles_pages/msi_radeon_r9_390x_gaming_8g_oc_review,21.html

On average, 5-10% faster than a 290x...

Right on par with a 980, losing at some, winning at others, but neck and neck most of the time... All at ~$100 less, and a more future-proof SLI/crossfire benefiting 8GB or ram.

Market has no place for a $400 980-like performer! right?! :p

(Special Note: Games not tested in SLI/crossfire, so the true 8GB benefits arent even explored yet)

(Special Note #2: The real boost is still yet to come, when DX12/Vulkan uses the 290/390s larger compute unit sized than most GPUs (44 for 290x, 16 for 980, for example)

Salient point from above linked review:

Any Radeon R9 390X in most scenarios will be performing roughly at GeForce GTX 980 and Titan like performance, if priced right that is a pretty okay position to be in. Performance wise all modern games up-to 2560x1440 will run pretty good, and that is at the good image quality settings. For Ultra HD the 8 GB comes in handy, then again one card is not powerful enough to drive that resolution for gaming with high image quality settings and proper AA levels. What I am trying to say is that the 8GB graphics memory is nice and welcome, but might be a little irrelevant for most end-users while you do pay a price premium for it.

Drivers then, I decided I MUST make a comment on them. My main concern with the 290 and 390 series products are not so much the hardware and the performance the GPU can deliver, no, my main concern is proper driver support. For the past year AMD's driver support has been sub-par, and that's the honest truth. WHQL driver releases are slow and often the Beta releases are released too late. A good example here was the recently released 'The Witcher 3', it took AMD four weeks before they released a driver that enabled Crossfire support. This is a problem that is hindering and bothering the end-users, AMD needs to step up their game and release 0-day driver releases with at the very least AAA rated titles. As hey, you want to play that game at launch day without the worry that your graphics card isn't optimized or multi-GPU enabled.
 
Last edited:
Really.
They havent got HDMI 2.0 and thats been out for about a year.
They still havent got a game library to allow developers to use GPU accelerated Physics.
And this is despite being late to the party.

The [H] review is out.
The GPU is the same or worse than the 290x, they need higher vcore to get the small overclock and thats with a better cooler that should allow for less voltage.
My 290x didnt need any voltage increase to get to 1070MHz.
Thats zero increase in efficiency for the 390x. The power use chart shows its actually a lot worse!
They show that the power used by the whole system is 50% higher for the 390x vs the 980.
With just the video cards it looks like it uses about 2x the power as the 980.
So the GPUs used in the 390x look like rejects from the 290x that were waiting for a better cooler.

AMD are sadly shafting 290x owners, the 390x has higher tesselation performance on the same hardware.

I dont think the 390x is better value.
It is missing capabilites, features, isnt faster than the 980 and uses a ton more power!
The 980 has had a price drop too.
And thats without considering the 980s higher overclock capability.

Do you run a 4K living room TV for gaming? If not, why do you care about HDMI 2.0... That market doesn't really apply to the 390x, so its irrelevant. People aren't throwing together massive crossfired/SLI living room rigs...

AMD is shafting 290x owners? Once again AMD has ensured card buyers of extended driver support by continuing to use the tech in future lines, how is that screwing 290x owners? AMD does WAY better for last-gen-buyers than NVidia, so your gripes are becoming comical at best...

And with the power thing... Power usage matters to some, but performance is what you buy a graphics card for otherwise a console would make the most sense... Most of the time peoples rigs are idle or off, so the power jack for performance is worth it while in use... How long does that 980 have to be used to make up for the $100 difference with power? -_-

And I also wait for DX 12, where the compute unit and processor difference will really allow AMDs 'old tech' to shine like it 'future proofed' for. :)
 
Last edited:
Lemme know when the 390x can match my reference 980 clocked at 1590 mhz and we can talk :D
 
Last edited:
Do you run a 4K living room TV for gaming? If not, why do you care about HDMI 2.0...
Odd question. I will upgrade to 4K, possibly in the lifetime of this card.

That market doesn't really apply to the 390x, so its irrelevant. People aren't throwing together massive crossfired/SLI living room rigs...
I guess you missed the 390x promotion from AMD where they tout its 4K ability :p
Its a good job people arent using CF 390x in the living room because they would be very disappointed.

AMD is shafting 290x owners? Once again AMD has ensured card buyers of extended driver support by continuing to use the tech in future lines, how is that screwing 290x owners? AMD does WAY better for last-gen-buyers than NVidia, so your gripes are becoming comical at best...
Yes AMD is withholding performance increases from the 290x.
Check the beginning of the [H] review.
One of the features of the 390x on page 1 is "Faster Tessellation".
Then [H] noted an excess of performance compared to the 290x on page 3 and stated
We ran two separate tests here because we couldn't believe how much faster the MSI R9 390X was over the R9 290X, but it was that much faster. It comes much closer to GTX 980 performance in this game, and that is with HairWorks enabled in both tests above. There just might be something to that improvement in tessellation performance noted in the introduction.
They are the exact same GPU core yet the 290 isnt getting the performance increase.

And with the power thing... Power usage matters to some, but performance is what you buy a graphics card for otherwise a console would make the most sense... Most of the time peoples rigs are idle or off, so the power jack for performance is worth it while in use... How long does that 980 have to be used to make up for the $100 difference with power? -_-
You have no argument, I covered performance already.
There are many points where the 390x isnt as good. The argument doesnt even need power for people to make the best choice.
Yet there it is, another down point.

Also consider that it matters when people need to upgrade their PSU to use a single 390x.
Crossfire needs a serious PSU.
The value is diminishing.
 
Odd question. I will upgrade to 4K, possibly in the lifetime of this card.


I guess you missed the 390x promotion from AMD where they tout its 4K ability :p
Its a good job people arent using CF 390x in the living room because they would be very disappointed.

People aren't going to be using a single 390x or 980 for 4k gaming... so a 4k TV (not monitor, with you know, DisplayPort) isn't likely to be used when Crossfiring or SLIing this level... Not the 390x's market.

Yes AMD is withholding performance increases from the 290x.
Check the beginning of the [H] review.
One of the features of the 390x on page 1 is "Faster Tessellation".
Then [H] noted an excess of performance compared to the 290x on page 3 and stated

They are the exact same GPU core yet the 290 isnt getting the performance increase.

So now you are mad that the 390x actually did improve at something? A opposed to being mad that it didn't improve in anything?

Seriously man, which way is it? Are you just going to be pissed off no matter what?

Also consider that it matters when people need to upgrade their PSU to use a single 390x.
Crossfire needs a serious PSU.
The value is diminishing.

The alternative is either SLI 980tis, which would also need a new PSU and would cost a ton more.... Or SLI 980s, which wouldn't require a new PSU, but in SLI for 4k/multi-monitor it would be limited by its 4GB gddr5... Point being, if u want to Crossfire/SLI for insane-level-gaming (4k/multi-monitor/VR), the 390x has the same power requirements as the next 'over 4gb' option...

You just keep on stretching with your 'logic' and misapplied scenarios.
 
When someone flashes their 290X to a 390X and their tessellation performance goes up, then I'll be angry.
Until then... Meh. Kudos to AMD for fixing it.
 
When someone flashes their 290X to a 390X and their tessellation performance goes up, then I'll be angry.
Until then... Meh. Kudos to AMD for fixing it.

Really? 390x is a real 980 competitor for less money AND 290x owners get 'free upgrades' via bios/drivers (as well as fire sales), and it angers you?

I can see if you were mad about "AMD isn't making new tech they are just reusing old tech" if AMD weren't releasing Fiji... But hey, they are leading the pack in 'tech'.

NVidia spends time neutering their chips for multiple SKUs while ignoring old tech, while AMD spends time making real 'new tech' and updating old tech... and your mad at AMD?

where is the logic in your anger?
 
The 390x is a real 980 competitor if you don't overclock, which with the 980 is as simple as "click slider, slide it to +100 mhz, click power, slide it to 110%" and all of a sudden it's game over. What i've just outlined here will work on literally any 980 out there. Hell, right out the box mine actually did i believe +170 and +300 to memory. The actual resulting clocks from +170 base ended up being +255 core clock.
 
When someone flashes their 290X to a 390X and their tessellation performance goes up, then I'll be angry.
Until then... Meh. Kudos to AMD for fixing it.

I thought similar but its such a large increase, its a bit nuts.

In the last Witcher 3 test on page 3, the 390x is 40% faster than the 290x !!
The max performance the overclock could give is 20% if it was solely dependent on memory performance which it wont be.
So we have to conclude on this game that at least a 30% performance increase is being denied the 290x owners.

I sold my 290x to a friend, we have already agreed to blow a 390x BIOS on it when 4GB modded ones appear.
If there is code to prevent it working on a 290x, modders should be able to disable it.
 
The 390x is a real 980 competitor if you don't overclock, which with the 980 is as simple as "click slider, slide it to +100 mhz, click power, slide it to 110%" and all of a sudden it's game over. What i've just outlined here will work on literally any 980 out there. Hell, right out the box mine actually did i believe +170 and +300 to memory. The actual resulting clocks from +170 base ended up being +255 core clock.

AMD's offerings seem to overclock ~100mhz over AIB manufacturers overclocks... While less then nVidias in clock frequency, the gains look very similar... Overclocking big numbers look nice, but how it translates into performance matters more...
 
People aren't going to be using a single 390x or 980 for 4k gaming... so a 4k TV (not monitor, with you know, DisplayPort) isn't likely to be used when Crossfiring or SLIing this level... Not the 390x's market.
Precisely, yet the 390x was marketed by AMD for this exact use.
I will buy another 980 if performance isnt good enough.
I'm not that bothered about the 4GB issue because I tend to use low AA anyway.

So now you are mad that the 390x actually did improve at something? A opposed to being mad that it didn't improve in anything?
I'm not mad, I dont care because I have a 980 to get rid of the driver issues. My gaming life is a whole lot better :)
I wouldnt blame 290x owners for being annoyed.

Seriously man, which way is it? Are you just going to be pissed off no matter what?
Its not me who is pissed off dude lol.

The alternative is either SLI 980tis, which would also need a new PSU and would cost a ton more.... Or SLI 980s, which wouldn't require a new PSU, but in SLI for 4k/multi-monitor it would be limited by its 4GB gddr5... Point being, if u want to Crossfire/SLI for insane-level-gaming (4k/multi-monitor/VR), the 390x has the same power requirements as the next 'over 4gb' option...
Already covered the 4GB issue.
For first gen 4K, I'll reduce settings a little.
It will make me look forward to my next upgrade.

You just keep on stretching with your 'logic' and misapplied scenarios.
Strange that you should state your exact problem :p
 
AMD's offerings seem to overclock ~100mhz over AIB manufacturers overclocks... While less then nVidias in clock frequency, the gains look very similar... Overclocking big numbers look nice, but how it translates into performance matters more...

And there is the problem, only an extra 9% is on the table, a total overclock of 14%.
I can get a 30% overclock from a stock 980 without pushing the voltage much and a little bit of fan noise.
I run it at 26% and my card is almost silent.
 
Precisely, yet the 390x was marketed by AMD for this exact use.

4k monitors have Display Port. Same with Freesync, etc. I don't think AMD marketed the 390x as a HTPC solution for the livingroom, did they? But they did market 4k, yes.

I will buy another 980 if performance isnt good enough.
I'm not that bothered about the 4GB issue because I tend to use low AA anyway.

You like your lower settings at extreme, and similar performance at non-extreme, for more money... And you want the world to know you think the less expensive, higher settings option is not as good... Got it.

I'm not mad, I dont care because I have a 980 to get rid of the driver issues. My gaming life is a whole lot better :)

Gotcha, you got no gripes but just want to spend time griping...

I wouldnt blame 290x owners for being annoyed.

For driver and possible bios improvements? That would be odd to be annoyed about.

Already covered the 4GB issue.
For first gen 4K, I'll reduce settings a little.
It will make me look forward to my next upgrade.

Again, gotcha. More money for lower settings and same performance is the best! 390x being less money for higher settings and same performance sucks!

Or whatever... :p

And there is the problem, only an extra 9% is on the table, a total overclock of 14%.
I can get a 30% overclock from a stock 980 without pushing the voltage much and a little bit of fan noise.
I run it at 26% and my card is almost silent.

If a 390x sees the same magnitude of gains from a 15% overclock that a 980 sees from 30% overclock, why does it matter if the NVidia clocks higher? Wouldn't that just further balance the power difference between the cards?
 
4k monitors have Display Port. Same with Freesync, etc. I don't think AMD marketed the 390x as a HTPC solution for the livingroom, did they? But they did market 4k, yes.
Both you and AMD said living room lol.

You like your lower settings at extreme, and similar performance at non-extreme, for more money... And you want the world to know you think the less expensive, higher settings option is not as good... Got it.
Most games will run at max.
A few will have to run at just below max.
I'm not bothered, not sure why you are.

You need to cut down on the psychedelics, CF AMD gaming isnt possible in the living room at 4K (unless you like 30fps 4:2:0).
We've covered this a few times.

Gotcha, you got no gripes but just want to spend time griping...
Since you started this thread you have twisted reality.
You called is speculation at one point.
Now that we have the facts you are still doing it.
I find it fun to correct you :p

For driver and possible bios improvements? That would be odd to be annoyed about.
Its the same hardware.
The improvement wont be in the BIOS other than changes in default voltage, it will be the driver that contains the performance benefits.
Like I said, I wouldnt blame 290x owners being miffed if they didnt get to see the same benefits.

Again, gotcha. More money for lower settings and same performance is the best! 390x being less money for higher settings and same performance sucks!
Making things up again.
Its higher settings, higher performance, more capabilities, more features and only a little more money.
Perhaps less money if you have to buy a PSU.
The value for money isnt there.
I think you forgot to figure in the 980 price drop, but even without it, I wouldnt consider getting or recommending the 390x.
And thats without considering my main bugbear, driver problems!

jamesgalb needs to be poofed "AMD White Knight".
Hehe
 
Last edited:
It cuts it for me. Same price as a Ti and slightly better performance. No reason for me to go with the 980TI personally.

less ram no dvi and no hdmi 2.0 kill the card on top of that

thats also vs a stock card most 980tis hit 1300mhz pretty easy
 
They still havent got a game library to allow developers to use GPU accelerated Physics.
Do you actually know what you are talking about? Developers can use DirectCompute or OpenCL to write whatever GPU accelerated physics they want. They also offer TressFX, which is GPU accelerated hair physics. Plus they designed Mantle, which is being used as the basis for pretty much every low-level graphics API on PC, which is a lot more important than physics middleware.
 
mantle is not "being used as the basis for pretty much every low-level graphics API on PC"

if anything its the other way around dx12 didnt come out of thin air
 
Vulkan is nearly copy/paste Mantle. From my understanding DX12 is very similar to both of them, and was announced only after AMD spurred them on with Mantle.
 
Vulkan is nearly copy/paste Mantle. From my understanding DX12 is very similar to both of them, and was announced only after AMD spurred them on with Mantle.

other way around mantle was Vulkan Alpha

and ether way its not showing much promise even Apple is dumping Vulkan
 
Do you actually know what you are talking about? Developers can use DirectCompute or OpenCL to write whatever GPU accelerated physics they want. They also offer TressFX, which is GPU accelerated hair physics. Plus they designed Mantle, which is being used as the basis for pretty much every low-level graphics API on PC, which is a lot more important than physics middleware.

I'm not sure you know what you are talking about.
I said Game Library.
 
less ram no dvi and no hdmi 2.0 kill the card on top of that

thats also vs a stock card most 980tis hit 1300mhz pretty easy

Don't care about DVI, or HDMI 2.0. I have DP on my monitors. We'll see if the RAM hurts the performance
 
http://www.maximumpc.com/3dmark-benchmarks-show-amd-fury-x-and-nvidia-titan-x-are-neck-and-neck/

not looking good for the Red team...slightly slower then 6 month old card isnt going to cut it

So all you see is memory that's all? Lol a 649 dollar card running neck and neck with 1k card is not good enough? Seriously talk about total blindness to what's I front of your eyes, you don't have to be a fanboy of either camp to admit the truth. If 350 dollars are worth nothing to you then buy what makes you happier but don't deny that at least fury x makes a card that's good value for money
 
So all you see is memory that's all? Lol a 649 dollar card running neck and neck with 1k card is not good enough? Seriously talk about total blindness to what's I front of your eyes, you don't have to be a fanboy of either camp to admit the truth. If 350 dollars are worth nothing to you then buy what makes you happier but don't deny that at least fury x makes a card that's good value for money

except there's already a 650 dollar card that runs neck and neck w/ a titan x, and it's available now, with DL-DVI and HDMI 2.0

and doesn't need an external radiator
 
except there's already a 650 dollar card that runs neck and neck w/ a titan x, and it's available now, with DL-DVI and HDMI 2.0

and doesn't need an external radiator

LOL people make is seem like the radiator is actually required, lmao. Board has 275W TDP and now people bitch about a quiet cooling solution. Seriously I am all for 980ti and fury x but on what basis people justify their purchase is beyond me. One could argue the same thing that a watercooled 980ti costs a 100 more:rolleyes:.

So now a stock watercooler that is the same price as 980ti is some how a bad thing. 290x had more TDP than fury x and that didn't require a watercooler. The watercooler isn't required people need to stop making stupid statements.
 
except there's already a 650 dollar card that runs neck and neck w/ a titan x, and it's available now, with DL-DVI and HDMI 2.0

and doesn't need an external radiator

Realiturd is need a rad because the vram is on an interposer and die.
And for those Envydiots and Envyturds who are complaining about Vram Amd had a demo that showcased their superiority over Nvidia at higher resolutions again. They ran the Dirt Rally at 3×4k(12k eyefinity) at 60 fps ultra on a single fury x card.
And please stop spreading fud. The watercooled fury is the fury x aka flagship and the air cooled fury is fiji pro(cut down chip) and the R9 Nano is a full fat hawaii(3072 sp) with fiji provements+ enhancements.
 
Realiturd is need a rad because the vram is on an interposer and die.
And for those Envydiots and Envyturds who are complaining about Vram Amd had a demo that showcased their superiority over Nvidia at higher resolutions again. They ran the Dirt Rally at 3×4k(12k eyefinity) at 60 fps ultra on a single fury x card.
And please stop spreading fud. The watercooled fury is the fury x aka flagship and the air cooled fury is fiji pro(cut down chip) and the R9 Nano is a full fat hawaii(3072 sp) with fiji provements+ enhancements.

LOL

this is what a butthurt AMD post looks like, someone screenshot this
 
980ti sits between the 980 and Titan X so not really
and it has more ram dvi hdmi 2.0 overclocks well etc

But your argument was...

...slightly slower then 6 month old card isnt going to cut it

The 980 Ti fits that bill, too.

Now regardless of specs (well, except for the lack of support for non-DP high resolution or high refresh rate displays...that's just lame), if the performance is there, how is it any different than the 980 Ti?
 
2GB less ram and 6 months late?

this is the card thats going to be fighting with Pascal if they want any hope of winning that it needs to be fast

also it relies on drivers heavily and AMD driver support hasnt been the greatest
 
Back
Top