AMD Fury Series is here. Discussion thread.

Anyone that has one of these getting the pump whine which was supposedly only in review units? Seen a few posts from people with retail cards with the same issue.
 
Anyone that has one of these getting the pump whine which was supposedly only in review units? Seen a few posts from people with retail cards with the same issue.
http://wccftech.com/amd-radeon-fury-x-reportedly-suffering-buzzing-coil-whine/

According to them, CM has confirmed it's a pump issue and is fixed in new batches.
But we were also told it was ONLY review samples, which was incorrect.

AMD’s Antal Tungler has confirmed that the problem exists in early production units. However a fix (for the pump whine) has been applied by Cooler Master USA and it is hoped that the problem has been resolved for future R9 Fury X units.
 
Seen the info about the pump issues, what i find annoying is some etailers are refusing to replace the card for this issue. Some of them are basically putting it into the coil whine category and saying deal with it. When according to amd and coolermaster its an issue that should have been resolved for the retail boards.
 
This is why you don't early-adopt a product that was obviously rushed to market.
See also: Gigabyte G1 980 Ti.
 
People buying the Fury X cards. Can you explain rationale for buying one? Just wondering given what I have read. Seems to be out of stock everywhere like the 980 Ti so people must be seeing something I am not.

Would be great to hear it.
 
Last edited:
Thanks. Very interesting link. Had not seen that review :).
Another question that I have asked multiple times but without an answer. Given I have a 650D case, is there any way to have two AIO GPUs + an AIO CPU in that case or will I have to buy a new one? If new one, what would typically fit two of Fury X cards (I am going to install an H110i for my CPU soon).
 
Thanks. Very interesting link. Had not seen that review :).
Another question that I have asked multiple times but without an answer. Given I have a 650D case, is there any way to have two AIO GPUs + an AIO CPU in that case or will I have to buy a new one? If new one, what would typically fit two of Fury X cards (I am going to install an H110i for my CPU soon).

Honestly i think it would work out fine if you got a single fan 120mm cooler for the cpu as well. then you might could fit 2 coolers on top of the case and one one the back. Just don't have room on the top for 2 cooler if using the double size cpu aio but i might work with the smaller
 
PLP? Dinosaurs are roaming your desktop.

I think it's time for you to upgrade to higher resolution monitors.

If you point me to 40" 21:9 4K monitor, then I might upgrade because I wouldn't lose that much in the FOV department but getting 34" 21:9 4K means I lose in every way except for resolution compared to my 30" 1600p monitor.

40" 16:9 4K monitor is vertically too big and it has a shit aspect ratio :(
 
Last edited:

New review, same as every review..

"the problem AMD faces is that the GTX 980 Ti is the safer bet. On average it performs better at every resolution, it has more VRAM, it consumes a bit less power, and NVIDIA’s drivers are lean enough that we aren’t seeing CPU bottlenecking that would impact owners of 144Hz displays. To that end the R9 Fury X is by no means a bad card – in fact it’s quite a good card – but NVIDIA struck first and struck with a slightly better card, and this is the situation AMD must face. At the end of the day one could do just fine with the R9 Fury X, it’s just not what I believe to be the best card at $649."
 
People buying the Fury X cards. Can you explain rationale for buying one? Just wondering given what I have read. Seems to be out of stock everywhere like the 980 Ti so people must be seeing something I am not.

Would be great to hear it.

Freesync monitor + FuryX = $1,100
G-Sync monitor + 980 Ti Hybrid = $1,600
 
Honestly i think it would work out fine if you got a single fan 120mm cooler for the cpu as well. then you might could fit 2 coolers on top of the case and one one the back. Just don't have room on the top for 2 cooler if using the double size cpu aio but i might work with the smaller

He could also grab a drill. ;)

Do you need all those HDD cages? Rip one out if not. It's easy to drill out rivets I did it on my 330R.
 
If you point me to 40" 21:9 4K monitor, then I might upgrade because I wouldn't lose that much in the FOV department but getting 34" 21:9 4K means I lose in every way except for resolution compared to my 30" 1600p monitor.

40" 16:9 4K monitor is vertically too big and it has a shit aspect ratio :(

You can do a custom resolution 21:9 on a 40" 4K monitor.
3840x1607 21:9. I game like this when playing GTA V and Project Cars.

After I discovered this, standard 21:9 monitors became obsolete.
 
Grasping. You don't need a Hybrid 980 Ti.


Well, I was fixated on running quiet and cool, but after some research I saw the Hybrid has it's own noise issues (albeit not as piercing). I just exchanged my Fury for two 980 Ti ACX 2.0+.

I can appreciate the innovation of HBM and cost effectiveness of AMD/Freesync, but I didn't want to deal with the pump whine issue. The issue of RMAing right off the bat for a known issue kind of made me sour on it all.
 
Gsync is far superior in every way but price. At max overclock on each a 980ti is about 20% faster.

Must suck to be loyal to only one company you end up with inferior stuff. I only buy the best no matter the maker.
 
Last edited:
Gsync is far superior in every way but price. At max overclock on each a 980ti is about 20% faster.

Must suck to be loyal to only one company you end up with inferior stuff. I only buy the best no matter the maker.

You do realize that the best really is subjective. And not the least to mention price is a factor to most. So far there are a few Adaptive-sync monitors that are getting relatively decent reviews and being hailed as competitive with G-sync. And of course there are others needing some more work. But also consider the other simple fact that Adaptive-sync is quickly surpassing G-sync in number and choices are always helpful with pricing.
 
Gsync is far superior in every way but price. At max overclock on each a 980ti is about 20% faster.

Must suck to be loyal to only one company you end up with inferior stuff. I only buy the best no matter the maker.

I agree that the 980ti is faster overall. But GSYNC and FreeSync are really on the same level. I guess you could knock FreeSync for not doubling the frames below the the lower limit of the monitor, but I think that a person should have the mental capacity to choose settings that allow the game to stay within FreeSync range of the monitor.

AMD has FRTC control to limit higher frames to the FreeSync range or you can use RadeonPro and others to do it. According to PCPER, as long as you stay in FreeSync range, there isn't a difference in it and GSYNC other than FreeSync is cheaper.

Maybe Nvidia will add some new features to differentiate the two technologies at a later date. Competition is always good.
 
Got my Fury X

-Pump whines. Sounds like coil whine but its the pump. Tested by taking the pump power cord out.
-Doesn't OC a shit with stock voltage: 1120mhz max stable. We'll see how it fares after we get voltage control. My ref 290X didn't go over 1100mhz with stock v, but it was game stable at 1250mhz after playing with voltage.

And I'm still excited because 4960x1600p PLP actually works beautifully :D
God how long have I waited for this feature to emerge. And now I have a card with hardware support for it and which actually has the horsepower to drive that resolution. And if I run out of juice, I can just get a second one for some CF sweetness (or badness). This wasn't possible before because PLP gaming had to be done in windowed mode and CF doesn't support windowed gaming.
 
Got my Fury X

-Pump whines. Sounds like coil whine but its the pump. Tested by taking the pump power cord out.
-Doesn't OC a shit with stock voltage: 1120mhz max stable. We'll see how it fares after we get voltage control. My ref 290X didn't go over 1100mhz with stock v, but it was game stable at 1250mhz after playing with voltage.

And I'm still excited because 4960x1600p PLP actually works beautifully :D
God how long have I waited for this feature to emerge. And now I have a card with hardware support for it and which actually has the horsepower to drive that resolution. And if I run out of juice, I can just get a second one for some CF sweetness (or badness). This wasn't possible before because PLP gaming had to be done in windowed mode and CF doesn't support windowed gaming.

PLP how does the 980ti run that btw? :D
The pump sound are from an early batch and if your troubled with it could ask for a replacement card. Voltage support should be coming soon.
Waiting for the 14 of July for more options with the Fury myself.:)
 

4K Surround =! PLP :rolleyes:

There are only two cards in the market which supports this little feature: Fury X and R9 285

In case you don't know what PLP actually means: Portrait-Landscape-Portrait monitor setup.

HXyzYlO.jpg


Btw, I would have bought 980Ti in a heartbeat if it would actually support that little feature but heck, those green cards still doesn't even support 10bit color depth on their consumer cards while AMD has 12bit support in their consumer cards.
 
Last edited:
4K Surround =! PLP :rolleyes:

There are only two cards in the market which supports this little feature: Fury X and R9 285

In case you don't know what PLP actually means: Portrait-Landscape-Portrait monitor setup.

HXyzYlO.jpg


Btw, I would have bought 980Ti in a heartbeat if it would actually support that little feature but heck, those green cards still doesn't even support 10bit color depth on their consumer cards while AMD has 12bit support in their consumer cards.

Assuming 980 ti or titan x can't do this, it's probably the first valid reason I've seen for anyone really picking fury x over 980 ti.
 
Assuming 980 ti or titan x can't do this, it's probably the first valid reason I've seen for anyone really picking fury x over 980 ti.

Its the only reason why I didn't order EVGA GeForce GTX 980 Hydro Copper instead. I actually ordered Fury before even seeing the review results because I knew it would have the PLP-Eyefinity support which I've been waiting for far too long.
 
Last edited:
We shouldn't expect performance to be that much less, probably 10-15% less than Fury X. But, without that huge radiator and pump

lol..what are they, smurfs? It's not huge by any metric.The DirectCU II cooler on my 7970s, now that could be considered huge. I've been reading a lot of funny shit today. Good times, good times :D
 
Assuming 980 ti or titan x can't do this, it's probably the first valid reason I've seen for anyone really picking fury x over 980 ti.
In addition to supporting PLP, Fury has much, much lower idle power consumption compared to NVidia cards when connecting three (non-identical) monitors.

http://www.computerbase.de/2015-06/...10/#diagramm-leistungsaufnahme-gesamtsystem_4

A Fury X system will consume 87 W, while a 980 Ti system consumes 136 W at idle with three monitors.
 
4K Surround =! PLP :rolleyes:

There are only two cards in the market which supports this little feature: Fury X and R9 285

In case you don't know what PLP actually means: Portrait-Landscape-Portrait monitor setup.

HXyzYlO.jpg


Btw, I would have bought 980Ti in a heartbeat if it would actually support that little feature but heck, those green cards still doesn't even support 10bit color depth on their consumer cards while AMD has 12bit support in their consumer cards.
That's really cool. I wish I had enough desk space to go back to three monitors :( I need a bigger house. :p
 
So apparently Nvidia has been cheating defaulting to lower AF quality compared to AMD even on their own gameworks games. It took a regular user running cards from both vendors to catch this.
"Professional" review sites should be ashamed.
 
So apparently Nvidia has been cheating defaulting to lower AF quality compared to AMD even on their own gameworks games. It took a regular user running cards from both vendors to catch this.
"Professional" review sites should be ashamed.

it's not nothing new, Nvidia has been doing this for years. I thought it was common knowledge?

I have know this since the 500 series days.....shrug

But AMD does it as well. You have to change settings in CCC To not use game optimizations.
 
Look at the blurry mess a Titan X displays on screen compared to Fury X

https://www.youtube.com/watch?v=wyYeT9Wvy1A

Pause it at 47 seconds and scan the image from bottom to top. Notice that the ground textures start nearly equaly as sharp but as you move up the Nvidia textures get progressively blurrier and blurrier while the AMD textures remain sharp.

7lJhNsU.jpg
 
I think AMD/ATI has always had better image quality.

Then why didn't review sites mentioned this? We are assuming its a fair comparison when we read benchmarks on the same in-game settings. Except it looks worse on NV GPUs, to a point where blurry textures mean it doesn't even deserve to be on "Ultra" comparisons.
 
AF these days is practically free, I've only seen like a 1-2% performance hit with it enabled in most games. Back in the day, it used to take more of a toll.

The notion that NV is cheating to save 1-2% perf at a massive IQ reduction is completely ridiculous. More than likely a driver bug in that particular game. Or it could be a user issue.

If you have a high end card, should always just force 16x AF anyway.

Here is AF performance hit in GTA V

grand-theft-auto-v-anisotropic-filtering-performance-640px.png
 
Back
Top