AMD Radeon R9 Fury X Video Card Review @ [H]

What good does any of this bickering do? The fact is, Fury X loses. AMD.... loses. nVidia wins here, big time. The real questions people should be asking are "What is my budget?" and "How does the fact that flagship GPU's will easily cost $1,000 in the future affect me, as a consumer?"

Indeed. No true enthusiast on here is happy that AMD has been faltering so bad lately. Competition is what keeps card prices reasonable.
 
Actually, we all lose. If AMD had put out a superior card it would have been better for everyone.
 
It seems to me that those that disagree with the review are in denial. We are judging Fury X based on what it offers for the asking price ($650) compared to the competition. Since it is priced exactly the same as the GTX980 Ti, that is what we're comparing it against. In that regard, it does not look good for the Fury X.
- 4GB VRAM (vs GTX980 Ti's 6GB)
- less performance (than GTX980 Ti)
- no HDMI 2.0 (which the GTX980 Ti has)
- requires space in a case for a WC radiator (no such requirement for GTX980 Ti)

This is why the Fury X fails at $650. If you want to spend your hard earned $650 on the Fury X (over a GTX980 Ti), by all means go ahead.

Fury X @ $650 does nothing to help us consumers. If you want GTX980 Ti performance, it will still cost you $650. If you want slightly less than GTX980 Ti performance (Fury X), it will still cost you $650. How do consumers win? We don't. Like another poster said, the only winner here is Nvidia.
 
It seems to me that those that disagree with the review are in denial. We are judging Fury X based on what it offers for the asking price ($650) compared to the competition. Since it is priced exactly the same as the GTX980 Ti, that is what we're comparing it against. In that regard, it does not look good for the Fury X.
- 4GB VRAM (vs GTX980 Ti's 6GB)
- less performance (than GTX980 Ti)
- no HDMI 2.0 (which the GTX980 Ti has)
- requires space in a case for a WC radiator (no such requirement for GTX980 Ti)

This is why the Fury X fails at $650. If you want to spend your hard earned $650 on the Fury X (over a GTX980 Ti), by all means go ahead.

Fury X @ $650 does nothing to help us consumers. If you want GTX980 Ti performance, it will still cost you $650. If you want slightly less than GTX980 Ti performance (Fury X), it will still cost you $650. How do consumers win? We don't. Like another poster said, the only winner here is Nvidia.

Fully agree here. It's yet another in a long list of botched executions by AMD. What's even more baffling here is AMD held ALL the cards. They knew the performance of their card, they knew the performance and price of their most direct competition, the 980Ti. They literally went into this launch in the most informed way possible, and still managed to fuck it up.


The level of incompetence at AMD is at another level right now. I mean, how can you not even get your drivers working properly for one of the most anticipated launches in recent history? Now I'm only assuming the odd performance variations are driver related and not an inherent issue with the card itself. It just boggles my mind how they managed to mess up with launch despite having all the pieces of the puzzle.
 
I prefer most things about NV hardware, drivers, feature-set, etc. but I really wanted to see this card do well. And honestly it DID do pretty well, just not $650-well, and not 4K-well. I don't even game at 4K let alone 1440, but I did want to see them reach their goal with that.

I'm hoping they manage to squeeze more performance out of it with new drivers, game optimizations, etc. Hopefully they can tip things in favor of their bandwidth over VRAM amount, but this is definitely a puzzling product. If it wasn't so expensive, I'd pop one in my living-room PC just because, but at that price, it doesn't really entice me to do my usual AMD experimentation for this cycle.
 
It seems to me that those that disagree with the review are in denial. We are judging Fury X based on what it offers for the asking price ($650) compared to the competition. Since it is priced exactly the same as the GTX980 Ti, that is what we're comparing it against. In that regard, it does not look good for the Fury X.
- 4GB VRAM (vs GTX980 Ti's 6GB)
- less performance (than GTX980 Ti)
- no HDMI 2.0 (which the GTX980 Ti has)
- requires space in a case for a WC radiator (no such requirement for GTX980 Ti)

This is why the Fury X fails at $650. If you want to spend your hard earned $650 on the Fury X (over a GTX980 Ti), by all means go ahead.

Fury X @ $650 does nothing to help us consumers. If you want GTX980 Ti performance, it will still cost you $650. If you want slightly less than GTX980 Ti performance (Fury X), it will still cost you $650. How do consumers win? We don't. Like another poster said, the only winner here is Nvidia.

Initially, I was very disappointed with the Fury X but upon further reading, I think AMD may be justifying the price because of the added AIO cooler. People like myself have purchased a Kraken G10 and an AIO cooler (H55 for me) to cool their GPUs (R9 290 here). The added cost was $25 for the G10 and $55 for the H55 for me.

For me, I think the Fury X should have been slotted in at $550 to $600, a reasonable price compared to the GTX 980 Ti when you consider the overall performance gap. If you view the AIO cooler included with the Fury X to have a value of $55 to $75, perhaps it makes sense at $650.

With that said, AMD should have released an air cooled version at $550 to $600 for those who want to customize their cooling solution.
 
I've took a time to answer this thread, specially taking into consideration all the ButtHurt'd guys, Im certainly glad to see them suffering badly for this card.

First of all as always, Excellent review Brent, and congratulations for keep your honest work, together with Kyle which certainly shown again in this thread patience no matter what kind of attack suffer, this kind of review is what make [H]OCP my choice since 2005.

And Again I have to say I'm Glad I went with a 980TI 3 weeks ago, 3 weeks of no pain waiting anymore for this card to be this kind of Fail. Im one of those who had high hopes in this card, but i've survived to the AMD marketing team, This time as every other year they will only sell me 1 or 2 mid range cards and nothing more.

Brent I have to ask something, GTX 980TI Boost clocks In-games, that's a very important thing at the moment of the comparison, specially for those of us that use Factory OC'd cards, are you also thinking in make a non-reference card vs the Fury X? Monster Like Gigabyte G1 have a huge advantage even over the Titan X out of the box due to high clocks, which make the 980TI even more appealing and a obvious choice over the Fury X.
 
Initially, I was very disappointed with the Fury X but upon further reading, I think AMD may be justifying the price because of the added AIO cooler. People like myself have purchased a Kraken G10 and an AIO cooler (H55 for me) to cool their GPUs (R9 290 here). The added cost was $25 for the G10 and $55 for the H55 for me.

For me, I think the Fury X should have been slotted in at $550 to $600, a reasonable price compared to the GTX 980 Ti when you consider the overall performance gap. If you view the AIO cooler included with the Fury X to have a value of $55 to $75, perhaps it makes sense at $650.

With that said, AMD should have released an air cooled version at $550 to $600 for those who want to customize their cooling solution.

If I design a car that requires a larger radiator to cool the engine that cost more money and the car is generally slower than its major competitor, that would be foolish.

If I don't sell a version of the car with a cheaper/alternate radiator that would be foolish.
This all assumes that the car could run with a cheaper or alternate radiator.

In reality it probably can't or we don't know so its not really a valid argument or consideration.

Its just another what if for AMD, which is really disappointing because that's all anyone is saying.

What if AMD lowered the price.
What if AMD sold it with an air cooler.
What if DX12 increases the performance.
What if you turn on Mantle.
 
What if you turn on Mantle.

Mantle is actually crap, in BF4 offer decrease in performance over time, and itself offer lower performance than DX11, In Thief The same history and even worse because it's accompanied by sever stuttering..
 
What if AMD lowered the price.
What if AMD sold it with an air cooler.
What if DX12 increases the performance.
What if you turn on Mantle.

There's that, then there is what actually happened.
 
didn't amd say that mantle on bf4 would need to be otpimized for gnc 1.2 when you guyz reviewed the 285? i don't think that ever happened and fury is 1.3.
 
didn't amd say that mantle on bf4 would need to be otpimized for gnc 1.2 when you guyz reviewed the 285? i don't think that ever happened and fury is 1.3.


All I know is if you BUY a Fury X card and want to play BF4 today, you will want to run DX. Beyond that, I really do not care what anyone or anything has to say about it. Results are what is conclusive.
 
I'm no expert but AMD can help themselves now by doing the first 2.

Personally, I'm sticking with my R9 290 at least for another year.

My suggestion would be yes. But what do I know?
 
It seems to me that those that disagree with the review are in denial. We are judging Fury X based on what it offers for the asking price ($650) compared to the competition. Since it is priced exactly the same as the GTX980 Ti, that is what we're comparing it against. In that regard, it does not look good for the Fury X.
- 4GB VRAM (vs GTX980 Ti's 6GB)
- less performance (than GTX980 Ti)
- no HDMI 2.0 (which the GTX980 Ti has)
- requires space in a case for a WC radiator (no such requirement for GTX980 Ti)

This is why the Fury X fails at $650. If you want to spend your hard earned $650 on the Fury X (over a GTX980 Ti), by all means go ahead.

Fury X @ $650 does nothing to help us consumers. If you want GTX980 Ti performance, it will still cost you $650. If you want slightly less than GTX980 Ti performance (Fury X), it will still cost you $650. How do consumers win? We don't. Like another poster said, the only winner here is Nvidia.

While all these things are pretty true some are less relevant than others imho.

The real loss for fury is the 4gb hard limit holding back performance and the lack of overclocking/ ability to match the 980ti. The fury x should have cost less to make up for not being able to match the 980ti, adapters will be out for HDMI 2.0, opinions on AIO's
but the 4gb hard limit is the real glaring problem that can't be fixed.
 
[H] is the only place I go for reviews, because of the real world testing. And this review does everything it should.

That being said, as consumers, it would still be helpful to know what's going on under Gameworks hood. If it does artificially damage a competitors offering (first, it would likely be illegal) and second, it damages the market if AMD were to go out of business, or get so far behind like they are in CPUs, slowing down innovation as one side can do it purely at their pace to maintain whatever pricing they want.

Not saying it is H's place to do that, but would like to see if someone with the technical chops would be able to do so.
 
Fully agree here. It's yet another in a long list of botched executions by AMD. What's even more baffling here is AMD held ALL the cards. They knew the performance of their card, they knew the performance and price of their most direct competition, the 980Ti. They literally went into this launch in the most informed way possible, and still managed to fuck it up.


It just boggles my mind how they managed to mess up with launch despite having all the pieces of the puzzle.

I totally agree. Why would you make audacious claims about performance a week before launch, when you know it's not going match your hype?

They'd be better off if they'd said nothing, released it at $599 and possibly said nothing about 4k (but since I haven't used the card, I don't know).
 
Priced $100 lower and not having that watercooling stock this thing would have been worth buying. Even if this thing were priced at the same price of a 980 MANY could not / would not purchase the Fury because of the water cooling requirement. I need to run at least two cards as it is. I absolutely do not have room for two additional WC radiators in my case, let alone one.

People on here saying hardocp should have shown overclocking results on this thing are absolutely insane. The card needs to have the watercooling stock because if it didn't it would burn up. There is no overclocking room here. Maybe 10% at most. Maybe. But I'd be very scared doing so given how hot it is already running.

I can't even imagine how bad their OEM sales are going to be with this card. Again, the stock WC requirement is going to kill them in this realm. It's less of an issue of price/performance and more of an issue that this card will simply not work in many OEM configurations / cases.
 
So for you guys it DOES NOT MATTER that dev A develops game B...

1. Using proprietary code from nVIDIA.
2. "some times" receiving money from nVIDIA.
3. Optimizes the game to nVIDIA gpus because the game is using nVIDIA proprietary code.
4. nVIDIA has more time AND KNOWLEDGE to polish their drivers given they have access to 100% of the game code.
5. AMD has NO access to 100% of the game code.
6. AMD has less time to polish their drivers... because sometimes the proprietary code is added 1 week or 2 from release day.
7. ...and... what if that proprietary code... that "secret code"... that "black box" detects the presence of AMD gpus and hinders the performance (!?)

...so for you guys all this is business as usual... cool ! :cool:


Ronan, I'm with you on gameworks, but the fury x still got beat in bf4, which is not gameworks. It's OK that they test gameworks games so long as we still see numbers from NON gameworks titles that do NOT have the black box code that is harder to optimize for or perhaps tilted towards nvidias specific hardware advantages over amds (like compute throughput).

AMD needed a clear win to justify the same msrp of the 980ti, and they did not get it, and this cannot be placed on gameworks because we have non gameworks titles that still show the edge to the 980ti. It's that simple. It's still not a bad card, but like I've always said, amd needs better than that to pull ahead.

The amd fans will buy the fury x as their chosen upgrade, the swing vote will break towards the 980ti (the dumber ones with money to light on fire the titanx), and the nvidia fanboys would have either stuck with what they had, or waited until pascal next year.

The problem?

The swing vote is smaller than many people suspect I think, but worse, amd has a smaller fanbase that will support them than nvidia, so not getting the edge hurts them more than it does nvidia. They needed a win, and did not get it. It may change with dx12 and windows 10, or more drivers, but by that time people will already have bought their cards. Day 1 performance is important, that is what moves people, and nvidia contorts their driver team into knots to get the most inflated performance possible with those numbers and it works.
 
Last edited:
While all these things are pretty true some are less relevant than others imho.

The real loss for fury is the 4gb hard limit holding back performance and the lack of overclocking/ ability to match the 980ti. The fury x should have cost less to make up for not being able to match the 980ti, adapters will be out for HDMI 2.0, opinions on AIO's
but the 4gb hard limit is the real glaring problem that can't be fixed.

Problem here is, and in most reviews, there is no proof of 4Gb being an issue. None. Every benchmark I have seen @4k has the Fury = 980Ti/TitanX, 4Gb against 6/12Gb. It's not to say it isn't an issue but it alone doesn't explain why the Fury is neck and neck with its NVIDIA competitors. Therefore the complaint that 4Gb is the issue for AMD when that is the one tier it puts up a solid performance, invalidates the complaint. It doesn't explain why the NVIDIA counter parts struggle against the same card they easily out performed at lower resolutions. Of course there is bus width and bandwidth but the issue being made was 4Gb which seems unfounded or needs a lot more investigation before making such claims.
 
Fixed, thanks for the extra eyes. - Kyle
 
Last edited by a moderator:
Kyle, not sure if you'll see this or if it's been answered. Was the 980ti boosting? Or were they "heated up" before testing to stabilize clocks? Just trying to get a better idea of whats doin.

Yes, as Kyle stated we always take our results after the card has been used for at least 15-30 minutes of gameplay. Since we actually play the games before taking fraps data to find the highest playable settings, this allows the card to warm up sufficiently before the actual fraps grabs, so it works out naturally.

It was boosting over the boost clock consistently while gaming, 1201 MHz while gaming.
 
so there's 3 different date cat 15.15 fury drivers and not all review sites got the same one... oh amd what have you done :eek:
 
Just asking again in case it got missed during all the needless fighting/trolling:

Did you explain how in the Dying Light tests you find that at 1440p you think it is VRAM limited, but then in 4K, it isn't? Or are you saying that it is in both...?

Doesn't add up is all.

It was, at both resolutions. At 4K the game can use up to 6GB of VRAM, proof here - http://www.hardocp.com/article/2015/06/15/nvidia_geforce_gtx_980_ti_video_card_gpu_review/10

In the 4K test in this review, I lowered it to "Best Quality" instead of maximum settings mainly because the 290X was way too slow to actually get data well on with the framerates so low at maximum in-game settings, it bottlenecked the card severely, so I had to lower the game to "Best Quality" (thus lowering the overall VRAM usage for that particular test) just so I could get data.
 
Last edited by a moderator:
[...] The needed a win, and did not get it. It may change with dx12 and windows 10, or more drivers, but by that time people will already have bought their cards. Day 1 performance is important, that is what moves people, and nvidia contorts their driver team into knots to get the most inflated performance possible with those numbers and it works.

True, also by the time DX12 is any kind of big deal, this card will be old.

The review here really expresses what this feels like, a proof-of-concept. Very poorly positioned and IMHO dead on arrival. There is not a single reason for me to buy it over a 980Ti.

Regarding the 4GB, it's fine for right now, but that's also what was said about 2GB cards 2 years ago. 4GB is not looking into the future, especially not when I can get 6GB (or 8GB from cheaper AMD cards) today.
 
It was, at both resolutions. At 4K the game can use up to 6GB of VRAM, proof here - http://www.hardocp.com/article/2015/06/15/nvidia_geforce_gtx_980_ti_video_card_gpu_review/10

In the 4K test in this review, I lowered it to "Best Quality" instead of maximum settings mainly because the 290X was way to slow to actually get data well on with the framerates so low at maximum in-game settings, it bottlenecked the card severely, so I had to lower the game to "Best Quality" (thus lowering the overall VRAM usage for that particular test) just so I could get data.

Thank you. Things didn't jive as I was looking it over, but that makes sense.
 
I wish they had made it single slot instead of dual slot. It wouldn't have been too hard to do!
 
Problem here is, and in most reviews, there is no proof of 4Gb being an issue. None. Every benchmark I have seen @4k has the Fury = 980Ti/TitanX, 4Gb against 6/12Gb. It's not to say it isn't an issue but it alone doesn't explain why the Fury is neck and neck with its NVIDIA competitors. Therefore the complaint that 4Gb is the issue for AMD when that is the one tier it puts up a solid performance, invalidates the complaint. It doesn't explain why the NVIDIA counter parts struggle against the same card they easily out performed at lower resolutions. Of course there is bus width and bandwidth but the issue being made was 4Gb which seems unfounded or needs a lot more investigation before making such claims.

I don't know man, I read the review here are some quotes form it:

" HBM is limited right now in its first iteration to just 4GB of VRAM on a single GPU. We think the decision to constrain your flagship high-end video card, at a $649 price point, with only 4GB of VRAM today is a shortsighted, confusing and ultimately bottlenecking choice."

"It is the wrong move to make. 4GB constrains and limits the potential of your flagship high-end video card especially at the resolution and display sizes the AMD Radeon Fury X are being marketed as excelling at, 4K. "

"The memory constraints are real, and this is going to spell trouble for the AMD Radeon Fury and Fury X when it comes to running multiple cards at 4K, the performance will be there, but the capacity for high settings that leverage physical VRAM won't be. Unless of course we see VRAM pooling across multiple GPUs that we have heard AMD hint at earlier this year."

Sorry if I am misrepresenting that, but my take away was that the 4gb is a hard limit that is affecting the card and will affect it in the near future.
 
I'm not disappointing in new AMD Radeon R9 Fury X.
But like other here I to was really hoping AMD would give us a competitive product but it there frist product with HBM so I'm sure the next ver will be much better and I hope cheaper as they work on to a smaller die fab after all 5cm or 2in sq it very big chip and that mean they don't get very min chip from a 12in silicon wafers in order get cost down.
I do not think it a big fat flop.

The Pros
1: It in small package size vs the GTX 980 Ti which a monster
2: It run cooler then GTX 980 Ti unless add water cooling to the GTX 980 Ti which will jack up the cost a lot.
3: It a perfect solutions for smaller case like the mini itx gaming system

So it has some Cons
1: 4GB vs GTX980 Ti' 6GB to me this not big deal as will not be going 4k any time soon.
2: Less performance than the GTX980 Ti not by much as it all depend on the game.
3: No HDMI 2.0 which the GTX980 Ti has, and ponits is ?? as DP is still better any way.
4: Ho may it requires space in a case for a WC radiator no such requirement for GTX 980 Ti, but did not take in count you need that you good venting for GTX 980 Ti to stay cool which useless in mini itx case and it a dust magnet.

Hope lee it be something when we see some Windows10/DX12 game performances numbers.
I say it 5O/5O as it still to early to say what will happing as make driver improvement.
Just think of the day when CPU go down this route.
 
In the 4K test in this review, I lowered it to "Best Quality" instead of maximum settings mainly because the 290X was way to slow to actually get data well on with the framerates so low at maximum in-game settings, it bottlenecked the card severely, so I had to lower the game to "Best Quality" (thus lowering the overall VRAM usage for that particular test) just so I could get data.

Thanks Brent, This is probably the answer that most Heart-broken AMD fanboys need to just shut up, also thanks for the 980TI clocks, 1201mhz not bad for those temps...
 
Yes, as Kyle stated we always take our results after the card has been used for at least 15-30 minutes of gameplay. Since we actually play the games before taking fraps data to find the highest playable settings, this allows the card to warm up sufficiently before the actual fraps grabs, so it works out naturally.

It was boosting over the boost clock consistently while gaming, 1201 MHz while gaming.

And Fury X is locked at 1050MHz, so that's a 14.38% difference in clockspeed.

In apples to apples at 4k, the Fury never loses to the Ti by more than that 14.38% difference at 4k... did AMD actually achieve higher IPC than nVidia? Maybe it was the lack of a new process (defunct 20nm/incoming 16nm FF) and HBM1/4GB that killed this architecture after all.
 
Ever heard of this thing called Direct X? Maybe, just maybe if Nvidia's hardware could run it, they would not have to make their own library to make their hardware run better.... :p

actually if Nividia was anywhere near half way decent, they would let AMD see what is inside of it just like AMD did with TressFX...

All GW features are run in DX11. That is why they can run on AMD hardware. They use standard DX API calls.
 
And Fury X is locked at 1050MHz, so that's a 14.38% difference in clockspeed.

In apples to apples at 4k, the Fury never loses to the Ti by more than that 14.38% difference at 4k... did AMD actually achieve higher IPC than nVidia? Maybe it was the lack of a new process (defunct 20nm/incoming 16nm FF) and HBM1/4GB that killed this architecture after all.

You cannot compare the GPUs clock to clock. Way too many differences.
 
And Fury X is locked at 1050MHz, so that's a 14.38% difference in clockspeed.

In apples to apples at 4k, the Fury never loses to the Ti by more than that 14.38% difference at 4k... did AMD actually achieve higher IPC than nVidia? Maybe it was the lack of a new process (defunct 20nm/incoming 16nm FF) and HBM1/4GB that killed this architecture after all.

clock speeds in cross-branding is just useless and not comparable, as architectures vary, ROPS, TMUs. Shader count, etc... take into consideration those factors more than the 14.38% difference just in clock speed.
 
for me it seems Fury owns Nvidia so I am buying it.
Cant rely on the way Hardocp do their testing as I never bought stuff based on that.
I check the way I do gaming and for my set up Nvidia dont offer me anything better.
Fury however does is why I buy AMD as its superior stuff.

Fanboyism at its very best. Flopper your opinion about matters counts less every day now.
 
Sigh. All the hype and then this. The lower RAM clocks didn't help the overall glutenous power consumption and the card was playing catch up in the benchmarks. Combined with AMD's historically horrific drivers and we have this mess. I don't think its a bad card but I'm guessing a plain ole GTX980 would give it some competition.
 
Problem here is, and in most reviews, there is no proof of 4Gb being an issue. None. Every benchmark I have seen @4k has the Fury = 980Ti/TitanX, 4Gb against 6/12Gb. It's not to say it isn't an issue but it alone doesn't explain why the Fury is neck and neck with its NVIDIA competitors. Therefore the complaint that 4Gb is the issue for AMD when that is the one tier it puts up a solid performance, invalidates the complaint. It doesn't explain why the NVIDIA counter parts struggle against the same card they easily out performed at lower resolutions. Of course there is bus width and bandwidth but the issue being made was 4Gb which seems unfounded or needs a lot more investigation before making such claims.

You have been shown proof multiple times where less than 4GB matters.
I can show it to you again if you like, but I dont think it will help you.
You either struggle reading, understanding or have something wrong up top.
As such, what you say has limited value.

Fury is not neck and neck with the 980ti, it was soundly beaten.
If you wish to contradict this, provide direct evidence so it can be scrutinised.
 
Back
Top