AMD Radeon R9 Fury X Video Card Review @ [H]

There was nothing wrong with the article. It was bold and to the point which is the [H] way of writing reviews.

The same type of articles have been written for other vendors including NVidia and you guys didn't have a problem then. So why now?

Is it because [H] has disturbed the established order by writing against a company that you guys love so much?

I wasn't comparing to any other article of theirs. I was commenting on the merits of this sole article. If you want I can go and take parts of the article and reword them so you can better understand.

Also for those talking about the 4Gb, at no point in the entirety of the article was the 4Gb HBM shown to be an issue. Commenting/alluding to one game used 5Gb on a 6Gb product is not conclusive to 4Gb being an issue. Maybe it is, maybe it isn't. But to comment that it is at 1440p but then at 4K it doesn't seem to be an issue or not nearly as great of one over whatever was afflicting the 980Ti, was premature. Scientifically this is what one might call a set-back to the 4Gb hypothesis and therefore requires a more indepth look. Funniest part is that they were harsh about AMDs comments on cards for certain resolutions, but in the end it looked as though AMD was correct. At 1440p the Fury didn't look too impressive against its competition. But at 4K it was a contender.

At any rate, for now it is just an ok performer with some tantalizing features: HBM, watercooling at stock, small form factor... . I am interested to see with just a little passing time if it gets better or this is truly its performance level.
 
Just give all your money to nVidia. I honestly look forward to 1,000 dollar flagship GPU's. It will weed out plebs who cannot afford 2 grand on a dual card setup.
 
I got that impression too.. A lot of subjectivity in the review. Was this review Nvidia-endorsed? :confused:

I don't think that the review was anti-AMD. I think that they took the card into areas with high demands for resources such as VRAM, ROPS, etc as detailed in the review, and they reported their findings.

Initially I was pissed at AMD for not trying hard enough. Or not having all the resources on the card to compete better. But after reading other reviews around the web, I concluded that since I boycott 99% of GameWorks games; a great deal of the issues wouldn't pertain to me. I play stuff like Dragon Age Inquisition, Witcher 3, Devil May Cry series, Civilization Beyond Earth, GTA V and a lot more.

In those games the card doesn't falter much. I don't mind one notch down in GTA V. So for me the Fury X isn't a bad proposition. The problem is that there is a better overall performer in the GTX 980ti. If you're thinking of playing the games in the [H]ardocp suite, then they laid out what the results were.

If anything AMD should contact Kyle and ask him what sections of the games in the testing suite were having issues and correct them. Undoubtedly there will be more cards to be tested if AMD eventually allows AiB partners to create custom cards in the future.

And I agree with them about the overclocking report. If AMD doesn't unlock the voltage, then what's the point in overclock testing? AMD already said memory overclocks were pointless. A voltage locked core is pointless to test for overclocking performance. I do believe, going by past performance, that AMD will eventually unlock the voltage and allow users to have fun with the cards.
 
I wasn't comparing to any other article of theirs. I was commenting on the merits of this sole article. If you want I can go and take parts of the article and reword them so you can better understand.

Also for those talking about the 4Gb, at no point in the entirety of the article was the 4Gb HBM shown to be an issue. Commenting/alluding to one game used 5Gb on a 6Gb product is not conclusive to 4Gb being an issue. Maybe it is, maybe it isn't. But to comment that it is at 1440p but then at 4K it doesn't seem to be an issue or not nearly as great of one over whatever was afflicting the 980Ti, was premature. Scientifically this is what one might call a set-back to the 4Gb hypothesis and therefore requires a more indepth look. Funniest part is that they were harsh about AMDs comments on cards for certain resolutions, but in the end it looked as though AMD was correct. At 1440p the Fury didn't look too impressive against its competition. But at 4K it was a contender.

At any rate, for now it is just an ok performer with some tantalizing features: HBM, watercooling at stock, small form factor... . I am interested to see with just a little passing time if it gets better or this is truly its performance level.

Did you look at the Dying Light 1440p portion? Brent did a good job elaborating there. What I'd really want is someone like PCPER to use Brent's settings and generate some data. Generally the two sites jive, [H] and pcper, even though they use completely different review types.

I always figured 4GB of VRAM would be ok for the majority of games with a single card. It's multicard where it'll be an issue since you can jack up the IQ.

I'd be surprised if AMD unlocks this card. With already being at 60C on an AIO and 100C VRM temps it'd be a huge risk for AMD for increased RMAs. I think people are confusing suitable AIO temps with air cooler temps.

You know what. I hope [H] does an OC vs OC review. I always loved that kind of review, you don't see it often and the Fury X would get annihilated. Maybe it's best for AMD if they don't. :)
 
Last edited:
Hardforum will always report the facts as they are. I can usually count on Kyle and Brent to report them. It is not their fault AMD is a no show. Certainly, in 5 years when there is zero competition and the GPU advances are nil, they will be complicit. Truth be told, they were sideline commentators. Not players.
 
To all the people claiming GameWorks is the problem.

Why are you buying video cards? Do you play PC games?

Are you goingn to ignore a game and simply not play it because it has GW features? Do you just not enjoy doing a lot of PC gaming?

I'm trying to understand what you want us to do, cause it sounds like you want us to cherry pick games based on features and ignore one brands features, thus creating a bias, and thus not evaluating new, common, popular games people are playing on the PC today.

Ever heard of this thing called Direct X? Maybe, just maybe if Nvidia's hardware could run it, they would not have to make their own library to make their hardware run better.... :p

actually if Nividia was anywhere near half way decent, they would let AMD see what is inside of it just like AMD did with TressFX...
 
Last edited:
Ever heard of this thing called Direct X? Maybe, just maybe if Nvidia's hardware could run it, they would not have to make their own library to make their hardware run better....

actually if Nividia was anywhere near half way decent, they would let AMD see what is inside of it just like AMD did with TressFX...



What?
 
Thanks for the review.

I worried about AMD's future right before they came out with the slot cartridge Athlon, and now. Unfortunately, I don't think their new Athlon (Fury) can save them this time.... will be the end of an era for sure....
 
The Amd fans in this thread make Apple fans look tame. The card is way overpriced for the performance it offers and Nvidia just dropped prices. So everyone needs to calm down and accept the fact that at best this is a first gen card with its fancy HBM memory and the like. The next chance team red gets will be at a die shrink and perhaps a new core. There's nothing left to discuss this is a let down from team red. I anticipate prices will drop to sub 500 usd

Remember when Kyle was rocking dual 290x's? They're not biased
 
WTF......resources to simply overclock a video card? What the hell is happening here?

I could have been clearer in that statement. When I am referring to resources in that sentence I am talking about the the time and money it takes to produce the entire overclocking article.

Given that we are seeing excellent overclocks out of the 980 ti (and this would be the card we would directly compare it to), spending the time to do all the testing and writing the review with a miniscule Fury X overclock would be a waste of time. If we see that we can overclock the Fury X the way Joe Macri led us to believe, we will certainly cover that.

Time is also a resource, and it's not just about overclocking a card, it's pretty much redoing an entire review just to fully see the higher MHz difference it makes.

Exactly.


As for follow up articles that have a lot of focus on overclocking, here were just the few we have done since December....so instead of making up wild accusations of bias, I would suggest we will stay with current ways of covering overclocking when it makes sense.

http://hardocp.com/article/2015/04/14/nvidia_geforce_gtx_titan_x_video_card_review/

http://hardocp.com/article/2015/04/06/gigabyte_gtx_960_g1_gaming_video_card_review/

http://hardocp.com/article/2015/03/30/powercolor_pcs_r9_290x_video_card_review/

http://hardocp.com/article/2015/03/23/asus_strix_gtx_960_directcu_ii_oc_video_card_review/

http://hardocp.com/article/2015/03/16/asus_rog_poseidon_gtx_980_platinum_vs_amd_r9_295x2/

http://hardocp.com/article/2015/03/03/asus_rog_poseidon_gtx_980_platinum_video_card_review/

http://hardocp.com/article/2015/02/17/gigabyte_gtx_980_g1_gaming_video_card_review/

http://hardocp.com/article/2015/02/02/msi_geforce_gtx_960_gaming_overclocking_review/

http://hardocp.com/article/2014/12/29/asus_gtx_980_strix_dc_ii_oc_video_card_review/

http://hardocp.com/article/2014/12/01/msi_gtx_980_gaming_4g_video_card_review/
 
WTF......resources to simply overclock a video card? What the hell is happening here?

Doesn't overclock worth a shit at stock voltage and the VRMs on the back are already running in excess of 100C under load. Unless you do some cooling mods to get those VRM temps under control, there's not going to be much overclocking headroom.
 
remember when tomb raider came out and AMD owned it? NVidia users were crying like little babies and AMD helped them get it right on their hardware in about a weeks time

Source?

Oh, and BTW...GW features are either extentions of DX or work in tandem with DX. Even without them, DX games and programs run perfectly fine with nVidia products, just as it typically does with AMD products...given that the developer coding isn't shit.

https://developer.nvidia.com/what-is-gameworks
 
Last edited:
It's actually less hot than the other 2, thats surprising. Wait... is it the only one of the 3 that had water cooling?

Too bad power consumption still isn't all that great. And the price.... sigh. They need to up their game. Or at least price them more competitively.
 
The Amd fans in this thread make Apple fans look tame. The card is way overpriced for the performance it offers and Nvidia just dropped prices. So everyone needs to calm down and accept the fact that at best this is a first gen card with its fancy HBM memory and the like. The next chance team red gets will be at a die shrink and perhaps a new core. There's nothing left to discuss this is a let down from team red. I anticipate prices will drop to sub 500 usd

Remember when Kyle was rocking dual 290x's? They're not biased

1st Wrong! They are sold out everywhere... the card is in the right price and it comes with wc.
2nd You should wait for new drivers. 390x "290x" almost 3 year old card is currently trading blow with GTX980. and the best part is Fury X trading blows with Titan X and 980ti. what do you think is going to happen when they release better drivers?
3rd HBM is not fancy, is actually new tech thats why Nvidia wants yo use it...
4th stop being a fanboy, they aren't paying you right?
 
So you are against forward looking technologies like realistic hair simulation?

Proprietary technologies ON PC games ? YES I'm against it ! And EVERY PC gamer should be against it !

If you like Proprietary technologies in games... buy a console !

And that is what nVIDIA is doing to PC Gaming... it is destroying it ! It is dividing PC Gaming in two... "PC nVIDIA games" and "PC AMD games"...

Is that what you want Brent ? Two more consoles on the market ????
 
Funny how some people refuses to accept the outcome and instead chose to attack [H] reviewer and editors.

That is nothing new. We get called biased for Green, biased for Red....many times both on the same day. It is such an easy, proof-less, and childish form of attack that it does not bother us. We can give direct and detailed answers to such allegations, but it never makes any difference to the accusers. Water, duck's back, all that...

Our passion lies in testing the hardware, collecting REAL WORLD GAMEPLAY data, repeatable data, sharing that data with our readers, analyzing that data and forming a conclusion based on how good /bad our gameplay experience is, then sharing that.

However if you look at gaming as a whole, there are business practices which are beneficial and practicies which are detrimental to it. In console gaming for example, moneyhatting exclusives is one of the detrimental practices.
You may want to call it bias, but if you look beyond just the game popularity, you can provide value for more readers, namely those which are critical of GameWorks/TWIMTBP as something which helps game creators but also hurts gaming in a way.

There are ethical ways to make money and there are unethical ways. Not all companies are the same here. I don't say that AMD are only saints (they are not) but there is a distinct difference in how they approach things vs. NVidia. Open standards and technologies to encourage competition vs. closed ones to lock out the competition.

This review is not about business practices. It is about GPUs, playing real games, and giving our opinions. If you are expecting more in a GPU review from HardOCP, you expectations are misplaced.

The review was in no way Professionally written, sorry. I didn't post because I was in conflict with the results. I didn't ask for any of the results/facts to be changed. I just simply stated that to be professional the format or placing of certain information needed to be different and along with that the word usage.

Some of you apparently have little understanding of format and connotation when it comes to article writing and being a reporter altogether. The way this particular article is written allows for conspiracy to bias whether any exists or not. A lot like WCCF does to get page hits, which most here would agree is deplorable.

I already conceded the facts seemed in line with most reviews with the exception of BF4, so I am not accusing bias here. I just rather read an article based on nothing but facts and findings UNTIL the CONCLUSION where if the author feels the need to "pull no punches" and "tell the truth" of the situation, he can do it there. That format is likely to not lend itself to scrutiny based on bias. But then again it doesn't add to page hits or water-cooler talk , hence why quite a few sites employ the tactics like we saw in this article.

Your thoughts are noted. They were the first time you made the post. I would suggest you not ever visit HardOCP again if you truly have this much disdain for our review format.

Thanks for the review!

While it is not the card I was expecting/hoping, I still went ahead and bought one. It's a big step up from what I got now, and I'm getting tired of crossfire. Going to stick to one card for the time being :cool:

Thanks for the kind words. The Fury X was absolutely not what we were expecting either. That said, it is far from a bad product. It is mis-priced and a bit mis-configured in our opinion. There are truly not any "4K" single GPU cards in our eyes yet. At $550 retail this review would have read very differently. We still would have beat on the 4K claims however....pretty much like we have done with every other single GPU card at this time as well.
 
Here's my background, I'm an IT professional working for a big name retailing company.

I used to be an in-store tech building rigs of all shapes and sizes, prices and such.
I still tinker around on the side with these things due to the nature of my business.

I used to be such a great fan of AMD and ATI, I fell in love with both companies' price per performance compared to Intel/Nvidia.
I have fond memories of gaming on an Athlon XP all the way up to my overclocked Opty 165 on a DFI Lanparty Mobo. There was a time when there was a CLEAR graphics clarity difference between an ATI card and an NVIDIA card.

The day I heard my two loves were merging, I felt a great surge of excitement, thinking, THIS IS IT! This will be the best company ever and both will rise higher than ever bringing joy and good competition to all!

Sadly, none of this has shaped up to be anything more than mere desire.
AMD/ATI is more like an ex-girlfriend you mess around with from time to time, trying to bring back what you felt back once upon a time, yet realize your current high maintenance girlfriend (Intel/Nvidia), is definitely a better bet for the long term.

These laurels that Intel/Nvidia are resting on are beginning to affect us all and pricing/competition is terrible these days with all these companies promises of better power efficiency and production costs.

I agree, AMD definitely would have had a better hit if they came about this more humbly, at a cheaper price point, perhaps targeting the 1080/1440p performance sector and providing less false "4K READY" promises, in an attempt at invoking investors' and consumers' ire.

Both divisions are definitely not interested in winning anyone's dollar but rather in shaking a stick at the competition.

Oh, well, there's always next year. :(

*Edit, Sorry about the Rant Kyle, the anticipation built up around this card was surreal and the delivery, less than stellar.
I am definitely sure that once Windows 10 is out and a few DirectX 12 benchmarks/games are available, we may have a different view as to what AMD was hollering about with it's press materials.
But your right, the current performance of DirectX 11 games are... fair, comes off as lukewarm and is definitely NOT what they needed this round to stir the Green Giant.
This has left me wondering... Where did they get those numbers they keep touting around? What are their marketing guys are smoking up there at AMD Corp? :cool:
 
Last edited:
Proprietary technologies ON PC games ? YES I'm against it ! And EVERY PC gamer should be against it !

If you like Proprietary technologies in games... buy a console !

And that is what nVIDIA is doing to PC Gaming... it is destroying it ! It is dividing PC Gaming in two... "PC nVIDIA games" and "PC AMD games"...

Is that what you want Brent ? Two more consoles on the market ????

Like above from another poster. You are talking about the business market as it applies to GPU and gaming. We are reviewing hardware and how well it plays games, not penning editorials about said market. Don't confuse the two.
 
Good review. Sad outcome. AMD dropped a turd worthy of being flushed after a triple wipe.
 
Hardforum will always report the facts as they are. I can usually count on Kyle and Brent to report them. It is not their fault AMD is a no show. Certainly, in 5 years when there is zero competition and the GPU advances are nil, they will be complicit. Truth be told, they were sideline commentators. Not players.

It's not really AMDs fault that they cant bring it with the budget and smaller personnel pool.




A lot of people like to bitch at AMD and in turn give NVidia their money. Anyone remember GTX 970 bullshit? If you guys really like AMD stop expecting a miracle and a graphics card that can walk on water and beat the crap out of NVidia. How is AMD supposed to do that when it has no money.


You guys want AMD to be competitive? Buy their shit.
 
Here's a mock dev scenario:

Dev says we need a PC version to maximize revenue. Calls up a Port Studio.
Devs says we need money to pay Port Studio, let's see if Nvidia or AMD will pay.

Nvidia gladly pays for a popular AAA title and gives the Port Studio the tools (Gameworks) to make porting easier.

Nvidia then advertise the game as a Gameworks title and promises an awesome gaming experience.

Port Studio isn't equipped enough to handle a AAA port and drops the ball.

Dev and Port Studio deliver a shit experience, even on Nvidia hardware with 30fps caps, missing advanced features etc.

Nvidia or AMD takes the blame. Dev counts money.

Rinse and repeat.

This has been happening for 20 years. Ever buy a game that required Glide (3Dfx) but all you had was D3D (Riva)? Or you had a Rendition card and were limited to RRedline games?
 
You guys want AMD to be competitive? Buy their shit.

for me it seems Fury owns Nvidia so I am buying it.
Cant rely on the way Hardocp do their testing as I never bought stuff based on that.
I check the way I do gaming and for my set up Nvidia dont offer me anything better.
Fury however does is why I buy AMD as its superior stuff.
 
supreme butthurness :D

Good review even suprising since H, had a very favourable opinion towards the 290 (x) despite its 95 degrees
Ordered a zotac 980 ti, thanks
 
{H}ardology is the search for FACT. Not truth. If it's truth you're interested in, the Softforums are down the hall.
 
for me it seems Fury owns Nvidia so I am buying it.
Cant rely on the way Hardocp do their testing as I never bought stuff based on that.
I check the way I do gaming and for my set up Nvidia dont offer me anything better.
Fury however does is why I buy AMD as its superior stuff.


That's brand loyalty talking. If my thought is misplaced, I would love to hear some reasoning behind your statements above.
 
for me it seems Fury owns Nvidia so I am buying it.
Cant rely on the way Hardocp do their testing as I never bought stuff based on that.
I check the way I do gaming and for my set up Nvidia dont offer me anything better.
Fury however does is why I buy AMD as its superior stuff.

It's not a bad card. I would have priced it a little lower... But it makes sense to see if people will buy it where it is from AMD's perspective.

I have a hard time understanding what you wrote though. Why does it own nVidia? Perhaps constructive criticism? What would you rather see? How do you verify Fury X is better for the way you game? Why do you consider it superior? Ect...

Not saying anyone would care, but it might sway someone to buy AMD or at least have a discussion. I am honestly curious.

Edit: Kyle beat me to it, in a much shorter to the point manner haha.
 
Last edited:
for me it seems Fury owns Nvidia so I am buying it.
Cant rely on the way Hardocp do their testing as I never bought stuff based on that.
I check the way I do gaming and for my set up Nvidia dont offer me anything better.
Fury however does is why I buy AMD as its superior stuff.

Go ahead, might as well put your money where your mouth is, you deserve what you get :p
I've made the sensible decision and ordered a 980ti.
That you can conclude Fury is better is amusing.
There is of course no evidence from you.
Good luck.
 
I don't get the people complaining about the lack of overclocking in the article. Are you sure you want to go down that road of overclocking the Fury X and then overclocking the 980Ti? 980 Ti overclocks to about 30%, Fury X - 5-10%, It will not be pretty for the Fury X.
 
I don't get the people complaining about the lack of overclocking in the article. Are you sure you want to go down that road of overclocking the Fury X and then overclocking the 980Ti? 980 Ti overclocks to about 30%, Fury X - 5-10%, It will not be pretty for the Fury X.


Oh yeah, my GUESS is that the 980 ti is going to smoke the Fury X when it comes to overclocking results. I would love to be proved wrong on that however. AMD?
 
But...I'm still worried about the Fury X's VRM temperatures.

Please educate me. If the VRM temps are already at 100C I don't think there is room for OC with the current cooling solution. Assuming AMD will unlock the voltage.

How high can VRM temps go?

Sorry this is important to me. I'm not planing on overclocking but the VRM temps at 100C are making me nervous on buying this card. I'm seriously thinking this was a design flaw and could potentially be a problem after few months of usage?

Sorry I'm not expert on GPU components :D
 
Yet Kyle, you didn't speak to any point directly that I made. This usually signifies my stance and point are correct and you have no defense. Had hoped you would take heed.
 
It's not really AMDs fault that they cant bring it with the budget and smaller personnel pool.

A lot of people like to bitch at AMD and in turn give NVidia their money. Anyone remember GTX 970 bullshit? If you guys really like AMD stop expecting a miracle and a graphics card that can walk on water and beat the crap out of NVidia. How is AMD supposed to do that when it has no money.

You guys want AMD to be competitive? Buy their shit.

The problem with Fury wasn't that AMD had no money for development, lol. They did manage to come up with a solid GPU for the segment it is in. The problem is they misrepresented the product. They gave the impression that this was to be that miracle GPU you mentioned.

Where Fury X sits is in between 980Ti and 980. Therefore if it was priced for that segment, ie. $550, it would have been well received. That would have made it a clear winner over 980, while also being a cheaper alternative to 980Ti with a slightly lesser performance.
 
When the first 3D graphics card came out...I wonder if this same childish proprietary argument was going on?

This is an outrage folks. All the processor makers are doing 3D graphics via direct CPU calculation. Now this company comes out and creates proprietary HW to allow this work to be offloaded from the CPU. They even have the audacity to court companies to craft code in a way which makes their HW perform better. We cannot stand for this! I want a level playing field and eliminating HW 3D acceleration is the only way.
 
Kyle, not sure if you'll see this or if it's been answered. Was the 980ti boosting? Or were they "heated up" before testing to stabilize clocks? Just trying to get a better idea of whats doin.
 
This is what I'm worried about: Will this create issues in the long run? Even w/o OC?

Fiji_Cooler_Master_Heat_Full_Load_380Watt-pcgh.jpg


http://api.viglink.com/api/click?fo...ww.hardware.fr/articles/937-...s-sonores.html
IMG0047700.png
 
Back
Top