AMD Radeon R9 Fury X Video Card Review @ [H]

Maybe someone mentioned this already, but looking at this page: http://www.hardocp.com/article/2015/06/24/amd_radeon_r9_fury_x_video_card_review/9#.VYuktle4SSo

It would appear to me that the FPS over time for Fury X tracks quite well with the 980 Ti, with no sudden, unexpected nosedives that would be indicative of the vram limit being breached and textures getting thrashed. This is so even in supposedly vram heavy games like GTA V and Dying Light. Does this mean the 4GB framebuffer thus far isn't a huge problem, or am I reading the graphs completely wrong?
 
Maybe someone mentioned this already, but looking at this page: http://www.hardocp.com/article/2015/06/24/amd_radeon_r9_fury_x_video_card_review/9#.VYuktle4SSo

It would appear to me that the FPS over time for Fury X tracks quite well with the 980 Ti, with no sudden, unexpected nosedives that would be indicative of the vram limit being breached and textures getting thrashed. This is so even in supposedly vram heavy games like GTA V and Dying Light. Does this mean the 4GB framebuffer thus far isn't a huge problem, or am I reading the graphs completely wrong?

The real indicator would be VRAM usage statistics. It would be nice if Brent posted these during an Apples to apples comparison. But we seen several scenarios where ultra high textures have results in close to 8 Gigs of memory use. The simple lack of a spike might be the result of not all the Fury's memory being used up. And sometimes games auto adjust down based on the amount of VRAM available to avoid thrashing. DX programming allows me to have a choice of where I wanted memory allocated for a texture. (system or GPU) AMD's compression textures might apply here to save some memory space. But it doesn't mean that 95% of the memory isn't in use already with compression. So it's hard to say. But lack of memory doesn't bode well for the future.

Also there's a disadvantage with compressing textures. If I want to retrieve a specific set of pixels for modification on a texture it becomes problematic. The process of retrieve, decompress, modify, recompress, and write could throw the memory into nasty fragmentation.

I hate to see AMD to stumble like this. I'm a long time supporter with ATi 8500, ATi 9800 Pro, and AMD 7970. (I also owned NVidia TNT2, MX200, and 560) Given an equal performance and "future-proof ness" I would lean toward AMD just because I don't like NVIDIA's marketing tactics. But I can't justify it. Besides I'm waiting for the next smaller process node.
 
Last edited by a moderator:
The real indicator would be VRAM usage statistics. It would be nice if Brent posted these during an Apples to apples comparison. But we seen several scenarios where ultra high textures have results in close to 8 Gigs of memory use. The simple lack of a spike might be the result of not all the Fury's memory being used up. Games sometimes auto adjust down based on the amount of VRAM available. AMD's compression textures might apply here to save some memory space. But it doesn't mean that 95% of the memory isn't in use already with compression. So it's hard to say. But lack of memory doesn't bode well for the future.

Based on my understanding of compression, it has nothing to do with how much VRAM is used. It is compressed between the VRAM and GPU so that it uses less bandwidth than it would otherwise need. In other words, 4GB of assets is 4GB of assets regardless of memory type, in this case, GDDR5 vs HBM. The compression advantage comes into play when the GPU is accessing VRAM.
 
First I have read the article and all posts here. I am not overly fond of the article as it leads itself to really unprofessional writing. I have stated before that an article reviewing any piece of equipment should leave personal remarks and words with negative connotation till the CONCLUSION. As some, few, have mentioned earlier it is somewhat hard to read as it lacks objectivity from minute one. The section about the R9-300 series and posts about Vram size and resolution for gaming would have been better suited for the conclusion as it seems more of a slant than anything substantive. That section set the tone for the rest of the article and what makes it hard not to feel there was an agenda, whether there is one or not.
Then the line in 1440p BF4 benchmark about giving the middle finger. That was quite infantile and again better suited for the Conclusion, where opinions belong. I think my point about word usage and placing of such is made so on to another point.

I couldn't have said it better. It's as if a jaded high school kid wrote this review.
 
Ok now that I am at home and have time to give comment here it is.

First I have read the article and all posts here. I am not overly fond of the article as it leads itself to really unprofessional writing. I have stated before that an article reviewing any piece of equipment should leave personal remarks and words with negative connotation till the CONCLUSION. As some, few, have mentioned earlier it is somewhat hard to read as it lacks objectivity from minute one. The section about the R9-300 series and posts about Vram size and resolution for gaming would have been better suited for the conclusion as it seems more of a slant than anything substantive. That section set the tone for the rest of the article and what makes it hard not to feel there was an agenda, whether there is one or not.

Then the line in 1440p BF4 benchmark about giving the middle finger. That was quite infantile and again better suited for the Conclusion, where opinions belong. I think my point about word usage and placing of such is made so on to another point.

Seems for the greater part, the reviews findings of fact are in line with most others so I am not debating them. Though I think BF4 needs another look (it was the only test that seemed off). However that being said, the test suite does seem shallow and does give credence to some scrutiny. It was stated that the latest games were used and that was the reason (for being shallow , as in small in number not intellect). I get not using Skyrim though it is still a widely played game, but there are other recent games that don't seem to get used that fit the criteria. Ryse: Son of Rome released Oct 10,2014 and therefore is a recent game. It also happened to be a game that played very well on AMD, scrutiny is warranted for both why it is and is not used. Then you also have Dragon Age: Inquisition which didn't seem to show affinity to one side or the other, that released end of 2014 and was quite an anticipated release. Again fitting the criteria. However the choice of games are limited greatly to ones that do inherently favor Nvidia. NOT A CONSPIRACY but a statement of fact. I would rather have them exist in reviews than not. My argument is not against their existence in [H] bench suite but rather the exclusion of others, or in this case the dismissal of concern by the authors.

And as was stated early on in the first few pages was the concern over a statement in the review:

We saw up to 5GB of usage when the VRAM capacity was there to support it. This game is bottlenecked on every 4GB video card at 1440p with maximum in-game settings.

But when you look at the 4K bench tests, the same Fury that at 1440p was behind the 6Gb 980Ti now was neck and neck with it. Therefore the concern was the facts not the statement that unfortunately was blown off by the author and editor. Point being if 4Gb was indeed a bottleneck and reason for concern at just 1440p, then what was going on at 4K. That was the question and a good one seeing the facts didn't back up the statement. Even if 4Gb is a concern and a factor for the fury in the results then what was the concern affecting the 980Ti?

Anyway those are a few of my concerns and observations. Sad part is PcPer has had excellent and objective reviews of the 300 series (R9-390 in particular) and Fury. Used to be I couldn't stand their reviews for the same reason I dislike this one, too much opinion throughout with little objectivity. I was fine with the PcPer conclusion which wasn't a great deal different than most that the Fury was a bit less than stellar, not living up to the hype. But at least they gave the positives and negatives without the flaming.

Thanks for this, durq. This sums up my opinion to a tee.
 
So this took 2 years to get here, I'm pretty sure that's not going to hold them over till their next flop of a release.... They have what 9 Quarters of operation $$$ left?

I really don't get it, the 290X was competitive as hell, this is like a POS science project you halfassed 20 minutes before class and got lucky with a C.
 
First off, the article was written professionally. Anyone can see this. Of course if you are some Bro type tweener, those would be the type of words you would use in an attack. No doubt this tread has many. And now we has some clown suggesting the review was written by a jaded high school kid. That's pretty rude and again, a clear example of the only thing that's wrong with this review is that it brings out the unintelligent and dim wits that try and get in a few cheap shots in.

A lot of you guys have tunnel vision. You see a few things and think that's all that matters. The review was written to a larger more broad spectrum. Sorry your feelings are hurt but the majority of people here do care that the price seems out of touch with the features. There are millions of people with DVI that cannot use this card. Fact, adapters introduce latency, go research this yourself. There are a lot of people that have invested $1000 - $2000+ dollars into HDMI 2.0 4K displays. This card is useless to them. Then there is the question of only using 4GB of ram. I could make other points as well.

If you're not mentally capable of having the intellect to find a way to contribute to this discussion in a neutral or positive manner, then don't. Be man enough to come back when you're ready. We'll save a seat for you. Don't come here raging with tunnel vision. Look at the bigger picture as it might relate to the larger collective here, not just you. Empathy and unbiased goes a long way.
 
First off, the article was written professionally. Anyone can see this. Of course if you are some Bro type tweener, those would be the type of words you would use in an attack. No doubt this tread has many. And now we has some clown suggesting the review was written by a jaded high school kid. That's pretty rude and again, a clear example of the only thing that's wrong with this review is that it brings out the unintelligent and dim wits that try and get in a few cheap shots in.

A lot of you guys have tunnel vision. You see a few things and think that's all that matters. The review was written to a larger more broad spectrum. Sorry your feelings are hurt but the majority of people here do care that the price seems out of touch with the features. There are millions of people with DVI that cannot use this card. Fact, adapters introduce latency, go research this yourself. There are a lot of people that have invested $1000 - $2000+ dollars into HDMI 2.0 4K displays. This card is useless to them. Then there is the question of only using 4GB of ram. I could make other points as well.

A lot? 4K displayers who threw down over a grand are not even 1%. They are the 1% of 1% of 1% of 1% and more.

The performance issues are a legit let down. We won't know if it was the ROPs or the 4Gb of ram until AMD releases an updated Fury X.
 
If you're not mentally capable of having the intellect to find a way to contribute to this discussion in a neutral or positive manner, then don't. Be man enough to come back when you're ready. We'll save a seat for you. Don't come here raging with tunnel vision. Look at the bigger picture as it might relate to the larger collective here, not just you. Empathy and unbiased goes a long way.

Bravo, well said. Based on your stunning example, we can assume that ad hominem attacks are the defining factor of being a man with intellect, yes?
 
Some of you guys (especially people 9+ years) should know by now the reviews here often pull no punches.. If PR sells a line of shit then they're going to get called out for it.

I can understand being butthurt if you're some AMD evangelist thinking this was the second coming of Christ, but it's not - too bad. I sure hope you weren't here singing Kyle & Brett's praises when the 290x was released, got a gold award and became the setup that Kyle was running for months as his crossfire system at home.

I mean, look at these pro-nVidia reviewers at work! http://www.hardocp.com/article/2013/10/23/amd_radeon_r9_290x_video_card_review/16#.VYuGGvlVhBc
 
Can't speak for others, but I personally will not give one cent to purchase or play TWIMTBP/Gameworks games and Nvidia + Intel products because I simply DO NOT condone intentionally gimping a competitor to make your products look better. I DO NOT condone or agree with intentionally gimping testing suites/benchmarks to adversely affect your competitor(s) ala Nvidia and Intel's unethical/monopolistic tricks past and present.

I have yet to ever see this proven. I think it's more an easy excuse when a card fails to live up to the hype.

In any review that I personally read, Gameworks/TWIMTBP results are discarded because unless and until the codepaths/code are transparent for the world to demonstrate non-trickery by anyone, based on Nvidia and Intel's past and current unethical behavior in this regard, equal footing CANNOT be assumed.

The code isn't likely to be "given" because it represents intellectual property. If the code was open, their competitor would likely use it in their own "built for amd" plan. Not likely, and amd wouldn't share code like that either.

The reason many games are gameworks games is because nvidia is out there working with game developers a hell of a lot more than amd is. Nvidia does it to add cool features to games, often features the devs wouldn't have time to do on their own. Amd puts their name on games just for advertisement. Can't say I can recall a new game in the built for amd plan, that had some cool new features that wouldn't have been possible otherwise. Hell there's been games in the built for amd plan that couldn't even run crossfire at launch. What's the point of a game dev being in the amd plan then? hmm? Nvidia made FXAA, which is just about the best aa for the cost regardless of card brand. And it works on AMD cards to boot! wtf?1!

But you can stop being so close-minded, and buy the game-works games because, guess what? You can turn off those features! Don't want gameworks-hairworks, that was written by nvidia! omg! untick the box! now now now!

Move the slider all the way to the bottom, whatever. But boycotting a game because it has the features just hurts pc gaming.

If you can show that the code intentionally hampers an amd gpu, by all means, please do.

Turning off those gameworks features, and then measuring performance of amd vs nvidia, has shown the same performance differences. So GW on or off, card x is y% slower than card z. Disproves that argument.

[offtopic a bit]
Fanboy psychology: http://lifehacker.com/the-psychology-of-a-fanboy-why-you-keep-buying-the-sam-1300451596
 
Last edited:
WTF......resources to simply overclock a video card? What the hell is happening here?

Time is also a resource, and it's not just about overclocking a card, it's pretty much redoing an entire review just to fully see the higher MHz difference it makes.
 
Time is also a resource, and it's not just about overclocking a card, it's pretty much redoing an entire review just to fully see the higher MHz difference it makes.

If only this was a review site dedicated to hardware reviews particularly at the enthusiast level... If only....


Honestly, if we get the tools to mess with voltage and really test the limits of over clocking, and [H] doesn't bother with it, it would be a clear sign (to me) of a very extreme bias.
 
Last edited:
If only this was a review site dedicated to hardware reviews particularly at the enthusiast level... If only....


Honestly, if we get the tools to mess with voltage and really test the limits of over clocking, and [H] doesn't bother with it, it would be a clear sign (to me) of a very extreme bias.

you mad bro?



wtf is up with you people? The card is slower than we had hoped. Get over it.
 
you mad bro?



wtf is up with you people? The card is slower than we had hoped. Get over it.

Not mad at all, a little reading comprehension goes a long way. Look at my sig first, and you'll see I'm (currently) an NVidia user.


This is a flagship card and is competitive with the 980Ti, though priced just a bit too high for what it delivers. The card isn't bad, the price is. If overclocking becomes an option things would be very interesting and I'd like to see the results. H not bothering to test overclocking on a flagship card would be silly of them considering the type of site they are. It's that simple. Bro.
 
Not mad at all, a little reading comprehension goes a long way. Look at my sig first, and you'll see I'm (currently) an NVidia user.


This is a flagship card and is competitive with the 980Ti, though priced just a bit too high for what it delivers. The card isn't bad, the price is. If overclocking becomes an option things would be very interesting and I'd like to see the results. H not bothering to test overclocking on a flagship card would be silly of them considering the type of site they are. It's that simple. Bro.

they addressed the overclocking back on like page... 2 or 3 or something. If they are able to unlock the voltage and get some real juicy OCs, they'll do a follow up review.

I had just read quite a few scathing comments by fanbois implying impropriety on Brent's part, for his choice of games.

I over reacted and thought you were one of those fanboys. My bad :)

Yeah, the card should be priced about $50 $100 lower.
 
Honestly, if we get the tools to mess with voltage and really test the limits of over clocking, and [H] doesn't bother with it, it would be a clear sign (to me) of a very extreme bias.

Don't hold your breath, this is a water-from-a-stone situation. If AMD felt there was a point to unlocking voltage, they would've (or should've) made damn sure they had it unlocked for reviewers. There is obviously some technical problem. And it's also apparent there was a significant disconnect between engineering and marketing when they made all the big boasts during the E3 reveal about 'overclocking like no tomorrow'.

That said, your "if they don't do X, then they're extremely biased" statement is absurd. Above all else these guys are technology enthusiasts - its very apparent in every review. I remember Brent in particular defending AMD vehemently when Maxwell launched last year and people were calling it the death knell for AMD - his response was basically "Why would you want that? We're all screwed if that happens". And he was right.
 
A lot? 4K displayers who threw down over a grand are not even 1%. They are the 1% of 1% of 1% of 1% and more.

However the people putting down over a grand for 4K TV for gaming are also the people willing to put +600$ to a premium/enthusiast level video card. Usually the average midrange gamer do not have that kind of money. It is the enthusiasts with bigger budgets this card should be catering to. Leaving out important detail like HDMI 2.0 puts the card in quite severe identity crisis.
 
they addressed the overclocking back on like page... 2 or 3 or something. If they are able to unlock the voltage and get some real juicy OCs, they'll do a follow up review.

That's all I'm asking. That when overclocking becomes a viable option, that they give the card a fair shake. Besides, these are probably the last high end cards we are going to see for quite some time. We need to milk as much content as we can out of them.
 
A lot of you guys have tunnel vision. You see a few things and think that's all that matters. The review was written to a larger more broad spectrum. Sorry your feelings are hurt but the majority of people here do care that the price seems out of touch with the features. There are millions of people with DVI that cannot use this card. Fact, adapters introduce latency, go research this yourself. There are a lot of people that have invested $1000 - $2000+ dollars into HDMI 2.0 4K displays. This card is useless to them. Then there is the question of only using 4GB of ram. I could make other points as well.

Exactly my thoughts towards the farcical and semantical defending of what is, in a very real sense, a disappointing, underperforming, overpriced, inflexible, and feature lacking product. Way to much tree house club blindfolded bro-fisting (the kind without touching fists) going on.
 
I just saw the tweaktown review the fury is performing slower than or similar to 390x in alot of their tests, that just doesn't make any sense. I have no idea that the hell is lacking in this card or some cards are just duds. I think AMD hopefully comes with a newer architecture and see where the bottleneck is, it seems like GCN is on its last leg or it just isn't performing at those clock speeds and held back by 28nm. I am thinking those shaders might be begging for higher clock speeds, but who knows. I bet AMD is already done with this experiment has probably all their attention on next gen hbm 2 and their next gen chips. Seems like these chips are begging for 14nm and 28nm just isn't cutting it for GCN
 
That said, your "if they don't do X, then they're extremely biased" statement is absurd. Above all else these guys are technology enthusiasts - its very apparent in every review. I remember Brent in particular defending AMD vehemently when Maxwell launched last year and people were calling it the death knell for AMD - his response was basically "Why would you want that? We're all screwed if that happens". And he was right.

That's your opinion and you're of course entitled to it. It doesn't change my mind though. If the option becomes available, I'd like to see [H] explore it. And not just [H] either. I'd expect the same from AT, Guru3d, OCN, etc etc. If it doesn't become an option, then there's no issue.
 
I like the noobies posting here accusing the review of bias. Strange that these people only come out of the woodwork to post when a negative review comes out.
 
I like the noobies posting here accusing the review of bias. Strange that these people only come out of the woodwork to post when a negative review comes out.

Couldn't be that reasoned minds are calling out a suspect review.......
 
Funny how some people refuses to accept the outcome and instead chose to attack [H] reviewer and editors.

It's unfortunate that this GPU isn't what we all hoped it would be. Regardless of whether the author sugarcoat the article or just get straight to the harsh truth, it makes no difference to the outcome. But from a consumer's perspective, I do not want to read an article that tries to be nice about a product if it isn't good. If it's a spade, call it a spade.
 
Tbh AMD didn't do that bad of a job considering their lack of funds for R&D. But still nVidia beats them in most occasions. Some sites test 4K resolution with older games too and then Fury X comes in better light in those tests.

But still... its weird that their old rebranded card has 8GB and they sell it for 4K gaming. Then they bring a new "super" card with only 4GB memory. Eh.

I'm not that interested in 980 Ti or Fury X. What I'm really interested in is the Fury Nano. But have to wait a while for it to come out to see tests.
 
I'm trying to understand what you want us to do, cause it sounds like you want us to cherry pick games based on features and ignore one brands features, thus creating a bias, and thus not evaluating new, common, popular games people are playing on the PC today.
Fair enough. It is of course a valid approach to have this strictly about the currently popular games, and which card is performing better. A lot of gamers care only about this. [H] has chosen this path and of course everybody needs to respect this choice.

However if you look at gaming as a whole, there are business practices which are beneficial and practicies which are detrimental to it. In console gaming for example, moneyhatting exclusives is one of the detrimental practices.
You may want to call it bias, but if you look beyond just the game popularity, you can provide value for more readers, namely those which are critical of GameWorks/TWIMTBP as something which helps game creators but also hurts gaming in a way.

Okay... I just don't get why you even care what these companies are doing to their competitors, unless you work for one of the affected companies. This is AMD's problem to deal with, not yours. Trust me, no AMD rep is gonna be white knighting for you. They just want your money, same as Nvidia.
There are ethical ways to make money and there are unethical ways. Not all companies are the same here. I don't say that AMD are only saints (they are not) but there is a distinct difference in how they approach things vs. NVidia. Open standards and technologies to encourage competition vs. closed ones to lock out the competition.

I have yet to ever see this proven. I think it's more an easy excuse when a card fails to live up to the hype.
It is hard to prove because only NVidia knows what is going on inside GameWorks.

However there are a few instances where it became quite obvious that cooperation between NVidia and game developers had the specific goal of hurting the performance of competing cards. The Crysis 2 tesselation comes to mind.
Batman: Arkham Asylum antialiasing is another.

I like the noobies posting here accusing the review of bias. Strange that these people only come out of the woodwork to post when a negative review comes out.
Among the people critical of this review are not only noobies.
 
The review was in no way Professionally written, sorry. I didn't post because I was in conflict with the results. I didn't ask for any of the results/facts to be changed. I just simply stated that to be professional the format or placing of certain information needed to be different and along with that the word usage.

Some of you apparently have little understanding of format and connotation when it comes to article writing and being a reporter altogether. The way this particular article is written allows for conspiracy to bias whether any exists or not. A lot like WCCF does to get page hits, which most here would agree is deplorable.

I already conceded the facts seemed in line with most reviews with the exception of BF4, so I am not accusing bias here. I just rather read an article based on nothing but facts and findings UNTIL the CONCLUSION where if the author feels the need to "pull no punches" and "tell the truth" of the situation, he can do it there. That format is likely to not lend itself to scrutiny based on bias. But then again it doesn't add to page hits or water-cooler talk , hence why quite a few sites employ the tactics like we saw in this article.
 
Well going by one of Kyle's earlier posts today he hasn't heard of any OC tools being released. Does AMD even plan to unlock the Fury's voltage officially? Seems it is all rumor if [H] has not been briefed on it...

If thermal camera images (covers removed) are accurate I most certainly would not touch the voltage.
 
There was nothing wrong with the article. It was bold and to the point which is the [H] way of writing reviews.

The same type of articles have been written for other vendors including NVidia and you guys didn't have a problem then. So why now?

Is it because [H] has disturbed the established order by writing against a company that you guys love so much?
 
the review was harsh but fair. it doesn't attempt to suck up, but actually delve into the real issues at hand regarding amd's fiji product with hbm in it.

hbm is an exciting new technology that definitely needed to be added especially when considering adding more gpus in the future.

that said, looking at the benchmarks and features, the 980 ti just outperforms it ..... yes it doesn't have hbm, but at the end of the day, people are going to be looking at the performance and features the product has ...... not what fancy new tech which amd didn't really show made up for everything else it did subpar.

it's sad to bad mouth amd, considering that they've done alot of good innovative things for the industry, like the push for mantle which force microsoft's hand to come out with dx12 ..... also freesync as a cheaper more available version of nvidia gsync .....

nvidia is basically 2 steps ahead of amd at this point. their highend product performs better, and they just waited for amd to release their fiji first, see how it fairs, then they'll release pascal after that slightly later to steal amd's thunder after learning from their mistakes/hicups upon their fiji launch.

so what we learned is 4gb hbm doesn't cut it versus 6gb or higher vram. hopefully pascals hbm2 will have 8gb... heck i'd be happy with even 6gb.
 
Thanks for the review!

While it is not the card I was expecting/hoping, I still went ahead and bought one. It's a big step up from what I got now, and I'm getting tired of crossfire. Going to stick to one card for the time being :cool:
 
Back
Top