AMD Radeon R9 Fury X Video Card Review @ [H]

Lazy console game developers are the reason cards need 8 GIG OF RAM right now! The structure of the console and their 8gb of shared video ram is the reason for this atrocity and why cards with a whopping 4gb of ram run slow.
 
Man that HDMI move is baffling. Why would they do that? Not like 4K TV's come with DP.
 
You guys are catching a lot of criticism because you test only a few games. Testing such a low number has, perhaps, caused you to reach different conclusions, or at least far more negative ones, from pretty much every other review site on the planet. Perhaps in the future you could do 10 or 15 tests instead of just a couple?

Your assertion that they are getting different results from other review sites is 100% false. The only site showing the Fury X performing better is THG, and their "testing methodology" can only be loosely labelled as such.

There is no reason to test 10 or 15 games. Test the latest ones that make GPUs go up in smoke, and only those, because they are the only tests that truly matter... unless you really think running WoW or Team Fortress 2 at 300FPS+ actually matters.
 
Dislike the very end of the review:

> In terms of gaming performance, the AMD Radeon R9 Fury X seems like better competition for the GeForce GTX 980 4GB video card, rather than the GeForce GTX 980 Ti. GTX 980 cards are selling for as low at $490 today. This is not a good thing since the AMD Radeon R9 Fury X is priced at $649, the same price as the GeForce GTX 980 Ti.

This seems like a bit of an exageration. There is a pretty big gap between 980 and 980Ti, the FuryX is much closer to the 980Ti than 980. Seems like an unwarrented dig, especially considering 980 was not part of review.

A heavily overclocked 980 would smoke this thing, and you know it.
 
I am not talking about price I am talking about tech...

Who the fuck cares about tech?

If there's a video card out there that has a tiny monkey inside banging rocks together, but it somehow spits out twice the framerate of a Titan X in an apples-to-apples comparison while consuming a similar level of power at a reasonable noise level (quiet down, monkey!) and sells for under $1000, I'm buyin' that sumbitch!
 
You guys are catching a lot of criticism because you test only a few games. Testing such a low number has, perhaps, caused you to reach different conclusions, or at least far more negative ones, from pretty much every other review site on the planet. Perhaps in the future you could do 10 or 15 tests instead of just a couple?

You do realize that you're asking for 2x-3x the workload for the review, right? Also, they don't recycle old benches (to ensure that you're getting a like-for-like comparison of the products/drivers at the time of review), which means a lot of testing. You get a very straightforward snapshot of the overarching system (software and hardware).

This is all for better or worse, but at least HardOCP is open and forthright about this. Given that GFX cards and their relationship to games is constantly evolving, it seems a pretty fair compromise.

All told, though, you have to remember that a design pipeline for ASICs like these are several years long, and that means some major speculation about the future landscape. This looks like an architecture designed years ago for a smaller node, and then aspects of it had to be pushed and pulled to fit into a 28 nm space. Likewise, the present computing demands of modern games may or may not fit well with capabilities of the chip (I honestly don't know, but it *seems* that NVidia's chips are better aligned). It was already clear that HBM was not going to make a huge difference performance-wise (given the wide GDDR5 pipes of NVidia's cards aren't bandwidth limited). Then you have what looks like rather immature drivers.

That's not the least bit of an excuse, because I could probably write something similar for NVidia. This stuff is hard.
 
Brent and Kyle can continue to claim that the blackbox crippletastic garbage that is Gameworks makes no difference in the results of AMD cards if they so choose. The simple truth is that AMD cannot fully de-cripple the performance of AMD cards in Gameworks games--because they cannot see the code in them to do so--and THAT---3/5 of [H] tested games being GW and 4/5 being TWIMTBP---is the reason the [H] review skews so heavily in the 980Ti's favor versus many other sites which show the Fury X basically on par with the 980Ti/Titan.
 
Last edited:
early drivers same ballpark as 980ti
awesome

Indeed. After the drivers mature things will be different. I believe that is the biggest point missing from this review. Other than that, a good review, and a disappointing initial showing from AMD.

I'm pretty sure this is going to be a non-issue overall though, because the Fury X is sold out all across NewEgg. Clearly, people want to buy it. Just wait for better drivers. A situation anyone buying cutting edge hardware should be familiar with.
 
Brent and Kyle can continue to claim that the blackbox crippletastic garbage that is Gameworks makes no difference in the results of AMD cards if they so choose. The simple truth is that AMD cannot fully de-cripple the performance of AMD cards in Gameworks games--because they cannot see the code in them to do so--and THAT is the reason the [H] review skews so heavily in the 980Ti's favor versus many other sites which show the Fury X basically on par with the 980Ti/Titan.

I have to agree with this.
 
Lots of "fury" in this thread. It's not a bad card, it's just positioned wrong. If they'd priced and positioned it to compete with the 980 the reviews would be singing its praises. Up against the 980ti is where the fail rage is coming from. AMD made a big mistake. It's not like they didn't know what the 980ti could do. Even if they change the price a week from now the damage will already have been done.
 
Brent and Kyle can continue to claim that the blackbox crippletastic garbage that is Gameworks makes no difference in the results of AMD cards if they so choose. The simple truth is that AMD cannot fully de-cripple the performance of AMD cards in Gameworks games--because they cannot see the code in them to do so--and THAT is the reason the [H] review skews so heavily in the 980Ti's favor versus many other sites which show the Fury X basically on par with the 980Ti/Titan.

So your solution is to ignore and not use common, popular games gamers are playing right now on the PC? Just weed out and ignore and cherry pick games based on features they support?

That to me, spells bias
 
Brent and Kyle can continue to claim that the blackbox crippletastic garbage that is Gameworks makes no difference in the results of AMD cards if they so choose. The simple truth is that AMD cannot fully de-cripple the performance of AMD cards in Gameworks games--because they cannot see the code in them to do so--and THAT is the reason the [H] review skews so heavily in the 980Ti's favor versus many other sites which show the Fury X basically on par with the 980Ti/Titan.

In the end that isn't going to matter because they are games that people play. If the card doesn't perform in the games that I play, why would I care if it wins in some games I don't play? You buy what performs in what you play, no matter what the reasons are.
 
Performance is lacking and seriously fail on the no hdmi 2.0. I mean really??? How can you focus on 4K gaming and leave that out? Last AMD card I had was a great bang for the buck 6870. Drivers were buggy but the card was ahead of its time. If this was priced at $400 or so this might fit into those shoes and people could live with the lower performance, memory, and buggy drivers.
 
Brent and Kyle can continue to claim that the blackbox crippletastic garbage that is Gameworks makes no difference in the results of AMD cards if they so choose. The simple truth is that AMD cannot fully de-cripple the performance of AMD cards in Gameworks games--because they cannot see the code in them to do so--and THAT is the reason the [H] review skews so heavily in the 980Ti's favor versus many other sites which show the Fury X basically on par with the 980Ti/Titan.

Well if AMD would get off it's ass and actually work with developers again like they used to we wouldn't see so many Gameworks games. A couple years ago AMD was really aggressive with courting developers but lately they've done fuck all.
 
You do realize that you're asking for 2x-3x the workload for the review, right? Also, they don't recycle old benches (to ensure that you're getting a like-for-like comparison of the products/drivers at the time of review), which means a lot of testing. You get a very straightforward snapshot of the overarching system (software and hardware).

This is all for better or worse, but at least HardOCP is open and forthright about this. Given that GFX cards and their relationship to games is constantly evolving, it seems a pretty fair compromise.

Yep, the [H] provides us with a unique look at video card reviews that goes beyond canned benchmarks and straight numbers comparisons. Go to practically any other review site if you want to see canned benchmark runs for a larger variety of games.
 
To all the people claiming GameWorks is the problem.

Why are you buying video cards? Do you play PC games?

Are you goingn to ignore a game and simply not play it because it has GW features? Do you just not enjoy doing a lot of PC gaming?

I'm trying to understand what you want us to do, cause it sounds like you want us to cherry pick games based on features and ignore one brands features, thus creating a bias, and thus not evaluating new, common, popular games people are playing on the PC today.
 
Are you goingn to ignore a game and simply not play it because it has GW features? Do you just not enjoy doing a lot of PC gaming?
It doesn't bother me, but you would be surprised how many AMD owners intentionally avoid purchasing GameWorks games. Well, that's what they say. They might actually buy and play the game when nobody is looking.
 
Brent and Kyle can continue to claim that the blackbox crippletastic garbage that is Gameworks makes no difference in the results of AMD cards if they so choose. The simple truth is that AMD cannot fully de-cripple the performance of AMD cards in Gameworks games--because they cannot see the code in them to do so--and THAT---3/5 of [H] tested games being GW and 4/5 being TWIMTBP---is the reason the [H] review skews so heavily in the 980Ti's favor versus many other sites which show the Fury X basically on par with the 980Ti/Titan.

Right or wrong, people still want to see performance in these titles from this GPU.

Brent has said in this thread that the games are chosen for pushing graphics, not for ulterior motives of painting anything in a different light.

What would be the point of the review if say all they showed us was Tomb Raider, Sleeping Dogs, Hitman, Deus Ex, and Dirt. All 'AMD' games....also 2 years old on average.
 
Lots of baseless accusations being thrown around, good lord grow up. (gameworks, no gameworks, its better than the review shows, etc, etc)

1. This chipset was supposedly going to blow everything out of the water, from AMD's PR campaign to the fanbois in the forums.

2. It didn't, its sort of on par with the 980Ti. If I was a AMD fan I would be pissed.

3. AMD just has worse driver support, even without Hairworks, it always has (I run both). If you want a powerhouse of a machine that you don't have to baby sit, Nvidia, if you want to be forever tweaking that one game that won't run well, then AMD, because it used to be faster, now on par.

4. People get mad about graphics downgrades, day 1 patches, and DLC, yet a GPU gets a free pass because the drivers need tweaking?
 
T
Are you goingn to ignore a game and simply not play it because it has GW features? Do you just not enjoy doing a lot of PC gaming?
.

I personally may because I don't agree with nVidia's way of doing things but that's not a reason to take them out of reviews. I'm glad H uses the most graphically demanding games out there, why buy a monster card to play WoW?
 
Once I read the in depth article on tech report on the architecture, I knew it would be a bust.

Same number of ROPs, but more shaders and textures? That screams unbalanced architecture, which would mean it would have very inconsistent performance compared to the 290x.

They also revealed that the die size was limited by the interposer, which is probably why they couldn't fit more ROPs. That combined with the total fuckup with 8gb.390 versus 4gb fury makes it look quite stupid.

It doesn't matter how fucking fast your vram is if you have to swap over pcie. Who the fuck approved this architecture? gddr5 made the cut on the 4870 because the density was finally comparable to gddr3.
 
Last edited:
Like I said earlier, the fact that many new games have GW features means something. It just is what it is, our job is to show you how games perform, find the gameplay experience, show you which video card does the best job for the money. In order to do that we must evaluate the game, no matter what features it supports, a game is a game is a game, to show you how Batman performs, we must test Batman. That's it really. No motives, no agendas, we just want to show you which cards deliver the best experience. If you guys want games to use more AMD features, petition AMD to convince developers, its all in the devs hands. NVIDIA plays ball hard, AMD has to do the same.
 
Brent and Kyle can continue to claim that the blackbox crippletastic garbage that is Gameworks makes no difference in the results of AMD cards if they so choose. The simple truth is that AMD cannot fully de-cripple the performance of AMD cards in Gameworks games--because they cannot see the code in them to do so--and THAT---3/5 of [H] tested games being GW and 4/5 being TWIMTBP---is the reason the [H] review skews so heavily in the 980Ti's favor versus many other sites which show the Fury X basically on par with the 980Ti/Titan.

So go and ball ache AMD for not providing code that can best leverage their gfx cards.
Then developers would be able to include it.
Until then, buy NVidia if it bugs you so much.
 
It doesn't bother me, but you would be surprised how many AMD owners intentionally avoid purchasing GameWorks games. Well, that's what they say. They might actually buy and play the game when nobody is looking.

Actually, most AMD people would just pirate it.

There's no GameWorks game out right now that piques my interest. They are all garbage (e.g. Arkham Knight). If I really, really had to play an Nvidia-sponsored game, I would just buy the console version, which is obviously more polished than most PC games.
 
What a major disappointment.

Really the same price as the 980Ti.... no thanks.
 
gameworks effects are meh at best, and performance per visual is way too high. (witcher's hair riding a horse is hillarious)
dx11 multithreading shoulda been a game side optimization for hardware maker to make driver for not make multithreading drivers for individual games GG! lazy devs and greedy publishers!
 
Brent and Kyle can continue to claim that the blackbox crippletastic garbage that is Gameworks makes no difference in the results of AMD cards if they so choose. The simple truth is that AMD cannot fully de-cripple the performance of AMD cards in Gameworks games--because they cannot see the code in them to do so--and THAT---3/5 of [H] tested games being GW and 4/5 being TWIMTBP---is the reason the [H] review skews so heavily in the 980Ti's favor versus many other sites which show the Fury X basically on par with the 980Ti/Titan.

Every other site including the techreport shows the FuryX to be trailing the 980 Ti by atleast 10%. [H] is spot on with it's review.
 
If you're building a high end system and don't pick a good case to begin with then you are an idiot.

Also high end GPUs and CPUs require high end power supplies. That also means a custom build most likely with a good case.

Or are you dumb enough to try to fit one of these into a Dell Latitude?

There are 1000s of chassis to pick from out there and, like my builds, a DIY or custom performance enthusiast system can certainly come in the flavor of a uATX or MiniITX form factor this day. Besides, enthusiasts are but a sliver of the overall consumer market. Enthusiasts that are ultra-picky with their chassis choice represent but a sliver of all enthusiasts. And those that want WCing are but a sliver of that sliver. But let's just keep defending AMDs stupid choice to market a niche product in the mainstream segment with semantics. :rolleyes:
 
Actually, most AMD people would just pirate it.

There's no GameWorks game out right now that piques my interest. They are all garbage (e.g. Arkham Knight). If I really, really had to play an Nvidia-sponsored game, I would just buy the console version, which is obviously more polished than most PC games.

how do you explain GTA V bechmarks. With many sites reporting a 15 to 20% advantage for 980 Ti.

Just look at the [H] review more some insight.
 
You guys are catching a lot of criticism because you test only a few games. Testing such a low number has, perhaps, caused you to reach different conclusions, or at least far more negative ones, from pretty much every other review site on the planet. Perhaps in the future you could do 10 or 15 tests instead of just a couple?

Yawn, head back over to the ADF echo chamber at AT. If you want canned benchmarks you are the problem, not [H].
 
To all the people claiming GameWorks is the problem.
...
Are you goingn to ignore a game and simply not play it because it has GW features?

Can't speak for others, but I personally will not give one cent to purchase or play TWIMTBP/Gameworks games and Nvidia + Intel products because I simply DO NOT condone intentionally gimping a competitor to make your products look better. I DO NOT condone or agree with intentionally gimping testing suites/benchmarks to adversely affect your competitor(s) ala Nvidia and Intel's unethical/monopolistic tricks past and present.

In any review that I personally read, Gameworks/TWIMTBP results are discarded because unless and until the codepaths/code are transparent for the world to demonstrate non-trickery by anyone, based on Nvidia and Intel's past and current unethical behavior in this regard, equal footing CANNOT be assumed.

If I personally do not agree with or condone the tactics that Nvidia and Intel choose to perpetrate, I choose not to economically support them in any fashion whatsoever.

Simple.
 
What a puzzling card. So, maybe if you want something a little better than a 980, but a little worse than a 980ti, and you want it to be small and quiet and not dump hot air into a case, say if you're building a SFF gaming rig or something....then maaaybe buy this card? (Or spend half this on a mini-970 and call it a day, since a 970 is still a great value for the money?)

That seems to be the only niche this fills. Which seems weird for a supposed new flagship card.
 
Might not be the appropriate place to ask but seeing how BF4 is 2 years old now will you guys be replacing that with something else such as Star Wars battlefront. The rest of the games seem fine since they are popular games but battlefield 4 doesn't seem to be to stressful if Frostbite must be included would Dragon Age:Inquistion be a viable alternative.
 
It doesn't bother me, but you would be surprised how many AMD owners intentionally avoid purchasing GameWorks games. Well, that's what they say. They might actually buy and play the game when nobody is looking.

I have an nVidia card and I avoid those games on principle, but I may grab them when they're on clearance in a few years if they look like fun.

My concern is that at some point if this keeps going I will not be buying my next video card by choice.
 
To all the people claiming GameWorks is the problem.

Why are you buying video cards? Do you play PC games?

Are you goingn to ignore a game and simply not play it because it has GW features? Do you just not enjoy doing a lot of PC gaming?

I'm trying to understand what you want us to do, cause it sounds like you want us to cherry pick games based on features and ignore one brands features, thus creating a bias, and thus not evaluating new, common, popular games people are playing on the PC today.

I think the games you picked are fine. If nvidia manages to snag popular AAA titles for GW, that's not your problem.
 
Zarathustra[H];1041688595 said:
Just because I am curious. Which titles would you have tested on?

Essentially, the conclusion that "Fury X is more evenly matched against GTX980 (not Ti), which costs $150 less" is one that has been repeated in just about every review I have read thus far.

I think the only exceptions to this have possibly been Crysis 3 and Metro Last Light, in whuch the Fury X performed better than in other titles. Choosing specific titles to make the Fury X look good would - however - not be an appropriate review methodology.

The titles that HardOCP ran with are representative of what people are playing, and representative of the titles in this generation, and I think they did a good job.

For the record, I'm not an AMD apologist. They've really screwed up this product cycle. Re-releasing old GPUS and throwing out a half assed prototype that wasn't ready as a flagship are some very seriously terrible decisions. They most certainly should have waited to put fuji GPUs in the whole series and a more mature driver state. They fucked up.

But at the same time, despite what someone else said earlier, it's not just toms that are showing contrary results.

But to answer your question: Alien Isolation, Assassins Creed (any), bioshock, perhaps civ, though it probably wouldn't have amounted to much relevant info, Crysis 3 (or even better, 1), dragon age 3, Metro, mudered soul suspect, Dirt 3, Sleeping Dogs, Hitman, Company of Heroes, watch dogs, Star citizen... Ok, I'm getting bored with this. You get the point.
 
There are 1000s of chassis to pick from out there and, like my builds, a DIY or custom performance enthusiast system can certainly come in the flavor of a uATX or MiniITX form factor this day. Besides, enthusiasts are but a sliver of the overall consumer market. Enthusiasts that are ultra-picky with their chassis choice represent but a sliver of all enthusiasts. And those that want WCing are but a sliver of that sliver. But let's just keep defending AMDs stupid choice to market a niche product in the mainstream segment with semantics. :rolleyes:

Look, even with uATX and miniITX chasis, if you are relying on your PS fan or a small 80mm to keep high end hardware cool, then you are an idiot.

Big Power = Big Heat. Get a damn case that can handle it.

Here are some choices with 120mm radiator support:
http://us.coolermaster.com/product/Lines/case-120mm.html
 
Basically, anyone who has their finger on GameWorks as the blame want us to not use games to evaluate performance of these video cards because they have GW 3D effects the developer has chosen to use in their games.

You know how ridiculous that is?

If you don't want to see how games perform between video cards, then what the heck are you buying your video cards to do ?

I really don't get it. The only way to find out the kind of gameplay experience a video card delivers in games, is well, to use those games. Theoretical benchmarks aint gonna tell you how Batman performs, the only way to find out how Batman performs on video cards, is to test Batman.

Maybe I'm thinking with too much common sense.

The Gameworks effects are an issue, but not the main problem, in my view at least.

Weather any added effects are on or off we can see the performance penalty on different systems and draw our own conclusions. The business of locking out the competition brings us back to consoles mind you, not PC Gaming.

Now, to discuss gameplay and this is where I see bigger issues coming into play. Gameworks STILL dictates what the basic gameplay performance will be for all systems and hide this behind NDA documents. Nobody can tell what is happening, if nvidia decides to optimize THEIR performance Ok, not good for other hardware to be sure but nvidia bought this privilege and the developer agreed to be bought out. But maybe, just maybe they do this in a way that reduces performance on other systems, what then? Nobody is allowed to see this. So the baseline for the competition is lowered while nvidia's own baseline is raised simultaneously. Best case scenario the completion is able to reverse engineer the instruction sequence and hack in a patch, somewhat blindly, to offset some of the penalty taking extra time and resources to get back to square one.

Bottom line is nobody except one corporation knows, and dictates, what exactly the performance is behind the code locked in by threat of legal action.


Now, when [H] or other reviews sites gather data it is real and valid, real world data. But it was potentially manipulated to begin with. So as a consumer one should be wary of these game titles. You have no choice mind you if you want to play them but you do have a choice if you want to financially support this type of practice and increase this type of corporate behavior. So the more Gameworks heavy a review game selection is the more a consumer has to take this into consideration when looking at the overall review.
 
In an SFF build you would not only have to worry about where to put the radiator, but where to run 15 inches of hose.
 
In an SFF build you would not only have to worry about where to put the radiator, but where to run 15 inches of hose.

That's what she said
200_s.gif
 
Back
Top