ASUS STRIX R9 Fury DC3 Video Card Review @ [H]

I don't disagree, but is there a single card that can run 4xAA at 1440p with 60+ fps? 1440p is still a pretty high resolution, and quite demanding for cards.
My 980 Ti runs Witcher 3 at 2560x1440, Ultra Settings, HBAO+ at 60 fps+ average. Very few dips into the 50s. HairWorks destroys it, but otherwise I think maxing out games at 2560x1440 with a single card at 60fps is now possible.
 
Not when they cripple performance on the competitor's hardware (and in some cases their own performance). See hairworks vs tressfx. That's not good for the competition, and thus not good for us.

Nvidia performance with TressFX was abysmal when it first came out, but only a couple of drivers later and it was faster than AMD.
 
I don't disagree, but is there a single card that can run 4xAA at 1440p with 60+ fps? 1440p is still a pretty high resolution, and quite demanding for cards.
With the exception of Witcher 3 I can maintain average of 60 fps Max details and 4xAA in almost all the latest games including GTA V, Dying Light, Farcry 4, Batman AK with an overclocked 980 ti. It does dip to 45 fps but for most part stays around 60 fps in these games.

P.S. This is with all nvidia options turned up and is one of the reasons I buy nvidia. Playing the new batman with fog, smoke and debris was awesome.
 
Yeah, pretty sure he just auto posted that. By itself, it loses across the board in power and cost to the 980, yet mostly even in performance. Crossfire, however, should be stellar for 4k as we have seen with the Fury X.

Here is the one thing that concerns me about the Crossfire stuff.

The Crossfire/SLI market is so slim does it really make financial senses to sell cards that only offer value only in that market?

Not sure what the actual market share is, but it seems like a risky move on AMDs part.

I'm sure it would push a good number of [H] users to a multicard solution if the scaling issues were resolved by AMD.

It could also be part of a long term strategy where they realize that a single card solution and DDR RAM just isn't the future. If you have the driver support and the IP even in liquidation or sale the company would be worth money.
 
Anandtech's review shows Fury solidly ahead in Crysis 3 as well - though not by 25%.

I cant find anywhere in the reviews that states how they perform the reviews.
They overclocked the card by 7.5% and 10% too.
 
Is AMD really trying to price themselves out of the market? I felt that the Fury-X was about $100 overpriced, and it just seems like the Fury-Vanilla should really be a $480 card. I miss the days when AMD's $300 ($500 in 2015 GPU price) 4870 (Fury-X in 2015) can compete with nVidia's top of the line $650 ($1,000 in 2015 GPU price) GTX280 (Titan-X in 2015).

I'm not the only one smelling price-fixing in the air between nVidia and AMD again, I hope someone at the SEC is paying attention. Why else would you release a somewhat competitive product for equal or more than your better performing competition.

AMD and nVidia were found to be price fixing in 2006, and the current prices lead me (and many I've spoken to) to believe they are price fixing again. Why else would AMD be charging $550 for a card that is at best equal to their competitions $500 card? It's as if they don't really want to compete in the market in an effort to artificially increase GPU prices.

I'm just saying, it's funny how in 2009 AMD top of the line Single GPU, the Radeon 5870 retailed at launch for $330, and just 6 short years later that price almost doubled to $650. I understand inflation (which is why I say the Fury-X was $100 overpriced)
 
AMD used to be a best valued proposition. What happened to them in the current genre of cards.

The pricing at 550$ looks ludicrous.
 
Is AMD really trying to price themselves out of the market? I felt that the Fury-X was about $100 overpriced, and it just seems like the Fury-Vanilla should really be a $480 card. I miss the days when AMD's $300 ($500 in 2015 GPU price) 4870 (Fury-X in 2015) can compete with nVidia's top of the line $650 ($1,000 in 2015 GPU price) GTX280 (Titan-X in 2015).

I'm not the only one smelling price-fixing in the air between nVidia and AMD again, I hope someone at the SEC is paying attention. Why else would you release a somewhat competitive product for equal or more than your better performing competition.

AMD and nVidia were found to be price fixing in 2006, and the current prices lead me (and many I've spoken to) to believe they are price fixing again. Why else would AMD be charging $550 for a card that is at best equal to their competitions $500 card? It's as if they don't really want to compete in the market in an effort to artificially increase GPU prices.

I'm just saying, it's funny how in 2009 AMD top of the line Single GPU, the Radeon 5870 retailed at launch for $330, and just 6 short years later that price almost doubled to $650. I understand inflation (which is why I say the Fury-X was $100 overpriced)

The die size of the 5870 was half that of the Fury, and add in the R&D costs of HBM.
 
AMD used to be a best valued proposition. What happened to them in the current genre of cards.

The pricing at 550$ looks ludicrous.

It really seems like the 980 Ti and the price drop on the 980 caught them by surprise and they simply can't react to it. With them selling every single chip that is being produced it's hard to blame them for not bothering right now but it'll be interesting to see what happens once supply catches up with the current demand.
 
If you are buying strictly on a performance per dollar ratio, it might not make sense. However having a nice factory water cooling setup is nice and worth it to those who like having this.

Sorry, I'm not following. The $579 STRIX R9 Fury isn't water cooled.

The pricing at 550$ looks ludicrous.

Agreed...even if just compared against Fury X. $100 more for significantly better performance and a CLC to boot makes Fury X hard to justify. And even worse when the cards with better cooling (like this ASUS STRIX) are priced significantly higher than that "base" price.
 
Would love to know if the 980 was running the same IQ as the Fury card.. I didn't care about drive default BS.
 
Wow.......talk about a bummer of a card. Should be priced at $499 like the 980.

Just my 0.02c
 
It really seems like the 980 Ti and the price drop on the 980 caught them by surprise and they simply can't react to it. With them selling every single chip that is being produced it's hard to blame them for not bothering right now but it'll be interesting to see what happens once supply catches up with the current demand.

You know I too was puzzled by their pricing initially, but if you think it through it actually makes sense.

You charge as much as the market is willing to bear. Right now limited supply means every card in stock is being sold at asking price. Until demand dwindles and they stop moving product in volume, it makes no sense for AMD to charge a penny less.
 
The die size of the 5870 was half that of the Fury, and add in the R&D costs of HBM.

Which is why I said the Fury-X should have been a $500/$550 card. Take pure inflation rates(leaving out die-size) and it should be a $417 card today.Considering Die size increase and HBM memory (which really shouldn't count as GDDR5 was a newish type of memory too for the 5870) and we come to $500/$550.
 
ASUS STRIX R9 Fury DC3



.. We really need to get them to simplify their naming scheme. That's a mouthful.
 
Some points from [H]'s Fury review:

1. Witcher 3 Apples-To-Apples with Gameworks features/options ON the 980 wins by 15%, while with Gameworks features/options OFF, the Fury wins by 6%. A 21% difference in performance for the Fury between having Gameworks features/options enabled or not. Interesting.

2. Dying Light Apples-To-Apples with Nvidia/Gameworks features/options ON, the Fury beats the 980 by 3%. while with Nvidia/Gameworks features OFF, the Fury margin of victory over the 980 increases to 31%. A 28% percent difference on the Fury between having the Nvidia/Gameworks features enabled or not. Interesting.

3. Far Cry 4 Apples-To-Apples with Gameworks features/options ON, the 980 and Fury are TIED in performance, while with Gameworks/Nvidia features/options OFF, the Fury outperforms the 980 by 7%. A 7% difference in performance for the Fury between Nvidia/Gameworks features/option enabled or not. Interesting.

4. A new AMD driver has resulted in performance gains for the Fury series. Interesting.

5. Many declared the possibility of Gameworks features/options gimping AMD card's performance laughable, conspiracy-theory nonsense, yet measured proof of exactly that result is shown in [H] testing here.

6. Many declared that AMD Driver improvements would not make a significant difference for the Fury series card's game performance. Yet here again, [H] documents exactly that occurring (most notably in Far Cry 4) only 16 days after the official release of the Fury X.
 
I wish people would get off of the R&D of HBM out of AMD's pocket. Hynix, a memory developer and producer, wasn't just making HBM for AMD. Its going to be in everything from cellphones to CPU/GPU's and didn't put a dent on AMD for R&D. There are a lot of company's investing in it. Don't let AMD fool you. Sure it costs them some money to put it on there chips. So they cut costs on the GPU. Give Hawaii 4096 cores, bet its the same exact gpu on Fury. Really, they are close as you can get to bankruptcy protection as company's can get... And no supply of cards. Can't imagine what the 400 series production will be like if they get there......
 
Too bad [H] couldn't compare Asus' STRIX variations of both cards with the current drivers and the same games.

In the review of the 980 STRIX, full load draw on the 980 STRIX is right in-between the reference 980 and the STRIX R9 Fury.
At 65c, the full load temperature of the 980 STRIX is significantly lower than either of the cards in today's review, which is to be expected.

Since games/game areas, resolutions and driver versions are different, however, a direct comparison of the two reviews can't be made.
 
Perfect card.
sits in between the 980ti/Fury x and can match them.
My gaming will be silky smooth once I get my hands on this baby
Lol, $579 is ridiculous for this card as it will be slower than the 980 when both oced. And its not like the 980 for 500 bucks is a good value anyway. If you can spend 580 bucks for a card then you would be a fool not to save up 70-80 bucks more for a 980 Ti. An oced 980 Ti would be well over 30% faster than an oced Fury and has more vram.
 
Lol, $579 is ridiculous for this card as it will be slower than the 980 when both oced. And its not like the 980 for 500 bucks is a good value anyway. If you can spend 580 bucks for a card then you would be a fool not save up 70-80 bucks more for a 980 Ti. An oced 980 Ti would be well over 30% faster than an oced Fury and has more vram.

I think I would get this card over the 980. OC to OC they are probably close to even, at least at 1440p. I think HBM will show some benefit with future games too.

The problem is of course, the 980 Ti. When you're already spending $550+ on a graphics card I don't know why you wouldn't spend another $100 for 15-20% more performance, and it's more like 25-30% when factoring in how well the Ti overclocks.

So, yeah, this card is in a weird spot at $550. It needs to be $450, that would put pressure on NV and would prevent the 980 Ti comparisons due to a much lower price.
 
I do appreciate the benches with NVIDIA GameWorks tech turned off, but I don't think the performance advantage of the Fury with non-NVIDIA specific technology was highlighted enough in the conclusion. When not using technology designed by and for NVIDIA, the Fury can perform up to 30% faster than the 980, and you conclude it a wash for performance and overpriced?

Certainly I do not begrudge NVIDIA for developing Gameworks, and it working better on NVIDIA cards, and game developers using it. But not all games will use these NVIDIA optimization, and some may even optimize for the Fury cards. With the performance advantage the Fury has over the 980 using just DX11 features, I can only imagine if the games were optimized for the Fury instead of NVIDIA cards how much better it would perform. Maybe it's not so overpriced after all.

I like to play my games with the image quality turned up, as do most people. If the goal is a better gameplay experience, this means using the built in image quality features the game developer has chosen to use in the game. In that light, it is a wash between the 980 and Fury, basically, where it matters. Most people don't play games stripped down at the lowest quality settings, unless you are doing competitive BF4 or something.
 
Card seems to perform ok, market will dictate its price in the coming weeks. For the Review, I'm still not enthused by the game selection, but the display on/off of Gameworks features helps to achieve a more accurate performance representation though it can still be deceiving.

I have to say after seeing the size of a Fury X, all other graphics cards look huge and dated now and this one is no exception. Dated is the best way I can describe how the Fury X looks a step ahead now, compared to the big honking cards.

We are using the latest games there are, popular games people are playing and having positive feedback with. Our entire suite has now been updated, except for BF4.

Perhaps the complaint shouldn't be with us, but rather the state of the game industry and choices game developers make in their own games in regards to feature use, if you aren't happy with their choices.
 
Good card! Thanks for the review guys. Also check out the Fury Tri-X, amazing acoustics apparently.

Too bad you guys didnt get to bench 4K, was looking forward to those scores, checking other reviews they show its as per usual, AMD does better the higher the res. Looking forward to the crossfire review as fury looks to scale really well with all that bandwidth on board. I think DX12 is really going to open these cards up!

Sorry, without testing at 4K this review is lacking. Thanks for the effort though......

Curious that [H]ardocp tested the GTX 980 at 4K upon launch, but declined to do so with the R9 Fury. Why is that?

http://www.hardocp.com/article/2014...rce_gtx_980_video_card_review/12#.VZ_5-mZVhBc


"4K performance is not bottlenecked on the GeForce GTX 980. For the most part it is faster than the GeForce GTX 780 Ti and Radeon R9 290X. There are some games where it will break even on performance to 780 Ti"..........Sept 18, 2014


The problem with 4K is not that the card can't render it, but that it does it at framerates most gamers wouldn't like and be comfortable playing games at, you'd have to turn your game quality down to low settings to enjoy the games at 4K. Single-GPU video cards aren't ready for 4K yet. Maybe in the next generation, but not this generation.

IF you were going to go with 4K on Fiji you are going to go Fury X anyway, not Fury.

If you really wanted a single-GPU for the best 4K experience right now though I'd go with the TITAN X.
 
Last edited:
Not when they cripple performance on the competitor's hardware (and in some cases their own performance). See hairworks vs tressfx. That's not good for the competition, and thus not good for us.

Everytime this gets brought up I swear people are just retarded. Mostly because they forget that with TressFX only one character used it.

It's true that TresFX doesn't use 64x tessellation, however if you had just as many characters on screen in Tomb Raider as the Witcher 3 you would see just as huge of a drop.
 
We are using the latest games there are, popular games people are playing and having positive feedback with. Our entire suite has now been updated, except for BF4.

Perhaps the complaint shouldn't be with us, but rather the state of the game industry and choices game developers make in their own games in regards to feature use, if you aren't happy with their choices.

I think the problem is the past 8 months or so have been pretty heavy on the GameWorks titles and AMD had almost nothing to show for it.

The rest of the year's heavy hitters in terms of graphics tech seems to all be AMD sponsored: Star Wars Battlefront, Hitman and Deus Ex 4. This should be a prime opportunity for AMD to showcase how well they run on their own tech.
 
We are using the latest games there are, popular games people are playing and having positive feedback with. Our entire suite has now been updated, except for BF4.

Perhaps the complaint shouldn't be with us, but rather the state of the game industry and choices game developers make in their own games in regards to feature use, if you aren't happy with their choices.

I will start with a positive, at least from the brief moment I had to look the review over. This review was far more professional than the Fury one a few weeks ago, much appreciated.

Now the negative, not about the review but your comment here. It was pointed out that Dying Light was far from popular as it was barely clinging to top 100 in the Steam review. Funny part is it is probably clinging to top 100 by just the number using it in reviews. Any way what's more interesting is someone pointed out some games as did I that met different criteria, such as Skyrim (the other guys choice) or new games like Ryse: Son of Rome and Dragon Age: Inquisition. Now I don't think Skyrim is necessary although it is still a top 10, Amazing I know. Dragon Age is top 10 as well and is quite new, in the last year. Ryse isn't quite as popular but new as well.

Any way It reeks havoc with your points as to why you use the particular set. The 2 newer games seem to favor AMD on Ryse and neither on Dragon Age which would go far in alleviating the complaints.

Also a point I thought of is why not run the Canned benchmarks as well. Most run in 2 mins or less so it isn't too taxing to the whole run time. Also one key negative is that when you play any game you cant guarantee exact duplication of the runs, why Canned benchmarks are quite popular, that and the time it takes to run. Just a thought.
 
I think the problem is the past 8 months or so have been pretty heavy on the GameWorks titles and AMD had almost nothing to show for it.

The rest of the year's heavy hitters in terms of graphics tech seems to all be AMD sponsored: Star Wars Battlefront, Hitman and Deus Ex 4. This should be a prime opportunity for AMD to showcase how well they run on their own tech.

Yep, it is up to AMD to evangelize its features for developers. It is up to game developers to choose what technologies they incorporate. It is up to AMD to optimize its drivers for games. It is up to us to find out how video cards perform in games and which ones allow the best gameplay experience for the money.

If a lot of games recently have used GameWorks features, then it just is what it is, it sure has heck isn't in my control what game developers chose to use in their games, only to evaluate it and see how it looks and performs.

I am getting very tired of the GameWorks discussion, it is an old and tired argument now people are using as a punching bag or conspiracy theory to put AMD down. Get over it I say. I for one like my games to have cool forward looking image quality features, making the gameplay experience better, I don't care who it comes from, just get it in there and make my jaw drop.

I am highly looking forward to the next Deus Ex game.
 
I will start with a positive, at least from the brief moment I had to look the review over. This review was far more professional than the Fury one a few weeks ago, much appreciated.

Now the negative, not about the review but your comment here. It was pointed out that Dying Light was far from popular as it was barely clinging to top 100 in the Steam review. Funny part is it is probably clinging to top 100 by just the number using it in reviews. Any way what's more interesting is someone pointed out some games as did I that met different criteria, such as Skyrim (the other guys choice) or new games like Ryse: Son of Rome and Dragon Age: Inquisition. Now I don't think Skyrim is necessary although it is still a top 10, Amazing I know. Dragon Age is top 10 as well and is quite new, in the last year. Ryse isn't quite as popular but new as well.

Any way It reeks havoc with your points as to why you use the particular set. The 2 newer games seem to favor AMD on Ryse and neither on Dragon Age which would go far in alleviating the complaints.

Also a point I thought of is why not run the Canned benchmarks as well. Most run in 2 mins or less so it isn't too taxing to the whole run time. Also one key negative is that when you play any game you cant guarantee exact duplication of the runs, why Canned benchmarks are quite popular, that and the time it takes to run. Just a thought.
No bias here at all :)
I play canned benchmarks all the time, great idea.
 
So why would you buy one of these when you can get similar performance in a 390X for $120 less with double the VRAM AND overclocking headroom? Or an 8GB 290X with the same perks for $170 less?

...I guess I should be fair and say I would since I sidegraded from a pair of 780s to a pair of 970s. Although the cost of each 970 was the reverse of the AMD situation, paying $170 less for each 970 compared to what I payed for each 780.
Sorry, but Fury and 980 couldn't play most of the games in the [H] testing suite at 4k with what most would consider acceptable frame rates. Thanks for your thoughts, though.

i'd say the biggest benefit is power usage over the 390x if thats something you care about but then again the 980 still uses a bit less than the Fury so thats still a hard sell. the only other one would of been size but the heatsinks friggin massive so thats still not a reason. only other selling point i can see justifying it is freesync.
 
Did anyone else here know exactly how this review was going to conclude the instant you saw it on the front page?...
 
No bias here at all :)
I play canned benchmarks all the time, great idea.

You apparently have a hard time understanding concepts, so I will spell it out for you.

[H] current review practice:

Albeit closer to real performance also can not be identical from one run to the next. Easy to skew results one way or another if one so desires. Not saying this is the case here.

Canned benchmarks:

Repeatable each and every time (except StarSwarm). However not always indicative of real world gameplay.

Both have positive points and negatives. By mentioning the simple addition of one does not preclude the other nor does it alone indict one of ulterior motives. Wait those words may be too big. Asking them to add Canned benchmarks to some does not mean I wanted them to stop the way they do it now, it was just to add. Nor did I mention it because I believed they were cheating or intentionally crippling one set of cards.

It was just a conversation with a legitimate concern. This is how a forum works.

Oh and maybe you can try and point out the bias. I mean spell it out. Not like you did here either, being vague. Means you didn't read or couldn't/wouldn't comprehend.
 
For all of you harping on how [H] does their reviews:

1) Buy your own video cards
2) Review the cards and post your results on the internet the way you want them to be reviewed
3) Leave [H] behind and see how many people follow

I can add more steps but they're not polite and they're not politically correct.

Thanks for the review. Looks like AMD has some pricing to do or more work on their drivers, or both.
 
Back
Top