ASUS STRIX R9 Fury DC3 Video Card Review @ [H]

This stupid site is the only one calling the GTX 980 a faster card or even on par to the Fury. You have a suite which is basically loaded with Gameworks titles. To top it off your apples to apples comparison with Gameworks features is basically unplayable on Witcher 3 and Farcry 4 at 1440p. You make those useless comparisons to create an illusion that GTX 980 can compete once Gameworks features are turned on. But the truth is GTX 980 is damn unplayable with those Gameworks features turned on. :rolleyes:

Right now hardocp has become a Nvidia Maxwell PR site for good. Almost every site is calling the Fury a much better product in terms of positioning as it beats the GTX 980 convincingly.

http://www.pcper.com/reviews/Graphi...ew-CrossFire-Results/Power-Noise-Pricing-and-

http://www.hardwarecanucks.com/foru.../69792-amd-r9-fury-performance-review-19.html

http://anandtech.com/show/9421/the-amd-radeon-r9-fury-review-feat-sapphire-asus/19

http://www.sweclockers.com/test/20792-amd-radeon-r9-fury-fran-asus-och-sapphire/17#content

The Fury is 10-15% faster than GTX 980 on average. But hey not there cause [h] already decided to shill for Maxwell. Anyway keep up the Maxwell PR effort. :p
 
Another flop. As predicted, this card loses to the 980 in most key areas.
AMD doesn't care about the poor value on these cards, they're manufacturing so few Fiji GPUs all of these cards are selling out anyway.

I hope it's worth ruining their reputation to sell a handful of over-priced GPUs to dedicated fanboys.
Eh? It's not that bad at all, lots of reviews have it looking like this

Talos Principle, FC4 & Shadow of Mordor, Metro it beats the 980 silly...
BF when you get to 4k it does better than 980, so Xfire there may be great.
Crysis 3, Total war it beats 980
yada yada
 
So why would you buy one of these when you can get similar performance in a 390X for $120 less with double the VRAM AND overclocking headroom? Or an 8GB 290X with the same perks for $170 less?

...I guess I should be fair and say I would since I sidegraded from a pair of 780s to a pair of 970s. Although the cost of each 970 was the reverse of the AMD situation, paying $170 less for each 970 compared to what I payed for each 780.
Sorry, without testing at 4K this review is lacking. Thanks for the effort though......
Sorry, but Fury and 980 couldn't play most of the games in the [H] testing suite at 4k with what most would consider acceptable frame rates. Thanks for your thoughts, though.
 
Eh? It's not that bad at all, lots of reviews have it looking like this

Talos Principle, FC4 & Shadow of Mordor, Metro it beats the 980 silly...
BF when you get to 4k it does better than 980, so Xfire there may be great.
Crysis 3, Total war it beats 980
yada yada

Yes almost every other review I've seen has the Fury beating the 980 in most games by around 10-15%. Strange since HardOCP had poor results for the FuryX too.

On Gameworks results the AMD cards can potentially have a certain advantage when Hairworks is turned ON. The Catalyst Control Centre has an option to reduce tesselation complexity if required and we know for sure Witcher 3 benefits from this greatly.
 
Yes almost every other review I've seen has the Fury beating the 980 in most games by around 10-15%. Strange since HardOCP had poor results for the FuryX too.

Not really.
[H] are one of the few review sites that do an extended gameplay instead of short gameplay or static benchmarks.
Thats why I am here, I trust these results.
They are more closely related to how we play the games.
 
The card seems to perform a lot better at 4K just like the Fury X.

If you are on a budget then two of these in CF seem worth it for 4K. Budget and 4K don't really go together in my opinion though :p If I had a 4K display I'd be getting a pair of the top cards.
 
well this seems to be very good, even PcPer gave it Gold Award. beats the 980 in all games and CF scaling is superior to SLI.
 
...This looks to be a great card for folks who want quiet. But then I suppose the issue is that you can't fit these beasts into the majority of HTPCs, if any. So many 'for the quiet' folks can't use it...

not necessarily. the design of most if not all of the more popular small form factor chassis take into account the expanding length and girth of the modern up-market video card. the fractal design node 304 or the cooler master elite are perfect examples. part of what's a bit flummoxing about the fury x is that it really isn't that accommodating for small form factor building than amd's literature would suggests. most of my clients want to cool their cpus with 120mm clcs in mitx builds, and that takes up the one available 120 radiator space in a chassis like a node 304 or an elite.

You want to use one of these at 4K?
What quality levels do you use?

yes, this is true. in this generation of premium cards as with the last, 60 fps and a lot of eye candy is still the providence of multi gpu. something i suspect amd still has a bit of an advantage in coming from last gen.
 
$579...ugh. No joy.

If you are buying strictly on a performance per dollar ratio, it might not make sense. However having a nice factory water cooling setup is nice and worth it to those who like having this.
 
AMD really needs to price cut 75-100 bucks off everything this generation. I feel like a broken record. The cards are nice but at the wrong price points.
 
Yes almost every other review I've seen has the Fury beating the 980 in most games by around 10-15%. Strange since HardOCP had poor results for the FuryX too.

On Gameworks results the AMD cards can potentially have a certain advantage when Hairworks is turned ON. The Catalyst Control Centre has an option to reduce tesselation complexity if required and we know for sure Witcher 3 benefits from this greatly.

Notice the game selection? I've already wrote about finding [H]'s game selection unbalanced, so I won't repeat myself any further.

But, the other reviews seen to further support my observation.
 
10% faster than the 980.
10% more expensive than the 980.
Start overclocking and it's a KO.

What is AMD doing? Who is going to buy this?
AMD hasn't competed 1-to-1 in years. I can't even name the last time they did it. Before the GTX 8000 series.
 
Did you even read the review or are you just making shit up?

Great review. Cool card, but again AMD is late to the party and has another overpriced product.

Really looking forward to seeing the updated Fury X drivers, and the CrossFire reviews, the cards look to scale quite well from what I have seen.

Would be great to get a [H] review of the equipment.

Yeah, pretty sure he just auto posted that. By itself, it loses across the board in power and cost to the 980, yet mostly even in performance. Crossfire, however, should be stellar for 4k as we have seen with the Fury X.
 
Sorry, but Fury and 980 couldn't play most of the games in the [H] testing suite at 4k with what most would consider acceptable frame rates. Thanks for your thoughts, though.

While this is true, I still want to see that side of the cards behaviour even just a one pager 4K results kinda thing. Oh well, probably in an upcoming review im hoping on Fury crossfire performance, Tri-X look to be an amazingly cool and quiet card.

Im betting 2x Fury's will trade blows with 2x980ti's thanks to the new architecture...Scales amazing from the reviews ive read thus far and for $200 less.
 
Not really.
[H] are one of the few review sites that do an extended gameplay instead of short gameplay or static benchmarks.
Thats why I am here, I trust these results.
They are more closely related to how we play the games.
Umm except I wouldn't touch GTA V, terribad game. Nor does Dying Light interest me. What if I don't want 60FPS because it's not a multiplayer game, and maybe I'd use different settings that what H uses, I hate grass in games as a great example.

I like their gameplay reviews but it doesn't mean they are the most accurate or best, it's just a different method and it's all based upon their specific reviewers personal preference, as they showed turning off and on various settings.

Kitguru, who gave it a great review surprisingly in Witcher 3, and they did a play test, not a canned one.

"I have played The Witcher 3 for around 85 hours and I have completed the single player campaign. I tested the game today by playing 4 different save game stages for 5 minutes each, then averaging the frame rate results for a real world indication of performance – one of the map sections we tested is one of the most demanding in the game and our results can be considered strictly ‘worst case’. The Witcher 3 is a dynamic world, so it is important to run tests multiple times to remove any discrepancies.

This is one of the greatest PC games ever released in my opinion, so I spent around a total of 48 hours benchmarking it for this review alone – it should be on your must have list, if you don’t have it already."
 
This has been a very confusing product launch, to say the least. Not what was I expecting.
 
Umm except I wouldn't touch GTA V, terribad game. Nor does Dying Light interest me. What if I don't want 60FPS because it's not a multiplayer game, and maybe I'd use different settings that what H uses, I hate grass in games as a great example.

I like their gameplay reviews but it doesn't mean they are the most accurate or best, it's just a different method and it's all based upon their specific reviewers personal preference, as they showed turning off and on various settings.

Kitguru, who gave it a great review surprisingly in Witcher 3, and they did a play test, not a canned one.

"I have played The Witcher 3 for around 85 hours and I have completed the single player campaign. I tested the game today by playing 4 different save game stages for 5 minutes each, then averaging the frame rate results for a real world indication of performance – one of the map sections we tested is one of the most demanding in the game and our results can be considered strictly ‘worst case’. The Witcher 3 is a dynamic world, so it is important to run tests multiple times to remove any discrepancies.

This is one of the greatest PC games ever released in my opinion, so I spent around a total of 48 hours benchmarking it for this review alone – it should be on your must have list, if you don’t have it already."
I'm sorry you have a large amount of issues.
They arent mine though.
 
Not really.
[H] are one of the few review sites that do an extended gameplay instead of short gameplay or static benchmarks.
Thats why I am here, I trust these results.
They are more closely related to how we play the games.

Yes, and right now it happens that every game in [H]'s suite is a gamework game, other than BF4 of course. It skews the review pretty badly. Brent, I think, even said that he wanted to include Batman and Project CARS in the future, so I don't expect any glowing reviews in the future for AMD here, however good the gpus are.
 
Yes, and right now it happens that every game in [H]'s suite is a gamework game, other than BF4 of course. It skews the review pretty badly. Brent, I think, even said that he wanted to include Batman and Project CARS in the future, so I don't expect any glowing reviews in the future for AMD here, however good the gpus are.

Tell them to pull the games from sale then.
Or tell AMD to get off their arses and make a gaming library optimised for their GPUs that devs can use to get the best from AMD hardware.
 
Curious that [H]ardocp tested the GTX 980 at 4K upon launch, but declined to do so with the R9 Fury. Why is that?

http://www.hardocp.com/article/2014...rce_gtx_980_video_card_review/12#.VZ_5-mZVhBc


"4K performance is not bottlenecked on the GeForce GTX 980. For the most part it is faster than the GeForce GTX 780 Ti and Radeon R9 290X. There are some games where it will break even on performance to 780 Ti"..........Sept 18, 2014
 
Tell them to pull the games from sale then.

Eh??

Or tell AMD to get off their arses and make a gaming library optimised for their GPUs that devs can use to get the best from AMD hardware.

Yes, sure, one more company that should adopt anticompetitive strategies. What wrong can come out of that for us, right?

Anyway, again the [H] review is the only one that is so negative. Even pcper is saying the Fury's better than the 980. That should mean a lot.
 
Since the release of the TITAN, any card under $999 seems to be a reasonable price now.... -___-

People are like, wow!! only $650 to get the performance of something worth $1000!?? what a deal..
 
Since the release of the TITAN, any card under $999 seems to be a reasonable price now.... -___-

People are like, wow!! only $650 to get the performance of something worth $1000!?? what a deal..
"Titan X" halo products used to be $500 and then it was around $300 for the sensible version.
We've effectively doubled those prices today... :rolleyes:

If it were 2008 right now the 980 Ti would be around $400 at most.

GPU market sucks. And you can't even stay budget anymore to avoid it, the $200 price point sucks even more.
 
The games are for sale and are worth playing.
It makes sense to review them, after all we want to see how the cards perform, for better or for worse.
It helps is make the correct purchasing decision ;)

Yes, sure, one more company that should adopt anticompetitive strategies. What wrong can come out of that for us, right?
I'll help you.
NVidia make a game library that allows devs to make optimised code in a shorter time for NVidia cards.
AMD dont.
Guess which cards get optimised code for advanced features.

You are acting like only one manufacturers optimisations can be used in a game.
 
It will be interesting to see which architecture does better in DX12. I suspect that one will solidly beat the other, which I'm not sure.
 
I'll help you.
NVidia make a game library that allows devs to make optimised code in a shorter time for NVidia cards.
AMD dont.
Guess which cards get optimised code for advanced features.

Not when they cripple performance on the competitor's hardware (and in some cases their own performance). See hairworks vs tressfx. That's not good for the competition, and thus not good for us.
 
Curious that [H]ardocp tested the GTX 980 at 4K upon launch, but declined to do so with the R9 Fury. Why is that?

http://www.hardocp.com/article/2014...rce_gtx_980_video_card_review/12#.VZ_5-mZVhBc


"4K performance is not bottlenecked on the GeForce GTX 980. For the most part it is faster than the GeForce GTX 780 Ti and Radeon R9 290X. There are some games where it will break even on performance to 780 Ti"..........Sept 18, 2014

Because at the time the only viable competition at 4K was the 780 Ti and the 980 was the best 4K single card solution. Now the minimum for 4K single card is the FX or the 980 Ti, it's an entirely different ballgame. I would personally never recommend a single Fury, 980, or 390x for 4K.
 
4K with reduced settings is achievable on any moderately high end video card. Some people choose 4K displays because they value the extra workspace and PPI even if they have to play new games at medium to achieve 60fps.

But it goes both ways. 4K may make the Fury look better, but 1080 might make the 980 look stronger. HardOCP never claimed to be the only source in town so you'll have to turn elsewhere for other resolutions, naturally.
 
I know it's been said before, but this does feel like a repeat of the 8800GTX vs HD 2900XT era.

The HD 2900XT wasn't bad either, but when a product is that late, you expect it to soundly beat the competition, not just match it (or lose in some cases).

AMD managed to bounce back relatively quickly from that, so here's hoping for another HD 4870 revolution.
 
Not when they cripple performance on the competitor's hardware. See hairworks vs tressfx. That's not good for the competition, and thus not good for us.

So tell AMD to make an optimised library for their cards.
Then AMD users wont be forced to use the code optimised for NVidia if they want to use the same game features.
 
Good card! Thanks for the review guys. Also check out the Fury Tri-X, amazing acoustics apparently.

Too bad you guys didnt get to bench 4K, was looking forward to those scores, checking other reviews they show its as per usual, AMD does better the higher the res. Looking forward to the crossfire review as fury looks to scale really well with all that bandwidth on board. I think DX12 is really going to open these cards up!
To me benching 4k for a single card is a complete waste of time and resources. Not sure why anyone would be interested in 30-45fps gaming on a PC. There is no single card that can run 4K maxed details and keep 60 fps in latest games.

Also I would take 4xaa at 1440p with 60+fps over no aa 4k at sub 60 fps any day of the week.

Just my opinion.
 
Not when they cripple performance on the competitor's hardware (and in some cases their own performance). See hairworks vs tressfx. That's not good for the competition, and thus not good for us.

no, they don't cripple the performance on competitor's hardware, nvidia also receive a big performance hit when have their gameworks features enabled, how you can explain AMD cards performing much better than nvidia cards in Far Cry 4? a gameworks tittle... the card with better tessellation will have better performance with HairWorks and Most GameWorks effects as the card with better Compute Abilities have better performance with TressFX... so where is the problem? Nvidia uses tessellation as their advantage and AMD use compute as their advantage how hard can be to understand that?. AMD shouldn't care how NVIDIA cards perform, and NVIDIA shouldn't care how AMD cards perform
 
So tell AMD to make an optimised library for their cards.
Then AMD users wont be forced to use the code optimised for NVidia if they want to use the same game features.

You mean like how Witcher 3 used both Hairworks and TressFX?
 
To me benching 4k for a single card is a complete waste of time and resources. Not sure why anyone would be interested in 30-45fps gaming on a PC. There is no single card that can run 4K maxed details and keep 60 fps in latest games.

Also I would take 4xaa at 1440p with 60+fps over no aa 4k at sub 60 fps any day of the week.

Just my opinion.

I don't disagree, but is there a single card that can run 4xAA at 1440p with 60+ fps? 1440p is still a pretty high resolution, and quite demanding for cards.
 
Back
Top