ASUS STRIX R9 Fury DC3 Video Card Review @ [H]

You apparently have a hard time understanding concepts, so I will spell it out for you.

[H] current review practice:

Albeit closer to real performance also can not be identical from one run to the next. Easy to skew results one way or another if one so desires. Not saying this is the case here.

Canned benchmarks:

Repeatable each and every time (except StarSwarm). However not always indicative of real world gameplay.

Both have positive points and negatives. By mentioning the simple addition of one does not preclude the other nor does it alone indict one of ulterior motives. Wait those words may be too big. Asking them to add Canned benchmarks to some does not mean I wanted them to stop the way they do it now, it was just to add. Nor did I mention it because I believed they were cheating or intentionally crippling one set of cards.

It was just a conversation with a legitimate concern. This is how a forum works.

Oh and maybe you can try and point out the bias. I mean spell it out. Not like you did here either, being vague. Means you didn't read or couldn't/wouldn't comprehend.

Thats great if you play canned benchmarks.

The bias is that you are want to change the review to favour AMD more because you arent happy.
Have fun.
 
The problem with 4K is not that the card can't render it, but that it does it at framerates most gamers wouldn't like and be comfortable playing games at, you'd have to turn your game quality down to low settings to enjoy the games at 4K. Single-GPU video cards aren't ready for 4K yet. Maybe in the next generation, but not this generation.

IF you were going to go with 4K on Fiji you are going to go Fury X anyway, not Fury.

If you really wanted a single-GPU for the best 4K experience right now though I'd go with the TITAN X.

That still doesn't explain why you tested the 980 at 4K when it was released, and just 20 days ago you tested the 390X a 4K??????
 
That still doesn't explain why you tested the 980 at 4K when it was released, and just 20 days ago you tested the 390X a 4K??????

As I recalled, when Brent did the testing for 4k, the setting he use are not playable by any means. I interpreted those testing were mainly done to see if the cards are bottlenecked at 4k.

Sure, Brent could have done testing at 4k, and probably reveal Fury is better at 4k than 980, but what is the point if it isn't in any acceptable playable framerates.
 
Well that's it, I'm not buying any AMD card this time around. I don't know what the hell they are doing over at AMD headquarters, but this stuff is a joke. Higher power consumption, even if it's not as bad this time around, and less performance while costing more or the same. I don't mind if the card is more expensive, IF it actually performs better. But this shit is a joke. Looks like GTX 970 or 980 it is. It's sad how these days Intel and Nvidia are a better deal than AMD.
 
I think testing any single card for 4K (even a Titan X) in current games is a pointless effort. Unless people like console like cinematic framerates on PC (read: sub 60 fps). Also turning down options at 4K to make it playable is also crap.
 
i was kinda hoping to finally ditch my 2 7870s, but i think ill keep them through the year and wait for pascal/fury2 now.
 
Sure, Brent could have done testing at 4k, and probably reveal Fury is better at 4k than 980, but what is the point if it isn't in any acceptable playable framerates.

Getting a Fury launch baseline at 4k would be beneficial for future comparisons.
Also, why [H]ard saw the need/benefit for testing every other recent high-end card at launch (780, 780ti, 290x, 970, 980, 390x) at 4k, and now inexplicably decide to skip it on the Fury is puzzling.
 
Last edited:
"Instead what happened is that the sound of a thousand hearts exploding around the globe were heard as gamers were met with disappointment"
Oh [H], you have fallen so, so far :eek:
 
Getting a Fury launch baseline at 4k would be beneficial for future comparisons.
Also, why [H]ard saw the need/benefit for testing every other recent high-end card at launch (780, 780ti, 290x, 970, 980, 390x) at 4k, and now inexplicably decide to skip it on the Fury is puzzling.

Time could have been a factor. Reviewers don't seem to get a ton of time to test cards prior to the embargo being up and the style of review [H] does does take longer than running benchmarks. If they don't feel there is much point to testing it at 4K then why waste the little time they have on it?
 
Time wasn't a factor as Kyle said this on page 1:



What was their reason to explore 4k in the reviews of: 780, 780ti, 290x, 970, 980, 390x?

I'm not going to pretend to read their minds but I'll reiterate what I said earlier: The 4k landscape when the 980 came out was different. The 980 was the first semi-viable 4K at that point. That isn't the case now. Right now the bare minimum for 4K is the Fury X or the 980 Ti. Anything less is not a good solution without going into SLI or CFX.

As for the 390x *Shrugs* AMD's marketing was painting it as the perfect 4K card and people still seem to have this weird notion that because it was more VRAM it must be better for 4K.
 
I'm not going to pretend to read their minds but I'll reiterate what I said earlier: The 4k landscape when the 980 came out was different. The 980 was the first semi-viable 4K at that point. That isn't the case now. Right now the bare minimum for 4K is the Fury X or the 980 Ti. Anything less is not a good solution without going into SLI or CFX.

As for the 390x *Shrugs* AMD's marketing was painting it as the perfect 4K card and people still seem to have this weird notion that because it was more VRAM it must be better for 4K.

I would agree... also the 980 brought a new tier of performance. It wasn't a big leap, but it was faster than anything out at the time.

We already know performance of 980 Ti and Fury X at 4K and neither are enough to drive 4K with a single GPU, so it doesn't make much sense testing the Fury. I would like to see a Fury CF @ 4K review however.
 
I would agree... also the 980 brought a new tier of performance. It wasn't a big leap, but it was faster than anything out at the time.

We already know performance of 980 Ti and Fury X at 4K and neither are enough to drive 4K with a single GPU, so it doesn't make much sense testing the Fury. I would like to see a Fury CF @ 4K review however.
Yeah it was not a big jump at all so not sure why you would call it a new tier of performance. The 980 was only about 5-6% faster than the 780 Ti at 1440 and about 7-8% faster at 1080. Of course now the gap between the 780 Ti and 980 is much larger.
 
Getting a Fury launch baseline at 4k would be beneficial for future comparisons.
Also, why [H]ard saw the need/benefit for testing every other recent high-end card at launch (780, 780ti, 290x, 970, 980, 390x) at 4k, and now inexplicably decide to skip it on the Fury is puzzling.

That is certainly true that Brent could have done it to get a baseline, but one can infer their previous test done on the Fury X to get a sense just how well it will perform under 4k. As for whatever reasons why Brent decides to skip it; who knows, it is his decision and we just have to live with that.

As for 4k testing, a meaningful test would be Xfire Fury X and SLI 980TI or Xfire Fury and SLI 980 since where getting into the realm of playable frame rates. Hopefully we will see something from [H] in the future regarding those type of testings.
 
Yeah it was not a big jump at all so not sure why you would call it a new tier of performance. The 980 was only about 5-6% faster than the 780 Ti at 1440 and about 7-8% faster at 1080. Of course now the gap between the 780 Ti and 980 is much larger.

Yes, I just mean it raised the performance ceiling (even if only slightly). So it was worth seeing how it performed in 4K, at the time.
 
AMD had typically been the price/performance leader in the past. Interesting that Nvidia is beating them here as well. I don't see the incentive to choose this over the 980 at this point if I were to upgrade from my current 280X (which I won't at 1080P for the foreseeable future).
 
Not bad. I like it. Wish it were $500 exact though.

I don't see the incentive to choose this over the 980 at this point if I were to upgrade from my current 280X (which I won't at 1080P for the foreseeable future).

Yep. Seems my Asus DC2 280x was slightly better than I anticipated. I'll sit on it a bit.
 
Nice review as always.

Some points from [H]'s Fury review:

1. Witcher 3 Apples-To-Apples with Gameworks features/options ON the 980 wins by 15%, while with Gameworks features/options OFF, the Fury wins by 6%. A 21% difference in performance for the Fury between having Gameworks features/options enabled or not. Interesting.

2. Dying Light Apples-To-Apples with Nvidia/Gameworks features/options ON, the Fury beats the 980 by 3%. while with Nvidia/Gameworks features OFF, the Fury margin of victory over the 980 increases to 31%. A 28% percent difference on the Fury between having the Nvidia/Gameworks features enabled or not. Interesting.

3. Far Cry 4 Apples-To-Apples with Gameworks features/options ON, the 980 and Fury are TIED in performance, while with Gameworks/Nvidia features/options OFF, the Fury outperforms the 980 by 7%. A 7% difference in performance for the Fury between Nvidia/Gameworks features/option enabled or not. Interesting.

4. A new AMD driver has resulted in performance gains for the Fury series. Interesting.

5. Many declared the possibility of Gameworks features/options gimping AMD card's performance laughable, conspiracy-theory nonsense, yet measured proof of exactly that result is shown in [H] testing here.

6. Many declared that AMD Driver improvements would not make a significant difference for the Fury series card's game performance. Yet here again, [H] documents exactly that occurring (most notably in Far Cry 4) only 16 days after the official release of the Fury X.

It is interesting.

However before you jump to the conclusion that gameworks is purposely slower on AMD (which I still think is not true), look at the texture and pixel fillrates.

980 pixel fillrate is 21.9% faster than the Fury.

Fury Texture fillrate is 55.6% faster than the 980.

What do these numbers tell me? Gameworks features are pixel fillrate dependent. Probably based on a "particalized" method vs a "texture" method. And likely, that's because that it the only way to do it. You can't make realistic looking hair with textures, or a really high texture fillrate. But with a lot of smaller particles, you can.

The texture fillrate probably comes more in to play at higher resolutions. So the GPU designers will be balancing their GPU's texture and pixel fillrates to achieve great looking effects. Nvidia has this balance figured out, AMD is playing catch-up. So to your point 5, I say that AMD gimped themselves.

If the texture fillrate was all that important, the Fury would be totally smoking the 980 by 55% on average. It doesn't. AMD beefed up a spec that didn't need it, although in SLI/Crossfire and at higher resolutions, that higher texture fillrate might come in to play.
 
Thats great if you play canned benchmarks.

The bias is that you are want to change the review to favour AMD more because you arent happy.
Have fun.

Lets try this... to what was I responding the first time?

Well if you attempted to read and comprehend without great disdain, preconceived, you would have seen I responded to the criteria they set up for why they chose the games they did. Originally in another thread they said, two parts, New and Popular. Now of the 5 games 1, BF4 isn't quite so new but it is popular. The other 4 are newish, released within the last year. Of those 4, Dying Light is not quite so popular at all really. Which originally when a poster asked about Skyrim and other games that he played this was the response, new and ( or could be implied here ) Popular. Dying Light was barely in the top 100, whereas Skyrim is usually top 10. Skyrim is just an argument in the way of popular and makes a good point, however its age without using mods makes it a little too dated to be very impactful with new hardware. Moving on, Ryse: Son of Rome is new but not very popular (not sure how it ranks against Dying Light). But it does have quite the lean toward AMD hardware which given 3 of the 5 lend themselves directly to Nvidia doesn't seem to be altering the WHOLE of the suite to AMD as you attempt to allude I am up to. Then there is Dragon Age: Inquisition, which fits both as being very popular and new, within the last year. It does not seem to favor either from most review's findings. This does not tip the scales either way.

Now all that was a discussion, rational and mature, with adequate reasons for their having merit. I am not requiring them to change a thing if they do not wish. Just a remark and suggestion that fit within their own criteria.

As far as Canned benchmarks, sorry there is no argument that can be reasonable for not ADDING them other than time and even that is slim. But again here I wasn't requiring just asking why they don't add a few, not every game has one. The reasons for argument pro and con for each I explained in the previous post and are adequate for why both could be used and neither preclude the other.

Now if you wish to debate the merits and the reason either method has a place in reviews then by all means lets discuss it. But to attempt to dismiss it simply because you like being difficult or biased in your own way, doesn't add anything to the debate at all.
 
Nice review as always.



It is interesting.

However before you jump to the conclusion that gameworks is purposely slower on AMD (which I still think is not true), look at the texture and pixel fillrates.

980 pixel fillrate is 21.9% faster than the Fury.

Fury Texture fillrate is 55.6% faster than the 980.

What do these numbers tell me? Gameworks features are pixel fillrate dependent. Probably based on a "particalized" method vs a "texture" method. And likely, that's because that it the only way to do it. You can't make realistic looking hair with textures, or a really high texture fillrate. But with a lot of smaller particles, you can.

The texture fillrate probably comes more in to play at higher resolutions. So the GPU designers will be balancing their GPU's texture and pixel fillrates to achieve great looking effects. Nvidia has this balance figured out, AMD is playing catch-up. So to your point 5, I say that AMD gimped themselves.

If the texture fillrate was all that important, the Fury would be totally smoking the 980 by 55% on average. It doesn't. AMD beefed up a spec that didn't need it, although in SLI/Crossfire and at higher resolutions, that higher texture fillrate might come in to play.

Good point. I think this whole thing is getting lost in translation. Most reasonable people don't think it is an INTENTIONAL crippling but rather steering games purposely to their strengths. Like I have said so many times, "this isn't inherently evil" but does open the door to be so. The problem up front is the closed nature of the features which in all honesty is reasonable but does not mean every consumer should be fine with it.

TressFX is open so Nvidia can optimize quite easily, but does not mean the code runs most efficiently on their hardware. Chances are, with great odds, TressFX will run most efficiently on AMD just like HW will run best on Nvidia. The only difference here is the open/closed nature of the code.

Granted we would all prefer it didn't come to situations like this, but this is the reality we are faced with. From THIS review you see it isn't all doom and gloom. Add to that a great deal of the issue with HW and likely other features that require heavy tessellation can be altered in AMD drivers, CCC, then it is less a problem, for now.
 
All these reviews make me want to find a 290x for a reasonable price.

Agreed......for 1440p.

For marginal 4k play, the Fury is hands down a better choice than the 980. Unfortunately, you won't see any 4k benchmarks here.

http://www.tweaktown.com/reviews/72...deo-card-review-hbm-water-cooler/index11.html

"R9 Fury has me enjoying AMD's offering, welcoming it if you're an AMD fan. This is the card you should buy. Not the R9 390X, not the Fury X - but the R9 Fury. SAPPHIRE's Tri-X R9 Fury is a great card, with stylish looks and great performance. I'd love to see what two of these in CrossFire are capable of"........

http://hothardware.com/reviews/amd-radeon-r9-fury-review?page=10

"This is the card that takes the fight to NVIDIA, delivering results that are on average only 4% to 7% behind Team Green's GeForce GTX 980 Ti, but for $100 less. And for $50 more than the GeForce GTX 980, the Fury consistently defeats that card by a decisive margin".......

http://hexus.net/tech/reviews/graphics/84512-sapphire-radeon-r9-fury-tri-x-oc/?page=15

"We come away from this editorial with the feeling that R9 Fury is a better bet than the Fury X - a card that is blighted by the impressive performance of the GTX 980 Ti. R9 Fury occupies that barren price/performance space between Nvidia GeForce GTX 980 and GTX 980 Ti"......

http://anandtech.com/show/9421/the-amd-radeon-r9-fury-review-feat-sapphire-asus/19

"On the other hand R9 Fury needs to compete with just the older GTX 980, and while it’s by no means a clean sweep for AMD, it’s a good outcome for AMD. The R9 Fury offers between 8% and 17% better performance than the GTX 980, depending on if we’re looking at 1440p or 4K. I don’t believe the R9 Fury is a great 4K card – if you really want 4K, you really need more rendering power at this time – but even at 1440p this is a solid performance lead"......
 
Last edited:
Solid review... The Fury performs a fair amount better on air than I expected. I really don't understand AMD's strategy right now though, they had a chance to really knock it out of the park by just lowering the msrp of the Fury and Fury X by $50 or so. The only thing I can think of is that they plan on having such a limited stock available until the next generation releases that they'll sell out regardless of price.
 
Solid review... The Fury performs a fair amount better on air than I expected. I really don't understand AMD's strategy right now though, they had a chance to really knock it out of the park by just lowering the msrp of the Fury and Fury X by $50 or so. The only thing I can think of is that they plan on having such a limited stock available until the next generation releases that they'll sell out regardless of price.

Agreed, and let's be honest, if they are selling everyone of the GPUs they produce, then I see nothing wrong with it. Despite the number of green bleeding members here, we need AMD to survive, and prosper...The reason you saw the 980 get a price cut? It wasn't out of good will from Nvidia costing their bottom line...

As for the other bickering, I do think that [H]ard could add 1-3 more games that aren't sponsored by Nvidia for Nvidia, but that is up to them to decide..I would certainly love to see something beyond Brent's lust to add Project Cars and Batman to the lineup.
 
I do not chose games based on who sponsors a game or who's technology the developer has chosen to include in the game. What you are asking for is in itself, a bias.
 
Lets try this... to what was I responding the first time?

Well if you attempted to read and comprehend without great disdain, preconceived, you would have seen I responded to the criteria they set up for why they chose the games they did. Originally in another thread they said, two parts, New and Popular. Now of the 5 games 1, BF4 isn't quite so new but it is popular. The other 4 are newish, released within the last year. Of those 4, Dying Light is not quite so popular at all really. Which originally when a poster asked about Skyrim and other games that he played this was the response, new and ( or could be implied here ) Popular. Dying Light was barely in the top 100, whereas Skyrim is usually top 10. Skyrim is just an argument in the way of popular and makes a good point, however its age without using mods makes it a little too dated to be very impactful with new hardware. Moving on, Ryse: Son of Rome is new but not very popular (not sure how it ranks against Dying Light). But it does have quite the lean toward AMD hardware which given 3 of the 5 lend themselves directly to Nvidia doesn't seem to be altering the WHOLE of the suite to AMD as you attempt to allude I am up to. Then there is Dragon Age: Inquisition, which fits both as being very popular and new, within the last year. It does not seem to favor either from most review's findings. This does not tip the scales either way.

Now all that was a discussion, rational and mature, with adequate reasons for their having merit. I am not requiring them to change a thing if they do not wish. Just a remark and suggestion that fit within their own criteria.

As far as Canned benchmarks, sorry there is no argument that can be reasonable for not ADDING them other than time and even that is slim. But again here I wasn't requiring just asking why they don't add a few, not every game has one. The reasons for argument pro and con for each I explained in the previous post and are adequate for why both could be used and neither preclude the other.

Now if you wish to debate the merits and the reason either method has a place in reviews then by all means lets discuss it. But to attempt to dismiss it simply because you like being difficult or biased in your own way, doesn't add anything to the debate at all.

I'll repeat myself because you keep repeating.
Extra bits added because you are clueless :p

You cant play canned benchmarks.
Mfrs can build cheats into the driver just for benchmarks that dont reflect game performance.
And they often dont reflect game performance anyway.
Why are you even at [H]? This is SPECIFICALLY what they DONT do.

Trying to change reviews so your favourite mfr wins is bias.
I'll sum it up with this:

I do not chose games based on who sponsors a game or who's technology the developer has chosen to include in the game. What you are asking for is in itself, a bias.
 
I do not chose games based on who sponsors a game or who's technology the developer has chosen to include in the game. What you are asking for is in itself, a bias.

Don't give that crap that you do not chose games based on who sponsors a game. btw your game suite does not reflect the games people play the most. Heck it does not even reflect the games which are most popular, most widely played and most critically acclaimed. Games like Shadow of Mordor, Dragon Age Inquisition are games which satisfy the above criteria much better than Dying Light or Farcry 4. The difference is both these games play well on both AMD and Nvidia hardware. But that won't matter you because you are more interested in loading up your test suite with even more one sided Gameworks titles like Project Cars. Way to go. :p

btw most Gameworks titles in the last year have been unoptimized and buggy pile of mess at launch and took months to run stable even on Nvidia hardware. In fact you tore apart Ubisoft for FC4 and AC Unity and here you are still promoting the same pathetic titles in your test suite while shying away from well designed and very well optimized games like Dragon Age Inquisition or Shadow of Mordor.

Anyway lets see what you are going to do when 3 major highly anticipated AAA games in the next 6 months are all AMD Gaming Evolved titles - Deus Ex Mankind Divided, Hitman and Star Wars Battlefront. All three games are pushing the envelope in PC graphics according to the trailers seen at E3. Deus Ex Mankind Divided looks awesome with DX12 and Tress FX 3.0 support.

http://wccftech.com/deus-mankind-divided-revealed-trailer-features-dx12-tressfx-support-dawn-engine/

http://techreport.com/news/28471/take-in-five-glorious-minutes-of-star-wars-battlefront-gameplay

http://www.ign.com/articles/2015/06/23/watch-the-new-hitman-agent-47-trailer

Trying to change reviews so your favourite mfr wins is bias.

On the contrary just so that you can rubbish a manufacturer [h] choses Gameworks games which are intentionally designed to skew the performance comparisons in reviews. Look at what project Cars does to avg score in a review. I give Project Cars as a worst case example.

http://www.techpowerup.com/reviews/ASUS/R9_Fury_Strix/31.html

If a single game can hurt avg performance across a game suite which has more than 20 games it talks about how badly the developer has optimized for AMD cards and how Gameworks licensees are effectively an extension of Nvidia PR & marketing implicitly. [h] suite is loaded with 4 Gameworks titles which when all Gameworks features are turned on (even if laughing unplayable in games like Witcher 3 or Farcry 4 on even a GTX 980) present a badly skewed view of the performance competitiveness of Nvidia and AMD cards.
 
Last edited:
Or let's benchmark Crysis 3 again since that's what we keep playing.

You gotta roll with the wave. AMD hasn't had decent developer relation recently due to budget shortfalls.
 
Or let's benchmark Crysis 3 again since that's what we keep playing.

You gotta roll with the wave. AMD hasn't had decent developer relation recently due to budget shortfalls.

Their developer relations are still mostly intact, we've just seen more GameWorks titles lately due to coincidental release schedules of NVIDIA-sponsored games. Project CARS, Witcher 3 and Batman: AK were all originally supposed to be released last year, but were delayed to nearly the same timeframe this year, which kinda amplified the whole GameWorks 'controversy'.

AMD-sponsored games are coming later this year and early next (Star Wars, Hitman, Total War, Deus Ex, Mirror's Edge, Robinson the Journey, etc.)
 
I do not chose games based on who sponsors a game or who's technology the developer has chosen to include in the game. What you are asking for is in itself, a bias.

But when you exclude other games that meet or exceed your criteria then that in and of itself is bias. Do you concur that of your 5 games 3 are inherently Favoring Nvidia hardware with a 4th close to the same? Recognizing that any game can have affinity to one group or another, again, isn't necessary to preclude or include. But when the results seem quite skewed, more so in your suite of games than with others, then that again opens the way for scrutiny and accusations of bias, EVEN WHEN NONE EXISTS.

Any way I originally was just responding to your criteria for what games you use as I have seen you mention time and time again, yet giving no credence to others that definite fit your criteria better than some you have.
 
But when you exclude other games that meet or exceed your criteria then that in and of itself is bias. Do you concur that of your 5 games 3 are inherently Favoring Nvidia hardware with a 4th close to the same? Recognizing that any game can have affinity to one group or another, again, isn't necessary to preclude or include. But when the results seem quite skewed, more so in your suite of games than with others, then that again opens the way for scrutiny and accusations of bias, EVEN WHEN NONE EXISTS.

Any way I originally was just responding to your criteria for what games you use as I have seen you mention time and time again, yet giving no credence to others that definite fit your criteria better than some you have.

AMD is beating the 980 GTX at stock but once both are OC'ed the 980 GTX runs away with the game.

It could be that 64 ROPs on the Fury are actually a bottleneck for it's massive shader array. It's just a theory. Maybe DX 12.0 will help with more parallelism but right now, Fury is just not the best choice out there seeing what the competition has.
 
Not when they cripple performance on the competitor's hardware (and in some cases their own performance). See hairworks vs tressfx. That's not good for the competition, and thus not good for us.

Please see how Mantle runs on GCN 1.2 and you'll understand the answer. :rolleyes:
 
Solid review... The Fury performs a fair amount better on air than I expected. I really don't understand AMD's strategy right now though, they had a chance to really knock it out of the park by just lowering the msrp of the Fury and Fury X by $50 or so. The only thing I can think of is that they plan on having such a limited stock available until the next generation releases that they'll sell out regardless of price.

While I agree it should be lower pricing, but if AMD can sell it for $550 and get away with it, they will. It will be interesting to see if they can sustain that pricing once supply doesn't become an issue anymore.
 
But when you exclude other games that meet or exceed your criteria then that in and of itself is bias. Do you concur that of your 5 games 3 are inherently Favoring Nvidia hardware with a 4th close to the same? Recognizing that any game can have affinity to one group or another, again, isn't necessary to preclude or include. But when the results seem quite skewed, more so in your suite of games than with others, then that again opens the way for scrutiny and accusations of bias, EVEN WHEN NONE EXISTS.

Any way I originally was just responding to your criteria for what games you use as I have seen you mention time and time again, yet giving no credence to others that definite fit your criteria better than some you have.

They pick games that people actually play. Is it sad that a ton of new AAA games are running much better on Nvidia? Sure, but if you exclude those games you'll just add bias and useless info. I don't care about some random game from 3 years ago, I want to see benchmarks of games like GTA V or Witcher 3. I don't base my game purchases around my GPU I base my GPU purchase around my games.
 
You apparently have a hard time understanding concepts, so I will spell it out for you.

[H] current review practice:

Albeit closer to real performance also can not be identical from one run to the next. Easy to skew results one way or another if one so desires. Not saying this is the case here.

Canned benchmarks:

Repeatable each and every time (except StarSwarm). However not always indicative of real world gameplay.

Both have positive points and negatives. By mentioning the simple addition of one does not preclude the other nor does it alone indict one of ulterior motives. Wait those words may be too big. Asking them to add Canned benchmarks to some does not mean I wanted them to stop the way they do it now, it was just to add. Nor did I mention it because I believed they were cheating or intentionally crippling one set of cards.

It was just a conversation with a legitimate concern. This is how a forum works.

Oh and maybe you can try and point out the bias. I mean spell it out. Not like you did here either, being vague. Means you didn't read or couldn't/wouldn't comprehend.

I see your point of using in-game benchmark, I just don't get why it is keep being brought up, this site has made it pretty clear they don't run benchmarks and there are plenty of other hardware review sites out there that does have benchmark results. Currently, Brent doesn't place any great value running benchmark results, no amount of convincing from you will change his position. If this is somewhat you cannot accept, then this site may not suits you.
 
They pick games that people actually play. Is it sad that a ton of new AAA games are running much better on Nvidia? Sure, but if you exclude those games you'll just add bias and useless info. I don't care about some random game from 3 years ago, I want to see benchmarks of games like GTA V or Witcher 3. I don't base my game purchases around my GPU I base my GPU purchase around my games.

No biggie really just discussion of criteria. But I gather you missed the part of an earlier post where Dying Light is not popular. And the part about how Dragon Age: Inquisition is, top 10 most days. Anyway I wasn't trying to get any games removed or added really, just debating the criteria and bringing to light the glaring hole in the criteria as it pertains to the current suite.
 
No biggie really just discussion of criteria. But I gather you missed the part of an earlier post where Dying Light is not popular. And the part about how Dragon Age: Inquisition is, top 10 most days. Anyway I wasn't trying to get any games removed or added really, just debating the criteria and bringing to light the glaring hole in the criteria as it pertains to the current suite.


You are over a decade too late to join that debate.

Your thoughts are noted. No need to repeat those yet again.
 
Actually being a forum it is expected from time to time to debate even age old dilemmas. Avoiding any such intercourse defeats the point of a forum. Unless this is more of a club, then the name is very misleading. At any rate, I always point out curious details and thought processes when they seem to be faulty or flimsy. Again I was just discussing some thoughts which at no point is it necessary to facilitate change nor required. But this dodge and weave every time it gets brought up is disheartening.

But let me point out one thing from this review which adds to the point I made about positives and negatives of benchmarking runs as done with [H]. Look at the Witcher3 the graph goes almost identical through peaks and valleys. This shows that the runs are nearly identical or in the least the same occurrences are evident in both runs at equal times during the gameplay. Now look at Grand Theft Auto 5, in this case the graph deviates toward the end (saw this in the fury review with Dying Light I think but vendors switched) with Nvidia dropping as AMD goes up. At this point it may be that the gameplay deviated and this is what can skew results, the negative of gameplay benchmarking.


I know this will probably surprise you, but yes, we are aware of the negatives and positives of real world in-game performance evaluations. We have been doing this for quite a while and have turned it into somewhat of a science on our end. And we take lots of time to make sure our results shown are not simply the "first run and done" type data. We take multiple runs and look at those to decide which is best representative of real world gaming performance.

That said, you are late to this discussion. We decided our focus and direction on this well over 10 years ago and are constantly refining our techniques on data collection.

Yes, we could run canned benchmarks, which we do not believe in, and show those to you. But we simply do not need to. There are many other sites that already take the easy way out to show you those. HardOCP is simply better than that. You may disagree.

6ed.gif
 
Back
Top