FineWine: Reference 980Ti vs Fury X revisited

A little independent testing shows that at least some of these tests weren't actually run at all, for any of the three cards. Some of the games (Doom/Vulkan @ 4k as a prime example) *none* of the cards perform that poorly in actual use.
 
A little independent testing shows that at least some of these tests weren't actually run at all, for any of the three cards. Some of the games (Doom/Vulkan @ 4k as a prime example) *none* of the cards perform that poorly in actual use.

What?

What do you mean independent testing?

How do you know where they performed the benchmark run in DOOM to compare it to others?
 
Because I have access to two of those cards, with a weaker core system, and both the 1070 and Fury X perform better at 4k maxed than what is shown in the listed article. Substantially better. There is no way they actually ran those tests with those drivers on that test system and got the results they posted. Not without something being *very* wrong with the PC itself.

Edit: this is specifically in response to the Doom query.
 
Because I have access to two of those cards, with a weaker core system, and both the 1070 and Fury X perform better at 4k maxed than what is shown in the listed article. Substantially better. There is no way they actually ran those tests with those drivers on that test system and got the results they posted. Not without something being *very* wrong with the PC itself.

Edit: this is specifically in response to the Doom query.

Doom doesn't have a built in benchmark, and you have no idea what scene they used to test.

There's literally nothing there for you to reach any conclusion whatsoever.

I have run extensive performance tests on doom previously and I can tell you depending on the scene I an average 75fps or 125.
 
So, by your own argument in this specific case, because nothing can be reproduced by settings alone, the linked article you posted (for purposes of discussion on Doom) is completely worthless and non-representative of any type of gameplay. Good to know. So the rest of the verified items that *weren't* Doom. Are those allowed?
 
So, by your own argument in this specific case, because nothing can be reproduced by settings alone, the linked article you posted (for purposes of discussion on Doom) is completely worthless and non-representative of any type of gameplay. Good to know. So the rest of the verified items that *weren't* Doom. Are those allowed?

That's a straw man, and many here know how dull I find them.

I'll entertain it though.

What I said was that the results cannot be compared to other sources unless they test a very similar run. This is not intended to be performance review of a game, the purpose of collecting this data is to compare the cards tested.

It means precisely fuck all that in your test with a fury x you get X change in average framerate.

If you had all the cards tested and performed the same benchmark run on all of them then we could compare relative performance of the cards in your data set to the one compiled by babeltech.

I don't know what "so the rest of the verified items that weren't doom." means, it reads like you wrote half a sentence then abandoned it.

I'm guessing what you're trying to say that is that since the doom numbers aren't representative of performance of the game as a whole, all the other results suffer from the same limitation thus are all invalidated. No, because the point isn't to represent performance of games, but to compare that of different cards.

No benchmark is representative of the game as a whole unless there's very little variation in the complexity of the different scenes.

Unless you have some actual evidence and justification for why this review contains flawed data your argument is null and void.
 
That's a straw man, and many here know how dull I find them.

I'll entertain it though.

What I said was that the results cannot be compared to other sources unless they test a very similar run. This is not intended to be performance review of a game, the purpose of collecting this data is to compare the cards tested.

It means precisely fuck all that in your test with a fury x you get X change in average framerate.

If you had all the cards tested and performed the same benchmark run on all of them then we could compare relative performance of the cards in your data set to the one compiled by babeltech.

I don't know what "so the rest of the verified items that weren't doom." means, it reads like you wrote half a sentence then abandoned it.

I'm guessing what you're trying to say that is that since the doom numbers aren't representative of performance of the game as a whole, all the other results suffer from the same limitation thus are all invalidated. No, because the point isn't to represent performance of games, but to compare that of different cards.

No benchmark is representative of the game as a whole unless there's very little variation in the complexity of the different scenes.

Unless you have some actual evidence and justification for why this review contains flawed data your argument is null and void.

Nope. Just needed to see you write that the actual performance numbers of the cards themselves don't matter, just the relation of performance of equal loops or benchmarks performed with each card. Essentially, that it's not the absolute number for each card, it's the differences between the cards when run through the same tests. Am I understanding that correctly?
 
Nope. Just needed to see you write that the actual performance numbers of the cards themselves don't matter, just the relation of performance of equal loops or benchmarks performed with each card. Essentially, that it's not the absolute number for each card, it's the differences between the cards when run through the same tests. Am I understanding that correctly?

No you're not.

They are testing three cards with old and new drivers.

The point is to see if the relative performance of, say, the Fury X and 980ti has changed over the course over a year.

The actual performance numbers do matter because they form the basis of the comparison...
 
It's just that if in their run the fury X averages 60fps and on your identical system you average 100fps then unless you're running the exact same benchmark run you've said nothing at all that could call into question the results
 
So, the point of the test is specifically determine if maturation of the drivers has significantly impacted performance of each card, solely. Meaning that because of the scope of the tests, it can't even be inferred that the same levels/loops were used for each card, unless the test itself is said to use a benchmark with a given settings list. Right?
 
So, the point of the test is specifically determine if maturation of the drivers has significantly impacted performance of each card, solely. Meaning that because of the scope of the tests, it can't even be inferred that the same levels/loops were used for each card, unless the test itself is said to use a benchmark with a given settings list. Right?

Why are you talking about a settings list?

Man...

I'm Mr Babeltech.

I slot in a Fury X and install a driver from last year, I benchmark doom using a predetermined loop of my choosing. I install latest driver and repeat the test.

I slot in a 980ti, I install last year's driver, run the same loop I used on the Fury. Install latest driver, repeat.

Last year the fury X was 85% of a 980ti

Using latest drivers the fury X is 93% of a 980ti.

Fury X improved X%, 980ti improved Y%.

Conclusion.
 
The problem arises when someone other than Mr Babeltech with an identical system decides to test DOOM, the benchmark loop won't be the same unless you agreed on it beforehand.

Like I said, I previously ran many many benchmark runs on doom on my card, I'll link when I find them here on the forums.

Depending on which level/scene I tested it the averages could change by up to 60%
 
Why are you talking about a settings list?

Man...

I'm Mr Babeltech.

I slot in a Fury X and install a driver from last year, I benchmark doom using a predetermined loop of my choosing. I install latest driver and repeat the test.

I slot in a 980ti, I install last year's driver, run the same loop I used on the Fury. Install latest driver, repeat.

Last year the fury X was 85% of a 980ti

Using latest drivers the fury X is 93% of a 980ti.

Fury X improved X%, 980ti improved Y%.

Conclusion.

??? Because being able to reproduce results is the definition of validating a test. You have to have the settings list to be able to verify that your own observations are in line. Say, if I screw up and don't max may FoV in my settings in DE:MD, then my test results wouldn't be comparable. I used Doom as an example of how things are messed up because, even in a full playthrough, I don't remember a point where a section yielded average framerates that tanked that far. Such incredibly low average framerates *across all three cards* has me wondering if that specific test actually took place. Without the definition of the section/loop, as you pointed out, it seems difficult to believe *any* of those quoted changes between driver revisions because the numbers used to establish the amount of change seem to be suspect. For the tests he performed that *did* use built-in benchmarks, some of those are also pretty significantly different from what can easily be reproduced.
 
??? Because being able to reproduce results is the definition of validating a test. You have to have the settings list to be able to verify that your own observations are in line. Say, if I screw up and don't max may FoV in my settings in DE:MD, then my test results wouldn't be comparable. I used Doom as an example of how things are messed up because, even in a full playthrough, I don't remember a point where a section yielded average framerates that tanked that far. Such incredibly low average framerates *across all three cards* has me wondering if that specific test actually took place. Without the definition of the section/loop, as you pointed out, it seems difficult to believe *any* of those quoted changes between driver revisions because the numbers used to establish the amount of change seem to be suspect. For the tests he performed that *did* use built-in benchmarks, some of those are also pretty significantly different that what can easily be reproduced.

You not remembering performance being that low isn't proof of anything... How do you know you played the entirety of the game using that one driver?

Were you playing the game or constantly watching the framerate and taking note of sections where it was relatively low?

Do you even have the same system they used?

If some of the games they tested have built in benchmarks, and you have great same system, and you can easily produce results that are very different from theirs then go ahead. That's exactly what I thought you would be doing when you said their data was off.
 
Wait are you now saying they didn't keep the settings constant across tests?
 
LOL I'd trust WCCF over Babeltech any day. AMD black listed that site back in the 7970 days. You should have seen their old site ABT before it split up. Reminded me of AMDZone is all I will say.
 
LOL I'd trust WCCF over Babeltech any day. AMD black listed that site back in the 7970 days.

AMD blacklisted this site back in the few months ago days.

WCCF's posts are 99% rumors and speculation. Babeltech runs tests, I honestly see no reason why you assume they falsify their data
 
AMD blacklisted this site back in the few months ago days.

WCCF's posts are 99% rumors and speculation. Babeltech runs tests, I honestly see no reason why you assume they falsify their data


Yeah but the company was in a wayyyy different position in 2011-2012 than it was in 2015. Just saying.... AMD is constantly doing damage control for their image now. Back then, you'd have to repeatedly attempt to sour their reputation. I'll let you google it since you are a pro here and always give AMD a fair shake.


Do I trust this site for reviews? Yes I do. I do NOT trust ABT or BTR or what ever it is called. Its owned by the same guy. Its my opinion, I am not stating as a fact but you can search and find your own facts. :)
 
Last edited:
No. I have access to the same cards, with a weaker core system ([email protected]) and I get higher results. So unless we're willing to entertain that an 8320 is somehow superior to an i7-4790 in a gaming benchmark (which I don't think anyone, including myself, is), then something is wrong with the numbers being used to establish the degree of change.

As an example, a very easily reproducible example, the SoM numbers are from the benchmark tool included with the game. We know this because the in-game engine is capped at 100 FPS, so having the 980 ti and 1070 reporting a framerate higher than 100 FPS means that the benchmark tool was being used.

Now I suppose it's possible that the FuryX tests were performed with actual gameplay and that the NV tests were just benchmark tool runs, but now we're treading on trying to make excuses for changes in testing methods within the same game, although not necessarily within the same card's suites. At the very least, a note saying that some driver comparisons for a game were done with included benchmark tools for some cards, where others were done with actual gameplay. When it comes to rolling out reported performance changes by driver version for each card, those notes mean something.

Since the benchmark tool was used for the NV cards, I'll make the presumption that it was also used with the AMD card. A stock XFX Fury X using 16.11.5 drivers, paired with the [email protected], using all of the settings completely maxed in SoM's benchmark produces a result of 39.7 FPS @ 4k. At 1440p, it's 73.9 and at 1080p, it's 98.5. The numbers are substantially higher than the reported output above, with a substantially weaker core system behind the card.

I'm not saying they changed the settings between the tests. I'm doubting whether they actually ran the tests as they stated in the first place. At least some of them.

Edit: for typo on CPU in their test system.
 
No. I have access to the same cards, with a weaker core system ([email protected]) and I get higher results. So unless we're willing to entertain that an 8320 is somehow superior to an i7-4790 in a gaming benchmark (which I don't think anyone, including myself, is), then something is wrong with the numbers being used to establish the degree of change.

As an example, a very easily reproducible example, the SoM numbers are from the benchmark tool included with the game. We know this because the in-game engine is capped at 100 FPS, so having the 980 ti and 1070 reporting a framerate higher than 100 FPS means that the benchmark tool was being used.

Now I suppose it's possible that the FuryX tests were performed with actual gameplay and that the NV tests were just benchmark tool runs, but now we're treading on trying to make excuses for changes in testing methods within the same game, although not necessarily within the same card's suites. At the very least, a note saying that some driver comparisons for a game were done with included benchmark tools for some cards, where others were done with actual gameplay. When it comes to rolling out reported performance changes by driver version for each card, those notes mean something.

Since the benchmark tool was used for the NV cards, I'll make the presumption that it was also used with the AMD card. A stock XFX Fury X using 16.11.5 drivers, paired with the [email protected], using all of the settings completely maxed in SoM's benchmark produces a result of 39.7 FPS @ 4k. At 1440p, it's 73.9 and at 1080p, it's 98.5. The numbers are substantially higher than the reported output above, with a substantially weaker core system behind the card.

I'm not saying they changed the settings between the tests. I'm doubting whether they actually ran the tests as they stated in the first place. At least some of them.

Edit: for typo on CPU in their test system.

Link your benchmark score in SoM on your system with a Fury X. Use the same driver they did, I'm assuming your Fury X is stock not overclocked, undervolted, overvolted, reduced memory timings etc.

Do we know that they're not using PresentMon btw ? And are you sure Shadow of Mordor is capped ?
index.php


Yeah there's definitely something up there considering this is 1440p. They do say they max out all settings in all games tested, so maybe there was updated texture pack or something?

Honestly man I struggle to believe they literally make up numbers that's utterly retarded if true
 
Last edited:
There is no mention of using PresentMon anywhere in their posting and yes, I'm sure that in-game SoM is capped at 100 FPS.

Edit: Found it. There's an option for Camera blur and Camera + Motion Blur. Turning motion blur on does have a significant impact. Not enough to make my 8320 system slower than their 4790 system, but it's not 10% faster at each resolution now. It's around 4% faster at each. Still pretty suspect.

After having found that, I rechecked the other in game benchmark they used, RotTR. Their numbers are still strangely low, and it's a lot easier to tell if a setting isn't maxed out in the option screen there, as the setting toggles/sliders don't wrap around to low when you click past ultra/highest.
 
Last edited:
So, by your own argument in this specific case, because nothing can be reproduced by settings alone, the linked article you posted (for purposes of discussion on Doom) is completely worthless and non-representative of any type of gameplay. Good to know. So the rest of the verified items that *weren't* Doom. Are those allowed?

I read the same post you did, but didn't read the same words. What are you going on about exactly?
 
Bad when websites make up numbers to get clicks. It's so easy to run benchmarks or just do it the right way. Why bother lying?
 
Back
Top