The Fury X vs. the GTX 980 Ti Revisited with 36 Games including Watch Dogs 2

Joined
Dec 23, 2010
Messages
886
Untitled-1.jpg
Untitled-2.jpg
Untitled-3.jpg
Untitled-4.jpg
Untitled-5.jpg


http://www.babeltechreviews.com/fur...ted-36-games-including-watch-dogs-2/view-all/



Overall, the GTX 980 Ti is still significantly faster than the Fury X in the majority of our games although the Fury X has been able to gain a little ground – 7 additional benches out of the 75 we originally tested are now in the Fury X’ favor – although it is still bested by the GTX 980 Ti overall. We don’t see the GTX 980 Ti losing any ground in the older games to the GTX 1070, although the newer card pulls further ahead in some of the newest games.

We continue to see good optimizations being made for the GTX 980 Ti although they appear smaller than for the GTX 1070. And as AMD’s flagship, we see AMD’s driver team continue to optimize it, making good on their progress to manage its limited 4GB of vRAM rather well. The games where Fury X had issues at 4K – especially with Assassin’s Creed Syndicate and GTA V – are now playing much better now than they were 6 months ago.
 
If the 980 Ti smokes the Fury X, then the 1070 is going to smoke the non-X Fury even more.
 
Picking a Nvidia Bias website to show proof is like super trolling.

Just like your troll post on Anandtech right now about Video cards. When will it end Medium.

God and you posted this over their. How much is Nvidia paying you now?
 
It's a lot closer than you think. ;) But the 1070 won of course. Still not too bad for $239.
He tested Heaven, GTA 5, and two AMD partnered games. I am sure it appears a lot closer than it actually is.
TPU's summary shows the 1070 about 25% faster at 1080p. Fury also has half the VRAM and basically no OC headroom.

In a market where 980 Ti's are going for $300-$350, Fury is worth about $240.
 
He tested Heaven, GTA 5, and two AMD partnered games. I am sure it appears a lot closer than it actually is.
TPU's summary shows the 1070 about 25% faster at 1080p. Fury also has half the VRAM and basically no OC headroom.

In a market where 980 Ti's are going for $300-$350, Fury is worth about $240.

Of course. I never said that they were equal did I?
 
And as AMD’s flagship, we see AMD’s driver team continue to optimize it, making good on their progress to manage its limited 4GB of vRAM rather well. The games where Fury X had issues at 4K – especially with Assassin’s Creed Syndicate and GTA V – are now playing much better now than they were 6 months ago.

That does not sound like a biased web site to me.

Nvidia Bias website to show proof

proof of what?
I quoted the review.
 
That does not sound like a biased web site to me.



proof of what?
I quoted the review.

Which everyone knows is a Nvidia Bias website. Their numbers are off. Even people at Anandtech called you out on it. You know what you did, Just like you trying to justify your (buy a $1000 high-end GPU for poor people) Article.
 
Problem for Fiji is Polaris that fixed one of the weakest points vs Nvidia with the added primitive discard. Unless you prefer old games Fiji isn't going to age well.

wd2.png
 
Problem for Fiji is Polaris that fixed one of the weakest points vs Nvidia with the added primitive discard. Unless you prefer old games Fiji isn't going to age well.

wd2.png

I don't see what you are trying to say here. please explain
 
As a 980 ti owner alls I can say is thank god I don't give a shit about DX12. Really just a reason for MS to push win 10.
 
lol, look at those Doom Vulkan numbers. You know they just phoned that shit in. They didn't test anything, just scribbled what numbers they wanted in.

Sad thing is, it's easily verifiable that they didn't actually do some of those tests. The Doom 4k *minimum* doesn't even drop that low in Vulkan.

RotTR is the only other one from that list I still have installed and paired with an [email protected], a stock Fury X is 10% faster at every resolution, using the same settings, from what is shown in those charts. And I'd expect there to be at least some CPU limitations from my 8320 compared to their i7, so...yeah. A fair number of questions should be leveled at those "results".

Edit: I loaded up the rest of the titles overnight and did a few more quick tests this morning. Easy enough when all of the settings are listed for each title.

Shadows of Mordor is largely the same as RotTR. 10% lower than reality across the board. They obviously used the benchmark, because the ingame engine is capped at 100 FPS, so the 1070 numbers mean they couldn't have done in-game testing. Benchmark it is then. 4k avg was 39.7, 1080p was 101.3

DE:MD was rough on the fury. This test reminded me why I don't try to play this at 4k usually. The benchmark was *not* used here, because with all the settings listed at 4k, the benchmark couldn't break 10 fps. The ingame play was only slightly better, averaging 14fps at 4k and 34 fps at 1440p.

DA:I was one I had to futz with a little. Mainly because there is no 1xMSAA setting in the game. Just off, 2x and 4x. So I ran my tests with 2x and still saw average fps slightly higher than reported above, although only 4k was > 5% different at 28.7 fps.
 
Last edited:
I don't see what you are trying to say here. please explain

Example of Polaris beating Fury due to the primitive discard. The bar simply got upped (never an issue on Nvidia) and pre GCN 1.3 falls short right away.
 
Using refence clocks on a 980 Ti is a bit of a waste as most overclock something like 20%.
 
lol, I saw this OP posted this at anand and they locked it... not surprised.


That's because some of the mods over there are AMD biased , its obvious.
In fact the forums is ran by AMD marketing now.
People keep asking why no reviews? Watch how fast Vega gets reviewed.
 
I think it's awesome that my RX 480 is faster in Doom than a Fury X. It must be my superior FX-9370 giving me all the extra frame rate under Vulkan. I knew Intel processors were overrated!

http://www.hardocp.com/article/2016...x_1060_founders_edition_review/4#.WETO5fArKCo


1468921254mrv4f5CHZE_4_4_l.gif


1468921254mrv4f5CHZE_4_3_l.gif

Not sure how your FX came into it. But it shows the issue of reusing old numbers with other game versions than current. Including you for using very outdated Doom numbers. Not to mention different settings, hardware, benchrun, drivers and so on. There is a mix of stock 4790K and 6700K. H for example uses a 4.7Ghz 6700K. I thought people had learned never to compare directly across sites.

http://www.hardocp.com/article/2016...0_g1_gaming_vs_msi_gtx_1060_x/12#.WEVPsYWcEuU

1478779503ovQdNNpphh_12_2.png


Even this doesn't look to be the original run either. Or something changed a lot. And whats up with the huge swings on the 480 in both cases?
 
Last edited:
If the 980 Ti smokes the Fury X, then the 1070 is going to smoke the non-X Fury even more.

Who the hell cares. I have two of those in an mGPU configuration and they run really well. For those games that do not support mGPU, one is usually sufficient for 4k at reasonable settings.
 
If you look at the graph it literally did that once the entire test. Maybe it didn't like a decal on the wall. ;)
Are we looking at the same benchmark... On the HOCP bench you posted, the RX 480's graph looks like a failed polygraph test.
We know AMD has had problems with frame consistency in the past and, frankly, seeing shit like this makes me hesitate on Vega.

What do these spikes/dips look like in-game? Stutters? Or is it just a visual error on HOCP's monitoring tools?

ejnSjKX.png
 
I was talking at the one 28 fps dip on the chart. The one all the way to the right. If you look at the tall spikes for the RX 480 that's when the card is getting really fast. It doesn't dip lower than the GTX 1060 except for that one instance where I said that there must have been a bad decal on the wall. It just runs away from the GTX 1060 in general. If the GTX 1060 had more horsepower then it would have taller spikes like the RX 480. But it doesn't because it couldn't hang in this title at that point in time.

Luckily things have gotten a lot better for both cards over time. Check out this review. The GTX 1060 simply doesn't have the horsepower for this game. Not sure why not; probably architecture differences that make some games faster on Nvidia and others faster on AMD. Maybe AMD and Nvidia like to be fed different ways?

GTX 1060 vs. RX 480 - An Updated Review
http://www.hardwarecanucks.com/foru.../73945-gtx-1060-vs-rx-480-updated-review.html

The OC'd EVGA card is getting stomped by a reference RX 480. Probably architecture differences rearing it's head. Both are great cards and the GTX 1060 wins some in other games.

GTX-1060-UPDATE-74.jpg
GTX-1060-UPDATE-34.jpg

GTX-1060-UPDATE-94.jpg
GTX-1060-UPDATE-54.jpg
 
Who the hell cares. I have two of those in an mGPU configuration and they run really well. For those games that do not support mGPU, one is usually sufficient for 4k at reasonable settings.

Only on older games. There aren't many new games that run well @ 4k with higher graphic settings on 1 card. I'm hoping this will change soon, but the best games always push the cutting edge. SLI/Crossfire support seems to be disappearing. Maybe dx12's mGPU support will change all this- but I'm not expecting it to.
 
Back
Top