FuryX aging tremendously bad in 2017

I was hoping this dumpster fire would fizzle out. Looks like it came around to bite OPs' ass. Hameeedo is like a combination of if Shintel and Juan had a shill child. Love how they posted this shit on reddigg too, as if they have an agenda or something.
 
Funny how when certain people are faced with hard numbers, simply shrug it off like it's nothing, instead of putting a valid argument against it. I wonder how did they even come to the conclusion that Kepler aged badly? Probably their imagination I guess, since they hate numbers and benches so much!


People seem to forget how the FuryX performs horribly with VR games:

VR-HOCP.jpg
 
Last edited:
What point does this thread serve? Furys have long been EOL and therefore nowhere to be found NEW. So do you think you are saving people? It is obvious you have an agenda, solely to bash AMD so you pick, selectively, graphs and games that will back you up, again on a point that has no service. This thread should have been locked and deleted minute one.
 
What point does this thread serve? Furys have long been EOL and therefore nowhere to be found NEW. So do you think you are saving people? It is obvious you have an agenda, solely to bash AMD so you pick, selectively, graphs and games that will back you up, again on a point that has no service. This thread should have been locked and deleted minute one.

But it isn't just about the FuryX- the FuryX is an example of an AMD halo product that is aging poorly, versus widely held expectations, where AMD products are supposed to age like 'fine wine'.

Definitely food for thought for anyone who is considering buying AMD and holding on for a period of time.
 
But it isn't just about the FuryX- the FuryX is an example of an AMD halo product that is aging poorly, versus widely held expectations, where AMD products are supposed to age like 'fine wine'.

Definitely food for thought for anyone who is considering buying AMD and holding on for a period of time.

Food for thought is that if people run an older videocard which has certain limits they are smart enough to adjust certain settings which would allow certain things as choking on the 4 gb limit less of a factor.

I haven't seen anyone but some of the people who are flame baiting on purpose forcefully pulling that argument about fine wine.
It is obvious that you can skew these results and cherry pick all you want but then again who would bother making a case for the Fury X and fine wine to begin with, find the 100 of posts claiming that the Fury X was in need of fine wine at launch time ?

Only people that enjoy spending their time framing their own chain of thought. For the reason why you will see topics in this forum why the drivers still suck it is all done by questionable people on this forum.
 
What point does this thread serve? Furys have long been EOL and therefore nowhere to be found NEW. So do you think you are saving people? It is obvious you have an agenda, solely to bash AMD so you pick, selectively, graphs and games that will back you up, again on a point that has no service. This thread should have been locked and deleted minute one.

there's a point a lot of people aren't taking in consideration, GCN was pretty naively known as *FineWine* Tech, saying all kind of *positive* things about AMD taking care of their users longer than Nvidia and a large ETC, but people never associated that to the fact that they has been using the same architecture since the HD7900 Series, however since polaris AMD basically suddenly entered in the *same realm* as Nvidia which it's provide optimizations and support only from newer *tech* and newer generations of GPUs, since the RX 480 all patch, most fixes, most optimizations left behind the more powerful old brothers (R9 390X/Fury/Fury X at that time).

However that is not the case today, the RX 480/580 are cards that easily stump an R9 390X and perform mostly on pair with Fury X and now it surpass it, strange right?. they are the same architecture, and what kept relevant the 7970/280X/290X/390X were the fact that they shared the same architecture with the Fury X, so most optimizations for the Fury X also improved the performance of older cards, and that's where Fiji Started to *Age* differently, When Nvidia launched Pascal, Fiji was still the most powerful AMD GPUs, devs knew that, devs optimized for that, AMD optimized for that (you know, the shared System Cache and vRAM) nevertheless with the introduction of Vega even being basically the same architecture aside from higher clocks, everything it's completely different, all optimizations for Vega should improve Fiji, Polaris, Hawaii, Tahiti (and the large list of rebranded architecture names of GCN), strange right?. FineWine Tech that no longer age the same as *newer architectures* are released, the true thing, is AMD doesn't have the money to keep optimizing their huge range of GCN GPUs nowadays they are A LOT, too much, to make specific optimizations so they focus in the newer, exactly as Nvidia always do, they are bad for that? no, I understand that. but people then have to stop the mentality of AMD cards AGE better, which i think it's the goal of the OP, I may be wrong, but that's how I see AMD Since Polaris and that's how I feel ALL my AMD hardware with each release of new games and drivers, I experience that fact from my own experience.
 
there's a point a lot of people aren't taking in consideration, GCN was pretty naively known as *FineWine* Tech, saying all kind of *positive* things about AMD taking care of their users longer than Nvidia and a large ETC, but people never associated that to the fact that they has been using the same architecture since the HD7900 Series, however since polaris AMD basically suddenly entered in the *same realm* as Nvidia which it's provide optimizations and support only from newer *tech* and newer generations of GPUs, since the RX 480 all patch, most fixes, most optimizations left behind the more powerful old brothers (R9 390X/Fury/Fury X at that time).

However that is not the case today, the RX 480/580 are cards that easily stump an R9 390X and perform mostly on pair with Fury X and now it surpass it, strange right?. they are the same architecture, and what kept relevant the 7970/280X/290X/390X were the fact that they shared the same architecture with the Fury X, so most optimizations for the Fury X also improved the performance of older cards, and that's where Fiji Started to *Age* differently, When Nvidia launched Pascal, Fiji was still the most powerful AMD GPUs, devs knew that, devs optimized for that, AMD optimized for that (you know, the shared System Cache and vRAM) nevertheless with the introduction of Vega even being basically the same architecture aside from higher clocks, everything it's completely different, all optimizations for Vega should improve Fiji, Polaris, Hawaii, Tahiti (and the large list of rebranded architecture names of GCN), strange right?. FineWine Tech that no longer age the same as *newer architectures* are released, the true thing, is AMD doesn't have the money to keep optimizing their huge range of GCN GPUs nowadays they are A LOT, too much, to make specific optimizations so they focus in the newer, exactly as Nvidia always do, they are bad for that? no, I understand that. but people then have to stop the mentality of AMD cards AGE better, which i think it's the goal of the OP, I may be wrong, but that's how I see AMD Since Polaris and that's how I feel ALL my AMD hardware with each release of new games and drivers, I experience that fact from my own experience.
I think you're getting ahead of yourself - Vega has been out for barely two months and you're already claiming AMD no longer optimizes for older GCN revisions. I'd say it's a bit too soon to claim that - give it another 6 months and then we'll see.
 
Funny how when certain people are faced with hard numbers, simply shrug it off like it's nothing, instead of putting a valid argument against it. I wonder how did they even come to the conclusion that Kepler aged badly? Probably their imagination I guess, since they hate numbers and benches so much!


People seem to forget how the FuryX performs horribly with VR games:

VR-HOCP.jpg

That's true for basically all AMD cards.

Perhaps Brent could extend his fine wine article that [H] did. Basically there was an initial improvement in some games (due to shitty optimization to start IMO) then flat lined.

https://m.hardocp.com/article/2017/01/30/amd_video_card_driver_performance_review_fine_wine/14


Fine wine is like taking nVidia's initial performance in Forza 7 and celebrating it. It's complete dog shit.
 
I think you're getting ahead of yourself - Vega has been out for barely two months and you're already claiming AMD no longer optimized for older GCN revisions. I'd say it's a bit too soon to claim so - give it another 6 months and then we'll see.

it's been happening since Polaris, so way longer than that. and if you compare the games and performance since the RX 480 up to today you will see that most games today perform better on RX 480/580 than Fury/Fury X.
 
I think you're getting ahead of yourself - Vega has been out for barely two months and you're already claiming AMD no longer optimizes for older GCN revisions. I'd say it's a bit too soon to claim that - give it another 6 months and then we'll see.

Seems fair given that Vega -> big Polaris, and Polaris has been out for quite some time.
 
But it isn't just about the FuryX- the FuryX is an example of an AMD halo product that is aging poorly, versus widely held expectations, where AMD products are supposed to age like 'fine wine'.

Definitely food for thought for anyone who is considering buying AMD and holding on for a period of time.
Then how about the 290X or the 7970, previous halos? You are just looking at one card which is different than all others : HBM and claiming a trend or some conclusion in spite of the larger data set.
 
Fine wine is like taking nVidia's initial performance in Forza 7 and celebrating it. It's complete dog shit.

Bringing up initial performance as a decision-making metric is very poor logic; worse is bringing up a game where the developer basically broke the engine. And as we saw, this was quickly remedied, and performance tended back toward the mean (as expected).
 
Then how about the 290X or the 7970, previous halos? You are just looking at one card which is different than all others : HBM and claiming a trend or some conclusion in spite of the larger data set.

...where did I put 'HBM' in that quote? Where did I say that HBM, as a technology, hurts AMD products?
 
This is why it doesn't make any sense to upgrade to the "latest and greatest" drivers all the time blindly. Current drivers are only intended to optimize the currently generation of products. AMD doesn't want Fury owners keeping their cards forever they don't make money that way. Nvidia and AMD both do this now and have been for a long time. I specifically use older drivers that were optimized for my Fury and I'm still enjoying the same performance the card had in the past.
 
...where did I put 'HBM' in that quote? Where did I say that HBM, as a technology, hurts AMD products?
Then read your quote again. You lumped all AMD products in your asinine point devoid of all other facts. Fury is not at all indicative of the whole, as the 7970 290X/390X prove. And as Stashix pointed out, time is needed before jumping to the conclusions you made.
 
The one where I didn't mention HBM?
Holy crap, I seriously hope you are intentionally being dense. I mentioned HBM because of the previous Halo cards from AMD ONLY the Fury has it and therefore not indicative of every other AMD card or it performance over time.
 
Perhaps Brent could extend his fine wine article that [H] did. Basically there was an initial improvement in some games (due to shitty optimization to start IMO) then flat lined.
Yep, do an article with all of these listed games, and lets see once and for all how the FuryX stacks up to the 980Ti now.
What point does this thread serve? Furys have long been EOL and therefore nowhere to be found NEW. So do you think you are saving people? It is obvious you have an agenda, solely to bash AMD so you pick, selectively, graphs and games that will back you up, again on a point that has no service. This thread should have been locked and deleted minute one.
Locked? LOL! is this how you manage technical arguments? I come up with solid numbers and you demand to lock them up and delete them? How very informative of you.
Oh and it's not cherrypicking when FuryX is bad in all of the late 2017 AAA titles and VR games, That's called horrible performance.
which i think it's the goal of the OP, I may be wrong
The goal is to show the arcs that age worse, and to stop the nonsense that NVIDIA gimps their old arcs on purpose, some AMD fans even think Fermi and Maxwell already aged badly!! Someone here already called the 1070 aging badly! AMD fans go through huge lengths to fantasize about a magical finewine thing for their GPUs, which -with the current evidence- clearly doesn't exist.
 
I don't understand why people buy AMD GPUs when nVidia ones at the moment perform clearly better.

Its because of marketing and price, there are no AMD cards that are capable in matching any nV cards in all metrics but they still sell, cause of the prior. They are cheaper for the same performance (rx 580/480 and less) and well the other cards Vega, people are hoping for those magic drivers which will never come.

People say nV has mindshare..... This is why nV has mindshare, they don't come out with turd (relative to competition) every other generation in this case for two generations. Just wait for next gen, AMD GPU mindshare will be at an all time low.
 
Hell, if someone is interested in VRR and there's an AMD card in their range, right now I'll recommend AMD every time.

Of course, that will take into account total cost and some of the shenanigans played by monitor manufacturers and FreeSync ranges and feature support- but if the product is there, I won't overlook it just 'because AMD'.
 
Cant you guys see that all the NV shills are upset about the fact that they have nothing new to play with. The hate is strong in H Forums. The 1070 Ti is just a buzz creating card, nothing to write home about. Would anyone like to talk about the 970 and its 4 GB memory. How about the performance of Fury in dual mining. vs 1080Ti.
The Vega cards are a great buy vs nV. Did anyone mention mining XMR on two threads with Vega while running HBCC. Can any nV card do that. NV Cards are boring and are a simple refresh of the Maxwell. Yesterday's tech. Maybe these reviews need to evolve and have a more detailed take on what the card in question is capable of not only in terms of gaming.
 
Cant you guys see that all the NV shills are upset about the fact that they have nothing new to play with. The hate is strong in H Forums. The 1070 Ti is just a buzz creating card, nothing to write home about. Would anyone like to talk about the 970 and its 4 GB memory. How about the performance of Fury in dual mining. vs 1080Ti.
The Vega cards are a great buy vs nV. Did anyone mention mining XMR on two threads with Vega while running HBCC. Can any nV card do that. NV Cards are boring and are a simple refresh of the Maxwell. Yesterday's tech. Maybe these reviews need to evolve and have a more detailed take on what the card in question is capable of not only in terms of gaming.


Monero (XMR) mining, even with its upswing in the past month, still not as profitable as the other alt coins. Vega 64 is about the same as a gtx 1080ti in XMR mining (hashrate prior to overclocking/under volting/power limiting) but it uses more power once ya do the rest, its not even close, and again, using AMD cards on other alt coins that aren't memory bound, they just can't hold a candle to them. There is a reason why nicehash dropped AMD support for its latest miner......

You want to mine with AMD cards, power consumption goes up, simple as that. There is no competition, right now I have 10% of my systems mining ETH, 10% Monero, the rest are on Nicehash, cause I want Bitcoins. Soon, I'm going over to all ASIC's, and use my GPU miners as investments, ASIC's as a business, money comes in it fuels my other businesses
 
I'm guessing that Hawaii-based GPUs aged well because optimizations are still carrying over from the PS4/XB1 which share the same GCN revision.

As for Tahiti, people seem to have forgotten that the launch drivers for the 7970 was complete and utter shit which wasted as much as 20% of it's performance potential for games at the time of launch and was what allowed a smaller, lower specced chip as the GK104 to match and even beat it in some games by the time it launched a couple of months later. Fine wine for tahiti just came in the form of fixing the launch drivers that held it back ("Never Settle" driver), releasing an up-clocked GHz edition and investments in ISV relations with their Gaming Evolved initiative. Actually, in 2012, the 680 was punching above its weight class and it can only do so much over time.
 
Fury still rules in dual mining. Mining whats most profitable at the moment is not entirely perfect approach. Each to his own... :D
 
I'm guessing that Hawaii-based GPUs aged well because optimizations are still carrying over from the PS4/XB1 which share the same GCN revision.

As for Tahiti, people seem to have forgotten that the launch drivers for the 7970 was complete and utter shit which wasted as much as 20% of it's performance potential for games at the time of launch and was what allowed a smaller, lower specced chip as the GK104 to match and even beat it in some games by the time it launched a couple of months later. Fine wine for tahiti just came in the form of fixing the launch drivers that held it back ("Never Settle" driver), releasing an up-clocked GHz edition and investments in ISV relations with their Gaming Evolved initiative. Actually, in 2012, the 680 was punching above its weight class and it can only do so much over time.

I was on AMD at the time- having had it with the drivers, as patient as I am, Kepler brought me back over to the green. Fast and efficient, and back to Nvidia's more regularly solid drivers? It was time.

I've kept a hard eye on AMD since, but the story has been much the same, bar a noticeably better driver program- late to the table, no top-end competition, and hotter/louder overall. FreeSync is certainly a plus for the moment though.
 
Some threads are biased and that is expected. This one is completely dishonest and it is wonderful to see the cheer leading squad for this thread that have no chance of ever thinking for themselves.


Project Cars 2
980Ti is 43% faster than FuryX @1080p and 25% faster @1440p!
Or 13 %
https://overclock3d.net/reviews/software/project_cars_2_pc_performance_review/10


Total War Warhammer 2
980Ti is 40% faster than FuryX @1080p, and 35% faster @1440p, even a GTX 1060 is equal or faster than FuryX
And the FuryX matches the 1070 in Total war Warhammer
https://www.techspot.com/review/1476-amd-radeon-vega-64/page10.html

Call Of Duty WW2
980Ti is 35% faster than FuryX @1080p, and 25% faster @1440p
Yep, Guru3d had WAY better results and so did TPU
https://www.techpowerup.com/reviews/Performance_Analysis/Call_Of_Duty_WW_II/4.html

Shadow Of War
980Ti is 20% faster @1080p, and 50% faster @1440p
or like 2% better at 1440p, but who is keeping track?
https://www.techpowerup.com/reviews/Performance_Analysis/Middle_Earth_Shadow_of_War/5.html

toilet.PNG
 
Fury still rules in dual mining. Mining whats most profitable at the moment is not entirely perfect approach. Each to his own... :D


Really? I'm making $2.5 per card on my 1070's, hmm can't do that on a Fury ;) , Vega yeah it can do it with double the power usage, actually Vega can do 3.25 bucks a card right now but once ya factor in power, it goes down, way down, to 2.25 bucks.

I don't mine whats most profitable, I mine on nicehash, it does it for me, and the reason why I want Bitcoin, is because of the potential of bitcoin to go past 10k is high, this coming year. Eth yeah it can get another 50%, Monero, I don't think that is going to go much more than 25% more, if you know anything about ICO's or market changes based on XMR, let me know, cause I don't see anything that is earth shattering. There is a reason why Bitcoin, Litecoin, Dash, Ethereum are the major coins, Ethereum has potential to surpass Bitcoin in market volume, because of the way its structured. Monero, I don't see that yet, even its white papers really don't show that its possible for it take on these other coins. Bitcoin forks are driving its price up right now, but those forks are quite profitable right now at least for mining. And now there is a bitcoin based on Equihash which is Zcash, better on nV hardware, poor relatively on AMD hardware.

If you are in it for investing only and not cashing out, I would stick with ETH, Monero doesn't seem to have the potential to match Eth, not yet at least.
 
there's a point a lot of people aren't taking in consideration, GCN was pretty naively known as *FineWine* Tech, saying all kind of *positive* things about AMD taking care of their users longer than Nvidia and a large ETC, but people never associated that to the fact that they has been using the same architecture since the HD7900 Series, however since polaris AMD basically suddenly entered in the *same realm* as Nvidia which it's provide optimizations and support only from newer *tech* and newer generations of GPUs, since the RX 480 all patch, most fixes, most optimizations left behind the more powerful old brothers (R9 390X/Fury/Fury X at that time).

However that is not the case today, the RX 480/580 are cards that easily stump an R9 390X and perform mostly on pair with Fury X and now it surpass it, strange right?. they are the same architecture, and what kept relevant the 7970/280X/290X/390X were the fact that they shared the same architecture with the Fury X, so most optimizations for the Fury X also improved the performance of older cards, and that's where Fiji Started to *Age* differently, When Nvidia launched Pascal, Fiji was still the most powerful AMD GPUs, devs knew that, devs optimized for that, AMD optimized for that (you know, the shared System Cache and vRAM) nevertheless with the introduction of Vega even being basically the same architecture aside from higher clocks, everything it's completely different, all optimizations for Vega should improve Fiji, Polaris, Hawaii, Tahiti (and the large list of rebranded architecture names of GCN), strange right?. FineWine Tech that no longer age the same as *newer architectures* are released, the true thing, is AMD doesn't have the money to keep optimizing their huge range of GCN GPUs nowadays they are A LOT, too much, to make specific optimizations so they focus in the newer, exactly as Nvidia always do, they are bad for that? no, I understand that. but people then have to stop the mentality of AMD cards AGE better, which i think it's the goal of the OP, I may be wrong, but that's how I see AMD Since Polaris and that's how I feel ALL my AMD hardware with each release of new games and drivers, I experience that fact from my own experience.

Supposition from start to end.

The amount of optimization that is needed for newer architectures is not same as older GCN based products that you can't optimize for "new" "GCN" features in older versions of GCN makes sense also several things have changed people seem to think that GCN is an overall feature set with the exact same hardware features which of course is not, what is used is an approach that allows compatibility but also to optimize for the changes in the newer hardware.

The rate of improvements slows down as the gpu features change and there for the same path for improvement is no longer shared or less used means less of an update on newer drivers for older gpu. nothing more nothing less.
 

Destiny 2 2160P - Fury X 12% faster

http://www.guru3d.com/articles_pages/destiny_2_pc_graphics_performance_benchmark_review,5.html

Resident Evil 7 1440P - Fury X 11% faster

http://www.guru3d.com/articles_pages/resident_evil_7_pc_graphics_performance_benchmark_review,7.html

Battlefield 1 1440P - Fury X 4% faster

http://www.guru3d.com/articles_pages/battlefield_1_pc_graphics_benchmark_review,7.html

Prey 2160P - Fury X 3% faster

http://www.guru3d.com/articles_pages/prey_pc_graphics_performance_benchmark_review,7.html

We can hand pick these things all day long. If your gonna sit here and disagree that these 2 cards will trade blows in different games and different resolutions (won't even mention driver differences) then your bias and agenda here is clear.
 
We can hand pick these things all day long. If your gonna sit here and disagree that these 2 cards will trade blows in different games and different resolutions (won't even mention driver differences) then your bias and agenda here is clear.
LOL, 3 or 4 or 12% is not 30 or 40%, These games have the 980Ti and FuryX within normal range of each other, all the links listed in the OP have the FuryX within the range of RX580/1060. Far below the 980Ti. BIG DIFFERENCE.
 
And we gave you links for your cherry picked games where the FuryX does as good as thr 980ti such as Shadow of War and WW2.
We didn't even bother posting AMD cherry picked games such as Dirt.

So LOL all you want because you have really shown us anything.
 
You nerds still arguing about a card that is 2 years ago, let just sum this up into 1 sentence. Fury X does well on some games and not so well on other games, end of story.
 
I don't understand why people buy AMD GPUs when nVidia ones at the moment perform clearly better.

ROI. I paid $200 for my Fury X. Mined with it for months, then sold it for $350. I am trying nvidia this time around, but $500 up front sucks. Loved buying up 6 pin RX 480 4gbs for $159 brand new. Gaming wise, i dont see a lick of difference going from a Fury X to a 1080 OC, but my benchmark scores are nice...:rolleyes:

Ive been on team red a long time. My 4870, 5870s 7970, and 290x all aged really well. I was too a bit worried about the 4gb on furyx, so selling them for a profit was easy choice. Kyle did a review comparing Vega to fury x clock for clock... werent they within 3-5% ?
 
Last edited:
Well he has a point and you do not, calling him out on what he did just shows you he validated his point about your post.
6 links vs 1 link, having a point against this kind of consensus is the same as saying Vega is going to crush Volta based on an AMD video scene with a "Poor Volta" sign!
 
Fury still rules in dual mining. Mining whats most profitable at the moment is not entirely perfect approach. Each to his own... :D

My 1070 says otherwise. That said it is winter now and I could use a space heater...
 
6 links vs 1 link, having a point against this kind of consensus is the same as saying Vega is going to crush Volta based on an AMD video scene with a "Poor Volta" sign!

it is supposition , you make a claim which according to you is true because of the limited links then you complain about someone linking different results which makes your links look bad so you dismiss them. Dismissing links that does not support your narrative just shows you are not here to prove anything but you being here to troll people.

Either all the links support your claim to be true then you have a point , now your post is just limited to the few examples you posted. Your claim of fine wine is also supposition because it is not a blanket framerate improvement but rather limited to certain games and engine.

In the end you proved nothing but your bias ...
 
Last edited:
Back
Top