FuryX aging tremendously bad in 2017

PontiacGTX is talking out of his ass. That we understand.


PontiacGTX is kinda correct but it wasn't optimizations being done for both vendors, developers can spend the time to optimize for all hardware,

Now they also had optimization routines for Xbox and PS as well, so it would be easier for them to port those over using shader intrinsics. Now if we look at performance of rx 580, which is close to the same chip in Xbox and PS4, the performance seems retaliative to standing of the competition's cards better than Vega, that is what shader intrinsics is about, same hardware different platform, still get the same optimizations to port over.

With Vega, those shader intrinsics will not come over well.

If Anarchist4000 was still around, we would have a nice flash back to discussions about this and how he wanted to say shader intrinsics are the end all of nV, since consoles will force optimizations paths. Of course I stated quite the opposite it will hurt AMD in the long run because it will really only work for similar level and generation of hardware, once that is different, then it all falls apart. This is a good example of that.

As far as tessellation they can't push polycounts for consoles, so game assets and tessellation levels have to be made for AMD hardware so that will automatically favor AMD PC hardware or actually in this case as PontiacGTX stated "doesn't hurt" AMD hardware.
 
Last edited:
The real question is will Vega age as well as Fury? :D

Talking about a love/hate relationship with a video card - Vega's. Steam VR yesterday was a slide show with the two Vega FE's, 18.2.1 Pro game drivers, 18.2.1 Pro drivers with or without CFX. Then discovered that the Bluetooth driver for the breakout box was not loaded - after that smooth sailing in CFX and VR (The Last Hope Serious Sam). Driver wise I had much better experience with Fiji than Vega.

Pascal cards overall have just been stellar - not perfect but pretty darn good.
 
I don't understand this thread, it's been completely debunked, can we close it?
It was never debunked, it was confirmed everystep of the way actually. and by multiple sources:

Dozens of 2017 games where the FuryX is barely faster than an RX580, and way behind the 980Ti

https://hardforum.com/threads/furyx-aging-tremendously-bad-in-2017.1948025/

[GameGPU] 980Ti is 30% faster than FuryX in 2017 games

http://gamegpu.com/test-video-cards/podvedenie-itogov-po-graficheskim-resheniyam-2017-goda

[HardwareUnboxed] GTX 980Ti is Aging Considerably Better Than FuryX!



[ComputerBase] GTX 1070 (~980Ti) is considerably ahead of the Fury X

https://www.computerbase.de/2017-12...marks_in_1920__1080_2560__1440_und_3840__2160
 
It was never debunked, it was confirmed everystep of the way actually. and by multiple sources:

Dozens of 2017 games where the FuryX is barely faster than an RX580, and way behind the 980Ti

https://hardforum.com/threads/furyx-aging-tremendously-bad-in-2017.1948025/

[GameGPU] 980Ti is 30% faster than FuryX in 2017 games

http://gamegpu.com/test-video-cards/podvedenie-itogov-po-graficheskim-resheniyam-2017-goda

[HardwareUnboxed] GTX 980Ti is Aging Considerably Better Than FuryX!



[ComputerBase] GTX 1070 (~980Ti) is considerably ahead of the Fury X

https://www.computerbase.de/2017-12...marks_in_1920__1080_2560__1440_und_3840__2160


Jesus you haven't been banned yet?
 
Lol, the Fury X (and most AMD cards) are obsolete the time they are released. They have been behind the ball for over a decade.
 
I made no threats, I don't have the powers to ban anyone, I merely expressed my shock.

How about I make a thread that the GTX 680 is aging terribly in 2018? Would be pretty fucking useless right?

A more apt analogy would be creating the same thread about the 980 Ti as they were released within the same timeframe and at the same price point.

EDIT: Up to this point it was not obvious you don't have ban privileges. Your badge carries the staff moniker and it's a reasonable assumption that someone with that tag has more powers than 'normal' users. Furthermore, I'm pretty sure it's against forum policy to infer or imply that the person should be banned. If you feel that strongly about someone's post I'm pretty sure that's what the report button is for.
 
Last edited:
If Fury x had been released at $499 MSRP on launch with variants at 379/429 I feel like the time continuum would have somehow been mended in some back to the future VooDoo.
 
If Fury x had been released at $499 MSRP on launch with variants at 379/429 I feel like the time continuum would have somehow been mended in some back to the future VooDoo.

At 499, it would be a seller, but at 649 and barely keeping up with a 980ti, it is just a no brainer which one to get.
 
If Fury x had been released at $499 MSRP on launch with variants at 379/429 I feel like the time continuum would have somehow been mended in some back to the future VooDoo.
It would have needed HDMI 2.0 though.
 
Fury X to me felt like a bit of a proof of concept / trial run for HBM. As an R9 290 owner, I remember getting the upgrade itch at the time and feeling a bit let down by Fury and Fury X. For one, I wanted something with more VRAM and the performance upgrade didn't seem worthwhile enough to justify the upgrade. The performance was certainly there for 1440P, but the VRAM left it in an iffy spot and I think we're seeing a bit of that now. You wonder what some of these benchmark results would be if they reran them at 1080P. I remember AMD specifically saying they had assigned a few software engineers to work on drivers specifically for Fury/X to optimize for memory usage and you have to wonder how long the engineers would remain dedicated to GPU's they probably didn't sell a whole lot of. Also, Fury/X was based on GCN3 which just like the R9 285X seemed to get the short end of the stick in terms of driver optimizations. I remember certain games performing better on the 280/x (7950/7970) compared to the 285X (tonga) specifically because a lot of games hadn't been optimized for GCN3 while they had been for GCN2 (hawaii/tahiti). I think this sort of remains true in that AMD is probably prioritizing driver optimizations and bug fixes for the most widely used cards and the Fury cards certainly aren't all that common. I worry a bit about Vega not so much in terms of AMD's driver work but more on the developer side, with so few cards in actual gaming machines and more of them in mining farms, you have to wonder how high of a priority it will be for them to make sure it runs well on the AMD side.

I think Fury at $399 and Fury X at $549 would have been competitive. I also think they should have allowed AIB's to sell Fury X with a beefy aircooler since there were definitely some cooling solutions that would have been able to handle it and default watercoolers are a no go for some users.
 
Once the Nano dropped in price, that made it competitive, IMO. RTG's mistake was thinking many people would see a SFF-case card and think that the small size was worth the $200 premium they charged for it.
 
Once the Nano dropped in price, that made it competitive, IMO. RTG's mistake was thinking many people would see a SFF-case card and think that the small size was worth the $200 premium they charged for it.

I have absolutely no problem with this if the performance is there :)

[i.e., if Nvidia had decided to put the requisite amounts of HBM on a GP102 or GP104 package (as a performance metric) and shipped a small version, assuming they could keep pricing reasonable, that would be extremely attractive...]
 
If the Fury X had been released with HBM with 8GB of VRAM, the cards would have still been competitive.

Slightly more; they'd still be slower, hotter, and they'd be even more expensive to produce.

Still would have been nice to see though!
 
I made no threats, I don't have the powers to ban anyone, I merely expressed my shock.

How about I make a thread that the GTX 680 is aging terribly in 2018? Would be pretty fucking useless right?

Well a long long long time ago the idea of only relevant threads flew out of the window in this forum. Most of the largest threads on here tend to have absolutely zero relevance or value for that matter.
 
Lol, the Fury X (and most AMD cards) are obsolete the time they are released. They have been behind the ball for over a decade.
This is just completely false. 7970Ghz put a beating on GTX 680 and the gap only widened with time. Now the GTX 680 loses to 7950 most of the time. Plenty of others follow the same trend at their respective price point.
 
7970Ghz put a beating on GTX 680

Most certainly not a beating, though slightly faster (if you liked AMD's drivers :ROFLMAO:), but it beat it in noise- four times as loud!

...and that's to higher-spec'd GTX680's that actually competed with AMD's late-to-market heat blower.

Now the GTX 680 loses to 7950 most of the time. Plenty of others follow the same trend at their respective price point.

That's a nice assertion you have there!
 
That's a nice assertion you have there!

in fact he is right with regarding on how the GTX 680 play modern games versus 7970 or 7950. in fact there's the case of things as Far Cry 4 which the 7970 beat miserably the GTX 780.. one of the little advantages on how many times GCN has been rehashed it's that most improvement and optimizations keep improving and optimizing older GCN parts. however once AMD Release a new architecture all of their users will be forced to upgrade soon as they will face the same fate, even AMD learned that since polaris, as all optimizations are made exclusively for newer GPUs forgetting older ones.
 
I don't see any facts- I see assertions (and I'm repeating myself). I also don't really care as just about everything of that age is relatively 'slow'.

fast fact lol.. yeah, everything on that age it's actually slow, but while at 1080P an HD7970/r9 280X (specially when overclocked) can still handle most recent games decently at aceptable quality a GTX 680/GTX 770 can not and that's a fact.

1420574520CH9QmTVFND_7_3.gif
 
I see a screenshot of them 'handling' just fine, which refutes your 'fact' with your own 'support', lol...

Also, those Nvidia cards can overclock too ;)
 
This is just completely false. 7970Ghz put a beating on GTX 680 and the gap only widened with time. Now the GTX 680 loses to 7950 most of the time. Plenty of others follow the same trend at their respective price point.
In a recent benchmark of recent titles, the 280X is just 10% faster than 680, in a heavily skewed game selection that favors AMD (their words not mine). So with a balanced selection, both would be comparable.



In facr just a year ago both had the same performance:

 
I don't think the Fury-X is aging better or worse for that matter than any other product in the past.
 
Last edited:
and a RX580 is the slowest on your list and a 1060 is faster than a 980. that particular benchmark test is either jacked up, or has so many anomalies I wouldn't consider it wholly representative.
It's also Far Cry 2. A DX9/10 title. A game that was released in 2008. No reputable benchmarking site would bother with that test.

Basically, that benchmark has zero relevance because it could be anything from drivers to GPU architecture not being optimized for the older DX.
 
If I had bought a Sapphire Nitro Fury as my second card in late 2016, I may have just stuck with the 2 x Nitro's but, I had gone with a Tri X and they just did not look good side by side. As mGPU support for them went, they worked great when supported. Oh well, I am on a Vega 56 now and it works really well.

When the Fury series is properly supported, they work really well, even today.
 
one of the little advantages on how many times GCN has been rehashed it's that most improvement and optimizations keep improving and optimizing older GCN parts. however once AMD Release a new architecture all of their users will be forced to upgrade soon as they will face the same fate, even AMD learned that since polaris, as all optimizations are made exclusively for newer GPUs forgetting older ones.
Finally, someone who's been actually paying attention instead of just blindly repeating that stupid "fine wine" crap!
And every time i see AMD fans wishing for a new architecture I always think to myself "be careful what you wish for"....LOL
 
Finally, someone who's been actually paying attention instead of just blindly repeating that stupid "fine wine" crap!
And every time i see AMD fans wishing for a new architecture I always think to myself "be careful what you wish for"....LOL

Eh, not really dude. I would rather base my purchases on my personal experience, whether you agree with me or not. Fury still works well, as long as the title is not a Gimp.... Errr, "Gameeorks" title. :D :)
 
Eh, not really dude. I would rather base my purchases on my personal experience, whether you agree with me or not.

Meaning that you abandon all logic, we know :ROFLMAO:

[what's funnier is that Gameworks is implemented using the API- it may offend your religion, but at least it's standards compliant!]
 
Back
Top