AMD, Roy Taylor, the Nano, and the Press @ [H]

"sooner or later some company is going to pull a full-on screw-the-pooch moment and continue to spiral down even after someone realizes the mistake"

This sort of rhetoric might just see HardOCP suffer the same fate!

Not that I would bat an eyelid.

No doubt Nvidia has a somewhat better flagship GPU, but your review of the Fury-X was to put it mildly, biased, scathing and REEKED of Nvidia fanboyism.

So what did you expect?

Good thing there are better tech sites out there.

Enjoy the downward spiral.
 
This is your example?! Behold, absolute slaughter of DX12 incapable nvidia cards.


94_400_amd-radeon-r9-fury-crossfire-triple-4k-eyefinity-11-520x2160.png

11Kx2.5K res, how many resolutions did they go through before they found one where Fury X won? And they used a medium preset - who wins on a high preset? Finally, how did two Fury X give a better than 2X increase in minimum frame rate over one -- unless the 4G memory limit on the Fury X is holding it back ...

One game, one rarely used and somewhat extreme resolution, medium presets and an anomalous result ... I guess if you twiddle the knobs enough, you can find a setting that gives you the winner you were looking for.

HardOCP would never insult the intelligence of its readers by posting a worthless benchmark result like that.
 
From the data I provided FuryX matches and destroys 980 TI SLI in performance at 4K proving that FuryX 4Gb is more then enough for high resolution gaming.

Running at medium quality presets doesn't sound like "more than enough" to me. In fact, I'd say it sucks. Why bother with hi res graphics if you have to destroy the quality of the graphics to do so?
 
"sooner or later some company is going to pull a full-on screw-the-pooch moment and continue to spiral down even after someone realizes the mistake"

This sort of rhetoric might just see HardOCP suffer the same fate!

Not that I would bat an eyelid.

No doubt Nvidia has a somewhat better flagship GPU, but your review of the Fury-X was to put it mildly, biased, scathing and REEKED of Nvidia fanboyism.

So what did you expect?

Good thing there are better tech sites out there.

Enjoy the downward spiral.
Lol. You've been here 7 days. This website has been here more than 15 years. Tom's sold out, Anand sold out. This is one of the biggest and most popular tech forums on the net, and its still owned by its creator. Idiot trolls like you come and go, but this website keeps going.

I laugh even harder that you call bias. This website has always called out every manufacturer when they fuck up. But of course you wouldn't know that since you just joined to ignorantly troll. I bet you didn't know that KB himself ran crossfire in his own personal rig for several years?
 
"sooner or later some company is going to pull a full-on screw-the-pooch moment and continue to spiral down even after someone realizes the mistake"

This sort of rhetoric might just see HardOCP suffer the same fate!

Not that I would bat an eyelid.

No doubt Nvidia has a somewhat better flagship GPU, but your review of the Fury-X was to put it mildly, biased, scathing and REEKED of Nvidia fanboyism.

So what did you expect?

Good thing there are better tech sites out there.

Enjoy the downward spiral.

That's it, Roy--keep trying...
 
"sooner or later some company is going to pull a full-on screw-the-pooch moment and continue to spiral down even after someone realizes the mistake"

This sort of rhetoric might just see HardOCP suffer the same fate!

Not that I would bat an eyelid.

No doubt Nvidia has a somewhat better flagship GPU, but your review of the Fury-X was to put it mildly, biased, scathing and REEKED of Nvidia fanboyism.

So what did you expect?

Good thing there are better tech sites out there.

Enjoy the downward spiral.


Bwahahahaha!!!!

We love [H]ate. Keep pouring it on!
 
"sooner or later some company is going to pull a full-on screw-the-pooch moment and continue to spiral down even after someone realizes the mistake"

This sort of rhetoric might just see HardOCP suffer the same fate!

Not that I would bat an eyelid.

No doubt Nvidia has a somewhat better flagship GPU, but your review of the Fury-X was to put it mildly, biased, scathing and REEKED of Nvidia fanboyism.

So what did you expect?

Good thing there are better tech sites out there.

Enjoy the downward spiral.

Oh, hi there Roy.
 
"sooner or later some company is going to pull a full-on screw-the-pooch moment and continue to spiral down even after someone realizes the mistake"

This sort of rhetoric might just see HardOCP suffer the same fate!

Not that I would bat an eyelid.

No doubt Nvidia has a somewhat better flagship GPU, but your review of the Fury-X was to put it mildly, biased, scathing and REEKED of Nvidia fanboyism.

So what did you expect?

Good thing there are better tech sites out there.

Enjoy the downward spiral.

Damn, you just made me spit out my coffee. 10/10 would read again.
 
"sooner or later some company is going to pull a full-on screw-the-pooch moment and continue to spiral down even after someone realizes the mistake"

This sort of rhetoric might just see HardOCP suffer the same fate!

Not that I would bat an eyelid.

No doubt Nvidia has a somewhat better flagship GPU, but your review of the Fury-X was to put it mildly, biased, scathing and REEKED of Nvidia fanboyism.

So what did you expect?

Good thing there are better tech sites out there.

Enjoy the downward spiral.


By better tech sites I assume you mean the ones that glorify AMD no matter what? The other tech sites that use canned benchmarks?
 
Again, the Fury X wins some in Crossfire and loses some. That's versus stock clocked 980 Ti, also.

Here are some 4k benchmarks from various websites and various games. The second set of benchmark links compares Fury X to overclocked 980 Ti in SLI and gets hammered by the overclock.







http://www.hardwareluxx.com/index.p...cf-vs-gtx-980-ti-sli-overclocked.html?start=5

http://www.hardwareluxx.com/index.p...cf-vs-gtx-980-ti-sli-overclocked.html?start=6

http://www.hardwareluxx.com/index.p...f-vs-gtx-980-ti-sli-overclocked.html?start=10

http://www.hardwareluxx.com/index.p...f-vs-gtx-980-ti-sli-overclocked.html?start=11

https://www.youtube.com/watch?v=rUY32Mq4dlY

Fury X Crossfire vs TitanX SLI
 
Last edited:
11Kx2.5K res, how many resolutions did they go through before they found one where Fury X won? And they used a medium preset - who wins on a high preset? Finally, how did two Fury X give a better than 2X increase in minimum frame rate over one -- unless the 4G memory limit on the Fury X is holding it back ...

One game, one rarely used and somewhat extreme resolution, medium presets and an anomalous result ... I guess if you twiddle the knobs enough, you can find a setting that gives you the winner you were looking for.

HardOCP would never insult the intelligence of its readers by posting a worthless benchmark result like that.

Running at medium quality presets doesn't sound like "more than enough" to me. In fact, I'd say it sucks. Why bother with hi res graphics if you have to destroy the quality of the graphics to do so?

I posted more then one review site... How about you check them out as well before you start jumping at me and TweakTown over "tweaking" graphics settings to paint FuryX in a better light.
 
Last edited:
By better tech sites I assume you mean the ones that glorify AMD no matter what? The other tech sites that use canned benchmarks?

Most websites by now post videos of their benchmark runs or youtube videos of gaming. I think Anandtech, [H]ardOcp and Toms should step it up.
 
Most websites by now post videos of their benchmark runs or youtube videos of gaming. I think Anandtech, [H]ardOcp and Toms should step it up.

Can you show me one? The only YouTube beckmark runs I have seen are people making video of them running... Canned bechmarks.
 

Did you take the time to look at these reviews before you make such ridiculous statements?

The first one in Korean, is just an aggregate score of accumulated frame rates. There is no detail there and does not provide an accurate representation of performance.

Its cool but he needs to go into more detail to really make his review worthwhile.

The second site the games are basically all unplayable at that resolution, with the exception of BF4 and Tomb Raider.

The Fury X is:
Slower in the benchmark utility
Faster in BF4
About the same in Metro
Shadows of Mordor you could argue the Fury X is faster, but you could also say the 980ti is playable.
Thief is the same story, I would lean toward the 980ti as faster and almost playable.
Tomb Raider is slower.
Faster in Bioshock.

So its faster in 2, about the same in one, and slower in 2, and in another 2 its up for debate but I would give the performance to the 980ti for coming closest to being playable.

The third site is useless because they just test the Fury X by itself in Crossfire and don't compare to any SLI cards, also we see some terrible Min frame rates which I would say points to the 4GB memory being an issue.

And the forth site Techspot says in the conclusion that the 980TI is better than the Fury X.

The Fury X does seem to have the potential to scale better than SLI which is cool, but I will reserve my option until I see the [H] review.

From what I can tell so far an OC'd 980TI which you can buy today is faster than a Fury X which is out of stock or horribly over priced.
 
Running at medium quality presets doesn't sound like "more than enough" to me. In fact, I'd say it sucks. Why bother with hi res graphics if you have to destroy the quality of the graphics to do so?

Having upgraded to a 4K monitor 2 months ago, my impression in a wide range of games (from WoW to GTA5) is that as long as you can maintain the highest level of textures, resolution trumps settings.

I'd rather play a game at 4K with max textures and everything else low than at 1080p with everything max.

Just my personal preference though.
 
Did you take the time to look at these reviews before you make such ridiculous statements?

The first one in Korean, is just an aggregate score of accumulated frame rates. There is no detail there and does not provide an accurate representation of performance.

Its cool but he needs to go into more detail to really make his review worthwhile.

The second site the games are basically all unplayable at that resolution, with the exception of BF4 and Tomb Raider.

The Fury X is:
Slower in the benchmark utility
Faster in BF4
About the same in Metro
Shadows of Mordor you could argue the Fury X is faster, but you could also say the 980ti is playable.
Thief is the same story, I would lean toward the 980ti as faster and almost playable.
Tomb Raider is slower.
Faster in Bioshock.

So its faster in 2, about the same in one, and slower in 2, and in another 2 its up for debate but I would give the performance to the 980ti for coming closest to being playable.

The third site is useless because they just test the Fury X by itself in Crossfire and don't compare to any SLI cards, also we see some terrible Min frame rates which I would say points to the 4GB memory being an issue.



We must be looking at things with different glasses because I see things for the way data is represented. What's wrong with the first link? It's not an American site?!

The data for 10 -13 different games is there; Fury X Crossfire is equal at 2560p and ahead of 980TI SLI at 4K resolution. Please note that there is hardly any games with Shamuworks tested.



27103937559EF3DE05294B



And the forth site Techspot says in the conclusion that the 980TI is better than the Fury X.

The Fury X does seem to have the potential to scale better than SLI which is cool, but I will reserve my option until I see the [H] review.

From what I can tell so far an OC'd 980TI which you can buy today is faster than a Fury X which is out of stock or horribly over priced.

Hey, I can cherrypick conclusions too. Lets take this from TweakTown, whose data you obviously don't agree with for whatever reason.

Final Thoughts
Here we are again, in our final thoughts. Last time around, I wasn't too impressed with what the single AMD Radeon R9 Fury X had to offer at 11,520x2160. But with two of them in CrossFire, I'm blown away. We're looking at equal to, or better than GTX 980 Ti SLI or Titan X SLI, which is really saying something.

Read more: http://www.tweaktown.com/tweakipedi...re-triple-4k-eyefinity-11-520x2160/index.html
 
We must be looking at things with different glasses because I see things for the way data is represented. What's wrong with the first link? It's not an American site?!

The data for 12 different games is there; Fury X Crossfire is equal at 2560p and ahead of 980TI SLI at 4K resolution. Please note that there is hardly any games with Shamuworks tested.

You are ignoring all of the English reviews that precisely tell you what settings they ran, etc. You are not addressing any of the reviews presented in English that show Fury X +/- a 980Ti, then once the 980Ti is overclocked, the Fury X gets smoked. Instead you are grasping at a single review done by a Korean guy on a forum and a review thats 11500x2160 that shows the nice multi-card scaling of Fury X, but is meaningless otherwise.

And stop with the 11000+ resolution comparison. 4k is what matters here. Just look at this review. Stock Fury X crossfire vs stock 980 Ti SLI it's neck and neck in most games. Overclock the 980Ti and the Fury X gets crushed.

Stock clock vs stock clock? They are generally close. Pardon the images. I've cropped and linked them directly since some folks seem to be having issues seeing.

http://www.hardwareluxx.com/index.p...-fury-x-cf-vs-gtx-980-ti-sli-overclocked.html


nwCTD7V.jpg


wxHS8rX.jpg


0mfK1Wa.jpg


BjHQPQU.jpg


MsR5dw3.jpg


4ginRyG.jpg


rpEjzFB.jpg


EAgQTst.jpg
 
Last edited:
We must be looking at things with different glasses because I see things for the way data is represented. What's wrong with the first link? It's not an American site?!

The data for 10 -13 different games is there; Fury X Crossfire is equal at 2560p and ahead of 980TI SLI at 4K resolution. Please note that there is hardly any games with Shamuworks tested.



27103937559EF3DE05294B

I thought he explained what was wrong with the first link pretty well. The information provided is entirely worthless. Total combined frame rate and percentages mean pretty much jack shit. A game-by-game break down is essential. Example: At 4K how many of those games were actually operating at playable framerates? What was the game-to-game performance difference between single, dual, triple, and quad?
 
Alien, Battlefield 4, BI, Crysis 3, Sleeping Dogs, and Sniper Elite 3 are all Gaming Evolved games. I can see it a mile away, don't even need to check AMD's website.
Whoever made that list went out of their way to favor AMD. Makes me skeptical to trust the other results. Why didn't they test anything released within the last year? -- Because they're all GameWorks games.

People gotta try at least a little bit harder, sheesh.
 
Alien, Battlefield 4, BI, Crysis 3, Sleeping Dogs, and Sniper Elite 3 are all Gaming Evolved games. I can see it a mile away, don't even need to check AMD's website.
Whoever made that list went out of their way to favor AMD. Makes me skeptical to trust the other results. Why didn't they test anything released within the last year? -- Because they're all GameWorks games.

People gotta try at least a little bit harder, sheesh.

So the review is invalid because it isn't stacked with nVIDIA games? I don't see people crying foul when the only games that are review are GameWorks titles....
 
2 things.

1) this write up does come off as really bitchy, prefacing with a claim that it isn't doesn't change the content.

2) I hate AMD, but without them the consumer will be in real trouble. We need them. They need to get their shit together and start getting back in the game. I do not want to pay $1000 for just a regular old GTX 980 type card in the future. It's already dangerously close and that's with some form of competition.
 
So the review is invalid because it isn't stacked with nVIDIA games? I don't see people crying foul when the only games that are review are GameWorks titles....

You dont do yourself any credit.
A balance of game reviews that are not one sided.
In case you didnt spot, [H] reviewed Gameworks games with Gameworks enabled/disabled and showed the difference.

Not to worry, the [H] review is inbound.
Enjoy.
 
You are ignoring all of the English reviews that precisely tell you what settings they ran, etc. You are not addressing any of the reviews presented in English that show Fury X +/- a 980Ti, then once the 980Ti is overclocked, the Fury X gets smoked. Instead you are grasping at a single review done by a Korean guy on a forum and a review thats 11500x2160 that shows the nice multi-card scaling of Fury X, but is meaningless otherwise.

And stop with the 11000+ resolution comparison. 4k is what matters here. Just look at this review. Stock Fury X crossfire vs stock 980 Ti SLI it's neck and neck in most games. Overclock the 980Ti and the Fury X gets crushed.

Stock clock vs stock clock? They are generally close. Pardon the images. I've cropped and linked them directly since some folks seem to be having issues seeing.

http://www.hardwareluxx.com/index.p...-fury-x-cf-vs-gtx-980-ti-sli-overclocked.html




sVvYFhg.jpg


d5DNjpl.jpg



It's all here.
 
So the review is invalid because it isn't stacked with nVIDIA games? I don't see people crying foul when the only games that are review are GameWorks titles....
I would rather see a review stacked with games that people are actually playing, than reviewers intentionally going back 3 years to cherry pick AMD titles for the expressed purpose of slanting their review. Where's GTA V? It's not a GameWorks nor GE game, but it still favors Nvidia... Excluded from their review.

If someone is going to be biased then they at least need to attempt to hide it. They're only missing Tomb Raider & Hitman to complete the GE line-up. Unless you plan on solely playing Gaming Evolved games and zero GameWorks games, that review is useless to everybody.

It's just another example where all I can say is: How dumb do you think we are? It's almost insulting.
 
David, do you ever post videos of your test runs? It'd be nice to see what's considered playable vs unplayable (as I suspect different players have different standards). Note this really has nothing to do with this specific thread.

Thanks

For as long as I've been a reader here (and for as long as I've been a reviewer here, which is a shorter period of time than of me being a reader), we have not posted videos of playable vs. not. I would also say that in many cases you cannot visualize a game not being playable as a lot of it has to do with the response and feel of how the game plays. That's not something that can be easily measured and reported (or even videoed). That judgment comes from us being gamers and knowing how games should respond to our inputs.

If anything, our best playable settings are more conservative than they could be, as our goal with them is that any of our readers can take that particular card (and comparable system setup) and be able to play through the entire game at those settings with a good experience.
 
It's all here.


Still not seeing minimum frames. Look at the minimum frames on the Witcher 3, Fury X crossfired and overclocked. 31 FPS. Overclocked 980 Ti @ SLI is 61fps.

You are delusional if you think crossfired Fury X can keep up with overclocked 980 Ti in SLI. No use trying to convince you otherwise. Keep learning Korean and playing 4 year old games, I guess.
 
You are delusional if you think crossfired Fury X can keep up with overclocked 980 Ti in SLI. No use trying to convince you otherwise. Keep learning Korean and playing 4 year old games, I guess.
I don't know about that review in particular but the general concensus is that multiple Fury X's scale better than 980 Ti's. Not sure about overclocked, I would assume it would help close the gap.

I haven't seen anything to prove otherwise. That Korean review (or whatever language it is) was just a bad choice.
 
Still not seeing minimum frames. Look at the minimum frames on the Witcher 3, Fury X crossfired and overclocked. 31 FPS. Overclocked 980 Ti @ SLI is 61fps.

You are delusional if you think crossfired Fury X can keep up with overclocked 980 Ti in SLI. No use trying to convince you otherwise. Keep learning Korean and playing 4 year old games, I guess.

You really want me to? Ok.... I don't see this lopsided performance that you speak of.

Witcher_01.png


Witcher_02.png


The Witcher 3: Wild Hunt shows competitive performance between the SLI and Crossfire setups. Both delivered the same minimum frame rate though the R9 Fury X Crossfire cards were just 4% slower when comparing the average frame rate.
:D
 
We must be looking at things with different glasses because I see things for the way data is represented. What's wrong with the first link? It's not an American site?!

The data for 10 -13 different games is there; Fury X Crossfire is equal at 2560p and ahead of 980TI SLI at 4K resolution. Please note that there is hardly any games with Shamuworks tested.
Hey, I can cherrypick conclusions too. Lets take this from TweakTown, whose data you obviously don't agree with for whatever reason.

No problems with it being a non American site, I just was using that as a frame of reference.
But the data is totally useless it gives us no min max, its just an frame total score, it gives no data that is useful in describing user experience.

I never said I disagree with TweakTown's data, I disagree with there conclusion.

As I said the only two games that are playable are BF4 and Tomb Raider, Fury X wins one and 980Ti wins the other.

And the 980Ti is the only card that comes in close to playable in Thief and Shadows at that rez.

Can't play a game that runs at 7 or 13 but you come close at 28 and 27. Averaged FPS doesn't mean anything if the game chops up.

So the fact that they would say the Fury X is better is wrong, at best its about the same as the 980Ti.

Not to even bring up the overclocking, but that's completely different.

EDIT:I guess I have to...
 
Last edited:
You really want me to? Ok.... I don't see this lopsided performance that you speak of.

:D

How about you read the conclusion from the site you just linked the graphs for?

http://www.techspot.com/review/1033-gtx-980-ti-sli-r9-fury-x-crossfire/page7.html

If we go back and look at the average frame rate performance of each game while also taking note of the minimum frame rates we see that the GTX 980 Ti SLI setup delivered very playable performance in seven of the 10 games, the Fury X Crossfire cards on the other hand provided what we consider to be very playable performance in six of the 10 games while remaining playable in the rest.

Gamers wanting to play at 4K will be happy with either setup overall, but we feel Nvidia offers a more consistent gaming experience while allowing for an additional 15% performance bump through overclocking. Normally we don't place so much emphasis on overclocking, but we feel those seeking an enthusiast multi-GPU setup are probably able and willing to enjoy the benefits of overclocking.
 
You really want me to? Ok.... I don't see this lopsided performance that you speak of.

I said the 980 Ti SLI and Fury X crossfire traded blows in games at 4k - which you replied the Fury X crushed the 980Ti and Titan. Now you are linking articles supporting my very argument. At stock clocks, the Fury X crossfire and 980 Ti SLI are +/- a few frames.

I'm glad you agree now.

That being said, the graph below is more or less the techspot performance at stock clocks - except the review actually overclocked the 980Ti to a reasonable level instead of the factory OC you demonstrated. Showing what the cards will do with a everyday enthusiast overclock is important, I think. There is 10-20% performance headroom on the 980Ti. This easily overcomes any competition there would have been by the Fury X, as it's already topped out.

wxHS8rX.jpg
 
It's especially important when you consider Fury X uses factory water with no other options and most reviews compare it to a stock, reference 980 Ti which is already suffering from severe limitations. Too many people take everything at face value.

Still having trouble convincing the AMD masses that Maxwell's overclockability is not a talking point. Typical response is "Well Fury X can be overclocked, too". It's a huge part of the reason that I have a 980 Ti in my rig right now. The purchase was made almost solely upon Maxwell's headroom. The 980 Ti gains upwards of 30% from overclocking (over reference), you just can't ignore that.
 
I would rather see a review stacked with games that people are actually playing, than reviewers intentionally going back 3 years to cherry pick AMD titles for the expressed purpose of slanting their review. Where's GTA V? It's not a GameWorks nor GE game, but it still favors Nvidia... Excluded from their review.

If someone is going to be biased then they at least need to attempt to hide it. They're only missing Tomb Raider & Hitman to complete the GE line-up. Unless you plan on solely playing Gaming Evolved games and zero GameWorks games, that review is useless to everybody.

It's just another example where all I can say is: How dumb do you think we are? It's almost insulting.

This right here. Give this man a cigar.
 
"sooner or later some company is going to pull a full-on screw-the-pooch moment and continue to spiral down even after someone realizes the mistake"

This sort of rhetoric might just see HardOCP suffer the same fate!

Not that I would bat an eyelid.

No doubt Nvidia has a somewhat better flagship GPU, but your review of the Fury-X was to put it mildly, biased, scathing and REEKED of Nvidia fanboyism.

So what did you expect?

Good thing there are better tech sites out there.

Enjoy the downward spiral.

dr-strangelove-stanley-kubrick-2.jpg
 

I read the conclusion to this and the Hardware Slave review. I think the problem I have with their final conclusions (and those of the other sites so far) is that they are allowing themselves to follow the AMD script.

"Imagine a m-itx case where only a card this size would fit. Now, isn't it the most powerful card you can buy?"

"Why, yes! Yes it is! Top Award!"

Coil whine, check. No HDMI 2.0, check. Price, check. (Price check on aisle 3?) But according to the rules set by the script, none of these things matter.

Can't wait to see what a real [H] review reveals, when the script goes in the toilet where it belongs.
 
"sooner or later some company is going to pull a full-on screw-the-pooch moment and continue to spiral down even after someone realizes the mistake"

This sort of rhetoric might just see HardOCP suffer the same fate!

Not that I would bat an eyelid.

No doubt Nvidia has a somewhat better flagship GPU, but your review of the Fury-X was to put it mildly, biased, scathing and REEKED of Nvidia fanboyism.

So what did you expect?

Good thing there are better tech sites out there.

Enjoy the downward spiral.

When the world ends it's going to be Kyle, Steve, and Keith Richards. No one else will survive.
 
Back
Top