AMD Radeon R9 Fury X CrossFire at 4K Review @ [H]

I've seen reviews of curved TVs that say they don't offer any benefit in a living room setup because you sit too far away to get any peripheral effect. The screen would have to be huge for it to be noticeable.

Does the closer viewing distance of a monitor make the curve effect, ummm, effective? ;)

Thanks to you and Brent for personally responding and giving me some big-picture (so to speak) insight on the market today.

I see no benefit to a curved TV in a living room type experience where you are sitting at 12 to 15 feet or more away.

Sitting two feet away I would suggest it does make a positive difference for me.

Given the quality of 4K screens improving for gaming situations, it makes it fairly hard to argue buying into a new Eyefinity or Surround setup. I have been waiting 6 years to replace my setup with a 4K screen that gave me excellent PPI, is over 40 inches, and did not give me mouse lag. That said, I am sure there are some gamers that will still find benefit in the "ultrawide" pixel that triple displays can provide. 5760 pixels wide still can give you a gaming advantage in many titles.

There is a ton of up-to-date information in this thread in our Displays forum.
 
I know this is nitpicking for not using the latest drivers but Cat 15.9.1 fixes CF problems in Far Cry 4 when using Win10, even though there's no release note for that.

Don't know if they are just getting lazy with those release notes :confused:
 
a curved LCD in a living room seems like it would only exacerbate viewing angle issues since multiple viewers would be viewing the tv from multiple angles.

but a 4k curved tv as your personal monitor where youre dead center and sitting close? probably pretty great.

cant wait for a single GPU 4k solution with a big ass 4k tv. im still concerned about 4k LCD price/image quality/black levels/response time/Hz/backlight uniformity.

man, i wish people still made plasmas, in all their heavy, power sucking glory.
 
For future reference the calculate is easy, horizontal x vertical resolution = pixels.

Literally 3840 multiplied by 2160 = total pixels.

Well, in my case, it was more a matter of not remembering what the usual resolution numbers were for surround and being too lazy to go back and look them up, LOL!
 
I know this is nitpicking for not using the latest drivers but Cat 15.9.1 fixes CF problems in Far Cry 4 when using Win10, even though there's no release note for that.

Don't know if they are just getting lazy with those release notes :confused:

We had already started this review before that driver was released. Had we had any communication from AMD on FC4 changes, we would have used that for the review. That all said, currently we have no idea if it fixes it or not.
 
I stopped looking at the review after Project Cars. How can a Titan X with only 9% more cores out perform the 980 Ti by up to 19%?
 
I stopped looking at the review after Project Cars. How can a Titan X with only 9% more cores out perform the 980 Ti by up to 19%?

Texture units also have a effect (and can be huge in some games) at the end are a total of -512 shaders working job but also -32 TMUs, also we don't know exactly what the clock vs clock were at the moment of the test.
 
I stopped looking at the review after Project Cars. How can a Titan X with only 9% more cores out perform the 980 Ti by up to 19%?

it DOES have twice the RAM, you know? Maybe its a RAM-limited title that turns RAM into frames? (Stop laughing, I'm being serious)

Seriously, when a game engine loads, it checks to see the amount of GPU memory available and will then load it up as it sees fit. Obviously there will be less swapping and more consistency when you have a huge tank of ram-space and the game engine is able to take advantage of it.
 
Texture units also have a effect (and can be huge in some games) at the end are a total of -512 shaders working job but also -32 TMUs, also we don't know exactly what the clock vs clock were at the moment of the test.
Yes of course I know it also has 9% more TMUs but that changes nothing at all. It is still just 9% more resources at most. If a card of the same architecture has 50% more of everything then its typically going to be 50% faster at most and in reality does not even scale that good. Each metric of a card does not get to magically scale on its own or cards would be many many times faster than they are today. If anything those 9% more cores and TMUs stands basically no chance of even making a full 9% difference since ROPs and memory bandwidth do not change. Nearly every review shows 4-5% overall difference between a Titan X and 980 Ti even at 4k which makes sense. There is simply no logical way the Titan X can be 19% faster than the 980 Ti.

Here you can see not even 1% difference in Project Cars at 4k between 980 Ti. http://www.techpowerup.com/reviews/AMD/R9_Nano/20.html
 
Yes of course I know it also has 9% more TMUs but that changes nothing at all. It is still just 9% more resources at most. If a card of the same architecture has 50% more of everything then its typically going to be 50% faster at most and in reality does not even scale that good.
You don't know what you are talking about.

Each metric of a card does not get to magically scale on its own or cards would be many many times faster than they are today. If anything those 9% more cores and TMUs stands basically no chance of even making a full 9% difference since ROPs and memory bandwidth do not change.
What do you mean? Metric? If you mean texture power vs pixel pushing power, those differ from gpu family to gpu family and between amd and nvidia. And that affects performance of different games differently.

The higher the resolution, the more important VRAM capacity and speed are. Titan X has double the VRAM vs the 980ti. Does that mean it should be twice as fast. By your faulty logic, yes.

Nearly every review shows 4-5% overall difference between a Titan X and 980 Ti even at 4k which makes sense. There is simply no logical way the Titan X can be 19% faster than the 980 Ti.

Here you can see not even 1% difference in Project Cars at 4k between 980 Ti. http://www.techpowerup.com/reviews/AMD/R9_Nano/20.html

Go read the article again because nowhere does it say Titan X is 19% faster than 980ti in Project Cars.
 
You don't know what you are talking about.


What do you mean? Metric? If you mean texture power vs pixel pushing power, those differ from gpu family to gpu family and between amd and nvidia. And that affects performance of different games differently.

The higher the resolution, the more important VRAM capacity and speed are. Titan X has double the VRAM vs the 980ti. Does that mean it should be twice as fast. By your faulty logic, yes.



Go read the article again because nowhere does it say Titan X is 19% faster than 980ti in Project Cars.
No you are the one that does not know what you are talking about. You also seem to fail at basic reading comprehension. :rolleyes:

And here its just hair under 19% faster. Anyway, unsubscribing so knock yourself out with a useless reply.


image sharing sites
 
Shooters and driving sims are still going to benefit from those 5700 wide resolutions.
 
You could get a really big 4K curved screen and use a letterbox resolution to give a similar effect.
There would be a performance benefit too.
 
I stopped looking at the review after Project Cars. How can a Titan X with only 9% more cores out perform the 980 Ti by up to 19%?

The Extra VRAM in Multi-GPU setup helps a lot, since M-GPU used more VRAM than Single-GPU setup due to its nature.

Not sure why you are getting butt-hurt from... ? Regret didn't get a Titan X? :p
 
You could get a really big 4K curved screen and use a letterbox resolution to give a similar effect.
There would be a performance benefit too.

Those Korean FreeSync monitors are supposed to come in 65" sizes in the future. They already have the 55" ones on EBAY.
 
No you are the one that does not know what you are talking about. You also seem to fail at basic reading comprehension. :rolleyes:

And here its just hair under 19% faster. Anyway, unsubscribing so knock yourself out with a useless reply.

http://s18.postimg.org/uelpu832h/1444048041_Pa_XLI3_YOE9_3_6.gif
image sharing sites

You realize that is with DS2X enabled, which doubles the resolution and downsamples it back to the rendered resolution, essentially SSAA at freaking 4K resolution. This is a very VRAM heavy setting, if you think the VRAM capacity isn't affecting performance at a setting like this, you are mistaken. We know at best 980 Ti is within 10% of performance on TITAN X in single-GPU, therefore, given a very GPU best case scenario 20% in SLI makes logical sense since we are doubling everything. This game scales very well. The result is what it is.
 
You could get a really big 4K curved screen and use a letterbox resolution to give a similar effect.
There would be a performance benefit too.

I wonder when we'll see 2160p Ultrawide (5120x2160) monitors... I could imagine a 45" Ultrawide with 110PPi... THAT would be amazing...
 
8k 120hz oculus rift would be amazing

Once we have graphics cards capable of pushing 90+ frames to that kind of resolution, I'd agree. Right now, it'd probably be an introduction to hating life brought to you by instant nausea, lol.

OOOO NEW USE FOR RIFT! Interrogation device! Could you imagine how terrible it would be if they pinned your eyelids open and stuck you in some sort of 'vomit simulation'!
 
Have a video going on the Rift of people drowning around you, place a wet towel across your nose and mouth, then slowly trickle water onto the towel to simulate drowning. ;)
 
Once we have graphics cards capable of pushing 90+ frames to that kind of resolution, I'd agree. Right now, it'd probably be an introduction to hating life brought to you by instant nausea, lol.

OOOO NEW USE FOR RIFT! Interrogation device! Could you imagine how terrible it would be if they pinned your eyelids open and stuck you in some sort of 'vomit simulation'!

For 8K 120hz, you would need the video card pushing 240fps at that resolution (120hz per eye).

As we are having issues with even getting close to 240fps at 1080p, massive improvements in CPU and GPU power would be required (~30x better GPU performance than current top end). You are talking about a long time away unless some insane breakthrough occurs.
 
For 8K 120hz, you would need the video card pushing 240fps at that resolution (120hz per eye).

As we are having issues with even getting close to 240fps at 1080p, massive improvements in CPU and GPU power would be required (~30x better GPU performance than current top end). You are talking about a long time away unless some insane breakthrough occurs.

That's not quite right. We typically cut a resolution in half on VR devices because a single eye only gets about half the screen. The refresh rate is happening, from top to bottom, across the entire surface so even if one eye only gets half the screen, its still getting refreshed at the full panel refresh rate.

Regardless, it still holds true that greater power in GPUs is needed (as well as a cable to transfer all that data to the display) to deliver 8k at an acceptable frame rate for VR.
 
It would be interesting to see review aimed at 144Hz 1440p crowd

I agree with this. We should be obsessed with Hz not resolution.

I get it from a GPU review standpoint that you're more likely to have CPU bottlenecks pushing 100Hz plus.
 
I agree with this. We should be obsessed with Hz not resolution.

I get it from a GPU review standpoint that you're more likely to have CPU bottlenecks pushing 100Hz plus.

GPU wise, there isn't much difference between 1440p 144Hz versus 4K 60Hz. Though I would love to see identical systems tackle both and review the data...
 
Nice review, does look like games selected has big differences in outcomes. Plus Win 10 has it's fair share of growing pains.

I can see why Project Cars is an outstanding title to use now, rapid development, great testing options to really push cards, future enhancements as time goes on.

Considering results, plus Nvidia does have in general a better OCing record/guarantee - It maybe prudent for AMD to mix things up (since also availability is getting better) on pricing.
  • FuryX $599
  • Fury $499
  • Nano $579
  • Fury X2 $1049
  • 390x $399

The only problem I see above is that Nvidia could just match and do a price war which I think AMD would loose.
 
I guess I get to mention the elephant in the room. Fury X overclocks very little. 980ti overclocks significantly. I would guess that the places where Fury X was on par or better in this review would be negated by that fact. Still, good to see Fury X making a solid showing. I hope AMD can update drivers to unleash its true potential.

Now that we have talked about the Elephant in the room, lets discuss the 800 lb. gorilla and that is that none of the R9 and Fury cards support HDMI 2.0. 4K @ 30hz on HDMI 1.4a is the best it can do. So if you don't have a Display Port monitor your options are limited.

Some Radeon adds tout 4096 x 2160 (HDMI; DP) which is deceptive as they run at 2 different max refresh rates.

Also would be nice to do a Crysis comparison, although the Battlefield 4 is usually close enough.
 
Now that we have talked about the Elephant in the room, lets discuss the 800 lb. gorilla and that is that none of the R9 and Fury cards support HDMI 2.0. 4K @ 30hz on HDMI 1.4a is the best it can do. So if you don't have a Display Port monitor your options are limited.

Some Radeon adds tout 4096 x 2160 (HDMI; DP) which is deceptive as they run at 2 different max refresh rates.
...and colour qualities.
Even at 4K 30Hz, Fury cannot do 4:4:4, it can only display compressed colour.
 
...and colour qualities.
Even at 4K 30Hz, Fury cannot do 4:4:4, it can only display compressed colour.

HDMI 1.x should only be used for lower resolutions.

If you're using any R9 Fury version for 4K without Displayport, please go sit down in the Total Fail section of the room...
 
HDMI 1.x should only be used for lower resolutions.

If you're using any R9 Fury version for 4K without Displayport, please go sit down in the Total Fail section of the room...
Speaking of total fails...
AMD tried to pull a fast one on their own userbase, but why are you telling me this?
I made it clear I am wise to them.
 
Back
Top