Why would AMD release a 4K video card without HDMI 2.0 and with 4 GB of VRAM

We get that you don't consider them good. But get this, no one is going to agree that turning down settings is maxed out. Most people don't even agree with how you view these features.

Actually, most people are ignorant as to what these features do but when you show them examples and poll them most will respond negatively to these features.

And there are plenty of people who agree with me that a game can be considered "maxed out" with that motion blur/DoF/CA shit turned off.

The fact that these are standard options that can be disabled and people actively bitch about them when they are unable to turn them off proves that they are undesirable.

You don't see an endless array of "fixes" to lower the resolution of textures, downgrade the lighting, etc. like you do to eliminate motion blur/DoF/CA now do you?

Anyways, I did a Batman: Arkham Knight video on SLI 980 Ti's per someone's request in this thread. Surprise! It runs like shit:
http://youtu.be/sWCQZf_0tEY

It also eats up all 6 GB of VRAM on the card. Can't imagine how Fury X is going to struggle with it.
 
Actually, most people are ignorant as to what these features do but when you show them examples and poll them most will respond negatively to these features.

And there are plenty of people who agree with me that a game can be considered "maxed out" with that motion blur/DoF/CA shit turned off.

The fact that these are standard options that can be disabled and people actively bitch about them when they are unable to turn them off proves that they are undesirable.

You don't see an endless array of "fixes" to lower the resolution of textures, downgrade the lighting, etc. like you do to eliminate motion blur/DoF/CA now do you?

Anyways, I did a Batman: Arkham Knight video on SLI 980 Ti's per someone's request in this thread. Surprise! It runs like shit:
http://youtu.be/sWCQZf_0tEY

It also eats up all 6 GB of VRAM on the card. Can't imagine how Fury X is going to struggle with it.

24fps min lol
 
Actually, most people are ignorant as to what these features do but when you show them examples and poll them most will respond negatively to these features.

And there are plenty of people who agree with me that a game can be considered "maxed out" with that motion blur/DoF/CA shit turned off.

The fact that these are standard options that can be disabled and people actively bitch about them when they are unable to turn them off proves that they are undesirable.

You don't see an endless array of "fixes" to lower the resolution of textures, downgrade the lighting, etc. like you do to eliminate motion blur/DoF/CA now do you?

Anyways, I did a Batman: Arkham Knight video on SLI 980 Ti's per someone's request in this thread. Surprise! It runs like shit:
http://youtu.be/sWCQZf_0tEY

It also eats up all 6 GB of VRAM on the card. Can't imagine how Fury X is going to struggle with it.

Looks like SLI is not working. I got pretty much the same numbers @ 4K on a single 980 Ti.

At my usual gaming res though, 3440x1440, it's very playable. Around 50 average.
 
Yep; SLI is broken and so is Crossfire.

Just in time for the highly anticipated title Batman: Arkham Knight this new GeForce Game Ready driver ensures you'll have the best possible gaming experience. With support for GeForce SLI technology and one-click game setting optimizations within GeForce Experience, you'll have the best possible performance and image quality during gameplay.

Game Ready
Best gaming experience for Batman: Arkham Knight, including support for SLI Technology and GeForce Experience 1-click optimizations

"Including support for SLI Technology" my ass, Nvidia.
 
As far as I am concerned, good riddance to the big and clumsy DVI. This thing has been around since 1999 and is currently on its way out as it is being replaced by DP. For the old timers stuck with a DVI only monitor, please note that a DP to DVI adapter costs around $10-$15.

Active DP to DVI adapters are required for 1440P. They cost $100.

EDIT: And you can't overclock 95% of the active adapters out there.
 
Last edited:
24fps min lol

And that is just the benchmark, I heard the game runs even worse.
Worst game of the year in my opinion.

Edit: Here is how many people worked on the PC port, yes only 12.
iron_galaxy_batman.jpg
 
Last edited:
It is as I expected. [H] confirms it. The 4 GB of VRAM on Fury X is crippling it:

http://www.hardocp.com/article/2015/06/24/amd_radeon_r9_fury_x_video_card_review/6#.VYuDv9pFuUk

The VRAM bottleneck is like a cap on performance, keeping the true performance of the GPU from shining through. This game is a shining example of how VRAM capacity bottlenecks can lower performance and power demand. Sorry AMD, 4GB isn't enough in this game at 1440p, what do you think is going to happen at 4K?

This card should have had 8 GB of VRAM. If HBM couldn't handle it then AMD should have stuck with GDDR5 and held off until HBM2 arrived to allow more than 4 GB of VRAM.

The GTX 980 Ti's 6 GB of GDDR5 is outperforming the Fury X's 4 GB of HBM.
 
This card should have had 8 GB of VRAM. If HBM couldn't handle it then AMD should have stuck with GDDR5 and held off until HBM2 arrived to allow more than 4 GB of VRAM.

4GB not an HBM1 limitation. AMD design limitation. They seemed to have designed Fiji for a smaller process than what they were able to get manufactured in the end. Had Fiji been able to hit the smaller process, there would've been room for another 4GB.

Their hearts were in the right place in terms of trying to forward-engineer but they flew too close to the sun gambling on the smaller node; should have stuck with GDDR5 for one more generation as long as GPU is still the bottleneck.
 
The fanboy arguments that it would give higher minimum fps than the 980ti didnt materialise either at any res.
 
I just want to point out that when people buy top end GPUs they are buying them often because they are very specific about their needs and desires for a GPU and its outputs. We cant control what ports and options the common OEMs put on their junk all we can do is ask our high end GPUs to be able to output to anything and plow through a demanding load. HDMI 2.0 will be a popular connection even if I hate it and I would want the option on any brand new > $300 GPU I purchased and I would also like DVI given that HDMI and DP have both done such a shit job keeping up with the needs of displays, this has forced many display makers to use DL-DVI. Anyone can bitch and moan about how DVI is outdated but the matter of fact is people are still shipping millions of displays with the port and just a couple years ago all the most popular displays many of us were buying could only reach their full potential with DVI. You are welcome to get rid of an output connections when monitors have not shipped with that connection for 5+ years. Too bad that isn't even freaking remotely close to the reality of times now. HDMI 2.0 literally just released and isn't even on many 4K TVs yet let alone the rest of monitors. DP is totally inconsistent, might be on a monitor, might not, and is often not on TVs. WTF are we supposed to do? We need flexibility in ports to deal with this shit mess the industry made for itself.
 
Back
Top