Why would AMD release a 4K video card without HDMI 2.0 and with 4 GB of VRAM

AMD prob didnt anticipate this backlash, which makes me wonder if they've got it together over there.
 
Good companies stay ahead of the curve and the bad ones are always trying to catch up.

That statement is flawed on so many levels. For one "good" is a relative term and what defines good because good could mean a company that has high profitability i.e. Apple and in Apples case they generally don't stay ahead of the curve they wait for others to innovate then copy that idea and polish it with high success. Come to think of it so does Nvidia in many cases one such case could be HBM, because they're not trying to get ahead of the curve there ( some would argue that HBM isn't needed but that same argument could be said of HDMI 2.0 since they adoption rate of 4k is miniscule atm.) .

I could go on and cherry pick but the point remains that just because you may favor one company or they happen to have the niche compatibility you want at a moment doesn't mean another company is not good. Try to keep your bias in check.

That said I will say that AMD may want to look at gaming experience because the do tend to be a little behind in the regard ( frame time took a few sites complaining to get them to realize that it was an issue, while nvidia was looking at this in advance as well as the QC that nvidia does for gsync compared to what they've allowed in the past for Freesync though that may be fixed. etc ) and given that apparently enthusiast that go for 4k gaming may choose tv's rather than monitors they probably should have looked at getting hdmi 2.0 if it isn't there already.


Though IMO 90hz - 144hz has far more impact on gaming experience than 4k and I would rather wait till 4k at at least 120hz was available with IPS panels or better before I even would think of trying 4k. This is coming from someone who is still waiting on a good non gsync 144k ips to come out -.-.
 
Dear AMD

You're releasing products in (mid!) 2015, why don't they at *least* support 2014 standards!?? (DP 1.3, HDMI 2.0)

Thanks

The Fucking (Nerdy) World
 
I could go on and cherry pick but the point remains that just because you may favor one company or they happen to have the niche compatibility you want at a moment doesn't mean another company is not good. Try to keep your bias in check.

I don't understand how I can have a bias when there is a clear choice. One has HDMI 2.0/DVI and one doesn't.
Now if both had HDMI 2.0/DVI and I chose Nvidia because "they have better drivers IMO", then you would have a point.
 
AMD could have taken more time to add HDMI 2.0 and 8 Gigs of vram. Everyone would have appreciated that release. It's not that the FuryX is bad in any way but it's missing a few key ingredients that would sway away a lot of buyers apart from the usual red team fans.
 
AMD could have taken more time to add HDMI 2.0 and 8 Gigs of vram. Everyone would have appreciated that release. It's not that the FuryX is bad in any way but it's missing a few key ingredients that would sway away a lot of buyers apart from the usual red team fans.

Agreed, seems rushed but will still be a bloody good product!! Maybe it's all they could do given the aggressive competition, and in that regard I understand :)
 
They don't have a choice on the VRAM. HBM1 is limited to 4GB. HBM2 allows more density but is not in production.
 
No proof those adapters will come out this summer. Also what I saw from that one manufacturer (Bizlink) said that they planned to release it Q4 of 2015 (I can't find where).

Those active adapters usually cost at least $100 and introduce problems like lag,

BizLink DisplayPort to DVI-D Dual Link Adapter , ACTIVE , Powered by USB Port , Brand NEW , Dell P/N : XT625"
Electronics; £29.99

I bought 2.
 
can someone please explain to me why prime and xizer aren't permabanned yet? most blatant trolls i have ever seen, yet they continue to be allowed to post.
 
I made several videos of the latest games running on 980 Ti's as requested by posters here:

http://hardforum.com/showthread.php?t=1866004

As you can see they hold 60 fps most of the time as I said. You can also see that most new games are exceeding 4 gigabytes of VRAM usage. Assassin's Creed Unity, Dying Light, Evolve, Grand Theft Auto V, require over 5 GB of VRAM at 4K resolution so no swapping is necessary. Tomb Raider needs 4.6 GB.

I still have not heard anyone properly explain to me how the Fury X's 4 GB of VRAM is supposed to be able to stop the stuttering that occurs when new textures have to be swapped into the VRAM from slower sources (RAM, HDD) in the middle of gameplay.

With 6 GB of VRAM, all necessary textures can be held in the VRAM without the need for swapping in new ones in the middle of the game. The act of swapping in new textures when the VRAM threshold is hit is what causes significant lag spikes (stutter).

I will do other games testing, like COD: Advanced Warfare and the Battlefields, later.

Fury X may be able to achieve higher average framerates than 980 Ti but I am skeptical that it will be able to maintain those framerates smoothly without lag spikes occurring when it hits its 4 GB VRAM limit. To me, a stutter-free experience is more important.

When I played these same games on my GTX 980s I encountered a lot of stuttering and the 4096 MB VRAM was always full. These games do not fill up the full 6144 MB of 980 Ti VRAM so they do not lag spike like the GTX 980 did.
 
Man 980 Ti owners are out in full force this week.
Gotta prevent that early on-set buyers' remorse.

AMD wouldn't market a new flagship as a 4K card when its not even capable of handling the games being tested in the launch benchmarks. Stop comparing it to bandwidth-starved GDDR5 Nvidia cards.
 
Gotta compare it to nV's cards its positioned that way. We can always compare it using medium settings heh but that won't get that far.

Marketing and reality are two different things Techreport stated their latest podcast going to push Fury X with higher levels of AA and AF, so lets wait and see.
 
There is really no evidence that NVIDIA's cards are bandwidth starved.

The primary benefits of HBM seem to be board size and power effiency, with memory bandwidth as a bonus, the trade-off being complexity of producing the chips.
 
Adapters are a horrible solution. Do some research, the add about 20 to 30ms of latency.

Honestly, keep what you have especially if guys are at 1080p. There is literally no point in upgrading to the Fury X, Pro, etc.

The entire community is stunned that these cards do not have DVI or HDMI 2.0 on top of, not enough memory to do high to ultra settings at 4k.

I'm going to wait for pascal in 2016.

Besides, we all know the driver situation / support from AMD is still a nightware.
 
Single card GPU drivers from AMD are totally fine. CrossFire support has been lacking in 2015, unfortunately, but was in general much improved over previous releases. I wouldn't hesitate to run a single AMD GPU at all. My girlfriend's PC is running a XFX 290 and she has no problems with any games.
 
I have had a ton of problems with my 280X that I know are specifically caused by this GPU and yet other people say I am crazy because their card is "fine".
 
Adapters are a horrible solution. Do some research, the add about 20 to 30ms of latency.

Honestly, keep what you have especially if guys are at 1080p. There is literally no point in upgrading to the Fury X, Pro, etc.

The entire community is stunned that these cards do not have DVI or HDMI 2.0 on top of, not enough memory to do high to ultra settings at 4k.

I'm going to wait for pascal in 2016.

Besides, we all know the driver situation / support from AMD is still a nightware.

Wait, WHAT? So these tvs with an already high latency compared to monitors are now worried about latency?
 
I have had a ton of problems with my 280X that I know are specifically caused by this GPU and yet other people say I am crazy because their card is "fine".
I'm not saying you are crazy, but I have played most of the major releases this year on my 290X CF setup and my girlfriend games all the time on her 290 and we haven't had many major problems. Even NVIDIA's current driver set for the 980 Ti has some issues for many users, nothing is perfect unfortunately. My point was more along the lines that AMD's driver quality is substantially better now than common internet opinion reflects.

Wait, WHAT? So these tvs with an already high latency compared to monitors are now worried about latency?
Not all TVs have terrible input latency, you get what you pay for.
 
I'm not saying you are crazy, but I have played most of the major releases this year on my 290X CF setup and my girlfriend games all the time on her 290 and we haven't had many major problems. Even NVIDIA's current driver set for the 980 Ti has some issues for many users, nothing is perfect unfortunately. My point was more along the lines that AMD's driver quality is substantially better now than common internet opinion reflects.

Not all TVs have terrible input latency, you get what you pay for.

Here are input lag findings for tv's http://www.hdtvtest.co.uk/news/input-lag
It seems fairly up to date and you can select to only see 4k tvs. A total of 6 tv's, only 4 in the 20ms range, have under 40ms input lag as far as 4k. All of those cost a 2000+ dollars and offer no better than 21ms of input lag.

So everyone who has been talking about input lag in this thread has a $2000 4k tv in their house and enjoys 21ms of input lag?

Here is another site that has more input lag results... http://www.displaylag.com/display-database/

Again nothing much under 20ms ( 2 tvs are 17ms that were 20ms on the other web site I listed) several $1000+ tv's with 27ms of lag (released in 2015) and only 2 of them are 40" at $1000 and $700.Link the the $700 tv. Reviews not so great.
And then we get back to the 40+ms of lag.
 
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
Here are input lag findings for tv's http://www.hdtvtest.co.uk/news/input-lag
It seems fairly up to date and you can select to only see 4k tvs. A total of 6 tv's, only 4 in the 20ms range, have under 40ms input lag as far as 4k. All of those cost a 2000+ dollars and offer no better than 21ms of input lag.

So everyone who has been talking about input lag in this thread has a $2000 4k tv in their house and enjoys 21ms of input lag?

Funny isn't it? I mean they have a legit reason for concern but their argument is going from reasonable to absurd just to bash and whine. While I agree not having HDMI 2.0 is a bit disconcerting and I have no rational explanation for it, if true, the incessant whining and complaining is not fruitful.
 
Wait, WHAT? So these tvs with an already high latency compared to monitors are now worried about latency?

Latency is an additive kind of thing so while it can be sensible not to worry about it below a certain level, that doesn't mean any level is ok or that introducing more in the chain is ok. A lot of people are ok with 30m so of latency or so (others disagree) as it is only two frames. Ok so you get a TV that has latency around that and are happy... but then if you add another 30ms from an adapter you are now up over 60ms, which less people are happy about.

It isn't an either/or situation where you either "care about latency and have to minimize it in absolutely everything," or "don't care and will deal with whatever you get."

Also you have to be a little careful with the Leo Bondar tester on 4k displays as it is a 1080 only device. Sometimes their 1080 performance is a fair bit worse as it passes through their scaler. With a computer you can bypass that, even if playing in 1080, by just having your GPU handle the scaling and sending a 4k signal.
 
Here are input lag findings for tv's http://www.hdtvtest.co.uk/news/input-lag
It seems fairly up to date and you can select to only see 4k tvs. A total of 6 tv's, only 4 in the 20ms range, have under 40ms input lag as far as 4k. All of those cost a 2000+ dollars and offer no better than 21ms of input lag.

So everyone who has been talking about input lag in this thread has a $2000 4k tv in their house and enjoys 21ms of input lag?

Here is another site that has more input lag results... http://www.displaylag.com/display-database/

Again nothing much under 20ms ( 2 tvs are 17ms that were 20ms on the other web site I listed) several $1000+ tv's with 27ms of lag (released in 2015) and only 2 of them are 40" at $1000.
And them we get back to the 40+ms of lag.
Bear in mind that input lag is somewhat subjective and some people do not notice it as much as other people. For couch gaming, and certain types of games, it's not as important. Would I play COD or Battlefield on 20+ ms of input lag? No. Would I play turn-based RPGs or strategy games? Probably.

Either way, I don't really see this as a valid reason to not include HDMI 2.0 support. The standard was released 2 years ago and is backwards compatible. It's a lack of foresight if they really didn't include it, given how well these cards would work in SFF media PCs and Steamboxes otherwise.
 
Bear in mind that input lag is somewhat subjective and some people do not notice it as much as other people. For couch gaming, and certain types of games, it's not as important. Would I play COD or Battlefield on 20+ ms of input lag? No. Would I play turn-based RPGs or strategy games? Probably.

Either way, I don't really see this as a valid reason to not include HDMI 2.0 support. The standard was released 2 years ago and is backwards compatible. It's a lack of foresight if they really didn't include it, given how well these cards would work in SFF media PCs and Steamboxes otherwise.

Well my reason for posting this is that the people spouting about 4k tv's for gaming think they have no input lag for some reason. I keep quoting the 40ms number because that seems to be the cut off according to these sites for good input lag in terms of gaming.
Since I found 2 4k 40" (the number I keep hearing going around) that have less than 40ms (27ms to be exact), if the people that are spouting this lag stuff don't own those sets what the hell are they talking about then?

There was no reason for AMD not to include HDMI 2.0, and it is an unfortunate over site. I am just trying to get the people who have baseless claims to quit making those claims.
 
Man 980 Ti owners are out in full force this week.
Gotta prevent that early on-set buyers' remorse.

AMD wouldn't market a new flagship as a 4K card when its not even capable of handling the games being tested in the launch benchmarks. Stop comparing it to bandwidth-starved GDDR5 Nvidia cards.

My detector is broken, I'm hoping that was a bad attempt at sarcasm.

I have had a ton of problems with my 280X that I know are specifically caused by this GPU and yet other people say I am crazy because their card is "fine".

I had issues with my 7850 and 7950, the 290 is flawless compared to them, and my 660Ti.
 
Too much input lag is a problem. Anything under 40ms is fine. So most televisions are fine as long as you turn on their game mode setting.

But add an adapter that adds another 30ms on top of that and that is terrible and definitely noticeable.
 
Man 980 Ti owners are out in full force this week.
Gotta prevent that early on-set buyers' remorse.

AMD wouldn't market a new flagship as a 4K card when its not even capable of handling the games being tested in the launch benchmarks. Stop comparing it to bandwidth-starved GDDR5 Nvidia cards.

What are you talking about AMD shill.

They were not able to achieve anything greater than 4GB with HBM1. And 4GB is not enough. Period.
 
They were not able to achieve anything greater than 4GB with HBM1. And 4GB is not enough. Period.
Unless you have a Fury X, I will wait for official reviews before getting my pitchfork, if it's all the same to you. :D
As I said before, I don't really care if 8-year-old technology isn't capable of handling modern resolutions.

If AMD's engineers claim they can handle it, I'll give them their chance. I'm not as eager to shit on AMD as some people around here... I prefer to base my outrage on facts, which can't be proven wrong, unlike baseless speculation. Better to be late and correct than early and wrong.

Only fanboys would get outraged over something that isn't even released yet.

What are you talking about AMD shill.
Now you've hurt my feelings.
 
Last edited:
Only fanboys would get outraged over something that isn't even released yet.

I agree with that part.

I consider someone calling someone else a shill basically a form of name calling when you can't argue effectively otherwise.

Also, I don't agree with most of what TS said. I've shifted my Titan X's from 6600 MHz to 8000Mhz on the memory and only gained 1% at 3.5GB usage. Others have reported more of a gain though in other apps (I used Firestrike Ultra). I never remember to bother with it. This is why I am skeptical HBM matters at 4GB. At 8GB it'll probably have a greater effect.
 
Last edited:
He mentioned earlier that AMD didn't feel a need to put more than 4 Gb of vram whereas clearly they were limited to it by HBM1.
If I actually said that, please link me the post so I can correct it.
I would never claim that AMD intentionally put 4 GB on these cards because we know they were limited to gen1 HBM for months.

If you can't find the post (which you can't, because I never said that) please edit that statement out. Misquoting someone for the purpose of pushing your agenda is a dick move.
 
There have been wrong facts presented by Tainted Squirrel throughout. He mentioned earlier that AMD didn't feel a need to put more than 4 Gb of vram whereas clearly they were limited to it by HBM1.

They would have loved to add more considering FuryX is an ultra high end card.

With a series of deceits and lies only make one conclusion in my mind that he's on a personal agenda and I rightly called him a shill.

Maybe he owned 970 SLI and is on a revenge scheme? Not a shill unless you get paid. To me that's two different levels. Revenge is ok in my book where being a shill is like selling your soul. :D

I personally think HBM at this capacity was premature. As far as I can tell the bandwidth was not needed yet and the capacity is a hindrance. I am too lazy to find it, but someone did trisli Titan X's at 5k and 9 of 15 of the games used over 4GB (up to 10GB IIRC) and had playable framerates. I use DSR and that slaughters VRAM. But we shall see on the 24th. [H] has definitely even paying attention to VRAM and the Fury X is the first card with the power to go over 4GB at playable rates (as far as I can tell).

I lied, I took the time to find it, I couldn't help myself and 5k is relevant to me because I use DSR (or VSR on AMD's side) up to 6880x2880:

The original 5K benchmarks w/ 4-Way SLI turned out to be quite bad because of the horrendous 347.88 drivers.

If you see the 3-Way SLI review at 5K (here: https://youtu.be/NQIc9MuP8ck), the performance is way better.

3-Way SLI Titan X w/ 350.05:
lKO5szx.jpg
 
Last edited:
If I actually said that, please link me the post so I can correct it.
I would never claim that AMD intentionally put 4 GB on these cards because we know they were limited to gen1 HBM for months.

If you can't find the post (which you can't, because I never said that) please edit that statement out. Misquoting someone for the purpose of pushing your agenda is a dick move.

Ok that was LordEC911 on 16 th of June in a different thread. I just dug up all the posts.
 
Maybe he owned 970 SLI and is on a revenge scheme? Not a shill unless you get paid. To me that's two different levels. Revenge is ok in my book where being a shill is like selling your soul. :D

I personally think HBM at this capacity was premature. As far as I can tell the bandwidth was not needed yet and the capacity is a hindrance. I am too lazy to find it, but someone did trisli Titan X's at 5k and 9 of 15 of the games used over 4GB (up to 10GB IIRC) and had playable framerates. I use DSR and that slaughters VRAM. But we shall see on the 24th. [H] has definitely even paying attention to VRAM and the Fury X is the first card with the power to go over 4GB at playable rates (as far as I can tell).

I lied, I took the time to find it, I couldn't help myself and 5k is relevant to me because I use DSR (or VSR on AMD's side) up to 6880x2880:

But isn't the memory used there total across 3 cards so individual card memory is 1/3 of the total. Hell I have yet to go over 3.5Gb using VSR @1800, highest being Skyrim modded. Wither3 used a max of 2.2Gb Vram with full Ultra. Now I don't use AA with VSR or at most maybe X2 so could be why mine is low.

add: honestly from that chart cant tell if it is x3 or they took that into account and that is on each card. Some show 3Gb so not likely it is only using 1Gb.
 
But isn't the memory used there total across 3 cards so individual card memory is 1/3 of the total. Hell I have yet to go over 3.5Gb using VSR @1800, highest being Skyrim modded. Wither3 used a max of 2.2Gb Vram with full Ultra. Now I don't use AA with VSR or at most maybe X2 so could be why mine is low.

add: honestly from that chart cant tell if it is x3 or they took that into account and that is on each card. Some show 3Gb so not likely it is only using 1Gb.

It's not 3x. I used over 3GB all the time on my single 980 when I had it. It's not crazy for a system 4x's as powerful to use those numbers.

Anywho in this case when I see data of high end systems using over 4GB I prefer a "prove to me 4GB is ok" not a I hope "AMD will pull magic out of their ass and all is ok" perspective. If this card did slaughter a Titan X it's not beyond me to switch to team red.
 
Last edited:
AMD is about to be broken up in two. They are just phoning it in at this point. This explains the lack of forethought.
 
It's not 3x. I used over 3GB all the time on my single 980 when I had it. It's not crazy for a system 4x's as powerful to use those numbers.

Anywho in this case when I see data of high end systems using over 4GB I prefer a "prove to me 4GB is ok" not a I hope "AMD will pull magic out of their ass and all is ok" perspective. If this card did slaughter a Titan X it's not beyond me to switch to team red.

that graph is misleading. many engines will use whatever you can give them, doesn't mean that you can't play them with a 4 GB card with no paging. a better test would be to have someone with a 4 GB card do the same benchmarks and make note of stutter. not arguing that 4 GB isn't pathetic for a flagship card in 2015, but that graph is just not a good representation of how much vram is actually necessary.
 
Back
Top