Why would AMD release a 4K video card without HDMI 2.0 and with 4 GB of VRAM

Xizer

Limp Gawd
Joined
May 7, 2010
Messages
227
What is wrong with this company?

Being a little faster than a 980 Ti doesn't mean jack squat when you are restricted to 30 FPS when you hook it up to your 4K TV because it only has HDMI 1.4 on it. All Maxwell cards can drive 4K @ 60 FPS thanks to having HDMI 2.0.

Being a little faster than a 980 Ti also doesn't matter when there's only 4 GB of VRAM. Fury X will constantly be hitting that 4 GB barrier and every time it does there will be huge stuttering as the GPU has to swap in new textures. This was the biggest problem that plagued the GTX 980 that the Titan X / 980 Ti corrected, and AMD is repeating the issue. The minute I swapped my 980's for 980 Ti's is when the stuttering in all the games I played at 4K stopped because the 980 Ti's finally had enough VRAM for 4K. It wasn't a power issue that made 4K gaming on SLI 980's a troubling experience; it was the 4 GB of VRAM that modern games running at 4K was constantly hitting. The speed of the VRAM is not what is critical for preventing the stuttering at 4K; It is solely about the amount of VRAM available. So AMD's HBM technology is useless as long as it is restricted to 4 GB.

As an Nvidia owner this makes me mad because I'm tired of Nvidia continuing to have a de facto monopoly due to AMD's constant incompetence at presenting a real alternative. Nvidia correctly recognized the necessity of HDMI 2.0 and more VRAM for smooth 4K gaming when they designed Maxwell. AMD might be able to put out cards that appear to run faster than Nvidia cards but what they fail to realize is that power is not the deal breaker; it's the critical features Nvidia GPUs offer that convince gamers to pay more for slightly slower Nvidia competitors to AMD.
 
i understand you guys are upset but this doesn't need to be a new thread.
 
I don't see the reason people are coming form the Nvidia sub forum to let everyone know what the fury cards do or don't have. It has been discussed in at least 6 threads in the AMD sub forum and some how everyone thinks they need to open a new thread about every little detail (even if it is big to you) of the Fury cards.
Well you don't.
 
This isn't a "little detail."

Calling a company's flagship GPU marketed for 4K with insufficient VRAM and the inability to output 4K @ 60 Hz a "little detail" is like calling a car that is missing its wheels a "minor issue."
 
Lack of HDMI 2.0 is certainly shaping up to be a "bababooey" moment for AMD, if for no other reason than VR is about to blow wide open at the end of the year and the VR headsets take an HDMI input. So if Fiji is the platform AMD will be coasting on for the next 2-3 years, they're basically handing VR over to Nvidia. Or they have to pray that the VR HMD's eventually go displayport once they move to 4K panels.

Yet at the same time AMD obviously has an interest in VR since they're developing LiquidVR, so it really makes no sense, unless they're just disorganized like Microsoft where the left hand and right hand don't talk.
 
This isn't a "little detail."

Calling a company's flagship GPU marketed for 4K with insufficient VRAM and the inability to output 4K @ 60 Hz a "little detail" is like calling a car that is missing its wheels a "minor issue."

There are display ports that can do 60hz @ 4k. If anyone owns a 4k TV this may be bad for them although some 4k tv's are coming with display ports now.
If you were planing to buy this card your complaint has merit but then you would have already have seen this was discussed like 2 days ago and confirmed some time yesterday or this morning.

We don't need over 9000 threads with the exact same thing repeated over and over for the people who had no plan on even buying this card in the first place.
 
I'm pretty sure that the fact that AMD has been partnering up with VR companies means that all the ports required will be available in the products designed to support them. Early leaks suggest that 4 GB isn't an issue even with ultra settings and the high resolution texture pack for Shadow of Mordor. It does seem odd that HDMI 2.0 is missing, but I don't think that will break the product.
 
There are display ports that can do 60hz @ 4k. If anyone owns a 4k TV this may be bad for them although some 4k tv's are coming with display ports now.
If you were planing to buy this card your complaint has merit but then you would have already have seen this was discussed like 2 days ago and confirmed some time yesterday or this morning.

We don't need over 9000 threads with the exact same thing repeated over and over for the people who had no plan on even buying this card in the first place.

Besides some of the high end Panasonic 4k TV's I cannot think of any other that is coming out with DP. The vast majority all use HDMI 2.0 only and there are quite a bit of people who prefer to use large 4k TV's as their monitors as shown in the Samsung 4k thread. AMD just lost a lot of potential customers with this move.
 
Why the hell would someone post another thread about this? I guess it's too hard posting in someone else's thread.
 
If you want my opinion. It is the Nvidia focus group members out in full force.

Their job is to put down AMD in all forums across the internet. So anything negative about the new cards...They will post/start threads about.

Case in point. 1 thread about 4GB of Vram, 1 thread about no DVI, 1 thread about no HDMI 2.0, next will be a thread about only having 12.0 DX12 support.

Next will be power usage...etc etc etc etc etc
 
Why the hell would someone post another thread about this? I guess it's too hard posting in someone else's thread.

It's called marketing. The OP is in 4 different threads all talking about the same thing. Apparently just now the explosion of 4K gaming on a TV has occurred. Nevermind the frame rate at that resolution absolutely sucks.

Quite a few if any 780's (like mine) support HDMI 2.0 but apparently it's such a standard that I'm supposed to throw away this $600 video card (when I bought it) just so I can game at 40 fps if I'm lucky. :rolleyes:

This is just getting stupid.
 
If you want my opinion. It is the Nvidia focus group members out in full force.

Their job is to put down AMD in all forums across the internet. So anything negative about the new cards...They will post/start threads about.

You are mistaken. I really do not like Nvidia. I want AMD to be successful so Nvidia's monopoly is broken. But I cannot bring myself to purchase an AMD product because they keep making so many stupid decisions. Even though I buy Nvidia I encourage others to buy AMD when possible. Most do not have the high standards I have so they do not require the higher quality product Nvidia offers. I also find Nvidia's business practices deplorable and I do not like that I support their business practices.

But no HDMI 2.0 is a deal breaker because my 4K TV doesn't have DisplayPort. AMD does not look like they will have a single viable alternative to Maxwell this generation. It will be at least another year before AMD cards are viable for 4K gaming.

Why the hell would someone post another thread about this? I guess it's too hard posting in someone else's thread.

Maybe because there's not a thread specifically about these issues? There's various "general" threads about Fury where this issue is brought up, but there is no thread solely about this problem like this one is.

I think it's a big enough issue to be worthy of its own thread.

It's called marketing. The OP is in 4 different threads all talking about the same thing. Apparently just now the explosion of 4K gaming on a TV has occurred. Nevermind the frame rate at that resolution absolutely sucks.

I get a stable 60 FPS on everything I play at 4K on my 980 Ti's mate... and I set all settings to ultra. The Witcher 3, Evolve, Sleeping Dogs, Dying Light, etc. 4K is here and ready to go. It's no longer some mythical resolution that is out of reach.
 
You are mistaken. I really do not like Nvidia. I want AMD to be successful so Nvidia's monopoly is broken. But I cannot bring myself to purchase an AMD product because they keep making so many stupid decisions. Even though I buy Nvidia I encourage others to buy AMD when possible. Most do not have the high standards I have so they do not require the higher quality product Nvidia offers. I also find Nvidia's business practices deplorable and I do not like that I support their business practices.

But no HDMI 2.0 is a deal breaker because my 4K TV doesn't have DisplayPort. AMD does not look like they will have a single viable alternative to Maxwell this generation. It will be at least another year before AMD cards are viable for 4K gaming.

You are incorrect. You mean 4k gaming on horrible input TV's.

They are perfectly fine for 4k gaming on monitors.

People who want to game on 4k Tv's are a very very very very very very small minority. And those people already have Nvidia cards with HDMI 2.0.

Otherwise real gamers use monitors. It's not like Kyle or Brent are reviewing games on a 4k tv....
 
*viable for 4K gaming on TVs

If you have a DisplayPort 4K monitor - meaning every 4K monitor afaik - you'll be okay. This sucks for HTPC users or people who game on TVs, though.
 
You are incorrect. You mean 4k gaming on horrible input TV's.

They are perfectly fine for 4k gaming on monitors.

People who want to game on 4k Tv's are a very very very very very very small minority. And those people already have Nvidia cards with HDMI 2.0.

Otherwise real gamers use monitors. It's not like Kyle or Brent are reviewing games on a 4k tv....
Is there really a need to put other people down based on what they choose to game on? I have 3 gaming PCs in my house, one of those is hooked up to a TV. I don't see how me sitting down to game on my couch instead of my desktop makes me less of a gamer.
 
You are mistaken. I really do not like Nvidia. I want AMD to be successful so Nvidia's monopoly is broken. But I cannot bring myself to purchase an AMD product because they keep making so many stupid decisions. Even though I buy Nvidia I encourage others to buy AMD when possible. Most do not have the high standards I have so they do not require the higher quality product Nvidia offers. I also find Nvidia's business practices deplorable and I do not like that I support their business practices.

But no HDMI 2.0 is a deal breaker because my 4K TV doesn't have DisplayPort. AMD does not look like they will have a single viable alternative to Maxwell this generation. It will be at least another year before AMD cards are viable for 4K gaming.

That thing I highlighted. Yeah everyone just read that.
I don't care how good Nvidia is doing. Or that you bought the 980ti or titan x. I care that there are far to many threads for something that effects all the people that want an extra 10fps at 4k on their super 4k tv's that have input lag up the ass.
 
You are incorrect. You mean 4k gaming on horrible input TV's.

They are perfectly fine for 4k gaming on monitors.

People who want to game on 4k Tv's are a very very very very very very small minority. And those people already have Nvidia cards with HDMI 2.0.

Otherwise real gamers use monitors. It's not like Kyle or Brent are reviewing games on a 4k tv....

Most people buying into 4K screens are buying a screen where the increased resolution actually matters -- a real 4K TV, not a TV for ants (a monitor). You don't have to be a neckbeard who crouches over a tiny display at a desk in mom's basement to be a "real gamer." A lot of us having living rooms, big surround sound speaker setups, friends... families... and for us, having a big screen is important.

Of the people I know PC gaming at 4K, I see way more using a TV than a monitor. All the monitor people are too busy circlejerking over their 120/144 Hz and 1440p resolution. People interested in 4K are the same kind of people fine with 60 Hz so they are going for TVs. These are the kind of people who don't give a shit about having 20-40 ms of input lag. The 120 Hz crowd is the crowd that cares about that stuff, and they're still stuck at 1440p.
 
Is there really a need to put other people down based on what they choose to game on? I have 3 gaming PCs in my house, one of those is hooked up to a TV. I don't see how me sitting down to game on my couch instead of my desktop makes me less of a gamer.

Wasn't putting anyone down, Just being honest. 4k Tv's are not made for gaming. I am not putting tv's down at all. I have 2 myself in my own house (not 4k) not including my large format monitor.

What I am saying is people are bitching about no 2.0 HDMI on the new AMD card. They already have an Nvidia card, and are just complaining to complain.

Everyone knows it was a huge mistake not to have it, but it seems like all of a sudden the whole world is running 4k60hz Tv's for gaming for some odd reason, when that isn't even the case.
 
at least a lot of us recognize it but some newbie's won't. Pretty sad imo
I haven't even surfed the other sites but I am sure it is all over the net.
 
Is there really a need to put other people down based on what they choose to game on? I have 3 gaming PCs in my house, one of those is hooked up to a TV. I don't see how me sitting down to game on my couch instead of my desktop makes me less of a gamer.

i think there is because the discussion has gone into the ridiculous. Especially when the only card that you can really use to game at 4K and have reasonable frame rate with at least some of the features turned on is the TITAN. And we all know that everyone here isn't running on that. Or they could have SLI, so are we now going to believe that everyone has SLI in their HTPC w/ 4K TV?!?!? Yeah no.
 
Ok, thanks for the warnings, I will make sure my next TV has a display port.

I didn't expect 60+FPS at 4K in this generation anyway. Not even on crossfired FuryX

Im waiting on next generation AMD, maybe waiting for a proven, mature platform has it's merits.
 
i think there is because the discussion has gone into the ridiculous. Especially when the only card that you can really use to game at 4K and have reasonable frame rate with at least some of the features turned on is the TITAN. And we all know that everyone here isn't running on that. Or they could have SLI, so are we now going to believe that everyone has SLI in their HTPC!?!? Yeah no.

Spot on post!
 
I get a stable 60 FPS on everything I play at 4K on my 980 Ti's mate... and I set all settings to ultra. The Witcher 3, Evolve, Sleeping Dogs, Dying Light, etc. 4K is here and ready to go. It's no longer some mythical resolution that is out of reach.

You do realize those games were tested here, right? Go look at the benches. You ARE NOT running 60 FPS stable on the games you listed.
 
i think there is because the discussion has gone into the ridiculous. Especially when the only card that you can really use to game at 4K and have reasonable frame rate with at least some of the features turned on is the TITAN. And we all know that everyone here isn't running on that. Or they could have SLI, so are we now going to believe that everyone has SLI in their HTPC w/ 4K TV?!?!? Yeah no.
The reason I find it disappointing is that a Fury Nano would fit perfectly into my NCASE M1 given the thermals, board size and power requirements, while most likely ending up significantly faster than any of the other mITX boards on the market. But without HDMI 2.0, I won't be able to hook it up to a 4K receiver. I realize I'm in the minority of gamers interested in this, but it's still a bummer. That card is the perfect gaming HTPC card.
 
Well maybe the nano will include the hdmi 2.0 for those wanting a card for htpc. It does seem like there missing out on a certain market share buy not having this feature at all...This i agree with even though i have no need for it just yet
 
Wasn't putting anyone down, Just being honest. 4k Tv's are not made for gaming. I am not putting tv's down at all. I have 2 myself in my own house (not 4k) not including my large format monitor.

What I am saying is people are bitching about no 2.0 HDMI on the new AMD card. They already have an Nvidia card, and are just complaining to complain.

Everyone knows it was a huge mistake not to have it, but it seems like all of a sudden the whole world is running 4k60hz Tv's for gaming for some odd reason, when that isn't even the case.

You mean that people interested in the latest, most expensive, fastest graphics card from a company are ahead of the curve when it comes to what display they are gaming on?

I am SHOCKED, truly SHOCKED by this revelation!

Ok, thanks for the warnings, I will make sure my next TV has a display port.

I didn't expect 60+FPS at 4K in this generation anyway. Not even on crossfired FuryX

Im waiting on next generation AMD, maybe waiting for a proven, mature platform has it's merits.

Hope you like Panasonic... and only Panasonic. Because they are literally the ONLY 4K TV manufacturer that puts a DisplayPort on their sets.

You do realize those games were tested here, right? Go look at the benches. You ARE NOT running 60 FPS stable on the games you listed.

Do I need to start making some 4K videos in Shadowplay with a PlayClaw FPS/CPU/GPU/VRAM usage overlay in the top left corner of the screen and uploading them to YouTube? Because I'll totally do it. Don't push me! ;)
 
Do I need to start making some 4K videos in Shadowplay with a PlayClaw CPU/GPU usage overlay in the top left corner of the screen and uploading them to YouTube? Because I'll totally do it. Don't push me! ;)

Are you running them in SLI? If not then feel free to show me the single 980 TI locked at 60 FPS @ 3840 x 2160 which is the standard 4K TV resolution. You said stable 60 so this means it shouldn't dip below that and please don't record a video with every feature turned off.
 
Um I kind of said earlier that it's limited to TITAN and SLI. You kind need to read upwards.... maybe you missed it. Could it be your screen is too small?:D
Yeah guy originally said that he had 980ti's, looks like we both need to get 4k tv's. :p

Either way even 980ti's or titianX's in sil don't get smooth 60 fps game play at max setting even on some older games.

Scroll down it shows all the min frame rates too. (it really must be time for a monitor upgrade :D)
 
Case in point. 1 thread about 4GB of Vram, 1 thread about no DVI, 1 thread about no HDMI 2.0, next will be a thread about only having 12.0 DX12 support.

Next will be power usage...etc etc etc etc etc

Maybe if there were not so many problems, there would not be so many threads. Simple logic really.
 
Scroll down it shows all the min frame rates too. (it really must be time for a monitor upgrade :D)

OK so we have GTA V with a min of 22.7 FPS (so even movies run faster), Metro 47.5 etc etc.. and that's in SLI.

Oh I'm going to buy a new TV just at 1080. :D I kind of want to play a game that doesn't look like stop animation.
 
Maybe if there were not so many problems, there would not be so many threads. Simple logic really.

I think if people were genuinely interested in buying a product they would have found this information in the AMD forum and decided if it was a problem for them or not. Simple logic really.
 
Here's a note, the guy (AMDMatt) is actually checking to confirm whether or not he misspoke.

So, as usual, we don't actually know the full specs of the Fury X.
 
Here's a note, the guy (AMDMatt) is actually checking to confirm whether or not he misspoke.

So, as usual, we don't actually know the full specs of the Fury X.
Yeah, I would be REALLY surprised if Fury didn't support HDMI 2.0. I was surprised when I read that it did not.
 
Maybe if there were not so many problems, there would not be so many threads. Simple logic really.

Simple logic is to go to the AMD sub forum where all this information is.

Just like the 3.5+0.5GB vram issue with the 970 GTX.
 
I think if people were genuinely interested in buying a product they would have found this information in the AMD forum and decided if it was a problem for them or not. Simple logic really.

People looking to compare cards might not go to the AMD subforum. Not to mention they might not want the overly biased point of view presented there. At least here you can get a more balanced and realistic point of view.
 
Back
Top