Why would AMD release a 4K video card without HDMI 2.0 and with 4 GB of VRAM

The cards with hdmi 2 can't even manage 60 fps with respectable settings in most modern games.

The ideal for gaming right now is 1440p at the high end. And at certain distances a 1440p is as much detail as a 4K TV, but that monitor would be much better for PC gaming

Additionally, Display port with adaptive sync support is a far bigger issue and I saw NOBODY complaining that nvidia was not adopting this in the past. Yet people are crying about something that offers little for PC gaming
 
My videos clearly show that new AAA games are using between 5 and 6 GB of VRAM when running at 4K resolution.

It remains to be seen if they really need to use that much. If you have it, they will use it. Sometimes putting data in VRAM that is not even used much.

I think it's properly silly to be turning down graphics settings to get high fps just because you were fooled into the 4K hype. High FPS plus max settings at a respectable resolution is still king. Not 4K and turned down effects.
 
Ok that was LordEC911 on 16 th of June in a different thread. I just dug up all the posts.

Yes and what I said is factually accurate.
There is no technical limitation of HBM that wouldn't have allowed them to add more stacks. Theoretically, they could have had 6-8 stacks if they would have used a larger interposer.

4 stacks of HBM got them the bandwidth they needed and the minimum VRAM capacity they thought is required at this time.

AMD is about to be broken up in two. They are just phoning it in at this point. This explains the lack of forethought.

Yep, that is why they have made the most complex GPU ever created and helped develop a brand new memory standard to replace GDDR5...
 
that is correct, also the GPU/fab reticular limit is pretty much reached with Fiji so using more stacks will increase the leads from the GPU and increase its size too.
 
I have my reservations on the 4GB as well, but it probably will only matter with 2 cards. The cards that have higher VRAM won't be faster solo
 
Alright, we get it, you love Nvidia. But why not wait until the official benchmarks before passing judgment on it being a good/bad 4k gaming GPU.

But then the trolls might not feel as superior after the facts come out. What fun is that?

Although they can always fall back on the "gotta" have HDMI 2.0 or the card is junk excuse.
 
Here my beef with NVIDIA:
NO PLP support

Here is my beef with AMD:
NO HDMI 2.0/DL-DVI

Other than that, both have AMAZING technology!

EDIT:
Thinking about this, I think this is my compromise solution...
PLP: AMD R9 285 in CrossFire for my work PC, and driving/older games, on old Dell 2007FP+3007WFP+2007FP

HDMI 2.0: NVIDIA 980ti in SLI for my HTPC/Gaming PC, on 58" Samsung 4K TV.
 
Last edited:
When all is said and done no one will find the furyx running out of VRAM in any game running available in game settings at 4K. nvidia fans will cry and then start talking about unreleased games.
 
I feel that the reason Fiji doesn't have HDMI 2.0 might be that it's somewhat as old as Tonga.
 
I feel that the reason Fiji doesn't have HDMI 2.0 is because it's somewhat as old as Tonga.
Some of the 300-series custom cards have HDMI 2.0 on them.
The 380 models have HDMI 2.0. Grenada and Pitcairn, too. The 7870 is 3 years old.
 
So only having 4Gigs matters now but during the 970 3.5 issue it was more than fine?
Thats what im getting here.

For the record I do believe 4Gig will show its limit but I think its odd that only 4-5 months ago it was perfectly fine when it was the only thing Nvidia had.
We shall see what special sauce they have cooked up with this card, and if its no good then they better hope to god they can get it up to 8GB soon.
 
So only having 4Gigs matters now but during the 970 3.5 issue it was more than fine?
Thats what im getting here.

You lack reading comprehension; that's what I'm getting here.

The GTX 970's VRAM has never been acceptable for 4K gaming nor has it ever been acceptable as a 4K gaming card. The GTX 970 has never been considered by anyone as an acceptable solution for 4K gaming. The GTX 970 has always been a budget card and never a flagship card meant for pushing 4K games.

4 GB of VRAM is fine for 1080p gaming. It is not acceptable for 4K gaming.

You are damn right having only 4 GB of VRAM matters when it is on the company's highest-end flagship card.
 
Nothing out there today is relevant to 4K gaming. Not from NV or AMD. Next generation video cards in 2016 will be where 4K takes off. Crapping on Fury is pointless when you can't crank all the eye candy with today's cards at 4K/60fps.
 
Nothing out there today is relevant to 4K gaming. Not from NV or AMD. Next generation video cards in 2016 will be where 4K takes off. Crapping on Fury is pointless when you can't crank all the eye candy with today's cards at 4K/60fps.

Guess what, 2016 cards won't be able to keep up with 2016 games. It's a rat race. Thinking that the next card will somehow hit 4k ultra with all games is foolish. You will always have to lower settings for the next gen games. That's how they keep selling new GPUs.
 
Guess what, 2016 cards won't be able to keep up with 2016 games. It's a rat race. Thinking that the next card will somehow hit 4k ultra with all games is foolish. You will always have to lower settings for the next gen games. That's how they keep selling new GPUs.

I absolutely do not agree with that. Today's cards can run 1080P on Ultra and believe it or not is still the standard resolution. Video cards will catch up to 4K and eventually 8K. That's not even throwing DX12 into the mix. You are also forgetting the continued consolization factor holding games back and its going to get worse in the short term as pc graphics continue to evolve. But go ahead and continue the bitch fest.
 
Guess what, 2016 cards won't be able to keep up with 2016 games. It's a rat race. Thinking that the next card will somehow hit 4k ultra with all games is foolish. You will always have to lower settings for the next gen games. That's how they keep selling new GPUs.
PC tech is currently scaling faster than gaming tech. Maybe blame the new consoles.
Deminishing returns also makes new resolutions less appealing.

We're approaching a plateau for 1080p, which is still the most popular resolution by a LARGE margin. I'm planning on buying a 980 Ti or Fury/X for my 1200p panel and keeping it until it dies, I have no interest in 4K or even 1440p, and at this rate it will still push 1080/1200p60 for at least 3 years.

A few years ago people were still fighting for 1080p60, now we're 2 YEARS into the new consoles and we're already fighting for 4K60. It's just insane.

Enthusiast tier cards will always be on the cutting edge, thus always pushing the limits. But for the heavy majority of PC gamers (where the most profit is!) the reasons to upgrade are becoming slimmer and slimmer.
 
So you're all telling me that Nvidia and AMD's flagships are targeting the 1080P crowd?
 
So you're all telling me that Nvidia and AMD's flagships are targeting the 1080P crowd?
Nah, but performance scales over time.
Look where relative 1080p performance is today compared to a few years ago.

Performance is going up faster than people's desire to increase their resolution. The cost of being a fringe PC gamer is just going to get exponentially worse since they are becoming more isolated.

VR will be mainstream before 4K is. So I guess that counts for something.
 
Nothing out there today is relevant to 4K gaming. Not from NV or AMD. Next generation video cards in 2016 will be where 4K takes off. Crapping on Fury is pointless when you can't crank all the eye candy with today's cards at 4K/60fps.

How many times do I have to repeat myself?

My videos clearly show that new AAA games are using between 5 and 6 GB of VRAM when running at 4K resolution.

4K gaming is here, it's viable, and you can max out almost every game out there and maintain 60 fps.

I post videos giving definitive proof of this and they are ignored by butthurt AMD fanboys.
 
I post videos giving definitive proof of this and they are ignored by butthurt AMD fanboys.
Most people are tired of explaining the difference between usage vs allocation, and what changes HBM might bring.
I could tell you why your numbers are meaningless but I am eating away at my membrane keyboard's lifespan. Pass.

It's going to be an ink blot test until Wednesday. You see a swan, other people see a naked woman. Just chill. 3 days people, you can make it without murdering each other, I promise.
 
How many times do I have to repeat myself?



4K gaming is here, it's viable, and you can max out almost every game out there and maintain 60 fps.

I post videos giving definitive proof of this and they are ignored by butthurt AMD fanboys.

Of course viable 4K gaming is here with dual 980 Ti's in sli. I bet dual 390X's would do it as well. Congratulations on being the 0.01% that is able to play 4K games with eye candy. That is when you can get working sli profiles. What about the other 99.9% that are waiting for 4K monitors/TV's at reasonable prices and single video cards that can do it. Use some common sense.
 
Of course viable 4K gaming is here with dual 980 Ti's in sli. I bet dual 390X's would do it as well. Congratulations on being the 0.01% that is able to play 4K games with eye candy. That is when you can get working sli profiles. What about the other 99.9% that are waiting for 4K monitors/TV's at reasonable prices and single video cards that can do it. Use some common sense.

Remember though that there are a lot more people gaming at "4K" since DSR/VSR came along -- sharpens up the image and doesn't require a 4K display. So the use cases have expanded and thus many more people are interested in 4K gaming who don't necessarily have a 4K dislpay.
 
So true even I do on a lot of my steam games, DSR 4K on my 2560x1440 res. monitor
 
I was re-watching the AMD presentation and caught this slide.
Take note of all the games listed there.

A lot of Gaming Evolved games, yeah. Far Cry 4 is thrown in there... GTA V, too.

TKPVRV4.png
 
I can't stand 4k DSR on my 1440p monitor because it makes my UIs tiny
 
4K gaming is here, it's viable, and you can max out almost every game out there and maintain 60 fps.
I post videos giving definitive proof of this and they are ignored by butthurt AMD fanboys.

Dude no one is butt hurt except for you from the looks of it. You said maxed out and I watched your videos and the settings were not maxed out on certain games(on top of that one of your games was running at 40-50fps), furthermore you own a sil setup which costs $1300. The cost of what you are using aside it is sil, duel card usage wastes a lot of power, does not scale to the cards full potential and more often than not the drivers are not the best.
4K gaming is here for those that want to compromise,in regards to drivers, power, cost, and lower settings, to achieve it. Everyone else is waiting for a single, efficient, none compromising solution for 4k. End of story.
 
Dude no one is butt hurt except for you from the looks of it. You said maxed out and I watched your videos and the settings were not maxed out on certain games(on top of that one of your games was running at 40-50fps), furthermore you own a sil setup which costs $1300. The cost of what you are using aside it is sil, duel card usage wastes a lot of power, does not scale to the cards full potential and more often than not the drivers are not the best.
4K gaming is here for those that want to compromise,in regards to drivers, power, cost, and lower settings, to achieve it. Everyone else is waiting for a single, efficient, none compromising solution for 4k. End of story.


This post 100% sums it up.
I think we are about two generations from that.
Pascal might be able to do it later on but like with any other card we have to wait and see.
 
I still think the metal/radeon logos look cheap as hell.
Like something a kid would make in shop class.

Maybe it's the rigid edges?
 
Most people buying into 4K screens are buying a screen where the increased resolution actually matters -- a real 4K TV, not a TV for ants (a monitor). You don't have to be a neckbeard who crouches over a tiny display at a desk in mom's basement to be a "real gamer." A lot of us having living rooms, big surround sound speaker setups, friends... families... and for us, having a big screen is important.

Of the people I know PC gaming at 4K, I see way more using a TV than a monitor. All the monitor people are too busy circlejerking over their 120/144 Hz and 1440p resolution. People interested in 4K are the same kind of people fine with 60 Hz so they are going for TVs. These are the kind of people who don't give a shit about having 20-40 ms of input lag. The 120 Hz crowd is the crowd that cares about that stuff, and they're still stuck at 1440p.

I have no clue why they would do this. It seems pretty fucking stupid. But the quote above seems pretty damn accurate IMO
 
I have no clue why they would do this. It seems pretty fucking stupid. But the quote above seems pretty damn accurate IMO

The quote above is not accurate, with its many points that hold no water in the real world.
This guy that you quoted is ruining a $1300 sli setup and not getting smooth 4k 60 fps in games without turning the game settings down. Yet this is a viable solution for everyone that pc games (make sure you go spend $1300 today) and on top of that no one should own monitors because lets face it monitors suck (you would not want to use something made for ants would you?)

HDMI 2.0 is LIFE! No one forget that!

Oh and input lag, that thing is not important... just forget it exists. ;)
 
I know I'm late but I just skimmed through the thread... this is great comedy gentlemen. I see the usual suspects coming out of the woodwork again and I must say, they spent some time working on their message board posting expertise.

I should buy more popcorn to fully enjoy these last two days of speculation.
 
The quote above is not accurate, with its many points that hold no water in the real world.
This guy that you quoted is ruining a $1300 sli setup and not getting smooth 4k 60 fps in games without turning the game settings down. Yet this is a viable solution for everyone that pc games (make sure you go spend $1300 today) and on top of that no one should own monitors because lets face it monitors suck (you would not want to use something made for ants would you?)

HDMI 2.0 is LIFE! No one forget that!

Oh and input lag, that thing is not important... just forget it exists. ;)

Everyone should get out of their bubbles.

There are those who prefer 144hz 1080P. There are those who prefer 4K/60. There are those who feel 60hz is too low, and there are those who have no issues gaming on 30h-60hz. There are those who feel 12ms or less is absolutely required, and there are those who see no input lag at all even with 40ms. There are those who play FPS all the time, so they need low input lag, 144hz, and don't care about 4K fidelity. Then there are those who prefer 4K fidelity and don't care about 144hz, because they play mostly RPG, strategy, MMORPG, etc. Not everyone prefer the same thing and not everyone play shooters and FPS exclusively! Let's not discount the ones who work from home doing photo editing, programming, web design who prefer 4K and 40" or bigger display. That's a huge audience as well and only TV's are in that size. Input lag is not a big deal to them.

There are 4K TV's now that does sub 30ms input lag, superior to IPS monitor display quality, and even quantum dots. They lose in some areas, but make up in other areas, scaler for example. Some 4K TV's display 1080P content so well that it looks native. No monitor can compete in the scaler department. Then there are others that optionally provide 1080P/120hz in addition to 4K. Many of you are not keeping up, but TV's are now very competitive to monitors.

There are two camps, the ones who play 144hz and dismiss the 4K big screen gamers, and there are those who play 4K 40-48" with superb contrast and better than IPS color quantum dots. Neither are wrong in their choices.

However, from a business standpoint, these two groups are both targets of Nvidia and AMD's flagship. They are not normal gamers. They are the ones with money and willing to spend ungodly amount to feed that hobby. These flag ship cards are not shooting for the 750Ti crowd. The 4K TV and 144hz audience are small in the grand scheme of things, but for the flag ship GPU audience, they make up a bigger piece of the purchasing pie.

From a business standpoint, alienating a part of that potential group of spenders is idiotic. There's no need to get into a fanboy defensive posture about it. It boils down to a business decision and how to gain market share. Let's assume that AMD make 100k of these GPU for sale in its lifetime. If they lose 10% of their potential sales, that could be the difference between losing money after fixed cost and R&D, and making a profit. From a business standpoint, not having HDMI 2.0 is a big mistake.
 
HDMI is an old, un-necessary connection technology that should have been deprecated 2 years ago. Just sayin'
 
Back
Top