Why would AMD release a 4K video card without HDMI 2.0 and with 4 GB of VRAM

HDMI is an old, un-necessary connection technology that should have been deprecated 2 years ago. Just sayin'

Display Port *mic drop*


If they lose 10% of their potential sales, that could be the difference between losing money after fixed cost and R&D, and making a profit. From a business standpoint, not having HDMI 2.0 is a big mistake.

Obviously their marketing and engineers feel differently.
 
Can you all please cut the bullshit and mudslinging about who is a real gamer or not, or criticizing people's display choices?

It's 2015, the boards include an HDMI port - it should be 2.0. Period. There is no excuse.
 
It's 2015, the boards include an HDMI port - it should be 2.0. Period. There is no excuse.

It's 2015, 4khd tvs should include Display Port 1.2 minimum. There's no excuse. Same song, but no one's singing it.
 
Can you all please cut the bullshit and mudslinging about who is a real gamer or not, or criticizing people's display choices?

It's 2015, the boards include an HDMI port - it should be 2.0. Period. There is no excuse.

Can you all please cut the bullshit and mudslinging about who is a real video card manufacturer or not, or criticizing company's display connection choices?

FIFY
 
TVs should include display port.

I think though that AMD had to include display ports for the sake of freesync multimonitor. It would be somewhat stupid if their flagship lacked that ability. Regular DVI is solved with an adapter.
 
Everyone should get out of their bubbles.

There are those who prefer 144hz 1080P. There are those who prefer 4K/60. There are those who feel 60hz is too low, and there are those who have no issues gaming on 30h-60hz. There are those who feel 12ms or less is absolutely required, and there are those who see no input lag at all even with 40ms. There are those who play FPS all the time, so they need low input lag, 144hz, and don't care about 4K fidelity. Then there are those who prefer 4K fidelity and don't care about 144hz, because they play mostly RPG, strategy, MMORPG, etc. Not everyone prefer the same thing and not everyone play shooters and FPS exclusively! Let's not discount the ones who work from home doing photo editing, programming, web design who prefer 4K and 40" or bigger display. That's a huge audience as well and only TV's are in that size. Input lag is not a big deal to them.

There are 4K TV's now that does sub 30ms input lag, superior to IPS monitor display quality, and even quantum dots. They lose in some areas, but make up in other areas, scaler for example. Some 4K TV's display 1080P content so well that it looks native. No monitor can compete in the scaler department. Then there are others that optionally provide 1080P/120hz in addition to 4K. Many of you are not keeping up, but TV's are now very competitive to monitors.

There are two camps, the ones who play 144hz and dismiss the 4K big screen gamers, and there are those who play 4K 40-48" with superb contrast and better than IPS color quantum dots. Neither are wrong in their choices.

However, from a business standpoint, these two groups are both targets of Nvidia and AMD's flagship. They are not normal gamers. They are the ones with money and willing to spend ungodly amount to feed that hobby. These flag ship cards are not shooting for the 750Ti crowd. The 4K TV and 144hz audience are small in the grand scheme of things, but for the flag ship GPU audience, they make up a bigger piece of the purchasing pie.

From a business standpoint, alienating a part of that potential group of spenders is idiotic. There's no need to get into a fanboy defensive posture about it. It boils down to a business decision and how to gain market share. Let's assume that AMD make 100k of these GPU for sale in its lifetime. If they lose 10% of their potential sales, that could be the difference between losing money after fixed cost and R&D, and making a profit. From a business standpoint, not having HDMI 2.0 is a big mistake.

Thanks for the lesson. My point has and will continue to be that 4k gaming right now is full of compromise (for both camps) and costs a lot to achieve, regardless of if you like TV or monitors.
I can only assume the amount of people that will want to crossfire or sil while paying $1300 and getting less than smooth 60 fps at 4k on a TV (unless setting are turned down) are a very very extremely limited few even in the enthusiast community.
So that being said this is a minimal issue. Yes it sucks, and was a horrible over site but at the end of the day all 100+ people that were considering doing this (using hdmi 2.0 on a 4k tv with $1300 worth of graphics cards) will just get Nvidia cards if they feel the need to. How much money did AMD lose? $130000. Big deal.

Everyone is saying this like it is the end of this company and the worst possible thing that could have happened etc...

Well once again it is NOT!
 
It's 2015, 4khd tvs should include Display Port 1.2 minimum. There's no excuse. Same song, but no one's singing it.

Market confusion and additional cost (albeit minimum) are always deciding factors.

You have to remember, DP was a royalty free alternative developed by computer mfg's mostly to avoid paying royalty cost. HDMI is promoted by the Home Entertainment industry which TV's are aimed at.
 
The simple fact of the matter is 4K gaming is not mainstream yet. Support is very new and currently no single GPU system is going to run 4K well enough. We are talking about a video card here not a cable box/DVD player/Blu-Ray etc. Why would AMD put more R&D toward a card just for HDMI 2.0 when 90% of people who buy a video card are going to be connecting it to a monitor using display port or DVI? While yes it would be a nice to have HDMI 2.0 on the GPU, we are only talking about such a small niche market of gamers that would need this option. Also cost would be higher, albeit maybe not much but still.
 
I see.

The original thread title is, and I quote, " Why would AMD release a 4k card without HDMI 2.0 and with 4 GB VRAM?"

http://hardforum.com/showpost.php?p=1041683213&postcount=217

Why is it so difficult for many of you to take the time to present evidence that the OP's assumptions are indeed false? What evidence have you all presented to prove the OP incorrect or misguided pertaining to real-world benchmarks?

You haven't.
 
The only info we have right now is
1) specifications
2) benchmark provided by AMD.
3) NVIDIA launched the 980 Ti earlier than expected at the relatively competitive price of $650. This suggests that NVIDIA views AMD new offering as a worthy threat

While the results are promising, it should be noted that
1) Games and settings were decided by AMD. Results are possibly biased.
2) We do not know whether image quality is similar to NVIDIA (eg.g shimmering).
3) Stability is unknown (drivers!)
4) OC potential is unknown
 
Dude no one is butt hurt except for you from the looks of it. You said maxed out and I watched your videos and the settings were not maxed out on certain games(on top of that one of your games was running at 40-50fps)

Nope, they were maxed out. You can clearly see in the image quality menus all options set to maximum except for the retarded image-quality ruining options like "depth of field," "film grain," and "motion blur." OF COURSE those were disabled, they always should be disabled. I don't consider a game "maxed out" with that trash enabled.

All view distance/shadows/textures/etc. were set to maximum values in my videos. I didn't disable that post-processing shit for better performance; I disabled it because it looks terrible. Why would I intentionally turn on a graphical option that would make my game look and run worse?

It's 2015, 4khd tvs should include Display Port 1.2 minimum. There's no excuse. Same song, but no one's singing it.

What the hell are you talking about? There is not a single PC gamer using their TV as a display who doesn't wish it also had a DisplayPort. We're all singing it, and we all know there's nothing we can do about it. So we have to deal with this HDMI trash.

No HDMI 2.0 user thinks it is superior to DisplayPort. We're just being held at gunpoint by the television manufacturers into using this shitty standard. I don't know where you get off thinking just because we want HDMI 2.0 capability it means we actually like this standard.
 
I see.

The original thread title is, and I quote, " Why would AMD release a 4k card without HDMI 2.0 and with 4 GB VRAM?"

http://hardforum.com/showpost.php?p=1041683213&postcount=217

Why is it so difficult for many of you to take the time to present evidence that the OP's assumptions are indeed false? What evidence have you all presented to prove the OP incorrect or misguided pertaining to real-world benchmarks?

You haven't.

We haven't? The card will not have HDMI 2.0 confirmed.
Why has no one posted any benches to prove 4gb is enough... because there are no benches that have been released by hardware sites. NDA is lifted on Wednesday supposedly.

As far as 60fps 4k the OP himself proved that it cannot be done without lowering settings and using at least 2 cards in sil, $1300 worth of graphics cards.

Why is it so difficult for people to actually look at facts on the situation of 60fps 4k or wait for benchmarks in regards to 4gb instead of post things like "What evidence have you all presented to prove the OP incorrect or misguided pertaining to real-world benchmarks?"
 
Nope, they were maxed out. You can clearly see in the image quality menus all options set to maximum except for the retarded image-quality ruining options like "depth of field," "film grain," and "motion blur." OF COURSE those were disabled, they always should be disabled. I don't consider a game "maxed out" with that trash enabled.

All view distance/shadows/textures/etc. were set to maximum values in my videos. I didn't disable that post-processing shit for better performance; I disabled it because it looks terrible. Why would I intentionally turn on a graphical option that would make my game look and run worse?



What the hell are you talking about? There is not a single PC gamer using their TV as a display who doesn't wish it also had a DisplayPort. We're all singing it, and we all know there's nothing we can do about it. So we have to deal with this HDMI trash.

No HDMI 2.0 user thinks it is superior to DisplayPort. We're just being held at gunpoint by the television manufacturers into using this shitty standard. I don't know where you get off thinking just because we want HDMI 2.0 capability it means we actually like this standard.

Well you had AA off in some titles as well. Not to mention your preferences do not mean the game is maxed out. Hell if that was the case some one might argue that they like medium settings while ultra setting look like trash to them and that is what they consider maxed out.

And the newest game is running at like 40-50 fps, so all the new games will follow suit there goes your 60fps 4k setup. Was a nice 2 months wasn't it?
 
Last edited:
The simple fact of the matter is 4K gaming is not mainstream yet. Support is very new and currently no single GPU system is going to run 4K well enough.

I guess that could be true but then we shouldn't give serious consideration to 4K benchmarks.
 
The simple fact of the matter is 4K gaming is not mainstream yet. Support is very new and currently no single GPU system is going to run 4K well enough. We are talking about a video card here not a cable box/DVD player/Blu-Ray etc. Why would AMD put more R&D toward a card just for HDMI 2.0 when 90% of people who buy a video card are going to be connecting it to a monitor using display port or DVI? While yes it would be a nice to have HDMI 2.0 on the GPU, we are only talking about such a small niche market of gamers that would need this option. Also cost would be higher, albeit maybe not much but still.

well over 90% i think. They did include an HDMI port and that is used by a crap ton of people. HDMI 2.0 is only beneficial for 4K and that is not going to be mainstream any time soon. Not when one would need a real high end GPU (more often 2) to get decent performance.
 
I guess that could be true but then we shouldn't give serious consideration to 4K benchmarks.

It matters for people on monitors with DP, but I personally prefer 1440p benches. That is the sweet spot imo. Manageable and some great screen tech there.
 
I guess that could be true but then we shouldn't give serious consideration to 4K benchmarks.

Yes we should not. 4k right now is used as a marketing tool. AMD was using it to market the 290x and Nvidia had to follow suit or look like a fool. Now we have all these people 4k this 4k that and either they compromise to experience 4k, refuse to run multi-card setups or they can't even afford to run 4k.

When the batman game comes out we can see how good 980ti's in sil still work at 4k... I'm betting 30 fps drops with regular 40-50fps game play at 4k. Win for everyone that bought 2 980ti's nothing like a smooth 60fps LOL :p
 
I wonder what its like to hate something so much that you'll never actually purchase? Seems like a waste of time to me much less make multiple posts about it.
 
This all comes down to functionality. We do not know if AMD did not have the extra money to budget the develop time that DVI and HDMI 2.0 Needed. We do not know if this was a way for AMD to push their customers to new monitors and away from their current displays. I am not saying that's why, I am saying these are very good questions and the need discussion with the ultimate goal being to draw AMD's attention and hopefully never have this happen again. There is literally no where you can not go and look that you won't find this discussion. I can name many many threads across any number of web forums.
 
As far as I am concerned, good riddance to the big and clumsy DVI. This thing has been around since 1999 and is currently on its way out as it is being replaced by DP. For the old timers stuck with a DVI only monitor, please note that a DP to DVI adapter costs around $10-$15.

The lack of HDMI 2.0 is a bit disappointing but PC monitors featuring HDMI 2.0 are rare. I suspect that only a few gamers use their 4K HDTV HDMI 2.0 for gaming. Thus, no big deal here.

Now, the 4GB limitation...
 
Nope, they were maxed out. You can clearly see in the image quality menus all options set to maximum except for the retarded image-quality ruining options like "depth of field," "film grain," and "motion blur." OF COURSE those were disabled, they always should be disabled. I don't consider a game "maxed out" with that trash enabled.

Depth of field is pretty damn important and turning it off is only something you do to minimize the amount of content being rendered. It's literally why that option is there. Motion blur utilizes the shaders and again there to minimize performance impact when turned off.

The reason why you don't see everyone moving to 4k is precisely because of these trade offs. I would MUCH rather have depth of field on versus rendering an image in a higher resolution and watch an object pop in or fade in unrealistically when the human eye can see for miles unobstructed. And we aren't talking about Ultra settings versus one feature here or there. For people running 780's, 980's, and 290's you are literally talking about the difference between running truly maxed out with no dips below 60 and running medium with constant dips below 60 (and more often if you don't have a second card) which is the whole point of needing HDMI 2.0. If you aren't going to maintain 60 then you might as well stick with a hard 30 with most features turned on which HDMI 1.4 provides.

The decision not to include or to include it isn't as clear as some make it out to be. Should it be there? Probably to avoid marketing pitfalls. However, including it is no slam dunk when NO single card can pull it off and 2 cards can barely do it with all features on if at all. You pair that up with no G-Sync or Freesync and really you are talking about gaming at a level that for me is at least a couple generations back and further if the game doesn't include textures meant for that resolution and that's just not my cup of tea... and I buy Nvidia cards almost exclusively.
 
Depth of field is pretty damn important and turning it off is only something you do to minimize the amount of content being rendered. It's literally why that option is there. Motion blur utilizes the shaders and again there to minimize performance impact when turned off.

Depth of field is shit, it looks like shit, and it has no business in video games. Why the hell would I want to turn that shit on and blur most of the objects on my screen? This cancer ruins console exclusive games because the peasant boxes don't allow you to turn it off like PC games do.

MORE content is rendered on-screen with depth of field OFF. Motion blur is shit created for console peasants so 30 FPS doesn't look as bad. It has no business in 60 FPS+ PC gaming. Why would I want to enable graphical effects that simulate flaws in human vision? I already have all the flaws of the human eye when I'm viewing a game through my human eyes; I don't need to double up on the human defects by turning on motion blur and DoF.

All these post-processing effects have been created to hide the flaws of console versions of games. Motion blur covers up low framerates and depth of field covers up poor draw distances. Chromatic aberration is the latest in these lines of eye cancers, joining "depth of field" and "motion blur" as an image quality ruining "feature." Chromatic aberration is like a really shitty form of anti-aliasing because the peasant boxes can't handle real anti-aliasing so they use this garbage instead to muddy up the screen and hide jaggies and bad textures.

It is facetious to run benchmarks with this crap turned on because no one who cares about visual fidelity would ever enable them in the first place.
 
We get that you don't consider them good. But get this, no one is going to agree that turning down settings is maxed out. Most people don't even agree with how you view these features.
 
I don't like motion blur and turn it off myself. I didn't pay $800 for a 144hz GSYNC monitor to have the games blur themselves.
 
No HDMI 2.0 user thinks it is superior to DisplayPort. We're just being held at gunpoint by the television manufacturers into using this shitty standard. I don't know where you get off thinking just because we want HDMI 2.0 capability it means we actually like this standard.

Sums up the thread.
 
Depth of Field is a stupid feature.Your eyes do it for you naturally.Hey guess what? When I'm aiming at some guy those trees way in the background aren't super sharp because that's how eyes work.If you do focus on them they become sharp.Unless of course, you're using Depth of Field.
 
Depth of Field is a stupid feature.Your eyes do it for you naturally.Hey guess what? When I'm aiming at some guy those trees way in the background aren't super sharp because that's how eyes work.If you do focus on them they become sharp.Unless of course, you're using Depth of Field.

your monitor doesn't have depth, so no your eyes don't do it for you, lol.
 
Depth of Field is a stupid feature.Your eyes do it for you naturally.Hey guess what? When I'm aiming at some guy those trees way in the background aren't super sharp because that's how eyes work.If you do focus on them they become sharp.Unless of course, you're using Depth of Field.

In 3D/real life space, yes. But looking at a monitor, no.
 
Back
Top