Why would AMD release a 4K video card without HDMI 2.0 and with 4 GB of VRAM

0.01 % are the rich guys who will give you their money for extreme cards.
 
The apologists in here are hilarious. Terrible defense of AMD. AMD is marketing this thing for 4k, and the highest end 4k panels right now happen to be TVs. I guarantee there's a large overlap between people buying 4k TVs for PC and those who buy high end GPUs. You say only. 01% of the population are buying 4k TVs for PC, but how many people buy high end cards to begin with.. Not many, so why would you alienate part of your already small market base. I've seen more people buying and say they're buying a 980ti after finding out about the lack of 2.0 than I've seen state that they're buying a Fury.

Also to the people action like 4k performance is out-of reach.. It's not.. Only a handful of games make 4( difficult with all settings maxed. Older and non intensive games like CS:GO and LoL look great at 4k. HARDOCP didn't even bother testing 1080p for the 980ti, probably because it's assumed if you're buying a 600 USD + GPU, then you've probably moved on from 2005 resolutions.
 
I too am very disappointed with AMD's decision to exclude HDMI 2.0 support on their new cards just doesn't seem to make sense to me, I mean, why? If NVIDIA and other manufacturers are implementing HDMI 2.0 on their products, I don't see why AMD couldn't. I guess they don't want to cater 4K TV users who prefer a bigger screen and most 4K TV's lack a DisplayPort therefore HDMI 2.0 will provide 60Hz with 4:4:4. If they insist on pushing DisplayPort why aren't they using DisplayPort 1.3 but still using that legacy DP 1.2 from 2009 I believe along with that inferior 1.4a HDMI which is already old and outdated. Great move AMD, this will only decrease your sales.
 
If you want my opinion. It is the Nvidia focus group members out in full force.

Their job is to put down AMD in all forums across the internet. So anything negative about the new cards...They will post/start threads about.

Case in point. 1 thread about 4GB of Vram, 1 thread about no DVI, 1 thread about no HDMI 2.0, next will be a thread about only having 12.0 DX12 support.

Next will be power usage...etc etc etc etc etc

Holy shit man, I am an AMD owner and have proudly purchased almost a dozen ATI products, I've been anti-Nvidia for ages because they overprice their products and push closed standards.

AMD IS FUCKING RETARDED for not including HDMI 2.0!
 
You are incorrect. You mean 4k gaming on horrible input TV's.

They are perfectly fine for 4k gaming on monitors.

People who want to game on 4k Tv's are a very very very very very very small minority. And those people already have Nvidia cards with HDMI 2.0.

Otherwise real gamers use monitors. It's not like Kyle or Brent are reviewing games on a 4k tv....

Are you paying attention to what people are buying? No one is buying 4k monitors, they're too small but the 4k TV owners threads are hundreds of pages long.
 
Holy shit man, I am an AMD owner and have proudly purchased almost a dozen ATI products, I've been anti-Nvidia for ages because they overprice their products and push closed standards.

AMD IS FUCKING RETARDED for not including HDMI 2.0!


This.

I hadn't used an Nvidia product since my Diamond TNT Ultra until I got my Samsung 4k, which forced me off of my Tri X 290 into a couple of 970s . I would go back to AMD no problem if Fury X came out with a huge advantage over the ti, but that's irrelevant now that AMD made that choice for me.
 
This.

I hadn't used an Nvidia product since my Diamond TNT Ultra until I got my Samsung 4k, which forced me off of my Tri X 290 into a couple of 970s . I would go back to AMD no problem if Fury X came out with a huge advantage over the ti, but that's irrelevant now that AMD made that choice for me.

TDSlam is the perfect example of the kind of customer AMD can't afford to lose. This is someone that isn't buying 1 AMD card, he BOUGHT 3! Then sold them to buy 970s because he didn't have a choice.
 
As I've said elsewhere, this is when AMD would benefit greatly from Samsung releasing those Freesync 4K monitors.
 
As I've said elsewhere, this is when AMD would benefit greatly from Samsung releasing those Freesync 4K monitors.

I saw a reviewer receive a few of them. No results yet though.
The issue is the sizes are limited to 24-31.5".

Most of us who have 4K sets are using 40-48" panels, and will not downgrade to anything smaller because of FreeSync, G-Sync, 120Hz etc.
HDMI 2.0 gives us access to large 4K panels.
 
Are you paying attention to what people are buying? No one is buying 4k monitors, they're too small but the 4k TV owners threads are hundreds of pages long.

There's a CRT thread with over 3 million views and 12000 responses. Your point?
 
I feel like upgrading to a 40"+ 4K panel monitor is kind of defeating the point of greater pixel density, but to each their own.
 
So apparently AMD stated that the lack of HDMI 2.0 is due to the architecture. So AIB will not be able to add it even if they wanted to. Fury X2? No HDMI 2.0 either. HDMI 2.0 for AMD will have to be next year at the earliest. Shame.

Yeah I'm a little salty. Was looking forward to having a choice. Now my choice is Nvidia or Nvidia.
 
I feel like upgrading to a 40"+ 4K panel monitor is kind of defeating the point of greater pixel density, but to each their own.

Increasing the DPI to 200% on smaller panels defeats the purpose as well.
At 40", you can have the clarity of a 110ppi monitor at 100% scaling. Before 4K, anything this size was blurry or the image seemed zoomed in with large icons.
 
I had to also sell my 290x's to move to hdmi 2.0 with the 970s in SLI.

AMD is out of the loop BIG TIME.

You really honestly have to ask who these cards are really for? To me, they are strictly 1080p cards. Clearly, 4k is just marketing to them and an after thought

And now, new card, no hdmi 2.0?

Sorry I just will never game on a microscopic 27" 4k that .... this is absolutely retardation
 
The apologists in here are hilarious. Terrible defense of AMD. AMD is marketing this thing for 4k, and the highest end 4k panels right now happen to be TVs. I guarantee there's a large overlap between people buying 4k TVs for PC and those who buy high end GPUs. You say only. 01% of the population are buying 4k TVs for PC, but how many people buy high end cards to begin with.. Not many, so why would you alienate part of your already small market base. I've seen more people buying and say they're buying a 980ti after finding out about the lack of 2.0 than I've seen state that they're buying a Fury.

Also to the people action like 4k performance is out-of reach.. It's not.. Only a handful of games make 4( difficult with all settings maxed. Older and non intensive games like CS:GO and LoL look great at 4k. HARDOCP didn't even bother testing 1080p for the 980ti, probably because it's assumed if you're buying a 600 USD + GPU, then you've probably moved on from 2005 resolutions.

Very sobering with some excellent points. And even IF the amount of people buying 4K60 TV's was just a small percent, this chip is AMD's platform for the next 2-3 years and the amount of people buying those TV's will only be increasing year over year. Plus you've got the VR head sets coming out that will require HDMI, and Valve is about to break into the living room console space at the end of the year with Steam Machines which means even more gaming PCs connecting to living room TV's.

And even if someone doesn't have an HDMI 2.0 display just yet, you want your GPU to be future proof,
especially when you're sinking $650 and the competition has a card that does HDMI 2.0 for less than $200

The condescending attitude from some about "just get displayport monitor, peasant" is a little out of touch with trends and reality here. All you're doing is alienating potential customers.
 
Last edited:
So the new argument is that monitors are microscopes and should be used by ants. Come on guys please try harder, these nonsensical arguments are getting more and more far fetched.
If you personally need HDMI 2.0 that is good for you, everyone else on planet earth seems to think this is a sad oversite by AMD not nuclear bombs raining everywhere.
 
I actually got in contact with an AMD rep and have not heard back from him. I think this is a subject they just want to go away. I sent another email this morning asking them to address this issue and giving them links to the hdmi 2.0 / Samsung 4k tread along with performance numbers and cost
 
I actually got in contact with an AMD rep and have not heard back from him. I think this is a subject they just want to go away. I sent another email this morning asking them to address this issue and giving them links to the hdmi 2.0 / Samsung 4k tread along with performance numbers and cost

Good to have Nvidia heralds checking this out for us. I kid I kid.
Seriously people have been informed now and can make a decision based on their needs. Was that not the propose? TO let people know so they could make informed decisions?
 
So the new argument is that monitors are microscopes and should be used by ants. Come on guys please try harder, these nonsensical arguments are getting more and more far fetched.
If you personally need HDMI 2.0 that is good for you, everyone else on planet earth seems to think this is a sad oversite by AMD not nuclear bombs raining everywhere.

You're pretty much missing the point.

There is no DVI or HDMI 2.0 and there are no adapters that support 4k@60hz ... this card is basically for 1080p as there is only 4gb of ram.

You have to look at the bigger picture.

A lot of people are really confused by this product. It just doesn't fit in many places. Let's face it, it's not a forward looking modern product as far as connectivity and frame buffer memory goes. What few great attributes this card has is held back in many other areas. That's the core argument.
 
You're pretty much missing the point.

There is no DVI or HDMI 2.0 and there are no adapters that support 4k@60hz ... this card is basically for 1080p as there is only 4gb of ram.

You have to look at the bigger picture.

A lot of people are really confused by this product. It just doesn't fit in many places. Let's face it, it's not a forward looking modern product as far as connectivity and frame buffer memory goes. What few great attributes this card has is great held back in many other areas. That's the core argument.

I am looking at the bigger picture...
First of all you need 2 of any teams cards to get better 4k gaming (above 30-40 fps) an even then it is not smooth game play without turning several settings down.
Sli and crossfire suck.
Why is everyone getting so riled up then? Was everyone planing to buy two 980ti's or two fury x's?
Everyone that bought a 980 or a 970 wasted their money because 4gb is not good enough for the future.

The only place this product fits, is in the PC's of people who were interested in buying it, and after looking at what it offered decided to buy it or not.

Now what was that about the future? The new HBM pascal is coming out in about a year why is everyone buying cards now that will be really outdated as we are not only getting a node change next year but new memory across the board.

Some things to think about.
 
As there is no more room on the wafer for more chips (and that would also mean more lanes), the only way you can go is by increasing the memory per chip.

I would say getting 8 Gigs stacked had way lower then expected yield turnouts.

I agree HDMI 1.4 is a disappointment. DP will handle it.

To answer your question, according to AMD's PR's, there's a whole family coming out based on the new Fury architecture. So you will likely see a refresh coming that fixes both issues possibly in fall or spring next year.
 
So apparently AMD stated that the lack of HDMI 2.0 is due to the architecture.

That's a surprising explanation. I wouldn't think something like that would go all the way down to the architecture. I just figured it was an interface thing.

Do these cards at least have HEVC and VP9 decode?
 
Well, the end is endless, new products will be released all the time..... If you don't get a card now, you wait for pascal. How about something new after pascal? Wait for that card? The cycle goes on.....

Again this cycle is different because not only is the node improving after 3+years of 28nm we are also getting new memory tech in all cards that will more than likely make every card really obsolete.
All this is a year away, and yet the argument is that enough people will buy 4k tv's or monitors in that year, and also buy two GPU's and either crossfire or sil them to play on 4k, in magnitude that any company that does not support HDMI 2.0 is going to fail and that is why this hdmi 2.0 thing is so big, THE FUTURE.
Well neither the 980ti or the fury x are in anyway going to be something great in a year, and everyone is not going to buy 4k sets even at the enthusiast level in the next year.
 
I am curious the see with the 4096 bit memory bus along with the 4GB of HBM, maybe the high end / ultra textures will not be so bad at all at 4k when being swapped in / out.
 
First of all you need 2 of any teams cards to get better 4k gaming (above 30-40 fps)

The preliminary benchmarks show Fury X capable of >30 fps for newer games at high quality settings. It's kind of being positioned as a good 4K card; you seem to be arguing against that.

If it's not for 4K then maybe we'll just focus on the lower-resolution comparison benchmarks.
 
Again this cycle is different because not only is the node improving after 3+years of 28nm we are also getting new memory tech in all cards that will more than likely make every card really obsolete.
All this is a year away, and yet the argument is that enough people will buy 4k tv's or monitors in that year, and also buy two GPU's and either crossfire or sil them to play on 4k, in magnitude that any company that does not support HDMI 2.0 is going to fail and that is why this hdmi 2.0 thing is so big, THE FUTURE.
Well neither the 980ti or the fury x are in anyway going to be something great in a year, and everyone is not going to buy 4k sets even at the enthusiast level in the next year.

Strawman. You don't actually need two GPU's to get 4K60. Single 980Ti can drive it fine in all but the most demanding games, and even in the most demanding the settings can be tuned down. Hell Ive been playing some of my old favorites at 4K60 on a single 970.
 
Last edited:
Keep this in mind: AMD and nVidia would prefer you buy monitors. Specifically Freesync/Gsync enabled monitors. Then they can better control your experience via drivers/features/etc.

They do not want you to use TVs.
 
Keep this in mind: AMD and nVidia would prefer you buy monitors. Specifically Freesync/Gsync enabled monitors. Then they can better control your experience via drivers/features/etc.

They do not want you to use TVs.

Interesting theory but the tinfoil hat is sitting a little crooked, since TitanX down to the $199 GTX960 have both DVI and HDMI 2.0
 
Strawman. You don't actually need two GPU's to get 4K60. Single 980Ti can drive it fine in all but the most demanding games, and even in the most demanding the settings can be tuned down. Hell Ive been playing some of my old favorites at 4K60 on a single 970.

No a 980ti cannot do that. Unless you are playing medium settings or some thing to that affect, (in modern games). I do how ever agree that older games will run at the 4k 60hz resolution I was considering playing fallout 3 again, no better way to play than a 4k 60hz.
 
Interesting theory but the tinfoil hat is sitting a little crooked, since TitanX down to the $199 GTX960 have both DVI and HDMI 2.0

Right, but what TVs have their tech branding on them? Now which monitors do?

0 vs !0

Just because they support legacy ports doesn't mean they are actively encouraging them. Whereas they have money invested into Gsync, you don't think they want people using those monitors?
 
Keep this in mind: AMD and nVidia would prefer you buy monitors. Specifically Freesync/Gsync enabled monitors. Then they can better control your experience via drivers/features/etc.

They do not want you to use TVs.

They can't force people to anything. People with the money to buy these high end cards will buy the best display possible. If that only has HDMI 2.0 then they going with whoever has the best card with HDMI 2.0.
 
They can't force people to anything. People with the money to buy these high end cards will buy the best display possible. If that only has HDMI 2.0 then they going with whoever has the best card with HDMI 2.0.

PREFER. Not force.

Limiting variables allows for fewer troubleshooting problems.

They would both prefer everyone use monitors. Hence the Samsung Freesync monitors that were announced, and the continued rollout of new Gsync monitors. How many TVs?
 
Do these cards at least have HEVC and VP9 decode?

I'd bet against it tbh, I reckon that Fury is nothing but a big Tonga with HBM bolted on (to save money on investing in a new architecture before 14/16nm hits)
 
Right, but what TVs have their tech branding on them? Now which monitors do?

0 vs !0

Just because they support legacy ports doesn't mean they are actively encouraging them. Whereas they have money invested into Gsync, you don't think they want people using those monitors?

I really don't understand your point.
Just because they have a proprietary feature like G-Sync, you think Nvidia will only recommend G-Sync enabled monitors?

Only stupid companies will do that.
Living room PC gaming is growing, and Nvidia knows that.
 
http://forums.overclockers.co.uk/showthread.php?t=18677121&page=24
Just to confirm.

The AMD Radeon™ Fury X is an enthusiast graphics card designed to provide multi-display 4K gaming at 60Hz.

In addition, Active DisplayPort 1.2a-to-HDMI 2.0 adapters are set to debut this summer. These adapters will enable any graphics cards with DP1.2a outputs to deliver 4K@60Hz gaming on UHD televisions that support HDMI 2.0.
 
http://forums.overclockers.co.uk/showthread.php?t=18677121&page=24
Just to confirm.

The AMD Radeon™ Fury X is an enthusiast graphics card designed to provide multi-display 4K gaming at 60Hz.

In addition, Active DisplayPort 1.2a-to-HDMI 2.0 adapters are set to debut this summer. These adapters will enable any graphics cards with DP1.2a outputs to deliver 4K@60Hz gaming on UHD televisions that support HDMI 2.0.

Most of us never saw a good DP to HDMI adapter.
They basically telling the customer to spend and extra $50-100 to get their product to work as desired.

So the price is really $749 for 4K TV users and that if the adapter works.

I will like to know if they will have a recommend vendors list of adapters that are certified to work.
I think that's the least they can do.
 
http://forums.overclockers.co.uk/showthread.php?t=18677121&page=24
Just to confirm.

The AMD Radeon™ Fury X is an enthusiast graphics card designed to provide multi-display 4K gaming at 60Hz.

In addition, Active DisplayPort 1.2a-to-HDMI 2.0 adapters are set to debut this summer. These adapters will enable any graphics cards with DP1.2a outputs to deliver 4K@60Hz gaming on UHD televisions that support HDMI 2.0.

Active. Even if it works, I wonder how many frames of lag this introduces.
 
No proof those adapters will come out this summer. Also what I saw from that one manufacturer (Bizlink) said that they planned to release it Q4 of 2015 (I can't find where).

Those active adapters usually cost at least $100 and introduce problems like lag,
 
Back
Top