Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Wow looks like there are a lot of users who dont know about adapters.
If you want my opinion. It is the Nvidia focus group members out in full force.
Their job is to put down AMD in all forums across the internet. So anything negative about the new cards...They will post/start threads about.
Case in point. 1 thread about 4GB of Vram, 1 thread about no DVI, 1 thread about no HDMI 2.0, next will be a thread about only having 12.0 DX12 support.
Next will be power usage...etc etc etc etc etc
You are incorrect. You mean 4k gaming on horrible input TV's.
They are perfectly fine for 4k gaming on monitors.
People who want to game on 4k Tv's are a very very very very very very small minority. And those people already have Nvidia cards with HDMI 2.0.
Otherwise real gamers use monitors. It's not like Kyle or Brent are reviewing games on a 4k tv....
Holy shit man, I am an AMD owner and have proudly purchased almost a dozen ATI products, I've been anti-Nvidia for ages because they overprice their products and push closed standards.
AMD IS FUCKING RETARDED for not including HDMI 2.0!
This.
I hadn't used an Nvidia product since my Diamond TNT Ultra until I got my Samsung 4k, which forced me off of my Tri X 290 into a couple of 970s . I would go back to AMD no problem if Fury X came out with a huge advantage over the ti, but that's irrelevant now that AMD made that choice for me.
As I've said elsewhere, this is when AMD would benefit greatly from Samsung releasing those Freesync 4K monitors.
Are you paying attention to what people are buying? No one is buying 4k monitors, they're too small but the 4k TV owners threads are hundreds of pages long.
I feel like upgrading to a 40"+ 4K panel monitor is kind of defeating the point of greater pixel density, but to each their own.
The apologists in here are hilarious. Terrible defense of AMD. AMD is marketing this thing for 4k, and the highest end 4k panels right now happen to be TVs. I guarantee there's a large overlap between people buying 4k TVs for PC and those who buy high end GPUs. You say only. 01% of the population are buying 4k TVs for PC, but how many people buy high end cards to begin with.. Not many, so why would you alienate part of your already small market base. I've seen more people buying and say they're buying a 980ti after finding out about the lack of 2.0 than I've seen state that they're buying a Fury.
Also to the people action like 4k performance is out-of reach.. It's not.. Only a handful of games make 4( difficult with all settings maxed. Older and non intensive games like CS:GO and LoL look great at 4k. HARDOCP didn't even bother testing 1080p for the 980ti, probably because it's assumed if you're buying a 600 USD + GPU, then you've probably moved on from 2005 resolutions.
I actually got in contact with an AMD rep and have not heard back from him. I think this is a subject they just want to go away. I sent another email this morning asking them to address this issue and giving them links to the hdmi 2.0 / Samsung 4k tread along with performance numbers and cost
So the new argument is that monitors are microscopes and should be used by ants. Come on guys please try harder, these nonsensical arguments are getting more and more far fetched.
If you personally need HDMI 2.0 that is good for you, everyone else on planet earth seems to think this is a sad oversite by AMD not nuclear bombs raining everywhere.
You're pretty much missing the point.
There is no DVI or HDMI 2.0 and there are no adapters that support 4k@60hz ... this card is basically for 1080p as there is only 4gb of ram.
You have to look at the bigger picture.
A lot of people are really confused by this product. It just doesn't fit in many places. Let's face it, it's not a forward looking modern product as far as connectivity and frame buffer memory goes. What few great attributes this card has is great held back in many other areas. That's the core argument.
So apparently AMD stated that the lack of HDMI 2.0 is due to the architecture.
Well, the end is endless, new products will be released all the time..... If you don't get a card now, you wait for pascal. How about something new after pascal? Wait for that card? The cycle goes on.....
First of all you need 2 of any teams cards to get better 4k gaming (above 30-40 fps)
Again this cycle is different because not only is the node improving after 3+years of 28nm we are also getting new memory tech in all cards that will more than likely make every card really obsolete.
All this is a year away, and yet the argument is that enough people will buy 4k tv's or monitors in that year, and also buy two GPU's and either crossfire or sil them to play on 4k, in magnitude that any company that does not support HDMI 2.0 is going to fail and that is why this hdmi 2.0 thing is so big, THE FUTURE.
Well neither the 980ti or the fury x are in anyway going to be something great in a year, and everyone is not going to buy 4k sets even at the enthusiast level in the next year.
Keep this in mind: AMD and nVidia would prefer you buy monitors. Specifically Freesync/Gsync enabled monitors. Then they can better control your experience via drivers/features/etc.
They do not want you to use TVs.
Strawman. You don't actually need two GPU's to get 4K60. Single 980Ti can drive it fine in all but the most demanding games, and even in the most demanding the settings can be tuned down. Hell Ive been playing some of my old favorites at 4K60 on a single 970.
Interesting theory but the tinfoil hat is sitting a little crooked, since TitanX down to the $199 GTX960 have both DVI and HDMI 2.0
Keep this in mind: AMD and nVidia would prefer you buy monitors. Specifically Freesync/Gsync enabled monitors. Then they can better control your experience via drivers/features/etc.
They do not want you to use TVs.
They can't force people to anything. People with the money to buy these high end cards will buy the best display possible. If that only has HDMI 2.0 then they going with whoever has the best card with HDMI 2.0.
Do these cards at least have HEVC and VP9 decode?
Right, but what TVs have their tech branding on them? Now which monitors do?
0 vs !0
Just because they support legacy ports doesn't mean they are actively encouraging them. Whereas they have money invested into Gsync, you don't think they want people using those monitors?
http://forums.overclockers.co.uk/showthread.php?t=18677121&page=24
Just to confirm.
The AMD Radeon™ Fury X is an enthusiast graphics card designed to provide multi-display 4K gaming at 60Hz.
In addition, Active DisplayPort 1.2a-to-HDMI 2.0 adapters are set to debut this summer. These adapters will enable any graphics cards with DP1.2a outputs to deliver 4K@60Hz gaming on UHD televisions that support HDMI 2.0.
http://forums.overclockers.co.uk/showthread.php?t=18677121&page=24
Just to confirm.
The AMD Radeon Fury X is an enthusiast graphics card designed to provide multi-display 4K gaming at 60Hz.
In addition, Active DisplayPort 1.2a-to-HDMI 2.0 adapters are set to debut this summer. These adapters will enable any graphics cards with DP1.2a outputs to deliver 4K@60Hz gaming on UHD televisions that support HDMI 2.0.