AMD have aktiv adapter to debut during summer for HDMI 2.0

I've used 4k at 100% at 39 and 40 inches. It's small, not ultra tiny, but small enough that I'll use browser zoom a lot. Would literally be unreadable at 27 inches without a ton of DPI scaling.


I'm reading it just fine at 28" but I did chuckle. Think some of you folks need to go to the eye doctor.
 
Even giving him the benefit of the doubt of not knowing the existence of these monitors, how could he possibly say that 27" is the max DP can support? Does he have a crystal ball and saw that no one building larger monitors would ever put in a DP? Well clearly he needs a new crystal ball.

We really need Dell, HP, Asus, Acer and NEC to release larger 4K monitors. 37-42".
 
Even giving him the benefit of the doubt of not knowing the existence of these monitors, how could he possibly say that 27" is the max DP can support? Does he have a crystal ball and saw that no one building larger monitors would ever put in a DP? Well clearly he needs a new crystal ball.
I know that dp can go bigger than 27 inches I should have worded that better. What i meant is when you look for a 4k dp monitor 95% of them are 27 inches. Yeah there is 32 inch but they are a lot of money when you can get a tv thats bigger for cheaper money. Also if im going to drop $1000+ on a monitor I am looking for more than a 2 inch increase in screen size and there is no way in hell I could "upgrade" to a smaller screen.
 
It's not just the DPI. Try reading text at 4k on a 27". You must all have Superman vision. Try any text heavy games. DPI won't save you at 100% font size.
 
Come on this logic is crazy I have a 5 inch smart phone with a 2560x1440 screen(538 PPI). Is that for ants too or do the guys that only use TV for gaming not use smart phones and carry around their 4k tv's? :D

I have a pc connected to my TV (very low input lag according to reviews) and the input lag compared to my 1ms monitor is very noticeable in some games. So there is in fact a difference there and anyone that has experienced it can confirm this.

Now the 4gb thing everyone has been spotting has some merit but we still need to see the reviews to get the extent of the limit 4gb actually gives in games for the fury x.
 
It's not just the DPI. Try reading text at 4k on a 27". You must all have Superman vision. Try any text heavy games. DPI won't save you at 100% font size.
I have, and it's not even slightly an issue.
This illustrates perfectly the futility of an argument over optimal screen size at 4k. It's apparently a preference.
More on topic...
I know the BizLink has been discussed. Maybe we can bombard them with inquiries?
http://www.bizlinktech.com/industries/list-sub.aspx?Type=100&CId=1
I sent them an email about availability ([email protected], [email protected], [email protected]) . Don't really expect to hear back, but it's worth a shot
 
I feel like the only person who wants a smaller, maybe 30-32" 4K monitor. I primarily use my monitor for gaming, and most games have UI scaling, so to me pixel density is most important.
 
I feel like the only person who wants a smaller, maybe 30-32" 4K monitor. I primarily use my monitor for gaming, and most games have UI scaling, so to me pixel density is most important.

I am with you. 32" monitors are the best, its a shame most of the 4k ones are $1000+ dollars.
 
What I really want is a 21:9 5120x2160 34-40" monitor with free sync whos going to release it first.

oh and with displayport...
 
What I really want is a 21:9 5120x2160 34-40" monitor with free sync whos going to release it first.

oh and with displayport...

You can run custom 21:9 resolutions on 40-48" TVs/monitors.
It will be a while before you'll see 21:9 in 5K. For me to buy one, it will have to be at least 40".
 
It would have been easier if AMD just implemented HDMI 2.0 on their cards... was that so hard AMD? Instead of having to use an adapter which may cause problems.
 
The lie is that you need HDMI 2.0 when adapters solve that issue and allow you to use HDMI2.0 monitors without a problem.
Acting like you cannot ever use your monitor or tv because of this is horseshit.
I want my card to have the most advanced video port around and DP is that port. Adapters have made non issues out of any incompatibility with DP.
But lets talk about it again and pretend this isnt a bunch of paid forum shills doing their thing.,
 
I buy what performs best at any given time. I was set on a GTX980Ti and still have been recommending that card to friends as time passes.
Recent build I just completed had a 960 in it.
I do like AMD but I'm not some blind fanboy or being paid to promote anything.
I also think not adding HDMI2.0 from the get go is stupid but for anyone to act like its a deal breaker.
Just wow, fucking cable adapters are coming and make this "issue" pointless and obvious marketing BS.
And for you to talk is just laughable.
 
I can't wait for the fanboy response to this. Instead of including the tech nVidia already provides right out the box, we are going to sell something separate.

It was the plan from the beginning. Team green for yet another generation. Nice shot in the foot AMD.

HDMI just needs to die. What does it offer over DP anyway?
 
anyone else find it strange with the amount of backlash amd has gotten about this, why people don't equally direct some of it towards the tv manufacturers?
 
Yup, HDMI is just a proprietary format to collect royalties, otherwise known as the "HDMI tax". It offers absolutely no value over the free-to-adopt VESA Displayport.

HDMI founders includes heavy corporate hitters Hitachi, Sanyo, Phillips, Toshiba, Sony, silicon image, technicolor.
 
anyone else find it strange with the amount of backlash amd has gotten about this, why people don't equally direct some of it towards the tv manufacturers?

AMD has been pushing Displayport [H]ard for years, dating back to the 6x Displayport only Eyefinity 5870. It's nothing new.

Yes, the backlash highly suspicious, as to why now. I can only figure guerrilla type counter-marketing.
 
Yup, HDMI is just a proprietary format to collect royalties, otherwise known as the "HDMI tax". It offers absolutely no value over the free-to-adopt VESA Displayport.

HDMI founders includes heavy corporate hitters Hitachi, Sanyo, Phillips, Toshiba, Sony, silicon image, technicolor.

reminds me of adobe flash (pay to use) vs HTML5

but anyways, HDMI is important in this situation for a few reasons

people use stereo receivers that don't accept displayport.
so if they're gaming on a 1080p HDTV, they're fine...though they wasted money on an over-powered card for this situation

however, 4k/60hz over HDMI (which almost all 4K HDTVs utilize) requires HDMI 2.0
i'm aware of the panasonic 4K HDTV that has a displayport but reports are that the DP connection is buggy

"so just use a 4K LCD monitor w/ display port!"
almost all 4K LCD monitors are TN panels, the HDTV counterparts are usually IPS or VA panels which are preferred for their black levels and wider viewing angles

DP to HDMI 2.0 adapter? no, active adapters induce input lag and are expensive (and buggy)

1440p/120+hz IPS panels which are very common and have been for the past couple of years require a DL-DVI (dual link) connection, HDMI 1.4a (Fury X) is not physically compatible with DL-DVI (an adapter does nothing), as HDMI 1.4a is physically comparable to single-link DVI which maxes out at 1080p/60hz
 
HDMI just needs to die. What does it offer over DP anyway?

Copy Protection!!!



I don't want to go OT but since we have some smart people in the thread... I was wondering if anyone knows how much latency / lag a KVM switch would introduce (to either the display or K+M)?
 
HDMI just needs to die. What does it offer over DP anyway?

Want an Oculus Rift? Yeah, HDMI. What to use an HTPC? Yeah, HDMI. Want to connect your computer to a projector? Yeah, HDMI. Want to connect your computer to a real home theater surround system? Yeah, HDMI. Bluray? HDMI.

The question isn't what can HDMI do? The question is why bother with other standards. I wish the new video cards had 2 HDMI ports. 1 for audio to my reciever (Which is far more capable than a shitty sound card) and another with HDMI 2.0 for 4k adult sized screens (40 inches+)
 
I can't wait for the fanboy response to this. Instead of including the tech nVidia already provides right out the box, we are going to sell something separate.

It was the plan from the beginning. Team green for yet another generation. Nice shot in the foot AMD.

Nvidia still doesn't support adaptive sync. FAR more important for PC gaming than hdmi 2.0 on HDTVs. We game on monitors

the vast majority of people using the Fury X won't be using a TV. They will be using a fancy 144hz 1440p or 4K monitor possibly with freesync. The wannabe console gamers will be disrespecting their PCs with a TV
 
adult sized screens
?
I'm pretty sure 'adults' use 4k for work more than play. Who in their right fucking mind puts 3+ 4k screens on a desk that's used 75%+ for work? I know I'd rather not wrench my damned neck looking all over the place trying to read/write and work in vcenter.
40"+ 4k screens are fine, just don't tell me how to use my hardware. It's a damned preference.
More on point, who else has contacted these guys with questions on functionality in an effort to build their own? http://www.displayport.org/faq/ask-displayport/
 
anyone else find it strange with the amount of backlash amd has gotten about this, why people don't equally direct some of it towards the tv manufacturers?

Does dp support hdcp? If not then it will never replace hdmi.

Looks like DP 1.3 supports HDCP 2.2, so going with royalties and established infrastructure keeping HDMI relevant.
 
Last edited:
I don't think the adapter release timing matters, because who is going to spend $750-$775 for Fury 4GB + Adapter vs just $650 for a 980Ti 6GB and similar performance with no adapter headaches?

The adapter is a band aid on a broken arm because it raises the total cost unreasonably.

while yes it's a bandaid for those that happen to own TV's that do 4k @ 60hz. it's also a cost savings on AMD and the 90% of other consumers that don't have a 4k 60hz tv that give 0 shits about HDMI 2.0.. do you think they just put this shit on a card and happily pay the bullshit royalty fee's and not transfer the cost to the customer?

the move to HDMI 2.0 and how the licensing is setup is a friggin joke.. you can't just get a license to use HDMI 2.0, because HDMI 2.0 is backwards compatible with 1.4 and older you also have to pay for the license to use HDMI 1.4 as well. so you're paying double the royalty fee's with 2 completely different licenses so a small percentage of your customers will be happy. if they had put it on the card, you'd be paying that unreasonable cost anyways and so would everyone else that doesn't care if the card has it or not.
 
while yes it's a bandaid for those that happen to own TV's that do 4k @ 60hz. it's also a cost savings on AMD and the 90% of other consumers that don't have a 4k 60hz tv that give 0 shits about HDMI 2.0.. do you think they just put this shit on a card and happily pay the bullshit royalty fee's and not transfer the cost to the customer?

the move to HDMI 2.0 and how the licensing is setup is a friggin joke.. you can't just get a license to use HDMI 2.0, because HDMI 2.0 is backwards compatible with 1.4 and older you also have to pay for the license to use HDMI 1.4 as well. so you're paying double the royalty fee's with 2 completely different licenses so a small percentage of your customers will be happy. if they had put it on the card, you'd be paying that unreasonable cost anyways and so would everyone else that doesn't care if the card has it or not.

It has nothing to do with price increase due to royalties. We haven't heard Nvidia speaking about royalties due to HDMI 2.0 and charging us extra for it.
This Fury chip has been on AMD shelves for release since last year. While waiting for their precious HBM, their product became dated.

Your giving AMD PR without billing them.
 
Want an Oculus Rift? Yeah, HDMI. What to use an HTPC? Yeah, HDMI. Want to connect your computer to a projector? Yeah, HDMI. Want to connect your computer to a real home theater surround system? Yeah, HDMI. Bluray? HDMI.

The question isn't what can HDMI do? The question is why bother with other standards. I wish the new video cards had 2 HDMI ports. 1 for audio to my reciever (Which is far more capable than a shitty sound card) and another with HDMI 2.0 for 4k adult sized screens (40 inches+)

It would be better if those devices had DP inputs. How many of those devices you've listed use HDMI 2.0? :rolleyes:

Why should a company pay to develop something that DP already offers? We'll have DP 1.3 and HDMI 2.0 will still be offering us what DP1.2 offered ages ago.
 
Want an Oculus Rift? Yeah, HDMI. What to use an HTPC? Yeah, HDMI. Want to connect your computer to a projector? Yeah, HDMI. Want to connect your computer to a real home theater surround system? Yeah, HDMI. Bluray? HDMI.

The question isn't what can HDMI do? The question is why bother with other standards. I wish the new video cards had 2 HDMI ports. 1 for audio to my reciever (Which is far more capable than a shitty sound card) and another with HDMI 2.0 for 4k adult sized screens (40 inches+)

Yep. HDMI is the de facto standard for a/v equipment. The availability of HDMI ports does matter to me when I look at cards. ymmv
 
And how much is that adapter gonna cost on top of the Fury X price tag??? Definitely going to be far more than a good GTX980 Ti with all that already included.
 
Ok context... For those that have to use HDMI2.0 then it looks like you will have to buy NVIDIA. For those with DL DVI then if it doesn't show up on Fury then you still have NVIDIA. For the rest of us we have choices. And have to add there is quite a group that has been asking for multiple DP on a GPU on AMD for a few years. They are real happy now.
 
Why do people have to buy Nvidia when Adapters make it an non issue?
That's my only beef with this is how quickly these topics are making that leap of logic.
What makes it a 'Have to' situation when their is a solution for the problem that allows you to get 4:4:4 Chroma and 4K at 60hz with the Fury.

If performance was not their for 4K I would understand someone saying go nvidia as their is no solution to fix that.This is a fucking display connector though, with an obvious solution.
 
Back
Top