Major compatibility oversight

capt_cope

Gawd
Joined
Apr 12, 2009
Messages
948
Just looked at the latest kitchenaid stand mixer, go figure they forgot to make it compatible with the thinset/grout paddle I've got.:mad:

Are all these threads about kids with cheap korean monitors that only support DVI or "HDMI 2.0 is the ONLY connection I'll use" legit, or has nVidia discovered guerrilla marketing?

Does anyone really base the purchase of a $600-$700 computer component solely on whether or not it supports a legacy connection their $200 monitor relies on? I could almost understand the HDMI 2.0 argument since it'd be nice to upgrade to a 40" screen, but we're still talking about televisions here, not computer monitors. Different products, different audiences, different connections.
 
Just looked at the latest kitchenaid stand mixer, go figure they forgot to make it compatible with the thinset/grout paddle I've got.:mad:

Are all these threads about kids with cheap korean monitors that only support DVI or "HDMI 2.0 is the ONLY connection I'll use" legit, or has nVidia discovered guerrilla marketing?

Does anyone really base the purchase of a $600-$700 computer component solely on whether or not it supports a legacy connection their $200 monitor relies on? I could almost understand the HDMI 2.0 argument since it'd be nice to upgrade to a 40" screen, but we're still talking about televisions here, not computer monitors. Different products, different audiences, different connections.

when the monitor they just bought a year ago, two years ago tops, only has one connection (DL-DVI, ala Qnix, Catleap 1440p panels), yeah it's a big deal that this card won't fucking work w/ their monitors

how is this even a discussion, of course people would be pissed off.
 
I have an $8000 Tv that has HDMI 2.0 ports because nearly 100% of 4k tvs do not support displayport I wish they did.

I gave up using 4 290x in quad crossfire for a 980 just because I wanted to use my PC with my TV. I would love to get 3 Fury X cards so much so that im going to wait until Wednesday to get the final verdict of whether my $8000 Tv will work with 2k worth of Fury X video cards.

So yes I base the perchance of 2 grand worth of video cards on 8 grand tv Ive already bought.

Sorry to feed another thread.
 
I have an $8000 Tv that has HDMI 2.0 ports because nearly 100% of 4k tvs do not support displayport I wish they did.

I gave up using 4 290x in quad crossfire for a 980 just because I wanted to use my PC with my TV. I would love to get 3 Fury X cards so much so that im going to wait until Wednesday to get the final verdict of whether my $8000 Tv will work with 2k worth of video cards.

So I base the perchance of 2 grand worth of video cards on 8 grand tv Ive already bought.

Sorry to feed another thread.

Nope it is all good :) You are a minority, and this is an example as Fury failing for some customers.

You are correct, the only way for you to have proper 4k60hz gaming is with Nvidia.

Unless that HDMI 2.0 to Display port adapter comes out soon, which I highly doubt, Nvidia is your only option. And in that case, AMD doesn't get your [H]ard earned cash.
 
by the time i buy a new card it will be time to upgrade my monitor so i could care less.
 
I guess an adapter is too much work for some people. Who would have known. I'm running two adapters right now with no issue for the last two years. And we all know AIBS are going to make custom cards with more options but who gives a fuck, let's just make this into an issue anyway.
 
I guess an adapter is too much work for some people. Who would have known. I'm running two adapters right now with no issue for the last two years. And we all know AIBS are going to make custom cards with more options but who gives a fuck, let's just make this into an issue anyway.


its a little hard to base buying 3 Fury X cards on the chance of an adapter that may or may not be available anytime this year. I know about the bizlink and mega dp 1.2 to hdmi 2.0 adapters but no news on when they will release and how hard the will be to find. its not an issue for you but don't make it seem like its a non issue for others.
 
HDMI 2.0 needs to be paired with HDCP 2.2 (for 4K HD BluRay content) for future proofing..

If it's just HDMI 2.0 - no good.
 
Just looked at the latest kitchenaid stand mixer, go figure they forgot to make it compatible with the thinset/grout paddle I've got.:mad:

Are all these threads about kids with cheap korean monitors that only support DVI or "HDMI 2.0 is the ONLY connection I'll use" legit, or has nVidia discovered guerrilla marketing?

Does anyone really base the purchase of a $600-$700 computer component solely on whether or not it supports a legacy connection their $200 monitor relies on? I could almost understand the HDMI 2.0 argument since it'd be nice to upgrade to a 40" screen, but we're still talking about televisions here, not computer monitors. Different products, different audiences, different connections.

Get of your high horse. Just because it isn't a problem for you doesn't mean it's not a problem for a lot of people that may be interested in buying a Fury. DVI may be legacy, but it is also still very common. I assume AIBs will be able to put the connector on their custom cards, but if that doesn't happen it's definitely going to hurt AMD's sales. Especially since NVidia's products have DVI (and HDMI 2.0 for that matter).

And yes, technically there are adaptors available. Unfortunately, were talking about a $100 one here instead of a $10 one. Why even bother with that if you can get a 980ti that will work out of the box?
 
Just looked at the latest kitchenaid stand mixer, go figure they forgot to make it compatible with the thinset/grout paddle I've got.:mad:

Are all these threads about kids with cheap korean monitors that only support DVI or "HDMI 2.0 is the ONLY connection I'll use" legit, or has nVidia discovered guerrilla marketing?

Does anyone really base the purchase of a $600-$700 computer component solely on whether or not it supports a legacy connection their $200 monitor relies on? I could almost understand the HDMI 2.0 argument since it'd be nice to upgrade to a 40" screen, but we're still talking about televisions here, not computer monitors. Different products, different audiences, different connections.

This had made my day for superbly illustrating the point
 
when the monitor they just bought a year ago, two years ago tops, only has one connection (DL-DVI, ala Qnix, Catleap 1440p panels), yeah it's a big deal that this card won't fucking work w/ their monitors

how is this even a discussion, of course people would be pissed off.

Just keep in mind it's not AMD's fault that these people bought a cheap-ass $200 monitor with 1 connector type.
 
Get of your high horse. Just because it isn't a problem for you doesn't mean it's not a problem for a lot of people that may be interested in buying a Fury. DVI may be legacy, but it is also still very common. I assume AIBs will be able to put the connector on their custom cards, but if that doesn't happen it's definitely going to hurt AMD's sales. Especially since NVidia's products have DVI (and HDMI 2.0 for that matter).

And yes, technically there are adaptors available. Unfortunately, were talking about a $100 one here instead of a $10 one. Why even bother with that if you can get a 980ti that will work out of the box?

These people bought cheap (price-wise) monitors with 1 connector. Maybe they should have spent an extra hundred to get one with DP so they wouldn't have to get a $100 adapter
 
Just keep in mind it's not AMD's fault that these people bought a cheap-ass $200 monitor with 1 connector type.

Elitist and condescending snark aside, just keep in mind it's ultimately AMD losing a sale that's the net effect here. This isn't any skin off the DVI monitor owner's back - and they don't "have to buy a $100 adapter" - they'll just buy Nvidia. You guys don't seem to grasp the bigger picture here.

AMD really should've waited one more generation before throwing the DVI baby out with the bathwater, especially at a time where they really need every sale.
 
Last edited:
Just looked at the latest kitchenaid stand mixer, go figure they forgot to make it compatible with the thinset/grout paddle I've got.:mad:

Are all these threads about kids with cheap korean monitors that only support DVI or "HDMI 2.0 is the ONLY connection I'll use" legit, or has nVidia discovered guerrilla marketing?

Does anyone really base the purchase of a $600-$700 computer component solely on whether or not it supports a legacy connection their $200 monitor relies on? I could almost understand the HDMI 2.0 argument since it'd be nice to upgrade to a 40" screen, but we're still talking about televisions here, not computer monitors. Different products, different audiences, different connections.

What a ridiculous argument, the most popular Display thread on Hardforum Displays (Which is the most popular forum section here) is for the Samsung 4k TVs as computer monitors. A LOT of us NEED HDMI 2.0 to make it work. Less than 40 inches on 4k is a waste of time, go big or go home!
 
I guess an adapter is too much work for some people. Who would have known. I'm running two adapters right now with no issue for the last two years. And we all know AIBS are going to make custom cards with more options but who gives a fuck, let's just make this into an issue anyway.

There is no adapter from HDMI 2.0 to anything else and there probably won't be due to the updated HDCP that is part of the standard. I waited 6 months for Fury X and now I have to buy a 980 TI because AMD has their head up their ass.
 
These people bought cheap (price-wise) monitors with 1 connector. Maybe they should have spent an extra hundred to get one with DP so they wouldn't have to get a $100 adapter

I'm afraid I also do not really understand the logic behind spending the least possible amount on a display so you that you wind up with complaints about not being able to attach it to your $650 video card.

Elitist and condescending snark aside

I don't really see notarat's comment as snarky at all. There seem to be a lot of people foaming at the mouth over the lack of a DVI connection on a video card that costs triple what these people invested in a display, when it's the display that will outlast multiple gens of video card, not the other way around.

It's no different imo than people who will spend literally thousands on gear and then bitch about the fact that none of it seems to work right with the $30 power supply they nicked from a dumpster behind Best Buy.

You get what you pay for, and pissing at AMD for not being considerate enough to include a DVI connector for single-connector-type displays that happen to be DVI is pretty disingenuous.
 
Elitist and condescending snark aside, just keep in mind it's ultimately AMD losing a sale that's the net effect here. This isn't any skin off the DVI monitor owner's back - and they don't "have to buy a $100 adapter" - they'll just buy Nvidia. You guys don't seem to grasp the bigger picture here.

AMD really should've waited one more generation before throwing the DVI baby out with the bathwater, especially at a time where they really need every sale.

Yep, killing potential customers when your market share so low is a genius strategy. Lol at all the emotional AMD drones. Getting upset that people are rejecting the Fury because they can't use it. How dare someone not desire a video card they're unable to use!
 
I'm afraid I also do not really understand the logic behind spending the least possible amount on a display so you that you wind up with complaints about not being able to attach it to your $650 video card.



I don't really see notarat's comment as snarky at all. There seem to be a lot of people foaming at the mouth over the lack of a DVI connection on a video card that costs triple what these people invested in a display, when it's the display that will outlast multiple gens of video card, not the other way around.

It's no different imo than people who will spend literally thousands on gear and then bitch about the fact that none of it seems to work right with the $30 power supply they nicked from a dumpster behind Best Buy.

You get what you pay for, and pissing at AMD for not being considerate enough to include a DVI connector for single-connector-type displays that happen to be DVI is pretty disingenuous.
]

I bet you that 90% of the PC gaming audience still uses DVI as the primary connector type for their monitor. I'd call ignoring the vast majority of users a pretty bad idea.
 
]

I bet you that 90% of the PC gaming audience still uses DVI as the primary connector type for their monitor. I'd call ignoring the vast majority of users a pretty bad idea.

I actually wonder what the numbers really are. I cannot possibly imagine it's as high as 90%. I can understand people pissed about HDMI 2.0 issues with their 4k TVs, but DVI (or lack thereof) has been a non-issue for every display I've laid hands on for over 2 years. That may only be my personal experience, but as I wandered around Microcenter a couple weekends ago, I think I'd be hard pressed to find a display that didn't have a DP.
 
I actually wonder what the numbers really are. I cannot possibly imagine it's as high as 90%. I can understand people pissed about HDMI 2.0 issues with their 4k TVs, but DVI (or lack thereof) has been a non-issue for every display I've laid hands on for over 2 years. That may only be my personal experience, but as I wandered around Microcenter a couple weekends ago, I think I'd be hard pressed to find a display that didn't have a DP.

Because Monitors aren't replaced at anywhere near the frequency as most computer parts. I know 5 people with overclocked Korean panels because it was the only way to get 100+ hz IPS panels at 1440P for the last two years. We didn't buy them because we're poor, we bought them because nothing else fit our needs. The average consumer is not like me, I know TONS of people with monitors that are 5+ years old and see nothing wrong with them, but now they won't be able to use them with Fury. It's short sighted and will cost AMD customers.
 
Because Monitors aren't replaced at anywhere near the frequency as most computer parts. I know 5 people with overclocked Korean panels because it was the only way to get 100+ hz IPS panels at 1440P for the last two years. We didn't buy them because we're poor, we bought them because nothing else fit our needs. The average consumer is not like me, I know TONS of people with monitors that are 5+ years old and see nothing wrong with them, but now they won't be able to use them with Fury. It's short sighted and will cost AMD customers.

You make valid points I had not considered. I always think of monitors as long-term investments as well, but have consequently always tried to get ones with the best connection spectrum for their spec. I've never picked up anything that drives past 60Hz either except my TV with its 'cinemotion' type stuff.
 
Are all these threads about kids with cheap korean monitors that only support DVI or "HDMI 2.0 is the ONLY connection I'll use" legit, or has nVidia discovered guerrilla marketing?

I would definitely believe the latter, social marketing is growing by huge leaps and bounds. Same goes for guerrilla social marketing, which incidentally nvidia has already been caught doing before in Asia.

It's quite distasteful, but also much harder to pin down and trace a direct link back to nvidia. Though as was done with forum posters on chiphell, not impossible.
 
I would definitely believe the latter, social marketing is growing by huge leaps and bounds. Same goes for guerrilla social marketing, which incidentally nvidia has already been caught doing before in Asia.

It's quite distasteful, but also much harder to pin down and trace a direct link back to nvidia. Though as was done with forum posters on chiphell, not impossible.

You realize that most of us that are complaining only started posting in the Video Card section recently because prior to that we were so busy posting in the Displays subforum which is the most popular on [H]. You know the 191 page long Samsung 4k TV owners thread full of happy customers being forced to buy Nvidia video cards...

http://hardforum.com/showthread.php?t=1853884
 
Last edited:
You realize that most of us that are complaining only started posting in the Video Card section recently because prior to that we were so busy posting in the Displays subforum which is the most popular on [H]. You know the 191 page long Samsung 4k TV owners thread full of happy customers being forced to buy Nvidia video cards...

http://hardforum.com/showthread.php?t=1853884

Yeah, so that means everyone in the world is using a Samsung TV as a display for their PC. :rolleyes:

YOU ARE A MINORITY. Your big thread on [H] doesn't change that. The Sony FW900 CRT thread has over 12000 replies and over 3 million thread views. Haven't seen anyone complaining about the lack of VGA outputs on the Fury X.
 
AMD, along with a bunch of other major players, declared DVI a legacy connector 5 years ago. How much more notification do you need that relying on DVI is not a good plan?

On the other hand, the HDMI 2.0 spec was released in September 2013. Video cards do get "taped out" anywhere from 6 months to a year before release, and you don't just drop in a new connector with double the speed at the last minute, but by the time Fury comes out it will have been 2 years and they can't get HDMI 2.0 in? If that is true (and the guy who said it is double checking on it) then I don't understand why it doesn't. It just doesn't make sense. I'm quite expecting to hear that the original statement was in error.

And that's a reaction from an Nvidia fanboy, check my posting history.
 
HDMI to dvi is acheap connector. Use one on my tv and have no issues though I don't game on it.
 
oh dang it, the brake pads from my 2015 Subaru outback won't fit on my 2003 Subaru Outback. Stupid Subaru for not including backwards compatibility on my new car...

y'all see how silly you sound now?
 
http://forums.overclockers.co.uk/showthread.php?t=18677121&page=24

Just to confirm.

The AMD Radeon™ Fury X is an enthusiast graphics card designed to provide multi-display 4K gaming at 60Hz.

In addition, Active DisplayPort 1.2a-to-HDMI 2.0 adapters are set to debut this summer. These adapters will enable any graphics cards with DP1.2a outputs to deliver 4K@60Hz gaming on UHD televisions that support HDMI 2.0.
 
So AMD shouldn't have to give its potential customers a way to connect to their existing video displays (something absolutely necessary to even use their product at all) yet NVIDIA are dicks for not going out of their way to make optional GameWorks features run flawlessly on AMD cards...gotcha.
 
So AMD shouldn't have to give its potential customers a way to connect to their existing video displays (something absolutely necessary to even use their product at all) yet NVIDIA are dicks for not going out of their way to make optional GameWorks features run flawlessly on AMD cards...gotcha.

Nope, not when they can sell you a $100 adapter after the fact. Why should they give it to you?
 
http://forums.overclockers.co.uk/showthread.php?t=18677121&page=24

Just to confirm.

The AMD Radeon™ Fury X is an enthusiast graphics card designed to provide multi-display 4K gaming at 60Hz.

In addition, Active DisplayPort 1.2a-to-HDMI 2.0 adapters are set to debut this summer. These adapters will enable any graphics cards with DP1.2a outputs to deliver 4K@60Hz gaming on UHD televisions that support HDMI 2.0.

The other problem is that even if they make a DP 1.2 to HDMI 2.0 adapter it won't support HDCP 2.2 which required for 4k Bluray. Also the question of: Will the adapter work with the Oculus Rift which uses HDMI 2.0?
 
The amount of people saying "a real enthusiast would have a displayport" have no idea how silly they sound. I have been waiting months(years) for this video card and the fact that I cant hook it up to my 30 inch dell that I paid $800 for is retarded. I dont know anybody who has a display port on their monitor and I know a lot of gamers with high end gear.

And the people that keep saying just get an adapter also have no idea what their talking about. You need hdmi 2.0 in order to send a dual link dvi signal so a dvi to hdmi is out of the option if you playing on anything over 1200p. If i had a "$200 monitor" sure I could go buy a $5 adaptor and play find at 1080p fine.But requiring me to buy an adapter that will let me go display port to dual link dvi but thats another $125 I gotta spend just to add lag into the system.

I can see why they didnt include dvi but there is absolutely no reason that they didnt include hdmi 2.0. The engineers have to be kicking themselves right now.

Also the fanboyism that people keep saying is silly too. My last 3 cards have been amd and I have been extremely excited for this card. The plan was to buy one of them and play on my 2560x1600 for a few months then buy at least 1 more and the 4k samsung tv. Now my only option is to go nvidia and amd is insane for forcing me into that decision. I love amd as a company but they make some really poor decisions and always find a way to ruin a product for their consumers. This is influenced in their stock prices and their debt that they keep piling up. Meanwhile nvidia is in the green...

Can we please stop making threads about this theres already too many.
 
You don't know anyone with a DP monitor...?

He's not alone, most people only buy 1 new monitor every 5+ years. Since LCD has been stagnant for so long there really hasn't been much motivation for change. None of my friends is using displayport either and I'm in the exact same scenario as cortexodus
 
HDMI to dvi is acheap connector. Use one on my tv and have no issues though I don't game on it.

That's fine if you only need 1200p or less. Anything higher than that and you need DL-DVI. Since HDMI cannot be converted to DL-DVI, you need a DP to DL-DVI adapter. If you're fine with 60hz, you can get one on Monoprice for $15. If you're running an OCd Korean monitor or something, you're looking at a $100 adapter.
 
Back
Top