AMD Fury X... Wait, no DVI?!

Ahhh, so its ok to do away with one type of legacy port but not the other :rolleyes:

Again, there are a lot of enthusiasts who use the Korean 1440p panels which only have a DVI port. Hell, go find the hundreds upon hundreds of pages on OCN. AMD is basically burning us all to do away with "legacy" all while keeping "legacy".

HDMI is not legacy, DVI is.

4K HDMI 2.0 TVs are a tiny, tiny, tiny, market compared to 1080p

if you want 4k, use DP or buy an active adapter.
 
Last edited:
HDMI 1.4 is legacy... And an active adapter is $$$$. But continue to put on your AMD blinders because removing DL-DVI is not a big deal.......
 
no blinders, i have no use for HDMI 2.0 so i dont care.

And thats fine, but do not assume that no one else does. It is an issue that potentially effects many users and their buying decisions. I could possibly be upgrading to the new 144hz IPS panel but until I make that jump, no sense in buying a Fury now like I was wanting to try.
 
HDMI 1.4 is legacy... And an active adapter is $$$$. But continue to put on your AMD blinders because removing DL-DVI is not a big deal.......


Edit: No HDMI 2.0 on the new AMD card? either AMD really is that dumb, or the internet is wrong.


both are likely.
 
no blinders, i have no use for HDMI 2.0 so i dont care.

normally i wouldn't but the lg34um95 have issues with sleeping on displayport so i need hdmi 2.0 to give me no sleep issues and custom 60 fps frames. so doesn't matter if i get a more powerful card if i automatically lose 10 fps or just deal with sleep issues
 
Yay, $35 extra for a shitty adapter that doesnt work for a lot of people and is finicky. All because AMD is making a poor decision to phase out DVI when monitors clearly have not phased them out and DVI is STILL useful.
 
It's time. DVI is a great big connector and the last of the legacy "screw in" heavy cables. I won't miss it's passing. Amd is forward thinking in this. In another few years dvi won't exist, and it won't be wasted space on the graphics card.
 
Again, not the same monitor.

The QX2710 he is referring to is the DVI-only model. Same with the x-star. You replied as though you were familiar with the history of the Korean 1440p PLS monitors - but even from the DVI / Gaming context of this thread, it could have been inferred that gamers want a DVI-only model for some reason. The reason:

Refresh Rate: 96Hz without dropping frames is common. 110Hz+ and you may see some dropped frames even on an upgraded DVI-D cable. Still, many of these panels make it beyond 120Hz. A custom calibrated color profile is necessary because there is noticeable gamma-shift when increasing refresh rate. Common profiles are available for 96Hz to 120Hz. I wish someone would make a profile for the 72-84hz range (which I'm limited to when using a 25ft DVI cable).

Input Lag: I don't know, but it's relatively low. I had thought it was usually 1 frame of lag at 60Hz. Forum member "NCX" would know.

Pixel Refresh: 1440p PLS is slightly slower than the Korean 1440p IPS (Achieva Shimian / Yamakasi / Overlord / etc) displays, I think. You'll get some blur. This isn't a fast TN panel.

The bad:

Backlight Bleed and QC: Problematic. Backlight bleed is common. There are fixes. The fixes require dismantling... and there will likely still be some leakage. It is also cheaply made. Expected.

Best Gaming monitor? I don't know about that, but it was a good option if you were willing to take the risk.

I see, thanks. My personal experience with these Korean monitors were an earlier version that I had a ton of problems with. Didn't know they improved in that regard.

That said, I simply found it ridiculous to assert they are the best gaming monitors when the Acer Predator XB270HU or ASUS MG279Q exist.
 
It's time. DVI is a great big connector and the last of the legacy "screw in" heavy cables. I won't miss it's passing. Amd is forward thinking in this. In another few years dvi won't exist, and it won't be wasted space on the graphics card.

While again, I agree with this, why do it now when LCD manufacturers are not even there? Give people options while the conversion is made. Lots of monitors have issues with one type of connection or another (like stated above). And limiting it to 1 type of connector for 4k @ 60frames is pretty dumb.
 
Doesn't every new 120/144Hz monitor come with DP? The only people who are gonna be angry are those with the Korean monitors and maybe those with really old ones.
 
HDMI is not legacy, DVI is.

4K HDMI 2.0 TVs are a tiny, tiny, tiny, market compared to 1080p

if you want 4k, use DP or buy an active adapter.

HDMI 2.0 is backward compatible with HDMI 1.4. Would make sense to add HDMI 2.0 and appease all the owners of monitors and TV's with this port and also owners of 4K TV's.

That way you have current tech covered and old tech.
 
Disgusting is all I can say to this. I hope that nobody buys this card unless AMD revises it with DVI-I output. I will never use DisplayPort. It is a horrible pile of shit, along with HDMI. There is only one good digital connector, and that is DVI. If I was making monitors, I would only offer DVI, VGA, and BNC input.
 
Wait, there is NO fcking way this is true. No hdmi 2.0 for 4k??

Whhhhaaaaatttttt????????

this can't be true ... there is no way this is right

I basically had to give up on AMD because of their hardware limitations and sold all 3 of my 290x's and bought 2 x MSI 970's for hdmi 2.0

I have to be missing something here on the new Fury cards.
 
Sadly the $35 review list has people bitching about no 120Hz 1080p support. Looks like the $100 level supports it with USB power..... just adds to the upgrade total if I decide to go that route.. staying on a 970 looks more and more attractive.. by the way, this thread isn't about the quality of cheap Korean panels, its about the Fury X's lack of DVI and me seeing if anyone has tried 120hz @1080p over an active dp to dvid adapter. Thanks!
 
Wait, there is NO fcking way this is true. No hdmi 2.0 for 4k??

Whhhhaaaaatttttt????????

this can't be true ... there is no way this is right

I basically had to give up on AMD because of their hardware limitations and sold all 3 of my 290x's and bought 2 x MSI 970's for hdmi 2.0

I have to be missing something here on the new Fury cards.

Well according to the AMD fan boys in this thread, there is no reason why you would need HDMI 2.0 because they provide display port and there are adapters out there. :rolleyes:
 
So let me get this straight, these folks with Korean monitors knowingly bought cheap monitors where corners were cut to save costs and only had a single DVI input. It was stated in Decemeber 2010 that INTEL and AMD would be moving to HDMI and DP. AMD is following through on their word and you are coming down on them? I mean really after 4.5 years they keep their word and people complain at them and not monitor manufacturers which kept using the connector?! The downside is that most of the people with these monitors are likely enthusiasts, but how large a % they make up is unknown.

Here is the Intel press release.


If GPU continue to include DVI then monitor manufacturers will continue to put it on their products. For the last few years the only monitors without at least HDMI were very low end models aimed at corporate/schools/etc that would not be going for a $500 GPU. You can still get some models with DVI and VGA (!) though. Also the knock-off monitors such as the Korean variants which are also cheap models meant to cut costs down to the basics.

HDMI will probably be the next to go legacy which is why my next monitor will have DP. I would be more surprised if NV continued to use DVI in their next flagship. Connectors come and go, sucks when you are on the legacy end...AGP, VGA, that damn parallel port that took forever to die, serial ports, ps/2 ports which still find their way onto motherboards, ISA, PCI non-e, etc.
 
Any chance that one of the manufacturers puts DVI on one of their custom pcb models?
 
I think people actually need to go and seriously research this adapter claim.

There is a thread with 230k replies and growing daily that deals exactly with this to a certain degree.

Please show me an adapter that supports 4k at 60hz that is confirmed to work with DP. "Confirmed"
 
Yay, $35 extra for a shitty adapter that doesnt work for a lot of people and is finicky. All because AMD is making a poor decision to phase out DVI when monitors clearly have not phased them out and DVI is STILL useful.

If you read the reviews and actually read the ones where people have the adapter working, it sounds a lot like the major problem is people NOT using the correct type of DVI cable.

I have an active min-dp to DVI adapter from monoprice (1920x1200 is the max res), and it has been working great for about a year now. It is on sale for a whopping $18 at monoprice right now.

When I bought it, I paid about the same amount. At the same point in time, pretty much all the other active adapters were $50+

Just because something is inexpensive doesn't mean it is a cheap piece of trash.
 
Several of those things are not known.

But give it a week and then you'll be able to have some justified disdain. Or a mouthful of crow... whatever.

VNlMN75.jpg


The Fury is a BIG no buy for those of us who have or want to buy 4K TVs for desktop or HTPC use.
Reviewers have to point this out. It's a very important connection that is missing ffrom a card that boasts to be great for 4K Gaming and Productivity.
 
A lot of the problems with the DP-to-dual-DVI adapters were due to the early firmwares they had. I had a lot of issues with the 2 I use on my 30" monitors (Accell and Bizlink), but flashing them to the latest 2.0 firmware fixed them quite nicely. I have had zero issues with my adapters in the past 2 years, they are so stable now I fully intend to wait at least 1 panel generation before jumping on the 4K bandwagon.

Supposedly DP-to-HDMI2.0 adapters will be released shortly, but I haven't seen any prediction on pricing.
 
Disgusting is all I can say to this. I hope that nobody buys this card unless AMD revises it with DVI-I output. I will never use DisplayPort. It is a horrible pile of shit, along with HDMI. There is only one good digital connector, and that is DVI. If I was making monitors, I would only offer DVI, VGA, and BNC input.

And what would the res on that monitor be...?
 
Disgusting is all I can say to this. I hope that nobody buys this card unless AMD revises it with DVI-I output. I will never use DisplayPort. It is a horrible pile of shit, along with HDMI. There is only one good digital connector, and that is DVI. If I was making monitors, I would only offer DVI, VGA, and BNC input.

Thank god you don't make monitors
 
So let me get this straight, these folks with Korean monitors knowingly bought cheap monitors where corners were cut to save costs and only had a single DVI input. It was stated in Decemeber 2010 that INTEL and AMD would be moving to HDMI and DP. AMD is following through on their word and you are coming down on them? I mean really after 4.5 years they keep their word and people complain at them and not monitor manufacturers which kept using the connector?! The downside is that most of the people with these monitors are likely enthusiasts, but how large a % they make up is unknown.

Here is the Intel press release.


If GPU continue to include DVI then monitor manufacturers will continue to put it on their products. For the last few years the only monitors without at least HDMI were very low end models aimed at corporate/schools/etc that would not be going for a $500 GPU. You can still get some models with DVI and VGA (!) though. Also the knock-off monitors such as the Korean variants which are also cheap models meant to cut costs down to the basics.

HDMI will probably be the next to go legacy which is why my next monitor will have DP. I would be more surprised if NV continued to use DVI in their next flagship. Connectors come and go, sucks when you are on the legacy end...AGP, VGA, that damn parallel port that took forever to die, serial ports, ps/2 ports which still find their way onto motherboards, ISA, PCI non-e, etc.

rofl, get off your high horse. people bought the qnix because it was the only 1440p, 27", IPS-based monitor that could do 120 Hz for YEARS, not because it was cheap. that was just a bonus. i'd also like to know how it was knockoff of the XB270HU three years before it came out, considering that's the only monitor you can buy that does the same thing. :rolleyes:
 
No DVI = Not a big deal.

No HDMI 2.0 = Mortal godamn sin. WTF were they thinking?
 
No DVI = Not a big deal.

No HDMI 2.0 = Mortal godamn sin. WTF were they thinking?

The only thing I can think of is the spec to put it on a discrete card wasn't ready when they made the card. It's not like they can't add it. I won't be buying the nano until it adds hdmi 2.0.
 
No DVI = Not a big deal.

No HDMI 2.0 = Mortal godamn sin. WTF were they thinking?

Both are big deals to people who own those monitors

DVI-dual link - 30" 1600p 16:10 displays that some people love and won't move from them, overclockable Korean 27" - only replacement is 600$ Freesync Asus
 
Cant get a better monitor for the price then a Korean 27", it's as simple as that.

With that said if an adapter isn't included its the 980ti for me, out of necessity.
 
It may not be a big deal to some, but for a lot of us, Korean 1440p monitors are a sweetspot. I really love my Qnix, and I'm not in the market to replace it for a few more years (when 4k is the norm rather than the high end). I cannot justify spending an extra $100 on an adapter, and I already mentioned that a new monitor isnt in the cards either. Especially not when the competition offers very similar gaming performance with the port that I need. If one of the custom cards comes out with a DVI port, that changes things, but until that happens, I'll be looking at the 980ti instead.
 
Is it safe to say that if the 300 series has adapter for DVI and hdmi 2.0 that the fury will?
 
AMD pretty much shot themselves in the foot by not adding HDMI 2.0 support on their new cards if that post is legit. This will piss off a lot of 4K TV users who wants 4:4:4 and 60Hz support. I was looking forward to AMD's high end card but losing interest now if it lacks HDMI 2.0.
 
HDMI over display port? No thank you, I prefer Display port.
Adapters are not a big deal, if you are planning on buying a 500+ dollar GPU then this shouldn't be an issue.
If this card equals or outperforms a Titan at its price point and you are worried about an Adapter then you are not thinking correctly.
 
In 2015 it's pretty damn hard not to have/find/buy a monitor without display port. Furthermore, there are some nvidia cards that come with NO HDMI or DVI port whatsoever. If it's not an issue for them. It sure as hell shouldn't be a problem in this instance on an enthusiast site. At this point if you are still rocking DVI (a standard that came out in 2000...freaking 15 years ago) I think it's time for an upgrade.

If you bought the latest CPU, overclocked it to hell and back, added water cooling, and went SLI / Crossfire for something that totals in the thousands ( as in you could almost buy a Ford Fiesta with the same amount of money) and didn't bother to upgrade your monitor then that's on you. That's just stupid and illogical.

If you are on an a 4K HDMI 2.0 TV because really that's the ONLY format where DP doesn't appear then at least there's an argument to made there I think. However, ALL 4K monitors support DP. All of them don't support HDMI 2.0 though. You would literally have to seek out a 4K monitor without DP intentionally in order to omit DP. So I'm just not feeling the outrage. It's been known for quite some time that 4K monitors went the DP route and 4K TV's went with HDMI 2.0. So just as with anything make your purchases accordingly.
 
Last edited:
Back
Top