Are two 1440p DVI DL monitors impossible on a GTX 1080?

Joined
May 2, 2006
Messages
565
I'm going crazy here trying to figure out how to get my 2nd Qnix up and running now that I've upgraded to a GTX 1080. My previous 780 had two dual link DVI ports so it was no problem. I assumed a newer card wouldn't struggle, but I can't get HDMI → DVI DL to work at all. Does anyone know a way to force the communication through the cable or where I can get an adapter that will actually allow this to work?

Both Qnix monitors are without scales so there are no other inputs other than the DVI DL and they only support their native resolution.

Link to my graphics card:
https://www.gigabyte.com/Graphics-Card/GV-N1080TTOC-8GD#kf
 
maybe need a DP to DVI adapter for the second screen, since maybe the DVI and HDMI share the same clock generator and you can only use one or the other.
 
Cool. So my $470 videocard is actually a $570 videocard. Super happy about that.

lol catch up with technology ;).. either way the reason for it is space, look at how many extra ports can be added with the removal of 1 DVI port.. most display manufactures(minus the low end stuff) have moved on to supporting HDMI and/or DP. i wouldn't be surprised if nvidia followed suit with AMD to remove the DVI port on their high end cards completely.. the port takes up way to much space and has become a hindrance for cooling using the stock cooler.
 
Cool. So my $470 videocard is actually a $570 videocard. Super happy about that.

Blame your monitors not your videocard.

This is like buying a monitor with inputs too new for your ancient / shitty videocard and blaming the monitor, wtf...
 
i wouldn't be surprised if nvidia followed suit with AMD to remove the DVI port on their high end cards completely..

They already did with the 1080Ti and the Titan Xp. Chances are good that they'll be doing so on the lower tier products when Volta comes along.

Cool. So my $470 videocard is actually a $570 videocard. Super happy about that.

You can't really blame the product for lack of features/functionality if you just bought it without doing any research on its capabilities. This is just like what happened when the CRT nerds cried foul when they bought new Pascal cards only to realize later that they no longer have DVI-I ports for their VGA adapters.
 
If you really want to keep your current Displays go for the Gigabyte mini gtx 1070 . It will support your ability to easily overclock your displays and from what I have seen will support your games altho you may have to lower your eye candy somewhat.
 
It's clear on the pictures too that it only has one dvi-dl port so why buy the card already knowing that. don't cheap out and get one adapter make sure to get two otherwise you can end up with your monitors displaying two different color temperatures. then you can save your time coming back and complaining about why your two monitors don't look the same even though they used to before.
 
Cool. So my $470 videocard is actually a $570 videocard. Super happy about that.

With the added insult to injury that apparently all displayport to dual link DVI adapters are terrible and prone to glitching and flaking out.
 
You're the idiot who bit on the fad of QNIX monitors and bought TWO, right when DL-DVI was headed out the door.

I have some limited sympathy for anyone who bought a cheap Korean 1440p panel. With the DVI phaseout already on the video card makers road maps; those monitors never should've existed. A lot of people who didn't do any research are going to get burned in another year or two when the number of DL-DVI links drops to zero resulting in unexpectedly early end of life for their displays. The flip side of that is even among more informed consumers, at the time no one expected DP to DLDVI adapters would remain a permanent train wreck. If you assumed a reliable <$50 adapter would be available when needed they still looked like reasonable buys.

The death of the last DL-DVI port will also hit people who bought 30" 1600p monitors that predated displayport. The impact of that coming has already made itself felt in the used monitor market. My NEC 3090 was ~$2k new, were I able to sell it now I'd probably only get $150ish for it. While part of that is price pressure from cheap 1440p monitors, knowing it's going to be end of life soon is dragging it down to well below the budget competition despite being a better display than cheap 1440p's in anyway except weight and power consumption.
 
The reason those adapters are so expensive is because there's more complex conversions going on.

DisplayPort can cheaply carry DVI SINGLE LINK or HDMI over it's four data lanes.

Both of those are only three lanes, so they can be transmitted, and then just have the data converted at the output.

But DVI-DL goes where no other standard did. Instead of upping the bandwidth and sticking with three channels, they just doubled the number of channels to SIX.

THIS MEANS YOU CAN'T SEND DL-DVI over DispalyPort without some high-speed serializing (not enough DP channels to carry 6 separate channels of DVI). It gets very clunky.

The small market combined with the issues converting the format make it destined to a life of high prices.

The cheaper 3-channel adapters SOMETIMES work with some HIGHER QUALITY displays that pushed their DVI controllers beyond to twice the speeds of spec, but that does not usually include the QNIX series (most of those were lucky if you got them over 90 Hz, IIRC?).

See the first review on this product page:

https://www.monoprice.com/product?p_id=12784
 
Last edited:
Back
Top