I'm ashamed, I have no excuse other than I've been too broke to play with shit and getting old. However, I have not been keeping up on the times and I really don't understand what is going on. Could someone please help educate me as to what is going on and advise the best way to connect these two monitors?
Background: I just upgraded from a GTX260 to a GTX970. I have *always* had both monitors plugged into both DVI ports on the GTX260. However, when I went to use both DVI ports today, only one monitor would be recognized at a time regardless of which port I plugged it into. On this GPU, one port is a DVI-D and the other is a DVI-I. The GPU also has 3x display ports and 1x HDMI.
The monitors are cheap Dell units, one is an S2409W and the other is an ST2421L. I don't know what type of DVI they support, the only specs I can find just make a general reference to having a "DVI" port. They both also have HDMI and VGA ports.
I was finally able to get both monitors working together, one with a DVI and one with an HDMI cable. However, I was getting a lot of red pixel dots and lines on the monitor with the DVI cable. I took a good look at the cable and determined it to be a DVI-D single link. I swapped it for the only other cable I had, a DVI-D dual link, and it cleared up that problem.
I'm not pleased with the image quality of the monitor connected to the HDMI port presently, it seems way too bright and the contrast is way off and I can't seem to find a happy balance with the on screen controls. It almost looks washed out. On the other hand, the monitor connected via the DVI-D dual link to the DVI-D port on the GPU looks excellent.
I can't upgrade monitors at this time so I'm wondering what is the best way to be connecting both of these for now.
Any guidance would be appreciated, thanks for reading and your time.
Background: I just upgraded from a GTX260 to a GTX970. I have *always* had both monitors plugged into both DVI ports on the GTX260. However, when I went to use both DVI ports today, only one monitor would be recognized at a time regardless of which port I plugged it into. On this GPU, one port is a DVI-D and the other is a DVI-I. The GPU also has 3x display ports and 1x HDMI.
The monitors are cheap Dell units, one is an S2409W and the other is an ST2421L. I don't know what type of DVI they support, the only specs I can find just make a general reference to having a "DVI" port. They both also have HDMI and VGA ports.
I was finally able to get both monitors working together, one with a DVI and one with an HDMI cable. However, I was getting a lot of red pixel dots and lines on the monitor with the DVI cable. I took a good look at the cable and determined it to be a DVI-D single link. I swapped it for the only other cable I had, a DVI-D dual link, and it cleared up that problem.
I'm not pleased with the image quality of the monitor connected to the HDMI port presently, it seems way too bright and the contrast is way off and I can't seem to find a happy balance with the on screen controls. It almost looks washed out. On the other hand, the monitor connected via the DVI-D dual link to the DVI-D port on the GPU looks excellent.
I can't upgrade monitors at this time so I'm wondering what is the best way to be connecting both of these for now.
Any guidance would be appreciated, thanks for reading and your time.