Can i run 2 video cards from different generations?

Florin22xxl

Limp Gawd
Joined
Dec 9, 2011
Messages
216
Hey guys,
I am using a GTX 1060 and want to add another card just to power 1 "special" monitor wich needs DVI-I.
Can i use older generation cards like this one:
MSI GeForce 8500 GT DirectX 10 NX8500GT-TD256EH 256MB 128-Bit GDDR2 PCI Express x16 SLI Support Silent Heatsink Video Card-Newegg.com
I was checking nvidia latest driver page,the one i use for my GTX 1060.
Drivers | GeForce
gtx 8000 series are not suported....this means i need to get a least something from
GeForce 400 Series up?To get it to fully suport my monitors native resolution 2560x1600/60 fps.
thanks.
 
Are you already maxing out the 4 displays on the GTX 1060? If not, you can use a DVI-D cable between the 1060 and your DVI-I monitor. I'd prefer that route over adding another GPU just for one monitor; the extra heat, noise, power draw should be avoided if possible.
 
I have an old Apple Cinema Display 30" 2560x1600/60 fps wich only runs natively on a DVI-I port.The 1060 has a DVI-D port wich is useless for me.My only solution is a DVI-I port or a DVI-I to DP convertor(wich i want to avoid because they are expensive and i don't know for sure if it would work).
 
You don't need a DVI-i cable. DVI-i combines the DVI-d signal you need to drive it with 4 extra pins needed to also allow running an analog VGA signal down the cable. The VGA support baked into DVI-i only has two purposes: allowing backward compatibility from DVI devices to VGA devices, and screwing up picture quality by causing your hardware to decide to use analog instead of digital signaling (fortunately not an issue at 2560x1600 because it'll look obviously catastrophically bad if it happens). On your cinema display it's there so you can connect an old computer that doesn't support DVI to it (but only at a lower resolution like 1920x1200); for running at native resolution all you need is DVI-d.
 
why bother with a DP to DVI converter? that's an expensive solution that would just add more input lag

If the display cable is a dvi-I with the 4 extra pins (which is there for no good reason) and the video card has only a dvi-d, just use this:

New High Quality DVI-I Female to DVI-D Male Adapter Connector Converter


I don't have a cinema 30 so I don't know if the input cable is attached to the monitor or not, but I have an hp lp3065 which has detachable cable and I always buy dvi-d cables anyway....
 
As an eBay Associate, HardForum may earn from qualifying purchases.
Pump the brakes, the old cinema's were POWERED by the DVI port. Is that what you have?

The other option --since it sounds like you have the card---is just add the card to the machine and see what happens.
 
I vote add the card and see what happens. Kinda like....everyone says don't do it, then those that actually see if it can be done are surprised!
 
the only thing you have to watch for is that sometimes NVidia driver package does not have support for a new card and a much older card within the same set of driver installation package. then you would have trouble b/c it might not let you install two driver packages at the same time....

As I said, why makes things overcomplicated when you can just get a cheap dvi-I - dvi-d converter and be done?
 
Update:
I got a second graphics card geforce 630 just for this.Everything works flawless as of now.I use the 630 just to power the Apple display.
The bad part is like wildpig1234 sayd i always have to check for new drivers if they support the 630 card.For the moment this will have to suffice.
 
Update:
I got a second graphics card geforce 630 just for this.Everything works flawless as of now.I use the 630 just to power the Apple display.
The bad part is like wildpig1234 sayd i always have to check for new drivers if they support the 630 card.For the moment this will have to suffice.
FYI...There is always a chance even down the road that NVidia will drop support in its driver package. That is why you can always just use the Windows built in driver since you are just using it for a 2D desktop. I'm pretty sure you wont have any problems.
 
Update:
I got a second graphics card geforce 630 just for this.Everything works flawless as of now.I use the 630 just to power the Apple display.
The bad part is like wildpig1234 sayd i always have to check for new drivers if they support the 630 card.For the moment this will have to suffice.

So are you saying you resolved the issues from this thread here:

Games using second slower graphics card

Don't leave us in suspense, tell us what you did to fix it. That way anybody else that finds this thread in google can find a solution :D
 
I think my problem with games runing on the second card was driver related.I did a fresh windows install and got the latest nvidia drivers and everything works as it should now.Windows detects both graphics card but games are using the 1060.I could not find any setting to actualy change what video card aplications should use.
 
Update:
I got a second graphics card geforce 630 just for this.Everything works flawless as of now.I use the 630 just to power the Apple display.
The bad part is like wildpig1234 sayd i always have to check for new drivers if they support the 630 card.For the moment this will have to suffice.

Why do you not like a simple dvd-i to dvi-d adapter? That was the simplest solution.
 
@widpig1234 that would still not solve my problem...i needed 2xDVI connections for 2 monitors and my video card only has one.Adding another video card with 1 more dvi port was my only solution.
 
@widpig1234 that would still not solve my problem...i needed 2xDVI connections for 2 monitors and my video card only has one.Adding another video card with 1 more dvi port was my only solution.

I see... Well, if you have two monitors with DVI, then that's def requires a different solution. Your original post mentioned only one apple cinema display which would easily be solved with a cheap passive dvi-I to dvi-d adaptor.

looks like the benq only support 1920 1080 144hz on dvi. with hdmi it's limited to 120hz. so I guess you do need a 2nd DL dvi.
 
FYI...There is always a chance even down the road that NVidia will drop support in its driver package. That is why you can always just use the Windows built in driver since you are just using it for a 2D desktop. I'm pretty sure you wont have any problems.

windows built in 2d driver is usually limited to 1600 x 1200 with zero 3d capability. it might not even let you play h264 with hardware acceleration.

florin, just make sure you remember which drivers built it is. probably in like a year there will be zero card with dvi anymore... I still have a few 1600p with dvi only. guess I will have to make do with the 10xx cards for now.
 
the 1060 doesn't have an HDMI out and DVI out?

It does have 1 dvi and 1 hdmi. But the dvi ouput no longer has analog signal so not crt dsub output compatible anymore ...

Rx 480 doesnt even have dvi anymore so I think we are heading toward no more dvi
 
It does have 1 dvi and 1 hdmi. But the dvi ouput no longer has analog signal so not crt dsub output compatible anymore ...

Rx 480 doesnt even have dvi anymore so I think we are heading toward no more dvi

Non-reference 480s can have DVI (ex this XFX card); but while the 5 year timeline for retiring all legacy video outputs from 2011(?) has obviously been missed the clock is ticking and those of us who have displays that need DL-DVI probably will need to buy one of those expensive adapters in the next year or three.
 
The 30" cinema display has a dvi-d cable. You dont need an adapter if your card had a a dvi-d port. I used to use that display.

 
The 30" cinema display has a dvi-d cable. You dont need an adapter if your card had a a dvi-d port. I used to use that display.


According to OP it has a nonchangable dvi-i cable which doesnt plug into a dvi-d output. Also he has another monitor with dvi... so he needs to have 2 dvi outputs.
 
According to OP it has a nonchangable dvi-i cable which doesnt plug into a dvi-d output. Also he has another monitor with dvi... so he needs to have 2 dvi outputs.
Ah. Well he could run 2 different vendors for sure. Also that can't be right unless Apple made 2 different models (possible). But that seems weird as the 30" cinema display cant use analog signals. Also he could just get a dp->dvi for the other display, especially if it's a sl-dvi screen.

Hey OP, mind showing us a picture of the connector on the display? I'm curious.
 
Back
Top