DVI to VGA questions

Scryel

n00b
Joined
Jun 1, 2015
Messages
7
Hi

I need some help concerning DVI / VGA field. I have a LCD monitor which has only VGA input and my video card has a VGA and DVI-I output.

Currently Im using simple VGA cable which is connected from LCD to VGA port on my video card. My questions is if I would connect LCD to DVI-I output on my video card would there be any quality improvements? If there would be any improvements my next question is would there be any difference if I used DVI to VGA adapter or DVI to VGA cable concerning quality or any other issues that I could encounter? Thanks for your answers.

Best regards
 
If your monitor only has a VGA input then it does not and won't understand a digital signal.

Even if you used an adapter, or a cable that has a DVI plug on one end and a D-SUB connector on the other you still get an analogue feed from the PC to the monitor.

Yes, there is a difference between feeding digital and analogue. The former has a crisper image in my experience and is immune to connector oxidation and other degradation.
 
Aha I see. Then there is no point of buying an adapter and to expect it would give me a nicer picture so I guess I will have to buy a new monitor :/. Thanks for the answer.
 
You can still try another cable. I've found that the old school white/gray thick ones are better shielded and generally seem to degrade less than the thin generic stuff they throw into the box with the screen.

Or lower your refresh rate. They used to be unlockable and you might find your monitor will operate properly at a lower one.
 
Hmm Im using a quite thick black VGA cable so I dont know if there is any thicker than I currently have :). As for the refresh rate I only have 60 Hz and 59 Hz available and when I chose 59 Hz picture gets kind of blurry and it is not sharp as I set it on 60 Hz. My friend has a DVI to VGA adapter and I will just for an experiment try it to see if there is any noticable difference.
 
I tried using an adapter but there is no difference at all. I also tried to use a thicker VGA gold plated cable and surprisingly the quality of picture was better because there was no ghosting or interferences when I was looking a video with black background like it was when I used a cable which came with a monitor. I guess I will buy a quality VGA cable so does someone know which cable to purchase :)?
 
On older monitors I often used to switch between DVI and VGA. When DVI became standard over VGA we did some basic visual tests unplugging and plugging between the two. The difference is hardly noticeable. Disappointed.

It's not a night and day improvement. A decent VGA cable will still give perfectly acceptable results.

Not something I'd sweat over.
 
I guess I kind of agree with both of you. I have seen some horrible monitor+analogue cable combos that produced two to three 'echoes' as I call them.
And I also have seen cases where blind tests were needed to distinguish the difference. It depends on the converter in the graphics card, difference between grounds, degraded electrolytic capacitors in the monitor's input stage...

Yes, the gold plated stuff does not oxidize and that's what you want - the least possible resistance combined with proper shielding.
Be sure to fasten those screws tightly (preferably with the computer off in case you bump the card out of the slot). Keep it away from the power cable.

There are pieces of software, such as toolboxes possibly added to your VGA driver that might allow you to unlock the refresh rate completely. I remember a program called Powerstrip - not sure if that's still around.

Be advised - if you set the wrong one, your monitor will yell at you about an unsupported input and it's safest to turn the monitor off for those 15 or so seconds until it resets itself to the previous setting.

Your monitor may also have a setting related to expected input voltage levels. There's also 'phase' or 'clock' IIRC.
Lots of fooling around one can do with an analogue connection :) definitely worth exploring!

To stay on the safe side, turn the monitor off when you hit 'input out of range' and don't accidentally hit enter! you'd confirm the 'bad' setting.
 
I guess I kind of agree with both of you. I have seen some horrible monitor+analogue cable combos that produced two to three 'echoes' as I call them.
And I also have seen cases where blind tests were needed to distinguish the difference. It depends on the converter in the graphics card, difference between grounds, degraded electrolytic capacitors in the monitor's input stage...

Yes, the gold plated stuff does not oxidize and that's what you want - the least possible resistance combined with proper shielding.
Be sure to fasten those screws tightly (preferably with the computer off in case you bump the card out of the slot). Keep it away from the power cable.

There are pieces of software, such as toolboxes possibly added to your VGA driver that might allow you to unlock the refresh rate completely. I remember a program called Powerstrip - not sure if that's still around.

Be advised - if you set the wrong one, your monitor will yell at you about an unsupported input and it's safest to turn the monitor off for those 15 or so seconds until it resets itself to the previous setting.

Your monitor may also have a setting related to expected input voltage levels. There's also 'phase' or 'clock' IIRC.
Lots of fooling around one can do with an analogue connection :) definitely worth exploring!

To stay on the safe side, turn the monitor off when you hit 'input out of range' and don't accidentally hit enter! you'd confirm the 'bad' setting.

Heh I will rather not experiment with the unsupported refresh rates because I dont want to risk damaging the monitor so I will just stick to the allowed methods where there is no danger of destroying something :p. Do you maybe know which cable is one of the best one to buy or it doesnt matter as long it has a gold plated connector and a thick cable with proper shielding :)?
 
Hi

I need some help concerning DVI / VGA field. I have a LCD monitor which has only VGA input and my video card has a VGA and DVI-I output.

Currently Im using simple VGA cable which is connected from LCD to VGA port on my video card. My questions is if I would connect LCD to DVI-I output on my video card would there be any quality improvements? If there would be any improvements my next question is would there be any difference if I used DVI to VGA adapter or DVI to VGA cable concerning quality or any other issues that I could encounter? Thanks for your answers.

Best regards

The quality will always be better using a good quality VGA cable also using the new standard of 6+3 those extra 3 wires provide feedback to the video card on the type ofmonitor connected and usually they are also better quality cables. I use a HDMI to VGA cable. They have passive cables but those will not work unless your monitor has a digital vga input. There are active DVI to VGA cables but since the DVI port has VGA signals and all you need a way to connect the VGA cable to the DVI port using a small adapter the signal quality should be the same on both ports.. The active HDMI to VGA cable converts your HDMI sigital to analogue signals but quality depends on the converter. Unlike digital signals, with vga type cables, you have to test it yourself. it is the only case where something like the monster cables actually provide any value.
 
On older monitors I often used to switch between DVI and VGA. When DVI became standard over VGA we did some basic visual tests unplugging and plugging between the two. The difference is hardly noticeable. Disappointed.

It's not a night and day improvement. A decent VGA cable will still give perfectly acceptable results.

Not something I'd sweat over.

You will only notice the difference at higher resolutions like 1080p... Even at 720p I notice differences and at 1080p the characters are unreadable on the VGA while they are clear with HDMI. DVI and HDMI are the same signal anyway so I am sure anything over like 1280x1024 would be noticeable as being fuzzy..
 
Hi

I need some help concerning DVI / VGA field. I have a LCD monitor which has only VGA input and my video card has a VGA and DVI-I output.

Currently Im using simple VGA cable which is connected from LCD to VGA port on my video card. My questions is if I would connect LCD to DVI-I output on my video card would there be any quality improvements? If there would be any improvements my next question is would there be any difference if I used DVI to VGA adapter or DVI to VGA cable concerning quality or any other issues that I could encounter? Thanks for your answers.

Best regards

Long of the short, anbsolutely no improvement will occur. The only way to change to signal type would be to upgrade to a monitor with DVI. All current monitors support at least DVI or a newer port. VGA is being pretty much phased out, even in the office setting.
 
Yeah I know but for now I will use this monitor until I get some money and buy a decent 27 inch LED monitor with IPS panel :).

I tried searching for the best VGA cable but there are little results found. The only cable that actually has a name on the site where I found the review recommends that the top notch quality is Unlimited PCM-2230. Does someone else know of some other quality VGA cables?
 
Yeah I know but for now I will use this monitor until I get some money and buy a decent 27 inch LED monitor with IPS panel :).

I tried searching for the best VGA cable but there are little results found. The only cable that actually has a name on the site where I found the review recommends that the top notch quality is Unlimited PCM-2230. Does someone else know of some other quality VGA cables?

Quality DOES matter with analog, unlike digital (i.e HDMI, digital either works or it does not). But I do not think spending a decent amount of money on a quality made VGA cable would make much of a difference unless you are currently experiencing interference.
 
Quality DOES matter with analog, unlike digital (i.e HDMI, digital either works or it does not). But I do not think spending a decent amount of money on a quality made VGA cable would make much of a difference unless you are currently experiencing interference.

I just checked this, I got a cheap spare VGA cable because even though I am using HDMI, The vga needs to be connected to the monitor to provide a monitor signal. other wise if I power off the monitor, my screen resets to 720p mode and all the windows are moved over.. The screen looks pretty bad with this cable. The characters are very fuzzy and I cant read them properly, I can make them out but would develop a headache after 15 mins of use. The Original Samsung vga cable I got with my old RGB monitor on the other hand works really well, the text is readable and such. It does not match the HDMI crispy quality but its up there.. both cables are really thin and look very similar but plugging them in, there is a huge difference. Looks alone wont tell you if the cable is good or not. This is not a factor for low resolution modes. I use a HDMI to VGA cable to connect my cable box to the monitor and even there I can see the difference in quality between the 2 cables. I always had many VGA cables that came with monitors so I never noticed or had a reason to buy them until now. HDMI cables on the other hand all seem to give the same picture quality.. Although there is a small problem of the monitor blinking when someone turns on their old style tube lights. The thin HDMI cable I got with the Cisco cable box works far better than the thick HDMI cables I bought. So even size is irrelevant even for digital signals. I bought 3 different HDMI cables trying to fix this blinking because I don't know which neighbor is playing with their lights at night and I was surprised the thin flexiable cisco HDMI cable worked so much better and rarely blinked in the evenings. I dont have that problem with VGA signals though, just the HDMI.. sheesh.. if it aint one thing its another.. I hate ballast's.. blink blink blink blink blink.. drove me nuts.
 
Yeah I know but for now I will use this monitor until I get some money and buy a decent 27 inch LED monitor with IPS panel :).

I tried searching for the best VGA cable but there are little results found. The only cable that actually has a name on the site where I found the review recommends that the top notch quality is Unlimited PCM-2230. Does someone else know of some other quality VGA cables?

Any vga cable that has the ferrite cores near each connector should be properly shielded as well.

The only ones I have ever seen ghost are the tin ones without the ferrite cores near the connectors.

You can easily tell if the cable has them as part of the cable near the connectors will be a lot bigger than the cable itself.

The vga cables that come with Dell and Acer monitors do not have ghosting problems.

Not sure about other current brands.

If you really can't find one where you are.. I can send you one for just the cost of shipping if you are in the US.
 
Thanks for the offer but I live in Slovenia so the shipping could be a problem and I dont want to give you an unnecessary work with sending/packing the item and not to mention giving me a cable for free :).

Anyways guys thanks for all your help for explaining me the situations that you noticed in your examples and what should I look when buying a quality VGA cable so the next step is to search for these kind of cables in my country or to buy it on internet if I wont find an appropriate one here :).

Best regards
 
The only time in 25 years of computing that I have seen a VGA display I couldn't use or was truly unusable, was when the panel or screen was on the way out.

The cable was rarely or ever the issue. And boy have I used a lot of VGA cables...

We used to run 22" 1600x1200 CRT screens at really high refresh rates back in the mid 90's with the supplied VGA cables from Compaq and they all ran fine. Due to the high refresh rates we would lose a screen every week but we were a multi-national so it didn't matter.

I've got a customers machine hooked up to a 1080p monitor with VGA right now...looks sharp and dandy.

Another case of over-analysing a situation that doesn't warrant it. Just plug in a cable and get on with it.
 
MANY VGA cables can not handle the frequency for higher resolutions. Finding one that works properly and dont give you a headache is hard. These cables were connected to the monitor and could not be removed in many cases with RGB monitors. With LED and LCD's you have to buy them separately and you have to make sure it is a 6+3 VGA cable and a well made one. In fact VGA does not really work at 1600x1080 modes without causing eye strain. They might look sharp but you will notice it in a few years. With HDMI thats not a problem. Even at lower refresh rates. So of course there is a difference, companies like monster cables are worth hundreds of millions because they charge an arm and a leg and people buy them after buying a lot of other cables. So better to buy a decent cable you know will work and online is not the place for that. Dont end up being a monster cable customer!
 
Long of the short, anbsolutely no improvement will occur. The only way to change to signal type would be to upgrade to a monitor with DVI. All current monitors support at least DVI or a newer port. VGA is being pretty much phased out, even in the office setting.
From a quick check it seems that for 1080p monitors there is no longer any real price difference between VGA only and VGA+DVI but there are sill plenty of VGA only monitors on the market, so it pays to check when buying. .
 
MANY VGA cables can not handle the frequency for higher resolutions. Finding one that works properly and dont give you a headache is hard. These cables were connected to the monitor and could not be removed in many cases with RGB monitors. With LED and LCD's you have to buy them separately and you have to make sure it is a 6+3 VGA cable and a well made one. In fact VGA does not really work at 1600x1080 modes without causing eye strain. They might look sharp but you will notice it in a few years. With HDMI thats not a problem. Even at lower refresh rates. So of course there is a difference, companies like monster cables are worth hundreds of millions because they charge an arm and a leg and people buy them after buying a lot of other cables. So better to buy a decent cable you know will work and online is not the place for that. Dont end up being a monster cable customer!

They do and its not hard.

By the time this guy has hunted around for a 'high spec' VGA cable he'll have the cash for or want a new monitor anyway. Put the money towards a better monitor.

Waste of time.
 
Back
Top