DVI vs VGA?

TechBoy

2[H]4U
Joined
Oct 1, 2006
Messages
2,529
What is the difference between DVI and VGA other than the capability for higher bandwidth? Is it a really noticeable difference on 1920x1200 res? Should I really care at all which one I use?

Information would be much appreciated.
 
With proper cables you'd be hard pressed to notice any significant difference. VGA has no built-in resolution limit either, it just needs better quality cables to go along with a higher resolution :)

DVI can even look worse depending on the cables and the processing circuits of the monitor.
 
It depends on the gfx card, VGA cable quality and the monitor you are using.
At that res, VGA can look really bad and can also look extremely good, pot luck in many respects.
Whether you need to change depends on how good it is for you already.
 
DVI doesn't look worse if a 'lesser' cable is used. It's a digital signal so it works or it doesn't.
A better DVI cable can only extend the signals range.

You're a fool to believe stuff like 'goldplated dvi cables' give a better image then one from normal metal/copper.

Thus a DVI cable doesn't affect the image quality that the monitor receives. With an analog VGA cable the cable can affect the image. Therefore DVI is preferred.
 
My TV accepts both VGA and DVI (HDMI).

The VGA signals the TV accepts is limited to certain frequencies. But DVI is DVI and is always accepted.
 
DVI should always be used if possible. VGA adds unnecessary conversions that soften the image. (This can range from slightly noticeable to somewhat blurry.) VGA on digital displays is a legacy connection that is still included for compatibility.
 
You should always use DVI on monitor's native resolution. That'll give the best picture. Using VGA will give you a varying degree of blurriness, depending on resolution.
 
I'd love to see people try the Pepsi Challenge with a good LCD screen, videocard and VGA & DVI cables. Most of the differences you see are purely in your head.

Perhaps something HardOCP could organise? :)
 
DVI = Digital end to end, VGA = Analogue

That should be all you require to make a decision, with VGA you are introducing 2 conversions (D/A and then A/D on the other end), depending on the quality of those you can end up with excellent or poor results, but never as good as DVI IMHO.
 
If you use top of the line equipment i.e : very expensive VGA cables with high end monitors you might not see a noticeable difference, but for average consumer monitors, especially with the bundled VGA and DVI cables there is a clear difference at high resolutions.

Even in a pure technical point of view VGA signals are more susceptible to signal degradation compared to DVI (two conversions + characteristics of analog signal transmission).
 
And of course DVI is impervious to dropped and flipped bits. For very high resolutions there isn't even a comparison possible, as DVI maxes out at 1915 x 1436 pixels (standard 4:3 ratio), 1854 x 1483 pixels (5:4 ratio) or 2098 x 1311 (widescreen 8:5 ratio), something VGA easily reaches as well. Higher resolutions with DVI are only possible by enabling a second link within the DVI connector (double-link DVI...), which essentially means that you're using two DVI connections, whereas VGA is then still using only a single connection.

In the end the image quality of all but the most expensive, high-end LCDs is saddening anyway :)
 
Came across this not long ago with a friends new dell. He had the vga cable installed instead of the dvi cable. After swapping cables, as much as some may not believe it, there was a difference in image quality. The digital image looked more vibrant. It was quite noticeable.
 
Back
Top