Is HDMI really better than VGA?

Mike63

n00b
Joined
Jan 10, 2014
Messages
34
I first used the HDMI port of my S2440L using an DVI-HDMI adapter, but then I eventually switched to the included VGA cable because I couldn't find out how to force Intel Graphics 3000 to display full range RGB (0-255). There was no discernible difference in the sharpness of text in going to VGA, and actually I could find no fault at all with the VGA output. I'm thinking maybe Dell used a high quality analog-to-digital-converter in this instance, since the S2440L has no DVI connection. In any case, what I've seen on this display is certainly evidence against the conventional wisdom that HDMI is always better than VGA. Perhaps I would see some improvement in image quality if I bought a separate video card so that I could get full range RGB through HDMI, but that seems to me to be a waste of money given that I have no problem with the output through VGA. For those of you not able to get HDMI to work right with this display, maybe you should give your VGA cable a try.
 
VGA= less bandwidth, no audio, analog

HDMI= more bandwidth, audio, digital
 
Given higher quality/better cables? It can be of higher bandwith (or let's say that there are no limits). Otherwise if you think that limit should be at what resolutions interference/signal worsening can made image too bad to accept as usable, then imho analog vga at resolutions of WQXGA, 4K and alikes would work worse then eg. HDMI 1.4/2.0. Analog VGA is also limited by practical implementations in videocards & monitors, that don't support too high resolutions and refresh rates over analog vga, as then image would be of too bad quality, even if standard itself has no restriction.
Pitty though .. i miss simplicity/universality/interoperability/long lasting of VGA, that was used for many years from 11 to 21" displays of any price range. I'd love for another alike interface to appear, that could be used for 10-15 years as is, with no lame HDCP encryption, be available and supported by almost everything .. just that imho it will not be developed as such anymore. And digital interfaces might do a bit better for no-data/no-quality loss transfers to long distances.
 
I do all of my 'non twitch' gaming on my 40" Sony TV, and had previously been using HDMI.

I recently switched to VGA as apparently this reduces the input lag somewhat (noticably does).

However, the picture is definitely worse those, but not by much. Many wouldn't notice, but the colours are slightly off, and it's certainly not as sharp. I've gone through 4 cables of varying quality and connectors and all make little difference. The very best one just helps remove 99% of the image ghosting (another problem with VGA).

Unless you need the greater bandwidth of VGA, or like me you wanna bypass as much of a TV's signal processing as possible, HDMI is better.
 
VGA/DVI: Limited only by cable quality. Given an excellent cable, this could have infinite bandwidth.
DP/HDMI: Limited by both standard and cable quality. Locked to VESA standards.
The answer to your question should be simple. My 16 year old CRT can do 2048x1535 at 80Hz. And it is 16 years old.
 
TV .. fullhd that is .. signal worsening only medium. If you up resolution even more, then loosing quality is more noticeable. Hence the reason why at some point monitor and gpu vendors stopped enhancing vga capabilities, adding it's support just at level of floppy disk drive where something of $5 cost was included in most cases but almost unused (similar with analog vga support, even if not as native connector, then at least by dvi2vga adapter), but very often via using vga you cannot use even native monitor resolution fully. Like i wrote in previous post, i'd love to see something like vga2 happen, as universal, as longplaying, w/o content protection crap like vga was .. but i am pragmatically sceptic, there won't be such. And comparing capabilities of original vga where it stopped (in real hw implementations), and where digital interfaces are now, later ones are already further.
 
I too was using a DVI to HDMI adapter, DVI at computer end and HDMI at monitor. The blacks were black, there was no detail in dark areas, it was just... black. Switched to the cheap invcluded VGA cable, but had to put a DVI adapter on the computer end (r7950 doesn't have VGA out) and woah...! Picture was awesome and got my dark scenes back. Not sure if straight DVI to DVI would do the same thing or not....
 
VGA= less bandwidth, no audio, analog

HDMI= more bandwidth, audio, digital

That is practically false unless you happen to have a very new 4k capable HDMI setup. I have a Korean 1440p monitor, guess what with HDMI on a GeForce 670 mobile I could NOT pass the signal but I could actually do it with VGA. That is right VGA had more bandwidth than HDMI.

HDMI sucks its the shitties connection ever invented. It doesn't have a way to secure cables so they slip out. The standards body is ALWAYS behind the times, they never freaking get anything ready ahead of time for a an incoming standard. As such cutting edge technology often requires alternative standards such as DL DVI or VGA gasp. How is it possible to have so much fail in one simple cable?
 
Simple. Industry wished for something that is under control. Be it royalties for using it, be it HDCP that supposedly should lessen piracy, lol. And then there came self-sustaining of reached usage critical mass when HDMI was everywhere in AV world, so it is and will be used for many years to come, even with all the mentioned drawbacks, even if better alternatives exist (DP had higher bandwitch, much sooner, and was licensing payment free) . Don't underestimate powers of inertia and legacy.
 
Only thing I can say is that HDMI has given me nothing but problems. DVI, which I have been using exclusively the last 6 years, has been issue free. Really weird since they are supposed to be similar.
 
Only thing I can say is that HDMI has given me nothing but problems. DVI, which I have been using exclusively the last 6 years, has been issue free. Really weird since they are supposed to be similar.

They are the same at the physical layer. Just at the link layer things start to get a little hazy. But as stated earlier, the manufacturers what "control" which means shit won't work right.
 
I too was using a DVI to HDMI adapter, DVI at computer end and HDMI at monitor. The blacks were black, there was no detail in dark areas, it was just... black. Switched to the cheap invcluded VGA cable, but had to put a DVI adapter on the computer end (r7950 doesn't have VGA out) and woah...! Picture was awesome and got my dark scenes back. Not sure if straight DVI to DVI would do the same thing or not....

You should be able to force Limited RGB (16-255) or Full RGB (0-255) in AMD's CCC. Should be under My Digital Panels -> Pixel Format.
 
what the hell is going on in this thread?

Lol, was wondering the same thing. I saw the title and was like "wut?". Then I read the thread and was like "lolwut"?

Now I'm just confused. This seems like one of those threads where people try to argue whether CDs are really better than vinyl, when the answer was long ago settled that "yes, they are" (unless you dig noise and popping but then you could always just digitally corrupt your music).

VGA sucks, CRTs suck, they are all dead end technology held up mostly for nostalgia's sake. Yes, I realize in a few specific metrics they are better, but a horse drawn carriage is better in a few specific metrics than a car..so what.

Now, give me FED or SED or something like that with a digital connection and then we're talking...
 
Lol, was wondering the same thing. I saw the title and was like "wut?". Then I read the thread and was like "lolwut"?

Now I'm just confused. This seems like one of those threads where people try to argue whether CDs are really better than vinyl, when the answer was long ago settled that "yes, they are" (unless you dig noise and popping but then you could always just digitally corrupt your music).

VGA sucks, CRTs suck, they are all dead end technology held up mostly for nostalgia's sake. Yes, I realize in a few specific metrics they are better, but a horse drawn carriage is better in a few specific metrics than a car..so what.

Now, give me FED or SED or something like that with a digital connection and then we're talking...


The correct analogy would be this

VGA/CRT/High Hz=FLAC

HDMI/LCD/60Hz=MP3

until OLED becomes mainstream CRT > LCD and when OLED comes then OLED>everything else including FED/SED
 
VGA quality is also limited by the driver and receiver at each end of the cable, not just the cable itself. I have always been able to tell the difference between a digital and analog connection to 'high' res monitors (>1.5MP).
 
dopple: Hmm, are you sure, that lossy analog signal transfer with loosing quality at higher resolutions/refresh rates/longer cable lengths is comparable to loosless audio compression, and loosless digital signal transfer is comparable to loosy audio compression and not the other way around? :)
 
VGA sucks, CRTs suck, they are all dead end technology held up mostly for nostalgia's sake.

People claim that VGA will always produce a blurry image in comparison to HDMI or DVI, but I'm just not seeing that with this particular display. I don't know if this is because of the panel type or because of the size of the display or what. Whatever the reason, I would argue that it's a waste of money for S2440L owners to purchase a separate video card just to gain access to an HDMI connection.

I also have a 23" TN display, and I see no difference between VGA and DVI with it. The conventional wisdom that HDMI/DVI>VGA I think probably goes back to an earlier time when most LCDs had poor quality ADCs. The idea that IPS>TN/VA probably also goes back to the time when most IPS displays were 8-bit instead of 6-bit+FRC.
 
VGA sucks, CRTs suck, they are all dead end technology held up mostly for nostalgia's sake. Yes, I realize in a few specific metrics they are better, but a horse drawn carriage is better in a few specific metrics than a car..so what.

Now, give me FED or SED or something like that with a digital connection and then we're talking...

I got news for you, hoss. LCDs have yet to be as good as good CRTs were when Trinitron monitors like FW900 were out. "In a few specific metrics", try superior in every single performance metric aside from product weight.
 
I got news for you, hoss. LCDs have yet to be as good as good CRTs were when Trinitron monitors like FW900 were out. "In a few specific metrics", try superior in every single performance metric aside from product weight.

Sorry, but I disagree. Fine text on an LCD driven at native resolution is far superior to CRT, be it aperture grille or shadow mask. I am sooo happy I don't need to worry about convergence adjustments anymore.
 
Sorry, but I disagree. Fine text on an LCD driven at native resolution is far superior to CRT, be it aperture grille or shadow mask. I am sooo happy I don't need to worry about convergence adjustments anymore.

You're absolutely wrong.

Guess why serif fonts are not standard on computers these days...whereas serif fonts dominate the printing industry?

Because common pixel-density LCDs blow ass at fine text rendering, contrary to your unfounded opine. Microsoft noticed it and changed the default font in Office from Times New Roman to sans-serif Calibri because text resolving of serifs is awful on LCDs.
 
If VGA looks better than HDMI in the important metrics (sharpness, contrast, detail) then something is wrong with the system.

You have to remember that VGA on an LCD display requires processing of the signal. There can be an unsharp mask applied, contrast enhancement, color saturation boost (ie. tint/color), and others being applied to the image after the A/D conversion. You may or may not have access to these controls.

There are specific test patterns you can use to determine what processing has been used, but I leave it up to the reader to discover these (much has been written that will not fit in this small text input box).
 
LCDs have yet to be as good as good CRTs were when Trinitron monitors like FW900 were out. "In a few specific metrics", try superior in every single performance metric aside from product weight.
Compared to LCDs, CRTs have many downsides:

-Bigger. Wall mounting is pretty much out of the question.
-Hotter. Not comfortable during warm summers.
-Heavier. Not easily moveable, thus annoying to clean.
-Phosphor burn-in. Needs screen saver to prevent problems.
-Flickering. 100Hz minimum needed, but not always possible.
-Stronger color shift. Quite regularly re-calibration needed.
-Power hungry. Higher energy bills and climate concerns.
-Glass reflection. Matte and semi-glossy doesn't exist.
-Poor brightness. Can't produce pure strong whites.
-Moiré interference. Ripples and waves. Hard to deal with.
-Imperfect geometry. No matter the calibration, always fuzzier.
-Terrible ergonomics. Little height adjustability and no pivot.
-Radiation emissions. The lead doesn't block all x-rays.
-Contains toxic waste. Environmental and health concerns.
-Lacks modern inputs, like DVI, HDMI and DisplayPort.
-Phosphor persistence. Looks kinda like motion blur.
-Lack of HDCP support. Can only work with a hack.
-Aesthetically unpleasant. Hardly any design options.
-Used; no warranty/support. New ones aren't sold anymore.
-Affected by magnetic fields. Have to watch out with speakers.
-Relatively small screen size. 24" inch is basically the maximum.
-No modern innovations, like 3D, G-Sync and FreeSync.

CRTs aren't bad, but certainly not "superior to LCDs in every single performance metric aside from product weight." Both screen types have pros and cons.

Because common pixel-density LCDs blow ass at fine text rendering, contrary to your unfounded opine. Microsoft noticed it and changed the default font in Office from Times New Roman to sans-serif Calibri because text resolving of serifs is awful on LCDs.
These fonts only appear better on an CRT, because the text is blurred due to the poorer geometry. The perfect geometry of an LCD exposes even the tiniest flaws in the font. But in practice, this is not really a problem anyway with ClearType. Or use a modern font that was designed for the much sharper LCD displays.
 
[X]eltic;1041093298 said:
Compared to LCDs, CRTs have many downsides:

-Bigger. Wall mounting is pretty much out of the question.
-Hotter. Not comfortable during warm summers.
-Heavier. Not easily moveable, thus annoying to clean.
-Phosphor burn-in. Needs screen saver to prevent problems.
-Flickering. 100Hz minimum needed, but not always possible.
-Stronger color shift. Quite regularly re-calibration needed.
-Power hungry. Higher energy bills and climate concerns.
-Glass reflection. Matte and semi-glossy doesn't exist.
-Poor brightness. Can't produce pure strong whites.
-Moiré interference. Ripples and waves. Hard to deal with.
-Imperfect geometry. No matter the calibration, always fuzzier.
-Terrible ergonomics. Little height adjustability and no pivot.
-Radiation emissions. The lead doesn't block all x-rays.
-Contains toxic waste. Environmental and health concerns.
-Lacks modern inputs, like DVI, HDMI and DisplayPort.
-Phosphor persistence. Looks kinda like motion blur.
-Lack of HDCP support. Can only work with a hack.
-Aesthetically unpleasant. Hardly any design options.
-Used; no warranty/support. New ones aren't sold anymore.
-Affected by magnetic fields. Have to watch out with speakers.
-Relatively small screen size. 24" inch is basically the maximum.
-No modern innovations, like 3D, G-Sync and FreeSync.

CRTs aren't bad, but certainly not "superior to LCDs in every single performance metric aside from product weight." Both screen types have pros and cons.

These fonts only appear better on an CRT, because the text is blurred due to the poorer geometry. The perfect geometry of an LCD exposes even the tiniest flaws in the font. But in practice, this is not really a problem anyway with ClearType. Or use a modern font that was designed for the much sharper LCD displays.

:)| tl;dr
 
dopple: Hmm, are you sure, that lossy analog signal transfer with loosing quality at higher resolutions/refresh rates/longer cable lengths is comparable to loosless audio compression, and loosless digital signal transfer is comparable to loosy audio compression and not the other way around? :)

Ask someone who has used a high hz lcd panel if they are willing to go back to a 60 hz lcd panel.... just the same way someone who has high end audio equipment will eshew using mp3s in favour of FLACs.

Using LCDs coming from high end CRTs is like forcing a rich man to live like a poor man.
 
If VGA looks better than HDMI in the important metrics (sharpness, contrast, detail) then something is wrong with the system.

What's wrong with my system is that it has no HDMI connector and the iGPU limits RGB output to 16-235 through a DVI-HDMI adapter. Using the adapter results in poor contrast of course, but text sharpness is more or less the same using the adapter or the VGA connection.
 
[X]eltic;1041093298 said:
Compared to LCDs, CRTs have many downsides:

<snip>

Yet, despite all of that I have yet to find an LCD which could replace my EIZO trinitron tube.

I'm not asking for much. Just a 60 Hz panel with uniform brightness and decent colors. Haven't found it yet.I paid some $450 for 24" Asus IPS but the corner glow was absolutely a pain in the ass and I couldn't deal with it. My old TN actually bothered me less because at least there, the backlighting was more or less uniform. Kinda sucks to have a dark scene everywhere except that bottom right corner just poking at you. Drove me nuts.
 
Who was that famous actress who said "I've been rich and I've been poor and rich is better".

I've used CRT and LCD and CRT is better. Talking about image quality, not the weight, and size, and toxic innards. Just image quality. There is no LCD that I have used that is better looking --- or better performing in the area of input lag, motion blur, than my defunct FW900 or the Viewsonic P225f. The only criteria where this LCD is an upgrade would be screen size. Not saying this LCD is is terrible, which is isn't, but it is not better. Even the aspect ratio of the FW900 is better, 16:10 rather than 16:9.

The problem is the CRTs are getting so old and worn that the tubes are reaching the end of their life span. They all have hundreds of hours on them and don't look as sharp as they did when new --- or they quit working altogether. That is why I gave up on them and went with LCD.
 
dopple: Hmm, are you sure, that lossy analog signal transfer with loosing quality at higher resolutions/refresh rates/longer cable lengths is comparable to loosless audio compression, and loosless digital signal transfer is comparable to loosy audio compression and not the other way around? :)
*lossless/lossy
 
Nothing could make me go back to my Viewsonic P225F (used this display from like early 2002 to 2005).

I used it right up until I upgraded to a 4k display (Viewsonic VP2290b LCD) as well as added a dell 3007-wfp (right when they came out).

In the end one of the most important things for me was resolution. On the P225F I was running 2560x1920 @ 63hz and it was definitely getting fairly blurry at that point. The only resolutions that didn't really look blurry was 1600x1200 and below (it did 100Hz @ 16x12 and that is usually what I gamed at). Even at 2048x1536@79 Hz I still saw quite a bit of blurryness.

When pushing high resolutions pretty much all higher resolution LCD's (2560x1440 or better) blow the CRT's out of the way in sharpness/pixel clarity. CRT's are way to blurry when pushing the limits.

I can see why people who don't like high resolution would still want to use them though.
 
Nothing could make me go back to my Viewsonic P225F (used this display from like early 2002 to 2005).

I used it right up until I upgraded to a 4k display (Viewsonic VP2290b LCD) as well as added a dell 3007-wfp (right when they came out).

In the end one of the most important things for me was resolution. On the P225F I was running 2560x1920 @ 63hz and it was definitely getting fairly blurry at that point. The only resolutions that didn't really look blurry was 1600x1200 and below (it did 100Hz @ 16x12 and that is usually what I gamed at). Even at 2048x1536@79 Hz I still saw quite a bit of blurryness.

When pushing high resolutions pretty much all higher resolution LCD's (2560x1440 or better) blow the CRT's out of the way in sharpness/pixel clarity. CRT's are way to blurry when pushing the limits.

I can see why people who don't like high resolution would still want to use them though.

That's the other thing about this LCD is the resolution. It's 27 inch TN 1920X1080. But it is fast, no motion blur, no input lag. In that regard it is just as good as the CRT. I settled for this TN because it was fast-- 120hz-- and I wasn't willing to live with any motion blur or input lag. And the colors aren't bad at all. But the resolution is a noticeable downgrade from either the P225f 2048X1536 or the FW900 2304X1440. That is my main beef with the monitor. Games do not look as sharp or detailed as my CRTs
 
I respectfully disagree that CRTs have better picture quality. It really depends on the user's preferences. A well-calibrated CRT has great picture quality, but it really only shines when it comes to contrast, viewing angles and speed. But LCDs beat CRTs when it comes to sharpness, picture stability, brightness and colors. No CRT can come close to the LCD's crystal clear image; LCDs have perfect geometry. The right LCD doesn't suffer from flickering either, which is easier on the eyes. LCDs are able to display brighter whites/colors too, due to more powerful backlighting. And modern LCDs can display more colors as well, with the right (wide gamut) backlight and color depth.

So in short:

Prefer contrast, viewing angles and hardly any motion blur / input lag? CRT.
Prefer crystal clear, bright and colorful image with no flickering? LCD.

No display technology is superior, it all depends on the user's preferences.

Personally, I never would want to have a CRT again. I can't go back to a small 22,5" screen (FW900) and I just don't like the fuzzier CRT image, either. I wish LCDs had CRT contrast and speed*, though. These are of course big CRT strengths.

* Although arguably, some LCDs are already as fast with LightBoost/ULMB and low input lag modes.
 
most decent CRTs fit the sRGB gamut pretty closely. When black levels are set low enough, these colors are rendered at their full natural saturation.

The only thing LCD has on CRT in color is when it comes to wide gamut.

Not to mention CRT destroys LCD in terms of bit depth.
 
Ask someone who has used a high hz lcd panel if they are willing to go back to a 60 hz lcd panel.... just the same way someone who has high end audio equipment will eshew using mp3s in favour of FLACs.

The correct analogy would be this

VGA/CRT/High Hz=FLAC
HDMI/LCD/60Hz=MP3

what the hell is going on in this thread?

lolwut indeed. The mp3/FLAC analogy is especially perplexing. The Vinyl v CD analogy is more appropriate.
 
why is it always crt vs lcd
crts are better for gaming and lcds are better for web/office
for games the lack of sharpness or slight convergence issues don't really matter and using a crt for text based work in 2014 is pretty stupid
they complement each other well, just use both?
 
Lack HDCP is not an issue; it is a huge benefit. I'd prefer to not have a display laced with DRM.
 
In other words FED would have been a godsend, but they scrapped that so GG.
 
I got news for you, hoss. LCDs have yet to be as good as good CRTs were when Trinitron monitors like FW900 were out. "In a few specific metrics", try superior in every single performance metric aside from product weight.

Totally agree. It's 2014 and I'm still waiting for LCD's to catch up to CRTs in the mainstream.

CRT: Every resolution is native, infinite colours, no viewing angle issues, 'nuff said.

LCDs have introduced such metrics as; colour space, black levels, backlight glow, dead pixels, brightness uniformity, viewing angle, pixel response, input processing latency, etc. All of which were already 'solved' by CRT tech.

Ooohhh, it's slimmer! Whoop dee fucking woo.

Only OLED can beat it, but according to latest word from Samsung et al, we're roughly 5-10 years out from now, on that tech being truly mainstream.
 
Back
Top