TV video connection - VGA (crystal clear) vs HDMI (oddly fuzzy/blurry)

RedJamaX

n00b
Joined
May 14, 2015
Messages
62
I'm sure this question has been asked before, but I'm looking for the answer that is not a shot-in-the-dark kind of suggestion, or some passed off answer like "mine works, something's wrong with your hardware" ...

When connecting a video card via HDMI, it seems a bit fuzzy, especially noticeable on text. But using the VGA cable provides a crystal clear image. It "looks" like the wrong resolution is being used, as though the images are being rendered 1/2 a pixel in either direction... but the resolution is set at 1080p resolution out to a 1080p TV. There is also a similar difference between DVI and HDMI when connecting a 24" monitor. DVI is crystal clear and HDMI just seems... "off" somehow, like the pixels are over saturated with color (reducing brightness and color does not help), or on some monitors it looks similar to the old "convergence" problems we used to see on CRTs...

It's not blurry, or fuzzy to the point that something is obviously wrong... it's just "off" somehow. And there is a noticeable difference in clarity when using the VGA port for a TV, or the DVI port for the monitor. Not noticeable when watching movies or playing games. Only really noticeable when navigating windows interface (icon labels and menus), using text-heavy programs (Internet or word processing). The difference is mush much less noticeable on a monitor, nonetheless, DVI still provides a clearer & sharper image than HDMI.

The two PCs listed in my signature are connected to actual TVs, plus two others connected to 24" monitors. Each of these PCs has had a variety of video cards with HDMI out, including; GT640 1GB, GT640 OEM 1GB, GTX260 896MB, GTX550Ti 1GB, GTX560Ti 1GB, GTX750Ti 2GB, GTX960 2GB, GTX970 4GB, Radeon HD 4650 1GB, Radeon 6750 512MB, R9 280X 3GB.... PC builds include machines with various chipsets and processors ranging from Core 2 Duo 1.83 - 3.0, Core 2 Quad 2.4 - 3.0, Xeon X5450, Intel G3220, Core i5-2500 and Core i5-2500k. TVs and monitors include Samsung 57" projection (8yrs old), Vizio 55" 480Hz (5years old), Sony 47" (2yrs old), Vizio 42" 120Hz (3 yrs old), Seiki 32" (1yr old), AOC 24" monitor (2yrs old), Dell 24" (7yrs old), ASUS 24" (1 yr old).

The Question(s):
1. Is there a way to make HDMI out just as crystal clear as VGA and DVI on TV and monitor (respectively)... or is this just the way it is??
2. If it's just the way it is... why? any links to explain it? Maybe something that's just different about the way HDMI works.
3. Does anybody else notice this as consistently as I do?
4. Does your setup display VGA (and/or DVI) and HMDI with equal quality... by equal, I mean that you see ABSOLUTELY NO DIFFERENCE between the two different connections? If so, what kind of system do you have? (video card, monitor/TV, etc) What are your settings for your video card and your display? Did you notice what I;m talking about and have to make adjustments to fix it, or was it just plug-n-play perfection?

I have looked in other forums and it always seems to be the same list of responses ... check resolution, driver, video card, cable, power interference, TV settings, model of the video card, model of the TV or monitor... No... these are not the answer. This is a phenomena I have noticed over several years with several TVs and monitors and many different PCs and even more video cards, and many other people have noticed this as well.

Resolution set to the native 1080p (or 720p for some TVs)
Where applicable... All TV POST processing has been disabled (interpolation, HDR, film-mode, Light adjustment, etc) and "Game Mode" enabled


Let me also include that, while I am new to this forum, I am not new to computers... I have worked 10 years in desktop computer field support and the most recent 7 years in tier 3 server support, I have built all of my own computers and game-rigs for my self and family over the last 17 years as well (starting from the days of the Riva 128).
 
Last edited:
Not so subtle "hint" here, pick a forum and stick with it, don't spam the same post in several forums.
 
Without knowing the exact models of your TVs my best guess is that the TVs don't have a PC mode that lets you set it to 1:1 pixel mapping via HDMI or if they do you don't have it enabled. You can look up overscan and TVs and find many articles about the phenomenon.
 
Overscan... found some things on that and will give it a shot when i get home...

Still, I wonder why this has not become something of a standard to be auto-adjusted. Obviously this is implemented correctly with game consoles... so you would think that PC graphics cards would also have the ability to adjust this correctly by default?

XBox360, PS3, XBoxONE, and PS4 all handle this adjustment automatically... you may have to (or be given the option to) adjust the screen for some "games"... but by default, the interface of the console OS is just as crystal clear as the PC is connected via a VGA or DVI cable...

seems like I may need to ask NVidia or ATI directly...
 
Overscan has no adjustment that makes any difference... the TV is correctly identified by the video card and the edges are already lined up properly in the overscan adjustment properties with the resolution correctly set at 1920x1080p.

No answer from NVidia yet.. And VGA connection on the same TV and video card is crystal clear... still looking for the reason "why"...
 
I've seen an issue kind of like this before with an AMD video card.

The color space used by video cards can change based on the output being used. Somewhere in the advanced settings of your video driver there should be something where you can select what color space to use.

The color spaces will be something like 4:4:4 or 4:2:2.

Try changing which color space is in use. I had that clear up a weird problem I had with an HP monitor just not quite looking right. The monitor supported the mode being supplied, but it just looked a little off. Switched to a different color space, and all issues were resolved.

A VGA signal is analog rather than digital. It goes through an digital-analog conversion on the video card side and a analog-digital conversion on the monitor side, so working with that doesn't really provide any information on where the problem may reside.
 
Last edited:
I've seen an issue kind of like this before with an AMD video card.

The color space used by video cards can change based on the output being used. Somewhere in the advanced settings of your video driver there should be something where you can select what color space to use.

The color spaces will be something like 4:4:4 or 4:2:2.

Try changing which color space is in use. I had that clear up a weird problem I had with an HP monitor just not quite looking right. The monitor supported the mode being supplied, but it just looked a little off. Switched to a different color space, and all issues were resolved.

A VGA signal is analog rather than digital. It goes through an digital-analog conversion on the video card side and a analog-digital conversion on the monitor side, so working with that doesn't really provide any information on where the problem may reside.

Good point about the conversion factor with VGA... that makes perfect sense... but doesn't explain why DVI is better than HDMI... I will try the color space adjustment.

Thanks
 
Doesn't look like "color space" is an available option with the current Nvidia drivers
 
In the Nvidia Control Panel, in the "Change Resolution" section, in the bottom right there is a "Output dynamic range" setting. It should be Full instead of Limited. (Full is 0-255, Limited is 16-235.)

Note: somewhere between 347.52 and 352.86, Nvidia moved this from the "Adjust desktop color settings" section to the "Change Resolution" section.
 
Back
Top