Weird ghosting on VGA only?

slaya

[H]ard|Gawd
Joined
Nov 10, 2002
Messages
1,132
Hey, so I have an acer p191w monitor which I purchased quite some time ago. When I purchased it, I only used it through DVI, which worked fine. Many months later I tried to use it via VGA only to find out there is a weird blurry/ghosting effect.

Through the years i've used many different desktops, video cards, and vga cables. All known to work fine on other monitors with no luck. I've narrowed it down directly to the monitor. I can't seem to understand how to fix this, i've scrambled through every setting on the monitor and computer with no luck.

On the monitors DVI port, everything works perfectly fine though.

I've attatched an image to show what I am reffering to.

Link to image

Thanks
 
Last edited:
why even bother with vga when you have dvi? :confused:

ps. you should take photo not screenshot :rolleyes:
 
At this moment, i'm in a position where I only have onboard VGA. I'm somewhat avoiding purchasing a video card currently, and was hoping I could fix this VGA problem and hold off on the video card purchase.

I'm pretty sure the screen shot captured the effect quite well, no?
 
Looking at the pic, its not ghosting, its wonderful MS deciding you need font smoothing, a feature they enabled by default in windows vista & 7.

I turned this off a while back, I forget exactly what I did. Google around for font smoothing or cleartype or smooth edges of fonts, or something.
 
I am looking into that now, Thank you.

Though, look at the shadow effect coming off the icons. Aswell as the fuzzy verticle stripe on the left side of the screen. It's effecting a bit more than just the text.
 
At this moment, i'm in a position where I only have onboard VGA. I'm somewhat avoiding purchasing a video card currently, and was hoping I could fix this VGA problem and hold off on the video card purchase.

I'm pretty sure the screen shot captured the effect quite well, no?

Let's use some logic here...

If you have nailed down the problem to the monitor...

And the monitor displays what the video card sends to it...

And a screenshot is a capture of a single frame from the buffer in the video card BEFORE it gets sent to the monitor to display...

Then you're not capturing the issue if it was due to the monitor itself.

Monitors are a terminus device: Anything send to it pretty much stops there. Not bidirectional. A screenshot doesn't display what artifacts the monitor is showing you. If you crack the LCD on your monitor, a screenshot won't capture that. Nor will it capture issues like an entire color channel disappearing because something on the monitor fried. (Like a Red gun going poof on a CRT)
 
Considering the screenshot is displaying the problem exactly as i'm seeing it on my screen your logic leaves me to believe there is a problem elsewhere?

I can honestly say that i've had atleast 3 different computers on this monitor attempting to use it via VGA with this same problem..
 
Considering the screenshot is displaying the problem exactly as i'm seeing it on my screen your logic leaves me to believe there is a problem elsewhere?

I can honestly say that i've had atleast 3 different computers on this monitor attempting to use it via VGA with this same problem..

Are you viewing this screenshot that you made on the same monitor that is having the problem? Because, quite frankly, it looks fine to me. The variance in quality of icon appearance is to be expected for programs that have been made when lower resolution images were standard.
 
doh, re read post cable. not sure how to delete post
 
Last edited:
Yeah... taking a screenshot doesn't show anything about what's actually displayed on the monitor. You need to take a picture of the monitor with a camera.

It's probably VGA ghosting from RF noise. You need to get a better cable that's shielded. It probably looks like this: http://www.vpi.us/images/vga-ghost.jpg
 
Yeah, post an actual picture, not a screenshot.

Another thing to remember is that using the analog VGA cable and signal is going to give you a less crisp display when compared to DVI... even if you have a well shielded cable. If that is all that is happening, then there is nothing you can do about it is you must use the VGA signal.
 
Generally speaking, if the VGA cable isn't almost as thick in diameter as a dime, it's junk. It needs that much shielding on top of the actual signal and ground wires just to prevent ghosting over even *moderate* distances when high resolutions are involved.
 
After googling the term "VGA Ghosting" it is exactly what is happening to my display.

I am using known, good, vga cables with no luck. Could it be an issue with the display itself?
 
Could it be an issue with the display itself?

There is no "could be" about it; it IS an issue with the display. You have effectively ruled out any other possibility. The monitor has subpar or malfunctioning VGA equipment. Lots of monitors with multiple outputs go really cheap on the VGA.

There is no way to fix it. Sorry.
 
After googling the term "VGA Ghosting" it is exactly what is happening to my display.

I am using known, good, vga cables with no luck. Could it be an issue with the display itself?

Shit cables or shit display. Probably both if you're using the cables that came with other displays.
 
I've been aware of this problem for some time and always disregarded it because it wasn't an issue for me to use DVI. I see now that it isn't a problem I can fix, and am ordering a video card.

Thanks for the help
 
Back
Top