Desktop blurry and oversized in native res on HDTV over DVI-to-HDMI adapter?

uKER

n00b
Joined
Dec 10, 2008
Messages
4
I recently got a full-HD HDTV and I'm trying to get it to work correctly with my PC using a DVI-HDMI adapter to hook up the DVI out of my vid card (an XFX 8800GT) to the TV's HDMI input.

The TV gets the picture with no issues, but when I select x1080 (the TV's native resolution), the desktop stretches out of the screen about two cm. in every direction, and the picture is not crystal-clear as it should be on an LCD in its native res.

I have tinkered with the overscan/underscan settings, but I haven't been able to get anywhere.
I also tried to force the TV's driver to that of a generic 1920x1080 60Hz flat panel, but nothing changed.

I suspect the EDID is to blame.
It reports no standard 1080 modes, so the PC detects the TV as 720p.
I've tried a couple of the OverrideEdidFlags tweaks found here, but haven't achieved much either.

Can anyone provide any insight?

If anyone requires it, I can provide my TV's EDID.

Any help will be much appreciated.

Thanks in advance.
 
Can you still return it? (EDID can be frustrating. A few years ago, I could get no good fix for this until the TV's motherboard was upgraded...)
 
It sounds like overscan to me. Check your HDTVs settings and look for a zoom option and play with that. You should be on "Fill" or the equivalent right now. You're looking for a "Standard" setting that makes normal TV have black bars on each side. If that's not an option I think the nvidia control panel may have options for overscan (I'm on ATI atm so I'm not sure, though CCC has overscan options). Only use this if your TV doesn't have any options.
 
I think he's saying that his 1080P TV is telling his computer that it's a 720P set.

On my original DLP set, screwed up EDID was a show stopper. However, I remember now that this was because the EDID was only showing an analog connection, which disabled the DVI effectively.

On a Sharp, it showed 1280 by 720P as its resolution per the EDID. However, I was still able to give it via DVI its true native resolution of 1366 by 768 in dot to dot mode and it would get a decent, if not perfect, lock. This was with XP and probably PowerStrip. And was about 3 years ago and now videocard controls have custom resolution support as well of course...

(At least in XP...)
 
are you using nvidia's control panel for the settings?
Yep.

It sounds like overscan to me. Check your HDTVs settings and look for a zoom option and play with that. You should be on "Fill" or the equivalent right now. You're looking for a "Standard" setting that makes normal TV have black bars on each side. If that's not an option I think the nvidia control panel may have options for overscan (I'm on ATI atm so I'm not sure, though CCC has overscan options). Only use this if your TV doesn't have any options.
Yes, it also seems like overscan, but I can't figure out why a digital TV would get overscan.
Yesterday I went back to try nVidia's overscan compensation utility, but nothing seems to change.
Maybe it's a driver issue.
I guess I'll have to try installing XP just to try that out.
About the 'Fill' option, IIRC (I'm at work ATM) the TV only has 'Fill', '4:3' and some 'Zoom 1' and 'Zoom 2' settings, and yes, 'Fill' is the only one that produces even close to expected output.
Anyway, I've played enough with them to be sure that they're not the culprit.

Is the screen itself set to dispaly pixel to pixel or dot by dot mode?
Yes, the TV is in 'Fill' mode.

I think he's saying that his 1080P TV is telling his computer that it's a 720P set.

On my original DLP set, screwed up EDID was a show stopper. However, I remember now that this was because the EDID was only showing an analog connection, which disabled the DVI effectively.

On a Sharp, it showed 1280 by 720P as its resolution per the EDID. However, I was still able to give it via DVI its true native resolution of 1366 by 768 in dot to dot mode and it would get a decent, if not perfect, lock. This was with XP and probably PowerStrip. And was about 3 years ago and now videocard controls have custom resolution support as well of course...

(At least in XP...)
I'm currently on Windows 7, and I have configured the nVidia control panel to tell it that the TV is a 1080p HDTV.
Also, as I said, I tried changing the TV's driver to a 1920x1080 DFP.

Thanks to all of you that replied.
Any further suggestions will be greatly appreciated.
 
If it's related, my Dell XPS M1330 w/ 8400M GS had problems with my Sony Bravia 40" LCD as well. The picture was crap and I couldn't center the thing properly, and I tried fill/wide//1:1/etc. My second M1330 with X3100 's HDMI worked properly, and the picture was slightly better, but it still wasn't very sharp and clear (fonts looked weird). VGA for both seemed to look perfectly. I had a 3470 on a Sony laptop but it doesn't have HDMI or DVI so I couldn't test it...

I to am seeing this as a driver problem. I haven't found a solution to the problem besides using VGA. And since I no longer have any HDMI laptops, I can't really play around anymore. I never really liked Nvidia... caused so many problems with my laptop. Mobile ATI worked a lot better for me. Maybe I should try it out with my desktop, which has a 7600GT... I need to find a DVI -> HDMI. I'll report back my findings.

Hope you find a solution!
 
Back
Top