I just got an LCD TV and am trying to use with my computer. I don't have cable, so I would like to just use a PC for all my TV viewing. The TV is Sharp Aquos LC-40LE700UN. It's 1080p 120Hz, so it is a decent one and should be able to handle any reasonable input. My video card is Radeon 9600. Yes, it's old, but it supports 1920x1080. I am using a Monoprice DVI -> HDMI cable to connect them.
Anyway, I tried a bunch of different resolutions. All seem to work fine. For some reason 1777x1000 does not work at all, but that does not concern me. When I try 1920x1080 60Hz (which is what I really want to use), I get a ton of horizontal static. I can make out what is on the screen, but it's definitely far from watchable. The TV classifies the signal as 1080p, so something is right. I also tried 1920x1080 30Hz (interlaced). That works and TV appropriately calls it 1080i. So why is 1920x1080 60Hz not working?
I would ideally try other TVs and video cards, but I do not have access to any at the moment. I thought maybe there was an obvious issue you guys could help me pinpoint.
Thanks in advance.
Anyway, I tried a bunch of different resolutions. All seem to work fine. For some reason 1777x1000 does not work at all, but that does not concern me. When I try 1920x1080 60Hz (which is what I really want to use), I get a ton of horizontal static. I can make out what is on the screen, but it's definitely far from watchable. The TV classifies the signal as 1080p, so something is right. I also tried 1920x1080 30Hz (interlaced). That works and TV appropriately calls it 1080i. So why is 1920x1080 60Hz not working?
I would ideally try other TVs and video cards, but I do not have access to any at the moment. I thought maybe there was an obvious issue you guys could help me pinpoint.
Thanks in advance.