Laptop not detecting external display

Joined
Feb 17, 2016
Messages
17
I've gone through every forum and troubleshooted a lot. Here's where I am so far:

Display: 37'' Sharp Aquos "monitor" - input is DVI only. No input modes to adjust. I use a DVI to HDMI cable to connect to the laptop. Cable works with this monitor on my desktop, cable boxes, etc. Just not my laptop.

Laptop: Lenovo Y570 - Intel HD 3000 Graphics Card & nVidia 555m.

I've updated drivers, uninstalled drivers, disabled intel, then nVidia. Tried unplugging everything, plugging in monitor, then booting up, nothing.

Computer doesn't read monitor in device manager, bios, control panel, intel or nVidia display managements. I've done the FN+F4 function, nothing. I've done the "Window's Key + P" (this brings up the list of options [clone, extend, projector] but does not effect the monitor.)

Laptop works with all other monitors, TV's, etc. Monitor works with other laptops, desktops, cable boxes, etc.

I've taken every action people came up with on other posts, but nothing has worked so far.

Any input would be great! Please help, thanks.
 
Last edited:
100% works. I use this same laptop with other monitors & TV's. There's only 1 HDMI port, and it works with no problems. Other monitors pop right up and start cloning display.
 
Are you sure the cable works then?

Edit: What I mean is for the TV. Maybe it's a pin missing on the DVI-HDMI that the TV (in particular) needs to do the "handshake."
 
Yes. I use the same cable with my desktop, cable boxes, other laptops and they all hook up fine. It's some sort of communication error between my laptop and my external display.
 
I do also have an adapter. I've tried both and neither is helping my laptop to detect my monitor. Both the cable and the adapter work with other devices.
 
You said the computer doesn't read the monitor. What shows up on the display then?
 
This is a laptop I'm connecting an external display to. So the internal display is fine, it's the external display that isn't being recognized.

Hence the name of this thread: Laptop not detecting external display - lol
 
This is a laptop I'm connecting an external display to. So the internal display is fine, it's the external display that isn't being recognized.
I know. What's the TV show while it's connected. Are you turning it on or expecting it to wake from sleep mode?
 
I know. What's the TV show while it's connected. Are you turning it on or expecting it to wake from sleep mode?
The TV shows nothing, when it's plugged into this laptop. Black screen.

When plugged into other devices, the image pops right up, duplicating.
 
I think it's obvious the cable isn't working. Try another one or another adapter. This isn't a configuration issue if everything you're saying is accurate.
 
The cable and the adapter work with other monitors, and with other desktops.
Dunno. There's really nothing else you can do, because all three of them (cable, laptop, TV) together simply don't work. I'm assuming you're not going to replace either TV or laptop.
 
I just took the DVI - HDMI cable out of the back of the monitor, plugged it into another monitor, and the laptop read it immediately. This has to be some sort of software issue with the laptop not reading this other monitor for some reason.
 
I just took the DVI - HDMI cable out of the back of the monitor, plugged it into another monitor, and the laptop read it immediately. This has to be some sort of software issue with the laptop not reading this other monitor for some reason.

Maybe, but I doubt it if the laptop works with everything else. There's also the possibility that there's a mechanical problem here, so make sure it plugs in properly.

Edit: Are you sure you don't need an AVC system or whatever that is to use the TV? That might be a thing, too. Good luck.
 
Last edited:
I appreciate your suggestions dude. I just left the monitor plugged in, unplugged the HDMI from the laptop, and plugged it into the HDMI port on my desktop, and the image popped up right away on the monitor.

So we know, it's not the cable (because it works with separate displays and separate machines), it's not the laptop (because other displays work fine) and it's not the monitor (because it works with other machines).

You can imagine my frustration.
 
Whoa, you're good. Let me do some reading here. Thank you. That's exactly what the rear of my "monitor" looks like. I'll update you.
 
try making the laptop resolution something below 720p, shut it down, plug the external monitor back into the HDMI port then turn it back on, see if it works then and if it reads it, you should be able to change the resolutions then and hopefully it remembers it.
 
Eh, no luck so far man. Because the computer doesn't read the monitor, it's only adjusting my laptop's resolution.
Make sure you know which GPU is controlling the HDMI out, set it to clone display if you can, connect it to something else if you have to, set it to 1280x720. Try an actual DVI cable with an adapter instead of the DVI-HDMI also. If what that post says is right, all you have to do is make sure there's a 720p signal going in when the TV is turned on. Good luck, you can figure it out.
 
Sorry if I missed it but; did you try hot-plugging ? (connect while everything's on)

Intel or nVidia sometimes don't detect displays that aren't already active (even trying to force detection can fail in these situations)
 
Sorry if I missed it but; did you try hot-plugging ? (connect while everything's on)

Intel or nVidia sometimes don't detect displays that aren't already active (even trying to force detection can fail in these situations)

Could you rephrase your second comment? What exactly are you suggesting I do? I've plugged in the monitor while on, off, laptop on, laptop off, unplugged, plugged in. I think just about everything.

I've tried all of the above suggestions and none of them allowed my computer to detect my monitor. I even tried the same setup on a 4th monitor, and that one works as well. Just not this one with this laptop.
 
Is there a way to disable HDCP etc even being attempted? Perhaps it's something fucky with that...
 
It could be an EDID issue with Sharp Aquos. ( the monitor fails to ID its make and resolution support back to the OS) I've had this issue with some older plasma monitors.

There's no real fix except to force the device as a generic VGA display. You'd be limited to 1024x768 and it'll fill a 16:9 screen, squashing everything.

And I would check your DVI cable and make sure it's DVI-D (Dual Link) instead of DVI-I (Single Link) Sometimes that can be a factor.
 
Is there a way to disable HDCP etc even being attempted? Perhaps it's something fucky with that...

There's not way to disable HDCP through the monitor as it has no settings options/screen. I've had the HDCP issue with this monitor in the past, on an xbox, but turning off and back on would work around that issue. It's just simply not being identified by the computer.
 
It could be an EDID issue with Sharp Aquos. ( the monitor fails to ID its make and resolution support back to the OS) I've had this issue with some older plasma monitors.

There's no real fix except to force the device as a generic VGA display. You'd be limited to 1024x768 and it'll fill a 16:9 screen, squashing everything.

And I would check your DVI cable and make sure it's DVI-D (Dual Link) instead of DVI-I (Single Link) Sometimes that can be a factor.

I think you may be right about that. I even purchased a USB -> HDMI adapter just to see if it would work around the issue. I can get the computer to read it, kind of, but only for a moment. The screen on the monitor goes from "black - loss of sync" to displaying somewhat of a picture. It's yellow and green, and has "wavelengths" going through it. This lasts for about 5-10 seconds before it goes back to "black - loss of sync". I say loss of sync, because that's what the monitor always reads once you turn it on. It's been that way since day 1.
 
So let me ask this...why is it that I can connect this monitor to my desktop without an issue, but 2 laptops won't read it? The only difference I can think of is my desktop has an AMD CPU and both the laptops are Intel.

With this being known, what could the workaround be with the Intel chipset? Both the laptop and the desktop are running GeForce graphics cards.
 
So I got two adapters to try to figure this thing out. 1. VGA Male -> DVI Female: using a DVI cable, no detection of monitor by laptop. 2. DVI Male -> VGA Female: using a VGA cable, no detection of monitor.

It seems as though my only option is to either buy another 37'' monitor that will be Intel happy, or buy another laptop that will be Sharp Aquos happy. Neither "solution" seems like much of a fix.
 
So here is my fix...I just built another desktop. I did an ultimate budget build, strictly for youtube & whatever else, in the garage. I had a spare mobo & g-card & got a cheap tower and hard drive from Micro Center. The monitor seems to work fine plugged into any desktop (via Graphics Card).

I'm sure there is a solution out there somewhere, but I was unable to identify what that is.
 
Back
Top