When I turn my computer on, my HP flat screen, w2207 monitor gives message"no dvi input" and then goes to sleep. Then I have to shut down and restart, it takes about 3-4 times before the monitor connects correctly. What do I do?
I had trouble when i first got this monitor, the analog D-sub, always worked, after a lot of time and effort, the dvi input worked, it has been a couple of years with no problems..now it suddenly stopped working..i have updated the driver, probably just a few weeks before it stopped working. When i select the dvi input...monitor goes to sleep mode..then screen goes black.
I had a similar problem whereas my monitor went to sleep mode after a few minutes. I'm running an HP LP 2065 in dual mode with another HP flat screen. However, I did resolve the issue after resetting the output settings.
How to do that:
right mouse click on the desktop
select graphic options
Maybe some changings/modification will help you too.
a 6ya expert can help you resolve that issue over the phone in a minute or two.
Best thing about this new service is that you are never placed on hold and get to talk to real repairmen in the US.
the service is completely free and covers almost anything you can think of.(from cars to computers, handyman, and even drones)
click here to download the app (for users in the US for now) and get all the help you need. Goodluck!
- If you need clarification, ask it in the comment box above.
- Better answers use proper spelling and grammar.
- Provide details, support with references or personal experience.
Tell us some more! Your answer needs to include more details to help people.You can't post answers that contain an email address.Please enter a valid email address.The email address entered is already associated to an account.Login to postPlease use English characters only.
Tip: The max point reward for answering a question is 15.
There is no signal coming from your PC. excuse some of the solutions as sounding dumb, but we have to explore all possibilities.
Check the power cable going into the computer Check the power input disconnect switch on the back of the power supply and make sure it is on If the pc is plugged into a power bar, make sure it is turned on and the breaker isn't tripped press power on the PC and make sure the lights are coming on.
now for the solutions that may actually help.
disconnect the power from the PC if the video is on a card, remove and re-seat the video card hook the power back up, connect the monitor and try again.
the CMOS on the computer may need to be reset. I hate totally clearing the CMOS, so what I usually do is remove 1 stick of ram and power up. If the system is alive again, power down, put the RAM back in and power back up. If that doesn't work, turn the system off, put the CMOS jumper to the CLEAR position for 15 seconds, put it back to the Normal position and then power back up.
Basically, if the monitor works on your friends system but not on yours, the the problem is your system, not the monitor (assuming it isn't working fine til it gets into windows...if that's the problem, you have a driver or configuration problem)
The "Select" button on the front is supposed to perform this function, usually a monitor will default to whatever mode it was in when it shut down. Alternate presses of the button will switch between the two. I have seen monitors fail to recognize seemingly valid inputs for various reasons, sometimes the computer settings are incorrect and sometimes failure of the monitor video input. I have also seen buttons fail or get stuck on, they also occasionally get contaminated with screen cleaning fluid. If you need more help reply with more specific information about the problem.
I'm guessing you are using a the VGA input of this monitor (the smaller "D" shaped connector with 15 pins) on the back. If not, and you are using the DVI input (larger white connector), then check your PC video output. I've seen this occur with HP monitors when they accidentally get set to "digital" input. Its easy to set that way (even by mistake), but not as easy to reset to analog (VGA). To do that, you'll need to find someone with a DVI output on their video card. Once you connect to a DVI source, the monitir will come on and stay on. You then need to go into the menues of the monitor, find the setting and change it back to "analog". After you do of course, you won't get an image on your monitor until you reconnect to an "analog" (VGA) source.