- If you need clarification, ask it in the comment box above.
- Better answers use proper spelling and grammar.
- Provide details, support with references or personal experience.
Tell us some more! Your answer needs to include more details to help people.You can't post answers that contain an email address.Please enter a valid email address.The email address entered is already associated to an account.Login to postPlease use English characters only.
Tip: The max point reward for answering a question is 15.
DVI and HDMI are the exact same signal, except that DVI doesn't include the provision for audio. Essentially the problem you're having is that you have the monitor set to auto input rather than HDMI input or DVI input. Setting the monitor to not look for any input you plug in and defining the input should cause the hunting wait to be eliminated.
Any contemporary monitor is able to switch analog/digital mode automatically if connection from signal source to monitor is set up properly.
Connect digital output from your graphic card to DVI input of the monitor. Disconnect any other signal cables from monitor and switch on both devices (PC & monitor). Wait a while. Monitor checks its inputs and a activates digital DVI, because it is only one active input on it.
If none picture excluding error message of monitor is displayed, go through these steps: 1) Check whether your monitor is set for automatic input selection. If monitor buttons have no response try to use analog input to activate screen (any picture, desktop background...) and then get monitor menu access by pressing menu button.
2) Find out what connector type has your graphics card in PC. Possibilities are: DVI, HDMI, DisplayPort and some their options. See Wikipedia for connector look if you are not sure. Remember that blue D-SUB (VGA) connectors are analogue and you cannot activate digital mode through analog VGA cable.
3) Use appropriate cable to connect monitor to PC. Example: Your PC has HDMI output. Use HDMi-to-DVI cable. Your monitor input is DVI and common DVI-DVI cable is not suitable in case you have PC with the HDMI. Note that DVI-to-DVI, DisplayPort-to-DVI etc. cables are available. You must buy right one.
Be aware that "Auto" button is inactive in digital mode of monitor.
I see that this monitor has a DVI input, in additon to the VGA input. I suggest getting a suitable cable to use the DVI input, to see if the fault replicates. If it doesn't, you have identified the problem - an issue with the analogue signal processeing circuitry in either the monitor or the computer. The monitor's auto sleep analogue circuitry, for example, may not be sensing a signal, and going to sleep. BTW, if you use a DVI-DVI cable, you avoid the video signal ever going through an analogue stage, and the image will also be purer.
DVI ==> Digital Video Input
If your monitor has both DVI and VGA inputs, and your PC's video-card has both DVI and VGA outputs, then buy a DVI-to-DVI cable, and use it, for a better picture.
If your monitor has both DVI and VGA inputs, and your PC's video-card only has VGA output, then use a VGA-to-VGA cable. Or, purchase a video-card with DVI output, to replace the video-card with VGA output, and then use a DVI-to-DVI cable.
On both the Graphics card and the monitor, the blue connector is a 15 pin DSub VGA video and the white connector is a DVI video.
With the monitor it should default to the 15pin DSub video input.
If you want to use the DVI video input you need to configure the monitor to use the DVI video input. DVI to DVI give the best picture quality.
If you want to go from 15pin DSub to DVI you need a converter cable.
Try conviguring the monitor for a DVI video input.
You will not benefit from the DVI input unless you have DVI output on your computer. The great advantage of DVI is that the picture is transferred in digital form all the way from your computer to the individual pixels on the monitor. If you only have VGA output on your computer then you will never get a better picture than by using a good quality VGA cable to link that to the VGA input on your monitor.
The DVI plug is designed to carry both digital and analogue (VGA) signals on the various pins, so "adaptors" exist to allow a VGA monitor to attach to a DVI output on a computer. Monitors may also support VGA input from their corresponding DVI input sockets, but the picture quality will never be better than by using a good VGA cable because it is still not using the digital capability of the DVI interface. What's more, because you may need more adaptors and thinner conductors in the cable, you will probably achieve a marginally poorer picture than with VGA to VGA.
When you next buy a PC or a new video card, get one with DVI to take advantage of your new monitor. In the meantime, your best bet is a quality VGA to VGA link.
Sorry that's not what you were wanting to hear, but I hope it will help you avoid further frustration.
the DVI input must be HDCP compatible in order for HDMI to work. It's a DRM encryption protocol. Condolences. Blame hollywood for it. The monitor model is too old to have had HDCP support built in to the DVI input.