Monitor no longer will switch to Analog (VGA) source.
...This suddenly happened to me when I tried to play my XBOX 360, which uses VGA. Whenever I try pushing to "Source" button, it keeps switching back to Digital (DVI). The only way I can get to VGA mode is if there is a signal in DVI mode, which means I have to turn on my computer every time I want to play my 360.
Why is this all of a happening all of a sudden? I tried using the "Factory Reset" feature in the menu, and have even unplugged the power cable for at least 6 hours, to no avail.
Re: Monitor no longer will switch to Analog (VGA) source.
I just bought the 22" Gateway Monitor and am Having similar problems.
Heres what I found out: Check the recommended cable input \ outputs for your cards and monitor. I know this 22" says it needs a 24-Pin DVI-D Dual Cable, but works with the others.
I'm currently using a 18-Pin DVI-D Single and think that is the problem. My card supports DVI-I Single and Dual, so I would recommend, for myself, DVI-I Dual Link to make sure it has the max power and signal.
Check to see what cable you are infact using, and go to this website to get some info on DVI-Cables: http://www.datapro.net/techinfo/dvi_info.html
This is where I found out that I didn't have the cable the monitor required for max input, and might be the solution.
I will post back, but this should get your started.
- If you need clarification, ask it in the comment box above.
- Better answers use proper spelling and grammar.
- Provide details, support with references or personal experience.
Tell us some more! Your answer needs to include more details to help people.You can't post answers that contain an email address.Please enter a valid email address.The email address entered is already associated to an account.Login to postPlease use English characters only.
Tip: The max point reward for answering a question is 15.
Any contemporary monitor is able to switch analog/digital mode automatically if connection from signal source to monitor is set up properly.
Connect digital output from your graphic card to DVI input of the monitor. Disconnect any other signal cables from monitor and switch on both devices (PC & monitor). Wait a while. Monitor checks its inputs and a activates digital DVI, because it is only one active input on it.
If none picture excluding error message of monitor is displayed, go through these steps: 1) Check whether your monitor is set for automatic input selection. If monitor buttons have no response try to use analog input to activate screen (any picture, desktop background...) and then get monitor menu access by pressing menu button.
2) Find out what connector type has your graphics card in PC. Possibilities are: DVI, HDMI, DisplayPort and some their options. See Wikipedia for connector look if you are not sure. Remember that blue D-SUB (VGA) connectors are analogue and you cannot activate digital mode through analog VGA cable.
3) Use appropriate cable to connect monitor to PC. Example: Your PC has HDMI output. Use HDMi-to-DVI cable. Your monitor input is DVI and common DVI-DVI cable is not suitable in case you have PC with the HDMI. Note that DVI-to-DVI, DisplayPort-to-DVI etc. cables are available. You must buy right one.
Be aware that "Auto" button is inactive in digital mode of monitor.
I am not an xbox person, but as far as hooking it up, it should work as long as the other device is NOT powered up. That being said you will still have to provide audio from the xbox another way as dvi only carries a video signal, unlike the hdmi which carries both audio and video.
I'm guessing you are using a the VGA input of this monitor (the smaller "D" shaped connector with 15 pins) on the back. If not, and you are using the DVI input (larger white connector), then check your PC video output. I've seen this occur with HP monitors when they accidentally get set to "digital" input. Its easy to set that way (even by mistake), but not as easy to reset to analog (VGA). To do that, you'll need to find someone with a DVI output on their video card. Once you connect to a DVI source, the monitir will come on and stay on. You then need to go into the menues of the monitor, find the setting and change it back to "analog". After you do of course, you won't get an image on your monitor until you reconnect to an "analog" (VGA) source.
To clarify, you know that analog is referring to the VGA 15 pin video input, and digital is referring to the DVI video input, right? You can't have the digital video unless you use the DVI input, which requires a DVI output from your computer (and a DVI cord of course).
If you already knew this and you mean that it won't switch over to your DVI, then there could be several problems. Most of them being with your DVI signal source (computer). Try booting the PC with ONLY the DVI cord attached between the PC and the monitor. If you have a VGA cord plugged into either, UNPLUG IT FROM BOTH ENDS.
Good trouble shooting. So it most likely is the monitor. Can you talk to your vendor about a replacement? There could be a voltage problem feeding the monitor. Is it set for the correct voltage? Can you check the voltage feeding the monitor? Other than that, there is a circuit problem in the monitor.
I had a similar problem, the monitor was going blank once in a while (since I don't have a VGA input), so I switched the Auto Source feature to
"Manual" and so far it's working fine for the past 2 days, if I run
into the problem again I'll update this message.
On the side
panel of your monitor, press the top button to access the menu, scroll
down to the setup menu (by pressing the 3rd button down on the panel),
press the 4th button down to enter the setup menu, scroll down to "Auto
Source", use the 4th button to select, "Auto" & "Manual" are your
choices.. press the 2nd button down to select "Manual" then the top
button to exit the menu(s).
This monitor is set up to be able to
do multi display with the software included. Utilizing both Digital and
Analog signals. I'm not using this feature so I'm not sure how
switching the Auto Source off will affect the performance of this