You have to make sure the DVI output of your computer supports analog video.
Check online for a DVI pinout. You will see the group of 4 or so pins to one side of the connector provide analog VGA. If your Video card (or cable) doesn't utilize those pins, then your monitor isn't going to be supplied the signal it needs.
a 6ya expert can help you resolve that issue over the phone in a minute or two.
best thing about this new service is that you are never placed on hold and get to talk to real repairmen in the US.
the service is completely free and covers almost anything you can think of (from cars to computers, handyman, and even drones).
click here to download the app (for users in the US for now) and get all the help you need. goodluck!
- If you need clarification, ask it in the comment box above.
- Better answers use proper spelling and grammar.
- Provide details, support with references or personal experience.
Tell us some more! Your answer needs to include more details to help people.You can't post answers that contain an email address.Please enter a valid email address.The email address entered is already associated to an account.Login to postPlease use English characters only.
Tip: The max point reward for answering a question is 15.
The problem could be in the adapter to the video card. There are different types of DVI connections. If the adapter does not have the proper set of pins then the card will not recognize that a monitor is connected at that port. Monitors themselves have no bearing on whether a single or dual monitor set up will work or not. Digital Visual Interface Follow this link, it may help you determine if the adapter you are using is correct for the graphics card.
in this situation, there are two options. 1.The device connected to your TV via DVI cable is off. 2. DVI cable connected to the HP L1950 is burned.
if you try to connect a device to the HP L1950, it has two ways to connect:
Input Signal - Two connectors: one 15-pin mini D-sub analog VGA and one DVI-D. Try VGA them.
Any contemporary monitor is able to switch analog/digital mode automatically if connection from signal source to monitor is set up properly.
Connect digital output from your graphic card to DVI input of the monitor. Disconnect any other signal cables from monitor and switch on both devices (PC & monitor). Wait a while. Monitor checks its inputs and a activates digital DVI, because it is only one active input on it.
If none picture excluding error message of monitor is displayed, go through these steps: 1) Check whether your monitor is set for automatic input selection. If monitor buttons have no response try to use analog input to activate screen (any picture, desktop background...) and then get monitor menu access by pressing menu button.
2) Find out what connector type has your graphics card in PC. Possibilities are: DVI, HDMI, DisplayPort and some their options. See Wikipedia for connector look if you are not sure. Remember that blue D-SUB (VGA) connectors are analogue and you cannot activate digital mode through analog VGA cable.
3) Use appropriate cable to connect monitor to PC. Example: Your PC has HDMI output. Use HDMi-to-DVI cable. Your monitor input is DVI and common DVI-DVI cable is not suitable in case you have PC with the HDMI. Note that DVI-to-DVI, DisplayPort-to-DVI etc. cables are available. You must buy right one.
Be aware that "Auto" button is inactive in digital mode of monitor.
You need a HDMI to DI cable not HDMI to d-sub cable. The sellers that are selling that cable do not tell you that it will only work with special video cards that output analog to the HDMI connector. HDMI is digital and so is DVI so they will adapt to each other. Keep in mind that this is for video only, sound will not work on the monitor because the DVI port does not support sound. You will need to add a set of speakers to the Xbox
Connect the VGA cable to the converter and the component video cable to the TV and the converter with the computer and TV off. Turn on the TV first and set it to the component video input and then turn on the computer. Then (assuming the VGA port is enabled - adjust in the BIOS if needed) set the computer's video output to the VGA port. (On my Windows 7 laptop, this can be done by right-clicking on the desktop. Choose Graphics Properties and set the display to the external monitor or both. Click Apply and then confirm the change (on the external monitor if using the single monitor). (VGA to Composite video converters also exist if you do not have an available component video input on your TV.)
If your computer is fairly new, you may have an HDMI port. Your TV may be able to read the signal with a DVI to HDMI cable. (It depends on the TV. Not all of them will read the computer's signal on the DVI port. Check the documentation for your model. TVs without a VGA port are less likely to support this feature.)
This monitor has two types of video inputs, a 15 pin VGA D-sub connector and a DVI connector. If the video card has a 15 pin D-Sub then connect the video cable to the 15 pin D-Sub connector on the monitor. If the video connectors are DVI then connect the video card and monitor with a DVI video cable. You will also need to configure the monitor for either the D-Sub or DVI video input.
If it's got a DVI input then it can handle a DVI signal.
There is no reason to have two cables plugged in unless you have two screens. If you've got two leads plugged in then the second one is probably turned off by default. I remember when I had a dual monitor set up, I had to turn on the second output in order to get both screens working.
You haven't said if you removed the d sub when the pc was off, but I get the impression that it was on at the time. Try unplugging the d sub while the pc is off, check the dvi cable is inserted correctly on both ends and then turn your pc on. It should now be working.
If that doesn't work then turn your pc off again, plug the d sub back in and start it up. Go to your display settings and turn on the dvi port, you want it to display a copy of the first screen. if you extend your desktop then you'll only be able to see half of it at a time. This isn't an ideal solution and your graphics card will be doing double the work, so try it the proper way first.
Unfortunately you are trying to mix two different video types: VGA is an analog video interface found on most computers (except for latest with flat panel monitors, which are usually DVI). Your TV DVI interface is digital (although there is a DVI-A, which is an anlog DVI interface, it is unlikely that your TV has this). So, what you are trying to do probably won't work. I assume your computer does not have DVI output (otherwise you wouldn't have bought the adapter), but some monitors have VGA inputs; does yours? What kind of DVI adapter did you get? Is a DVI-A? Check out http://en.wikipedia.org/wiki/Dvi for info on DVI interfaces.