I have a geforcefx agp8x card i got a new 22" wide LCD monitor which has a dvi-d output on it The cable has 2 sets of 9 pins + a flat pin What card do i need to use the dvi connection and will there be much difference vs the d-sub signal cable that i am using now?
- If you need clarification, ask it in the comment box above.
- Better answers use proper spelling and grammar.
- Provide details, support with references or personal experience.
Tell us some more! Your answer needs to include more details to help people.You can't post answers that contain an email address.Please enter a valid email address.The email address entered is already associated to an account.Login to postPlease use English characters only.
Tip: The max point reward for answering a question is 15.
The problem could be in the adapter to the video card. There are different types of DVI connections. If the adapter does not have the proper set of pins then the card will not recognize that a monitor is connected at that port. Monitors themselves have no bearing on whether a single or dual monitor set up will work or not. Digital Visual Interface Follow this link, it may help you determine if the adapter you are using is correct for the graphics card.
DVI is a new form of video interface technology made to maximize the quality of flat panel LCD monitors and high-end video graphics cards. It is a replacement for the P&D Plug & Display standard.
you can connect high definition monitor , LCD projector , LCD T.V to a DVI port
There three kind of DVI connector 1. DVI-D (Digital)
2. DVI-A (Analog)
3. DVI-I (Integrated Digital/Analog)
DVI-D format is used for direct digital connections between source video
(namely, video cards) and digital LCD (or rare CRD) monitors
DVI-A - High-Res Analog
DVI-A format is used to carry a DVI signal to an analog display, such as a CRD monitor or and HDTV
DVI-I - The Best of Both Worlds
DVI-I format is an integrated cable which is capable of transmitting
either a digital-to-digital signal or an analog-to-analog signal, but it
will not work transmitting a digital-to-analog or analog-to-digital
How to Recognize These three types of DVI Cable
There are two variables in every DVI connector cable, and each represents one characteristic.
The flat pin on one side denotes whether the cable is digital or analog:
A flat pin with four surrounding pins is either DVI-I or DVI-A
A flat pin alone denotes DVI-D
The pinsets vary depending on whether or not the cable is single- or dual-link:
A solid 27-pin set (rows of 8) for a dual- link cable
Two separated 9-pin sets (rows of 6) for a single-link cable
Any monitor can be used as the second monitor for a computer. However, your computer's graphics card must have two outputs. Some simpler graphics cards, most on-board video chipsets and all laptops *do not have* two outputs. Please check the back of your computer for two VGA outputs, or one VGA and one DVI output (you can use it by connecting a DVI cable to your monitor, or by plugging in a DVI to VGA adapter).
If your graphics card does not have two outputs, or if your computer uses an on-board video chipset, you can buy another graphics card to replace the one you're using.
DVI ==> Digital Video Input
If your monitor has both DVI and VGA inputs, and your PC's video-card has both DVI and VGA outputs, then buy a DVI-to-DVI cable, and use it, for a better picture.
If your monitor has both DVI and VGA inputs, and your PC's video-card only has VGA output, then use a VGA-to-VGA cable. Or, purchase a video-card with DVI output, to replace the video-card with VGA output, and then use a DVI-to-DVI cable.
Some cards can send 2 outputs that is split by the cable, which is connected to 2 monitors. Generally speaking though, most just split the image up
If you're not sure if your output supports multiple monitors off one output, it's almost certain it doesn't. There is USB 2.0 to DVI, HDMI and VGA Multi-Display Adapter resolutions of up to 1600 x 1200 Allows for effortless multitasking
If you are comfortable with opening the case, you can try this: Unplug the power supply, make sure are constantly grounded (touching metal)remove add-in video card. Locate the "Clear CMOS" jumper (usually 3 pins in a row with two shorted via a plastic jumper). Remove the jumper and short the other 2 pins (if originally pins 1-2 with 3 open, short 2-3 with 1 open) for a few seconds. Then replace on original pins. Plug monitor into motherboard video port and power on. If it works, you can leave as is or try to re-install your video card following the same precautions when opening the case and probably changing BIOS settings to utilize the add-on card.
Most of the Widescreen monitors come with DVI as well as VGA connector. A VGA connector and DVI connectors are different in many aspects. First of all the DVI connector would be white in color and VGA would be blue. VGA connector has 15 pins and DVI has 24 +1 pin
Did you try your monitor with another output (VGA)? If not, try it. So, you have to check and make sure where the problem is (display card, monitor, or cable). Try with another cable or monitor. This way you will know where the problem is. If you don't have another monitor, try the monitor you have with build in VGA output on motherboard by taking the display card out. Thank you, Tan