It seems that DVI port on GeForce fx 5200 does not work properly I also had a problem with this card, but I tried to connect this card to the samsung 245b with resolutuion 1920x1200 and this card could not give me the correct picture it was stretched out the monitor borders, but the analog (VGA) output works ok, prhaps in your case new drivers will solve the problem. I going to change this card for newer one. I found on some forum that FX5200 has a bug with DVI output. And wide and big resolutions can be displayed incorrectly, and it is not a surprise when the card was made such resolutions was only in plans for middle class computers.
- If you need clarification, ask it in the comment box above.
- Better answers use proper spelling and grammar.
- Provide details, support with references or personal experience.
Tell us some more! Your answer needs to include more details to help people.You can't post answers that contain an email address.Please enter a valid email address.The email address entered is already associated to an account.Login to postPlease use English characters only.
Tip: The max point reward for answering a question is 15.
All hope is not lost with just the VGA out on your card!
First, every flat panel monitor I have ever seen has a VGA input, so your GeForce2 card will still work!
VGA is still the top analog signal standard for computers. I am writing this with my on-board NVIDEA GeForce, connected to a Viewsonic VA2226w flat panel monitor using VGA and set at a resolution of 1680X1050!
Almost forgot..... just ignore the "TV Out" on your card.... this is old school and will give you nowhere near the quality you will get from VGA.... it was meant as a "crowd pleaser" to allow people to connect to their old school big screen TV sets.
Get the flat panel display... you will find out that it will work just fine with your VGA connection.
Hello ghinea18, sorry you can't do that because this will cause hardware conflict, that's why motherboard bios automatically disable integrated graphics if you install video card using pci express x16 slot - Nvidia GeForce 7300SE. do you know if your video card has dual vga or dual dvi connection?
You can try dual display, im not sure if this card supports dual monitors because nvidia online specs does not show full specs on 7300se series. \
Make sure your monitor is set up to recieve what ever signal you are wanting via the monitor's menu. There should be an input button on the monitor to select the signal type. The connection really doesn't matter as long as you are able to use the monitor. There could be something wrong with the connector on the video card. Also make sure you are using the card and are not using the onboard graphics connection by mistake. If the problem persists it is more than likely the card.
Use the DVI port, and the VGA port on the ATI Radeon X300SE graphics card.
If you have two monitors that are LCD, and they both have a DVI connector, use a DVI to VGA adapter for the VGA port on the ATI Radeon X300SE.
[You need a Female DVI to Male VGA adapter, such as this example,
Turn both monitors on. Right-click on an empty spot on the desktop screen. Left-click on Properties, then the Settings tab. Left-click on the monitor icon with the 2 in it. Go down to ->Extend my desktop to this monitor, and left-click in the box to the left of it.
Go down to the right, and left-click on Apply. Now go over to the left, and left-click on OK.
You should now have your desktop screen on both monitors. When you open a window on your Primary monitor, (Monitor icon 1), Left-click on the frame, hold the mouse button down, and drag this screen to the Secondary monitor.
If it doesn't work this way, drag to the other side.
A DVI-D signal is digital (whereas DVi-I is digital AND analogue) and can never be converted into a VGA signal without using expensive hardware boxes that convert the signal. These though tend to lose quite a bit of the signal quality. If I were you I'd look for a monitor with a digital connection, because I'm afraid this isn't going to work.
This card does not allow you to Multi-Display with the VGA port and the S-video port. If you put a DVI to VGA port adaptor (not supplied) on the DVI port, you can Mulit-Display two VGA monitors and (presumably) a VGA (from the DVI port) and S-video monitor.
This is not explained anywhere and the card does not come with a DVI to VGA port adaptor, so I wa only able to work this out after I bought the DVI to VGA port adaptor.
There are different reasons why this might happen. Let's try some.
1.) Does your board have an onboard video card, cause if it does, then XP might be using that as the primary monitor. TRy plugging into the onboard video when you start up! If that works then you need to disable the onboard video.(try disabling it in bios!)
if not then
2.)When booting at the select xp or recovery console, press F8. You'll get the menu with safe mode and what not. Go down to Enable VGA Mode then press enter. If that works, go to display settings and lower your resolution then restart. But usually when this happens your monitor says "Out of Range" but u can still try
Well if that doesn't work, tell me more!
A. There are two types of cable; DVI "I" and DVI "D". Make sure you are using the correct one
B. Update your following drivers:
1.VGA card driver 27.00 or later version
2.WDM driver(capture driver) 1.11 or later version
3.Expertool program (overclocking program) 2.81 or later version