Question about Gateway FPD2275W LCD Monitor

1 Answer

Can't use DVI with monitor

Is there a setting that needs to be changed in the hardware or software for the monitor to use DVI. The analog VGA works fine. The video card is a NVidia GeForce FX5500 going to a NEC Multisync LCD 1935NXM. Thanks.

Posted by on

1 Answer

  • Level 3:

    An expert who has achieved level 3 by getting 1000 points


    An expert that got 20 achievements.


    An expert that got 10 achievements.


    An expert that got 5 achievements.

  • Master
  • 4,889 Answers
Re: Can't use DVI with monitor

You have 3 ports on your graphics card(VGA - 15 pin D-Sub x 1, DVI x 1, and S-Video x 1)Now if you want to use DVI just connect the DVI port to your monitor thats all..

Posted on Oct 19, 2007

Add Your Answer

0 characters

Uploading: 0%


Complete. Click "Add" to insert your video. Add


3 Points

Related Questions:

1 Answer

What is vga/dvi cables

VGA and DVI is in the way the video signals travel. VGA connectors and cables carry analog signals while DVI can carry both analog and digital. DVI is newer and offers better, sharper display compared to VGA. You can easily tell them apart because VGA connectors (and ports) are blue while DVI connectors are white. Click the link below to see an image of them.

May 28, 2014 | Dell E172FP 17" LCD Flat Panel Monitor

1 Answer

How to switch from analog to digital mode

Any contemporary monitor is able to switch analog/digital mode automatically if connection from signal source to monitor is set up properly.

Connect digital output from your graphic card to DVI input of the monitor. Disconnect any other signal cables from monitor and switch on both devices (PC & monitor). Wait a while. Monitor checks its inputs and a activates digital DVI, because it is only one active input on it.

If none picture excluding error message of monitor is displayed, go through these steps:
1) Check whether your monitor is set for automatic input selection. If monitor buttons have no response try to use analog input to activate screen (any picture, desktop background...) and then get monitor menu access by pressing menu button.

2) Find out what connector type has your graphics card in PC. Possibilities are: DVI, HDMI, DisplayPort and some their options. See Wikipedia for connector look if you are not sure.
Remember that blue D-SUB (VGA) connectors are analogue and you cannot activate digital mode through analog VGA cable.

3) Use appropriate cable to connect monitor to PC.
Example: Your PC has HDMI output. Use HDMi-to-DVI cable. Your monitor input is DVI and common DVI-DVI cable is not suitable in case you have PC with the HDMI.
Note that DVI-to-DVI, DisplayPort-to-DVI etc. cables are available. You must buy right one.

Be aware that "Auto" button is inactive in digital mode of monitor.

Nov 18, 2013 | Belinea 1930 S1 Monitor

2 Answers

Samsung syncmaster 2033sw monitor, till now i was using a vga cable for the display, but recently i bought a dvi cable and connected it. but its not detecting the display of dvi cable. if i remove the vga...

LCD Monitors having multiple inputs have option and option to select and switch between the displays from analog (VGA) or Digital (DVI) inputs. Go to the LCD menu and select input source (DVI)

Hope this helps!


Nov 15, 2010 | Samsung Syncmaster 2033SW Monitor

1 Answer

Monitor wont come out of sleep mode

I'm guessing you are using a the VGA input of this monitor (the smaller "D" shaped connector with 15 pins) on the back. If not, and you are using the DVI input (larger white connector), then check your PC video output. I've seen this occur with HP monitors when they accidentally get set to "digital" input. Its easy to set that way (even by mistake), but not as easy to reset to analog (VGA). To do that, you'll need to find someone with a DVI output on their video card. Once you connect to a DVI source, the monitir will come on and stay on. You then need to go into the menues of the monitor, find the setting and change it back to "analog". After you do of course, you won't get an image on your monitor until you reconnect to an "analog" (VGA) source.

Jun 02, 2010 | HP w1907 LCD Monitor

1 Answer

My monitor says it is displaying in vga. how do I change it to analog?

VGA IS Analog? ONLY DVI-D & HDMI are digital. The only other analog video formats are, Composite, & Component.

Apr 12, 2010 | E-Machines eMachines Monitor E15T4

1 Answer

My viewsonic mo9nitor says analog not digital on

To clarify, you know that analog is referring to the VGA 15 pin video input, and digital is referring to the DVI video input, right? You can't have the digital video unless you use the DVI input, which requires a DVI output from your computer (and a DVI cord of course).
If you already knew this and you mean that it won't switch over to your DVI, then there could be several problems. Most of them being with your DVI signal source (computer). Try booting the PC with ONLY the DVI cord attached between the PC and the monitor. If you have a VGA cord plugged into either, UNPLUG IT FROM BOTH ENDS.

Dec 05, 2009 | ViewSonic Computer Monitors

1 Answer

No signal

You don't specify what kind of monitor, CRT, LCD, or even a TV attached to your system, you're using. It's likely the monitor is set to receive a certain kind of signal (either analog or digital) but it's receiving the other one instead. Before proceeding to the next steps, ensure the connections are solid and the the screws are set tightly and there's no movement.
Get your user's manual and learn how to navigate the monitor's menu in order to get to the setting where you choose Analog (for a VGA cable connection) or more commonly with LCD screens, Digital (DVI).
VGA and DVI connectors differ in shape and in the number and layout of their pins. If you're using a video card with a DVI connector (larger, more squarish D shape, with three separate sections of pins) then your monitor shoud be set to Digital. Else, your connector is VGA and thus your monitor should be set to Analog.
If both are in synch and you still do not get a signal, it's possible the video card has given out its ghost, since if you DO see the NO SIGNAL message displayed on your monitor, the monitor is Ok.
Good luck!

Aug 17, 2009 | ViewSonic Computer Monitors

2 Answers

Monitor says No signal digital

Is ether the monitor or video card uses both DVI or VGA port. There may be a switch on the graphic card that changes it from DVI to VGA, often outside the card.

Also is the computer monitor set to the right setting if it can use both VGA and DVI, if using DVI (of any type) it need to switch to digital, and if using VGA it need to be set to analog. This may be a physical switch or using the settings button. (see manual how to change them)

PS: VGA port is a blue D shaped port with 15 pin holes, and DVI is a white rectangle port with one port being a line.

Mar 17, 2009 | ViewSonic VG2030wm LCD Monitor

1 Answer

Cannot get correct resolution

1. It could be limitation of your video card. Older cards would not support high resolution for LCD monitors.
2. Be sure to use DVI(digital) output from your video card. You will not get a good resolution with VGA(analog)
3. When installing driver for a monitor sometimes you will get a chose to install analog(VGA) or digital(DVI) driver. Choose the correct driver.

Good luck!

Jun 04, 2008 | ViewSonic VX2235WM Monitor

Not finding what you are looking for?
Gateway FPD2275W LCD Monitor Logo

Related Topics:

295 people viewed this question

Ask a Question

Usually answered in minutes!

Top Gateway Computer Monitors Experts

 Mike M
Mike M

Level 3 Expert

3320 Answers

olayoju victor

Level 2 Expert

191 Answers


Level 2 Expert

206 Answers

Are you a Gateway Computer Monitor Expert? Answer questions, earn points and help others

Answer questions

Manuals & User Guides