Question about ASUS MW221U LCD Monitor

1 Answer

No display Dvi cable from graphics card to monitor results in no display.

Posted by Anonymous on

1 Answer

  • Level 1:

    An expert who has achieved level 1.

    Hot-Shot:

    An expert who has answered 20 questions.

    Corporal:

    An expert that hasĀ over 10 points.

    Problem Solver:

    An expert who has answered 5 questions.

  • Contributor
  • 24 Answers

Your DVI cabkle or graphics card has failed.

Posted on Dec 10, 2014

2 Suggested Answers

6ya6ya
  • 2 Answers

SOURCE: I have freestanding Series 8 dishwasher. Lately during the filling cycle water hammer is occurring. How can this be resolved

Hi,
a 6ya expert can help you resolve that issue over the phone in a minute or two.
best thing about this new service is that you are never placed on hold and get to talk to real repairmen in the US.
the service is completely free and covers almost anything you can think of (from cars to computers, handyman, and even drones).
click here to download the app (for users in the US for now) and get all the help you need.
goodluck!

Posted on Jan 02, 2017

budmrtn
  • 11361 Answers

SOURCE: my Asus monitor wont display my desktop! :(

Since you have narrow down the problem to be the monitor, the next to do is to open it up. First place to look will be the capacitors in the power supply (common problem with ALL monitors). Please look here to get some ideas what to look for inside.
http://s807.photobucket.com/home/budm/allalbums
http://en.wikipedia.org/wiki/Capacitor_plague
Post back so it may help other people also.

Posted on Apr 09, 2010

Testimonial: "thx mate found out there was nothing wrong with moniter found out my nice new graphics crad was nice new and busted taking it back today today thanks!"

Add Your Answer

Uploading: 0%

my-video-file.mp4

Complete. Click "Add" to insert your video. Add

×

Loading...
Loading...

Related Questions:

1 Answer

Samsung Syncmaster 305t Not displaying full resolution with Dual Link DVI cable.


2560 x 1600 at 60 Hz support . if may v sync 75Hz need to check the adjust.

Nov 05, 2015 | Samsung Computers & Internet

1 Answer

How to switch from analog to digital mode


Any contemporary monitor is able to switch analog/digital mode automatically if connection from signal source to monitor is set up properly.

Connect digital output from your graphic card to DVI input of the monitor. Disconnect any other signal cables from monitor and switch on both devices (PC & monitor). Wait a while. Monitor checks its inputs and a activates digital DVI, because it is only one active input on it.

If none picture excluding error message of monitor is displayed, go through these steps:
1) Check whether your monitor is set for automatic input selection. If monitor buttons have no response try to use analog input to activate screen (any picture, desktop background...) and then get monitor menu access by pressing menu button.

2) Find out what connector type has your graphics card in PC. Possibilities are: DVI, HDMI, DisplayPort and some their options. See Wikipedia for connector look if you are not sure.
Remember that blue D-SUB (VGA) connectors are analogue and you cannot activate digital mode through analog VGA cable.

3) Use appropriate cable to connect monitor to PC.
Example: Your PC has HDMI output. Use HDMi-to-DVI cable. Your monitor input is DVI and common DVI-DVI cable is not suitable in case you have PC with the HDMI.
Note that DVI-to-DVI, DisplayPort-to-DVI etc. cables are available. You must buy right one.

Be aware that "Auto" button is inactive in digital mode of monitor.

Nov 18, 2013 | Belinea 1930 S1 Monitor

1 Answer

How to add second monitor on ECS RS482-M


Doesn't work that way Max.

You can use EITHER the OnBoard Graphics,
OR;
the Add-On Card graphics. (Graphics card)

BIOS will NOT support both at the same time.

If you wish to use dual monitors, get an AGP card that has more than one graphics port on it.

For example, although I'm using a graphics card based on the PCI Express technology, and not AGP technology; my card has a VGA port, DVI port, and HDMI port.

1) VGA Connector; http://en.wikipedia.org/wiki/VGA_Connector

http://en.wikipedia.org/wiki/File:Male_VGA_connector.jpg


2) DVI; http://en.wikipedia.org/wiki/Digital_Visual_Interface

http://en.wikipedia.org/wiki/File:Dvi-cable.jpg

http://en.wikipedia.org/wiki/Digital_Visual_Interface#Connector

3) HDMI; http://en.wikipedia.org/wiki/HDMI

I use the VGA port, and the DVI port.

RCA EN-V L26HD31, (26 inch HDTV) to the VGA port.
HP 2009M 20 inch Widescreen monitor to the DVI port.

Had to install drivers for HP 2009M.

Let's say I had two DVI ports on the graphics card.
(ATI Radeon HD5450 is the graphics card I have)

But I have a monitor with a VGA cable, and a monitor with a DVI cable.
I would (And have) use a VGA to DVI Adapter.
One example,

http://www.directron.com/dvi.html

Note that the DVI is Male, and the VGA is Female.
The graphics card DVI connector is Female.
The VGA Cable connector is Male.

So.............what is the manufacturer name, and model number of the graphics card; plus what type of monitors are you trying to use?
That is, what type of monitor cables do they have?

Know this;
My HDTV I'm using as a monitor is Digital. However I'm using a VGA cable on it. Best quality would be achieved if I got off my lazy rear, and attached a DVI Cable to it. It uses both.

VGA is Analog. A computer however puts out a digital video signal.
The computer has to convert the digital video signal to an Analog one. This slows the video signal down, and the quality is not as good.

Best to use a digital monitor. (Or a digital monitor WITH a digital cable. DVI Cable. Such as I need to do)

You are also using a PCI Express graphics card, NOT an AGP graphics card.

I'm wrong? OK.

In the meantime turn the computer off, unplug from power, and yank that AGP card out of the slot you put it in.

Tell me what expansion slot on the motherboard you put it in.

The Orange expansion slot on the motherboard, is a PCI-Express x16 slot. Uses a PCI Express graphics card -> ONLY

The white expansions slots are PCI.
Can use a PCI graphics card. CANNOT use an AGP graphics card.

Post back in a Comment.

Regards,
joecoolvette

http://www.newegg.com/Product/Product.aspx?Item=N82E16814131338

Feb 18, 2013 | DFI 661GX-MLV SIS661GX CHIPSET SOCKET775...

1 Answer

How to cernecft from tower to moniter


Several styles of technology have been used, michealnorth;

1) VGA;

http://en.wikipedia.org/wiki/VGA_connector

2) DVI;

http://en.wikipedia.org/wiki/Digital_Visual_Interface

3) HDMI;

http://en.wikipedia.org/wiki/HDMI

4) Mini Display-Port,

http://en.wikipedia.org/wiki/Mini_DisplayPort

Let's say your computer has a DVI female connector on the back, but you have a VGA cable to your monitor. (With male VGA connector)
Use a VGA to DVI adapter,

http://www.directron.com/dvi.html

Other way around? DVI cable on monitor, but VGA port on back of computer?
Use a DVI to VGA adapter,

http://www.directron.com/dviadapter.html

[HD-15 is the proper name of a VGA connector ]

Same for HDMI, or Mini Display-Port.

On the back of your computer is the I/O area.
Input/Output area.
It has a rectangular thin metal shield around it, usually.
The I/O shield.

Keyboard, Mouse, Monitor, Audio {Sound} connectors, and USB ports, are in the I/O area.

There are horizontal slots at the back of the computer also.
These slots are used for expansion cards. A graphics/video card, is an expansion card.

If you have a graphics card in one of these horizontal slots, connect your monitor to it, and NOT to the I/O area.

Graphics comes from the graphics card now, and not from a VGA, or DVI port in the I/O area.

For additional questions please post in a Comment.
Regards,
joecoolvette

Aug 10, 2012 | Packard Bell iMedia Computers & Internet

1 Answer

I'm using a lenovo L195 LCD monitor, and my Graphic Card is Geforce 9300 GE. It is said that both the card and the monitor support DVI. When I try selecting the DVI signal input from the button on the...


If it's got a DVI input then it can handle a DVI signal.

There is no reason to have two cables plugged in unless you have two screens. If you've got two leads plugged in then the second one is probably turned off by default. I remember when I had a dual monitor set up, I had to turn on the second output in order to get both screens working.

You haven't said if you removed the d sub when the pc was off, but I get the impression that it was on at the time. Try unplugging the d sub while the pc is off, check the dvi cable is inserted correctly on both ends and then turn your pc on. It should now be working.

If that doesn't work then turn your pc off again, plug the d sub back in and start it up. Go to your display settings and turn on the dvi port, you want it to display a copy of the first screen. if you extend your desktop then you'll only be able to see half of it at a time. This isn't an ideal solution and your graphics card will be doing double the work, so try it the proper way first.

You might not notice the difference though.

Feb 04, 2011 | Lenovo L195 Monitor

1 Answer

TRYING TO CONNECT TWO MONITORS FOR EXTENDING USE


1) You have to have a graphics card with two monitor ports.

You Cannot use one port on a graphics card, and the VGA, or DVI port on the motherboard of the computer.
BIOS will only let you use the graphics card, or the Integrated Graphics.

2) Using a 'splitter' cable results in very bad graphics on two monitors, or no graphics at all.

3) With a graphics card installed that has two monitor ports on it, connect the monitors to the graphics card.

[ The graphics card can have a VGA port, and a DVI port. Or the graphics card can have two DVI ports.

VGA = Video Graphics Array
Photo of a VGA port,

http://en.wikipedia.org/wiki/File:SVGA_port.jpg

Photo of a VGA cable,

http://en.wikipedia.org/wiki/File:Vga-cable.jpg

DVI = Digital Visual Interface

Information about DVI showing a DVI cable, and connector examples for the DVI port on a graphics card,

http://en.wikipedia.org/wiki/Digital_Visual_Interface ]


4) Turn the two monitors on once they are connected to the graphics card.
Turn on the computer.

5) Windows loaded, right-click on an empty area of the desktop screen.

6) In the list go to the bottom, and click on Properties

7) Click on the Settings tab.

8) You will see two monitor icons.
One icon has a square with a 1 in it.
The other icon has a square with a 2 in it.

The 1 monitor icon is used to represent your Primary monitor.
The one you have been using.
As you can see the 1 monitor icon is sitting to the Left.
This is how your Primary monitor should be sitting on your computer desk.

The 2 monitor icon is used to represent your Secondary monitor.
The one you are going to add.
As you can see the 2 monitor icon is sitting to the Right.
This is how your Secondary monitor should be sitting on your computer desk.

Left-click on the 2 monitor icon.
Go below in the Display Properties window, and view where it states -

Extend my Windows desktop onto this monitor

There is an empty square to the left of it.
Left-click in the empty square.

Now go below to the Right, and click on - Apply
Finally go below to the Left, and click on - OK

Your desktop screen will now be on your Secondary monitor.

Whatever is on your Primary monitor, has to be drug over to the Secondary monitor, when you are on the internet, or a program. (Such as a game)

Go to the blue frame on the Left side of the Windows screen.
When your cursor turns into a double-headed arrow STOP.

You'll find it's a little tricky your first time, keeping your cursor into the position where it turns into a double-headed arrow.
With the cursor in a double-headed arrow, press the left mouse button down, hold it down.
Drag the monitor screen to the Left.

Keep dragging until the window on your Primary monitor is on your Secondary monitor.
It's helpful you'll find to have two of the same size monitors.

I have found occasion where when using the internet on two monitors, that I had to Restore Down the screen, THEN drag it over to the Secondary monitor screen.

(Icons at the top right of the monitor screen.
Minimize ( - ), .......Restore Down/Maximize, ...........Red X

Then I Maximized the screen


For further questions please state in a Comment.
(Believe upper right of your page. - Comment

Nov 02, 2010 | Computers & Internet

1 Answer

Changing cable port


The monitor is a Fujitsu VL-15DX5G. 15 inch LCD.

The monitor cable uses a female DFP20 connector.
(Display Port 20-pin)

http://auction.thumbnail.image.rakuten.co.jp/@0_auc/image50/cb/4a/00010523850/a1/85/img00149858603.jpg

So you need a male DFP20 to male DVI-D adapter, or adapter cable.

DFP20 Male to plug into the monitor cable, and Male DVI-D (Dual) to plug into the graphics card's, female DVI-D connection.

So far, all I have found from a quick search is a Male DFP20 to Female DVI-D adapter, such as this example,

http://www.worldofcables.com/store/viewitem.asp?idproduct=768

Of course this won't do.

IF, this is all that's available for DFP20 adapters, you'll need an adapter of Male DVI-D to Male DVI-D, also. (Yes, I am aware of how crazy that sounds)

(It would appear, adapters are sold to connect a normal DVI-D monitor cable, {Male}, to a computer with a female DFP20 connector)

Should your searching only find DVI-D female to DFP20 male adapters, or adapter cables, then here is an example of a DVI-D male to DVI-D male adapter,

http://www.kvconnection.com/product-p/m-sgc-dvid-mm.htm

(DVI-D male to DVI-D male cable would probably be better as there would be less strain on the graphics card's DVI-D connection)

May 14, 2010 | Computers & Internet

1 Answer

When I connect the moniter to the graphic card geforce 6200 it will not show a display at all. connection to the computer using the regular connection it shows


On both the Graphics card and the monitor, the blue connector is a 15 pin DSub VGA video and the white connector is a DVI video.
With the monitor it should default to the 15pin DSub video input.
If you want to use the DVI video input you need to configure the monitor to use the DVI video input. DVI to DVI give the best picture quality.
If you want to go from 15pin DSub to DVI you need a converter cable.
Try conviguring the monitor for a DVI video input.

Jan 30, 2010 | XFX GeForce 6200 Graphic Card

2 Answers

"No signal detected" when LCD connected with DVI-D Cable


Thank you for providing your configuration details.
You have failed to specify wheather or not your LCD Display is under Warrenty.

Perhaps these simple trouble shooting procedures will get the job done.

1. use the DVI to VGA adaptor that came with your card

Plug in a regular CRT monitor to the Primary Display Output Port using the DVI to VGA adaptor
( either port can be designated as the primary by you )
I like to use the Port nearest to the motherboard as the Primary

2. plug in the your LCD monitor on Secondary Display port using your DVI cable

At this point you should be able to see your desktop on the Primary CRT Display

3. Go to the display properties and use the Nvidia Dual Display Setup Wizard

set both desktops to CLONED at a resolution of 1024x768 and set the refresh to 60Hz
which is default for most LCD displays , some have higher refresh rates , consult your user manuel

If the display resolution and refresh are set incorrectly for your CRT or LCD you will see
Signal Out of Range ( if the display has the ability to display that error message )

At this stage both displays should be showing a Windows Desktop

If you now see your desktop on both displays , you can now put the LCD on the Primary Port



No Signal Detected ( this means the display device is not receiving a signal )

If your LCD display is still displaying the error message
"No Signal Detected" after performing the trouble shooting proceedures
You may need to read the user manuel about how to enable DVI signal input.

If you are certan you have done everything correctly but still see only "No Signal Detected"

Your LCD Display may need replacement

Loch Hime :-)

Sep 14, 2008 | BenQ FP71E 17" LCD Monitor

1 Answer

Mag monitor doesn't display bios post


I have no experience with the DVI links because such technology is still unavailable here in Argentina. We have DVI video cards but finding a suitable DVI monitor is another (long) story.
I think that your card (that has 2 DVI ports) is initializing one port first (the primary one) and then, when windows starts, it initializes the other and outputs video from both ports.
Try connecting the monitor in the other DVI port.
If your card has a VGA output, the VGA gets initialized first.

May 06, 2008 | Computers & Internet

Not finding what you are looking for?
ASUS MW221U LCD Monitor Logo

Related Topics:

60 people viewed this question

Ask a Question

Usually answered in minutes!

Top ASUS Computers & Internet Experts

Les Dickinson
Les Dickinson

Level 3 Expert

18389 Answers

Doctor PC
Doctor PC

Level 3 Expert

7733 Answers

David
David

Level 3 Expert

778 Answers

Are you an ASUS Computer and Internet Expert? Answer questions, earn points and help others

Answer questions

Manuals & User Guides

Loading...