Well,made a decision to call Dell as the PC is still under warranty,and it turned out to be a problem with the graphics card driver.
Took about an hour and a half of phone call and lots of uninstalling and reinstalling and downloading new drivers and a lot of display settings and resolution adjusting to get it right.Works a treat now.Hope that helps?
a 6ya expert can help you resolve that issue over the phone in a minute or two.
best thing about this new service is that you are never placed on hold and get to talk to real repairmen in the US.
the service is completely free and covers almost anything you can think of (from cars to computers, handyman, and even drones).
click here to download the app (for users in the US for now) and get all the help you need. goodluck!
- If you need clarification, ask it in the comment box above.
- Better answers use proper spelling and grammar.
- Provide details, support with references or personal experience.
Tell us some more! Your answer needs to include more details to help people.You can't post answers that contain an email address.Please enter a valid email address.The email address entered is already associated to an account.Login to postPlease use English characters only.
Tip: The max point reward for answering a question is 15.
Any contemporary monitor is able to switch analog/digital mode automatically if connection from signal source to monitor is set up properly.
Connect digital output from your graphic card to DVI input of the monitor. Disconnect any other signal cables from monitor and switch on both devices (PC & monitor). Wait a while. Monitor checks its inputs and a activates digital DVI, because it is only one active input on it.
If none picture excluding error message of monitor is displayed, go through these steps: 1) Check whether your monitor is set for automatic input selection. If monitor buttons have no response try to use analog input to activate screen (any picture, desktop background...) and then get monitor menu access by pressing menu button.
2) Find out what connector type has your graphics card in PC. Possibilities are: DVI, HDMI, DisplayPort and some their options. See Wikipedia for connector look if you are not sure. Remember that blue D-SUB (VGA) connectors are analogue and you cannot activate digital mode through analog VGA cable.
3) Use appropriate cable to connect monitor to PC. Example: Your PC has HDMI output. Use HDMi-to-DVI cable. Your monitor input is DVI and common DVI-DVI cable is not suitable in case you have PC with the HDMI. Note that DVI-to-DVI, DisplayPort-to-DVI etc. cables are available. You must buy right one.
Be aware that "Auto" button is inactive in digital mode of monitor.
The DVI cable transports the display signal from the video card to the monitor.
I think what Ron is asking is if you used a different cable to connect the monitor to the computer. (I'm assuming you used an old 15 pin monitor cable or whatever those old cables were?)
Without getting into your BIOS:
So on your computer, you probably have an onboard video card (a video card built into the motherboard itself), and a dedicated video card (probably where you're seeing the DVI out).
It may be possible that you have the on board video card (with the traditional connection) disabled, when the dedicated video card is plugged in.
This isn't rocket science, so don't worry. Phillips head screwdriver is all you need.
Pull the side panel off your computer (assuming it's a tower... if not, it's the top panel). Once you have it off, you'll see that video card plugged into a slot on the motherboard, with 1 phillips head screw holding it in place (inside, near the back, where the DVI port is).
Pull the screw, and the video card should come out. From there, the BIOS should recognize your card is missing, and thus revert to using the onboard video card.
DVI is a new form of video interface technology made to maximize the quality of flat panel LCD monitors and high-end video graphics cards. It is a replacement for the P&D Plug & Display standard.
you can connect high definition monitor , LCD projector , LCD T.V to a DVI port
There three kind of DVI connector 1. DVI-D (Digital)
2. DVI-A (Analog)
3. DVI-I (Integrated Digital/Analog)
DVI-D format is used for direct digital connections between source video
(namely, video cards) and digital LCD (or rare CRD) monitors
DVI-A - High-Res Analog
DVI-A format is used to carry a DVI signal to an analog display, such as a CRD monitor or and HDTV
DVI-I - The Best of Both Worlds
DVI-I format is an integrated cable which is capable of transmitting
either a digital-to-digital signal or an analog-to-analog signal, but it
will not work transmitting a digital-to-analog or analog-to-digital
How to Recognize These three types of DVI Cable
There are two variables in every DVI connector cable, and each represents one characteristic.
The flat pin on one side denotes whether the cable is digital or analog:
A flat pin with four surrounding pins is either DVI-I or DVI-A
A flat pin alone denotes DVI-D
The pinsets vary depending on whether or not the cable is single- or dual-link:
A solid 27-pin set (rows of 8) for a dual- link cable
Two separated 9-pin sets (rows of 6) for a single-link cable
If it's got a DVI input then it can handle a DVI signal.
There is no reason to have two cables plugged in unless you have two screens. If you've got two leads plugged in then the second one is probably turned off by default. I remember when I had a dual monitor set up, I had to turn on the second output in order to get both screens working.
You haven't said if you removed the d sub when the pc was off, but I get the impression that it was on at the time. Try unplugging the d sub while the pc is off, check the dvi cable is inserted correctly on both ends and then turn your pc on. It should now be working.
If that doesn't work then turn your pc off again, plug the d sub back in and start it up. Go to your display settings and turn on the dvi port, you want it to display a copy of the first screen. if you extend your desktop then you'll only be able to see half of it at a time. This isn't an ideal solution and your graphics card will be doing double the work, so try it the proper way first.
When you connect the computer to the monitor via an analog VGA cable, you get signal on the analog input. When you connect them via a DVI cable, you feed the monitor's DIGITAL input, so the monitor doesn't get any signal on it's analog input, and informs you about it. You have to select the digital input from the monitor's On Screen Menu, and then start the computer up - you should see the picture OK. Also remember to set the display resolution on your computer to match the optimum (native) resolution of your monitor - this will give you optimum picture quality. In case of VA712B it will be 1280x1024. Good luck and please come back with a testimonial if helped.
Thank you for providing your configuration details.
You have failed to specify wheather or not your LCD Display is under Warrenty.
Perhaps these simple trouble shooting procedures will get the job done.
1. use the DVI to VGA adaptor that came with your card
Plug in a regular CRT monitor to the Primary Display Output Port using the DVI to VGA adaptor
( either port can be designated as the primary by you )
I like to use the Port nearest to the motherboard as the Primary
2. plug in the your LCD monitor on Secondary Display port using your DVI cable
At this point you should be able to see your desktop on the Primary CRT Display
3. Go to the display properties and use the Nvidia Dual Display Setup Wizard
set both desktops to CLONED at a resolution of 1024x768 and set the refresh to 60Hz
which is default for most LCD displays , some have higher refresh rates , consult your user manuel
If the display resolution and refresh are set incorrectly for your CRT or LCD you will see
Signal Out of Range ( if the display has the ability to display that error message )
At this stage both displays should be showing a Windows Desktop
If you now see your desktop on both displays , you can now put the LCD on the Primary Port
No Signal Detected ( this means the display device is not receiving a signal )
If your LCD display is still displaying the error message
"No Signal Detected" after performing the trouble shooting proceedures
You may need to read the user manuel about how to enable DVI signal input.
If you are certan you have done everything correctly but still see only "No Signal Detected"
If you still have your old monitor plug it back in for a bit and load the drivers for the new monitor (from the disk which came with the new monitor). Right click on your desktop and choose properties and set your display resolution to something like 1024 x 768. Shut down the pc and connect your new monitor to it with the dvi adapter. If it is recognised ok you can then set the resolution from 1024 x 768 to the recommended amount for your new screen. The refresh rate is also important as some lcd screens only work at a certain level. There should be an auto switch on the monitor and if you press this it sets the display to optimum, - or should do.