I have a nVidia 8600GT and it was connected to my monitor (VGA to DVI converter) and to my TV via s-video. Irecently replaced the TV and unplugged the s-video cable from the backof the TV with the computer still on. Now my screen looks like thisONLY when I get to the login screen on Windows and ONLY when I amlogged into Ubuntu (boot screen looks fine on both OS)
I tried uninstalling drivers and reinstalling. Also tried removing and reinserting the card. No success.
I have reverted back to the on-board video and it works fine.... did I break my video card?
An expert who has achieved level 3 by getting 1000 points
An expert that got 10 achievements.
An expert that got 5 achievements.
An expert whose answer got voted for 100 times.
Re: Artifacts on monitor
Ha. Trippy bg and woven look. It seems it's only like this in certain video modes, so I expect what happened was not that electrical damage happened from incaution with cables but that a DDR3 line came unsoldered when the S-Video socket was manipulated. You can either sport around on the video card looking for the line by applying a q-tip with deionized water around the memory chips, just lightly squeeze the memory chips back into place...reflowing solder on the card, maybe.
Come to think of it, maybe there's a signal that S-Video is connected that did disaffect the line-scan circuitry to fail in that regular way; you can also try biasing elements of that connector's hook-up to the card (or just a longer cable, or a video switch) to see if that affects visual quality.
Good job debugging your situation. I hope the card goes back right with mere re-flexing.
- If you need clarification, ask it in the comment box above.
- Better answers use proper spelling and grammar.
- Provide details, support with references or personal experience.
Tell us some more! Your answer needs to include more details to help people.You can't post answers that contain an email address.Please enter a valid email address.The email address entered is already associated to an account.Login to postPlease use English characters only.
Tip: The max point reward for answering a question is 15.
Normally, monitors do not have tuning electronics for capturing TV signals.
Therefore, consider the monitor as a "dumb" unit that only displays information coming from your computer. Also, this unit does not have a HDMI input, but only VGA.
So, you are fated to have this monitor connected to your computer, always.
Now, you may find good software that will "tune" the TV signals from streaming at the Internet and show whatever is being broadcast. The same way you can access YouTube and watch videos these TV tuners will get the streaming signals the TV stations send via Internet (not all do that) and watch the channel in your PC/monitor
Since I do not have additional information about the computer you use I cannot give you other ideas
I hope I have helped you.
Connect the VGA cable to the converter and the component video cable to the TV and the converter with the computer and TV off. Turn on the TV first and set it to the component video input and then turn on the computer. Then (assuming the VGA port is enabled - adjust in the BIOS if needed) set the computer's video output to the VGA port. (On my Windows 7 laptop, this can be done by right-clicking on the desktop. Choose Graphics Properties and set the display to the external monitor or both. Click Apply and then confirm the change (on the external monitor if using the single monitor). (VGA to Composite video converters also exist if you do not have an available component video input on your TV.)
If your computer is fairly new, you may have an HDMI port. Your TV may be able to read the signal with a DVI to HDMI cable. (It depends on the TV. Not all of them will read the computer's signal on the DVI port. Check the documentation for your model. TVs without a VGA port are less likely to support this feature.)
This card does not allow you to Multi-Display with the VGA port and the S-video port. If you put a DVI to VGA port adaptor (not supplied) on the DVI port, you can Mulit-Display two VGA monitors and (presumably) a VGA (from the DVI port) and S-video monitor.
This is not explained anywhere and the card does not come with a DVI to VGA port adaptor, so I wa only able to work this out after I bought the DVI to VGA port adaptor.
Unfortunately you are trying to mix two different video types: VGA is an analog video interface found on most computers (except for latest with flat panel monitors, which are usually DVI). Your TV DVI interface is digital (although there is a DVI-A, which is an anlog DVI interface, it is unlikely that your TV has this). So, what you are trying to do probably won't work. I assume your computer does not have DVI output (otherwise you wouldn't have bought the adapter), but some monitors have VGA inputs; does yours? What kind of DVI adapter did you get? Is a DVI-A? Check out http://en.wikipedia.org/wiki/Dvi for info on DVI interfaces.