This question already has an answer here:

Hi I'm connecting a new Dell U2414M monitor to my wife's cheap work computer with a VGA connector. What I've seen so for are DVI to VGA converters stating in description that the signal is being converted from its digital form to analog.

Although I can connect the VGA output from the computer into the female part of the adapter and then connect the male DVI part of the adapter into the Dell female connector I'm not sure the description holds of you reverse the order. Physically they all connect but does the Analog signal from cheap computer get converted to Digital signal in the passive adapter.

Also what is the difference between a passive adapter as opposed to an active one other than the price being more expensive. Do either options really work or are they wishful thinking?

marked as duplicate by Ƭᴇcʜιᴇ007, DavidPostill, Nifle, Ramhound, Raystafarian Jan 5 '15 at 19:30

This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.


A passive adapter doesn't change the signal, it just connects the right pins on one connector to the right pins on the other. It is for when the signal is already in the right form, you are just matching connector formats. An active adapter takes the input signal and converts it to another type of signal. To go between analog and digital requires an active adapter.

The change works in only one direction. DVI to VGA has circuitry to take a digital signal and convert it to an analog one. It would take completely different circuitry to do the reverse.

For video content that is generated on your computer or comes from a good digital source, if it starts as digital and stays digital, and you view it at its native resolution, the signal undergoes no degradation from the source until you see it.

A signal that started as analog or is converted to analog is degraded by the conversion process and is further degraded by each processing step between creation or conversion and viewing. That is your starting point with VGA.

Note that with either a digital or analog signal, if you view it enlarged or reduced from its native size, the process of mapping it to another size reduces its sharpness and fine detail.

You can convert the analog VGA signal to digital so that the monitor can use it at the DVI port, but the quality won't be any better than the VGA source. In fact, the conversion to digital could introduce some degradation in that process. For this reason, you might as well just use a VGA cable to connect the computer to the monitor's VGA port, since converting to DVI won't provide any benefit.

  • "A signal that starts as digital and stays digital undergoes no degradation..." -- That's not quite true. Scaling, resampling and other transformations in the digital domain can introduce "degradations" euphemistically called artifacts. Excessive noise can corrupt the digital signal to the point that it cannot be demodulated (i.e it's then useless) (whereas an analog signal can handle noise and still be useable but with lowered quality. "A signal that started as analog or was converted to analog will not be as good." -- That's a questionable generalization. – sawdust Jan 3 '15 at 20:46
  • @sawdust - Perhaps over-simplification. Granted, it is certainly possible to degrade a digital signal, a low strength signal might be less useable than a noisy analog signal, and an analog signal can have advantages. This answer is in the context of the signal likely to be at the computer's video output assuming it came from a decent source. I'll qualify the answer. – fixer1234 Jan 3 '15 at 20:59

Not the answer you're looking for? Browse other questions tagged or ask your own question.