Analog to Digital (A/D) Board?

GreatestOne

Limp Gawd
Joined
May 15, 2005
Messages
488
Anyone know why some monitors use Analog to Digital (A/D) Boards and some dont, in layman's terms, what does this do exactly? I found some online sources, but doesnt really help me (my mech e days are loooong gone)

"The Analog-to-Digital (A/D) Board allows the microcomputer to digitally quantify up to four channels of current or voltage inputs from other system units. This card is one of several that can be plugged into the common input/output bus (slots 6 through 13) of the microcomputer."

and they seem to support QVGA, VGA, SVGA, XGA, WXGA, WSXGA, SXGA, SXGA+, WSXGA+, UXGA, WUXGA, QXGA.

Any EE's out there?
 
VGA is analog, CRTs are analog, computers are digital, LCD displays are digital.

computer -> CRT requires DACs on the computer's video card.
computer -> LCD via VGA requires DACs on the computer's video card, and ADCs on the LCD monitor

http://www.webopedia.com/TERM/R/RAMDAC.html

even better:
http://en.wikipedia.org/wiki/RAMDAC

VGA has mostly been made obsolete by DVI-D, HDMI, and Displayport, which are all pure digital intereconnects.
 
Last edited:
I'm no EE, but my best guess would be the presence of a dedicated scaler/image processor vs a dumb display, or that it has extra inputs.
 
VGA is analog, CRTs are analog, computers are digital, LCD displays are digital.

computer -> CRT requires DACs on the computer's video card.
computer -> LCD via VGA requires DACs on the computer's video card, and ADCs on the LCD monitor

VGA has mostly been made obsolete by DVI-D, HDMI, and Displayport, which are all pure digital intereconnects.

Thanks! But.....

So with the advent of the DAC's, the conversation about a/d boards became obsolete? So in todays tech, why would any monitor need an a/d board when almost every video card, probably even the onboard ones, have RAMDAC's installed, or am I still not getting it? The only reason I ask is because I heard there ARE still some displays that do NOT have a/d boards, but it seems that is unnecessary anyways today, so who cares if it does or does not have it?
 
Thanks! But.....

So with the advent of the DAC's, the conversation about a/d boards became obsolete? So in todays tech, why would any monitor need an a/d board when almost every video card, probably even the onboard ones, have RAMDAC's installed, or am I still not getting it? The only reason I ask is because I heard there ARE still some displays that do NOT have a/d boards, but it seems that is unnecessary anyways today, so who cares if it does or does not have it?


Because some monitors still have VGA inputs.
 
So in modern times, assuming no one will use regular VGA anymore but only HDMI/DVI/DP, is there any other reason to need a built-in a/d board in a monitor?
 
To sum up, to support a VGA input, a digital display, has to have three ADCs to convert the three analog color channel streams to digital values, so they can be loaded into the display controller as discrete numbers.

CRTs, did not have individually accessible pixels. Instead three electron beams, one for each color, were swept across the display, line by line, with the three VGA color channels directly controlling the amplitude of the beam. The higher the amplitude at any given point along the line, the more "excited" the phosphors being intersected by the beam, the more light they gave off, the brighter the corresponding color.
 
Back
Top