Return to Peripheral Page
Return to Computer Page

Monitor Page

Computer Video Monitor

If there is one single thing that makes the personal computer exciting, it is the monitor. A CD player has a processor inside. So do many microwave ovens and autos. The difference is that with the computer, we can get to Alburqueque from Boston in seconds, vice hours.

A video monitor works like a television set. When metals get red hot, the electrons orbiting the metal atoms become free. They can then be enticed off of the surface by a relative positive charge. When free electrons strike a phosphorescent material, it glows. Fairly brightly in the case of a video monitor. Free electrons easily attach themselves to atoms lacking charges themselves. Finally, electrons can be steered by applying a magnetic field to them. Taken together, we see how a monitor may work.

In a very basic Cathode Ray Tube (CRT) a beam of electrons is focused and aimed by electromagnets, then the electron beam strikes a phosphorescent surface at the other end of the tube. Light is given off by the phosphor at the end of the CRT. The entire tube is kept at as close to vacuum as possible so that the electron stream is not deflected.

This basic CRT would be a monochrome(one color) monitor. By raising the number of electrons that strike the screen, the display is made brighter. Relate this to a black and white television or an amber or a green screen computer monitor. On a computer monitor, each individual dot is addressed individually to make part of a shape or character. This dot is called a pixel.

Color CRT type monitors have three kinds of phosphors to produce Red, Green and Blue (RGB) light. All of the colors that humans can see can be produced by some combination of these colors. The first color monitors used only two brightnesses of red, green and blue to produce colors. Red could be off, on or on high. A total of only sixteen colors (to include black) could be displayed because all colors had to match in intensity.

EGA monitors worked much the same as CGA monitors did but expanded the color range, if not the quantity of colors. There could only be sixteen colors shown at any time. Since EGA monitors could use combined high and low intensities in the same color, the sixteen colors could be selected from 64 colors. This expanded pallett gave EGA monitors enhanced color possiblilties. Also, the size of the pixel shrank and so the clarity of the picture grew. This helped to inspire the development of the VGA monitor.

The VGA monitor can display any color. Any limitation is in the video card used. While early VGA video cards were limited to sixteen colors displayed, they improved on EGA video cards both in resolution and in pallet depth. The sixteen colors were chosen from 256 basic colors.

Since that time, true color VGA adapters have allowed the VGA and Super VGA monitor to display true color. This is the industry name for 24 bit color. With eight bits per color per pixel, there are 256 levels of red, green or blue per dot. 1677216 colors to choose from. It would take a 1295 by 1295 pixel monitor to display all of the colors possible! The largest industry standard monitor these days measures 1280 by 1024 pixels.

Common VGA or SVGA monitors may be capable of 640x480, 800x600, 1024x768 and 1280x1024 pixels. This is horizontal dots by vertical dots. At 640 by 480, the number of pixels is (640x480=) 307200. The 1280 by 1024 display shows 1,310,720. At 16 colors, each dot has 4 bits of information. At true color, each dot holds 24 bits. So a monitor showing 640x480 dots at 16 colors has (640x480x4=) 1,228,800 bits of information, while a true color 1280x1024 video card holds 31,457,280 bits of information! This is why the newer, better video adapter cards demand so much memory.

Screen savers came along because of a problem with cathode ray tubes. Recall from the basic CRT description above, the electrons hitting the phosphor coating on the front of the CRT. As each electron strikes, it may knock loose an atom! This is called sputter (like words flying out of a mouth taking spit along). When this happens, the phosphor in that spot being gone, there is less light given off. A display constantly showing the same image will concentrate this effect, producing screen burn.

In an early solution to this problem, computer input inactivity would shut off the video monitor. Activity would start it right back up. However, thinking the computer off, it often happened that users would mess something up. Someone wanting to use the computer might think that, well, if this switch position is OFF, then that way must be ON. Screen savers, then, must not completely shut off the screen, but minimize the portion in use when the computer is not active. Now any program that starts to change screen characteristics after a period of inactivity may call itself a screen saver. To determine the best of these, consider first whether they truly protect against sputter.

Return to Peripheral Page
Return to Computer Page

Updated January 24, 1998, 10:52pm.