The first people ever to gaze into a mirror were astonished to see themselves. Today, thanks to the humble computer monitor, you are able to look into an entire universe of information. Yet few of the thousands of people in the general Nashua NH area sitting at their computers right this minute are likely giving any thought as to the nature of what they are looking at. Except, of course, when their computer display breaks and it's time for monitor repair. Let's take a look at the history and development of the computer monitor: today's mirror into the 'information superhighway.'
A computer monitor or a display is an electronic visual display used for computers. The display comprises the display device, its circuitry and an enclosure. Come modern monitors, the display is in most cases a thin film transistor liquid crystal display (TFT-LCD) panel. With older monitors we used to see a cathode ray tube (CRT) about as deep as the screen size. Of course, until the 1980s the computers and their displays were only used for data processing. The TV was used for entertainment, but it later implemented some computer functionality. For example, the common aspect ratio of televisions and the computers has changed from 4:3 to 16:9.
The computer displays are often referred to as "monitors, and the reason for that is the following. Back in the day, the early electronic computers were fitted with one panel of light bulbs. These bulbs were used so that each would indicate the on or off state of a particular register bit inside the computer. This is how the engineers were able to monitor the internal state of the computer, so this panel of light bulbs became known as the "monitor". In those days, the monitor was only able to keep track of the program's operating, and a line printer was actually the primary output device.
Later, it was realized that the output of a CRT display was more efficient than a panel of light bulbs. Eventually, the monitor became a powerful output device in its own right, by giving control of what was displayed to the program.
Early Days of the CRT Display
The CRT display is basically a vacuum tube which contains one or multiple electron guns, in addition to a fluorescent screen which is used to view images. It accelerates and deflects the electron beams to the screen and thus creates the images. These images can come in a form of a electrical waveform (oscilloscope), pictures (television, computer display), radar targets and many more.
The first cathode-ray tube monitors appeared in computers as a form of memory, and not as displays. Some time later, it was realized that we can use even more of the CRTs in order to show the contents of that CRT-based memory. Designers then adapted radar and oscilloscope CRTs in order to use them as primitive graphical displays, with no color. These were rarely used for text at that time, and rather with the SAGE and PDP-1 systems.
The Time of Composite Video Output
First there were Teletype monitors. A teletype is an electric typewriter which communicates with another teletype over wires by the means of a special code. After the 1950s, engineers hooked up teletypes to the computers directly, in order to use them as display devices. Until the mid-1970s, these were the cheapest way to provide a continuous printed output of a computer session. By the mid-1970s it become obvious that using a teletype monitor (even the glass ones) were too expensive for one individual in order to build a personal computer.
Enter Don Lancaster, Lee Felsenstein and Steve Wozniak
These three guys had the same idea at the same time: Build a cheap terminal device using a cheap CCTV video monitor as a computer display! Some time ahead, both Wozniak and Felsenstein built video terminals into computers and thus creating the first computers along with factory video outputs, in 1976. These computers are known as the Apple I and the Sol-20.
Early Plasma Displays
Much before the plasma TV sets came into our homes, this technology was first invented and intended to be used as computer monitors. It was the 1960s, and this display technology had just emerged. It used a charged gas trapped between two glass plates. By applying the charged gas across the sheets, a glowing pattern emerged. The earliest computer system to use a plasma display was the PLATO IV terminal. Companies such as IBM and GRiD later experimented with these thin and lightweight displays in portable computers, but it never took off for personal computers. It did, however, emerge once more with the introduction of flat-panel TV sets we all know today.
Important Innovations and LCDs
In those very early days of the IBM personal computers, the users needed different monitors for every display scheme, whether it was MDA, CGA, EGA or others. In order to address this, NEC invented the very first multisync monitor. The MultiSync supported various ranges of resolutions, scan frequencies and refresh rates all in one. This of course soon became the standard for displays in the PC industry. Come 1987, IBM introduced the VGA video standard as well as first VGA-powered monitors. Along with IBM's PS/2 line of computers, almost every analog video standard has been built of VGA ever since, along with its well known 15-pin connector.
Short for liquid crystal display, the LCDs are a type of displays used in many portable computers. These kinds of displays utilize two sheets of polarizing material along with a liquid crystal solution between them. An electric current passed trough the liquid will then cause these crystals to alogn in order to block the light passing trough them.
When the LCDs first appeared, they had low-contrast monochrome affairs with very slow refresh rates. Come the 1990s, LCD technology has seen lots of improvement, all driven by a market boom, especially in laptop computers. As it was improved, the displays gained more contrast, endorsed better viewing angles and advanced color solutions as well. The LCDs soon leaped from the portable PC sector into the grounds of the desktop computers.
Computer Monitors Today
Desktop LCD monitors outsold CRT monitors in 2007, for the first time ever and thus their sales and market shares have continued to climb to this day. The LCD monitors today have become so inexpensive that it is no wonder many people are experimenting with dual-monitor setups to allow more display space for work, gaming and other computer activities. The industry we have today also emphasizes monitors that support 3D-vision through 3D glasses along with ultra high refresh rates. Another interesting thing to mention when it comes to monitors and displays, is how the line between a TV and a monitor has been blurred. People today are buying 42-inch high definition flat panel displays for under $999 and are using them as a computer monitor as well. Imagine how people would feel about us doing this today back in the 1940s - mind blown.
That said, people who are planning on using a large screen as a computer monitor, they would have been better off actually purchasing a 32 inch or even larger computer monitor, and not a TV. The reason for this is that a monitor will give you a much better image - since it was designed to serve as a computer monitor. The main difference between a computer monitor and a TV is that TVs are set to receive over-the-air broadcast signals and probably carry inputs that work with your satellite or cable provider, along with devices such as game consoles, DVD and other players. Sure, the screen on a TV may be bigger, but what is the consequence? Low resolution which equals bad image. Large format monitors will offer more resolution capabilities and you will be able to read what's on the screen much more easier.
Of course, if you are using a HDMI cable, have a good graphics card and a full high definition TV device, you will most likely enjoy your computer time like you would with a standard computer monitor. That said, monitors do tend to be more expensive, but are strongly recommended for usage in such as computer gaming, along with very good graphics cards.