Computer monitors are a crucial component of the modern world. For many people, a solid computer monitor is their connection to much of their work and social life. But what a lot of people never stop to consider is that the computer monitor is an invention that has evolved considerably since its inception.
Monitors have come a long way from their earliest incarnations. This has contributed to a much lower required level of user sophistication versus the early days.
The Earliest Monitors and Interfaces
Early in the development of computers, a display consisted of nothing more than flashing lights that indicated different functions. For people who thoroughly understand computer systems, this can be crucial information about what part of the computer’s memory the user is accessing or what functions the computer is performing. Unfortunately, for most people such a display is simply a row of small light bulbs that appear to flash at random.
Whether this type of display fits the modern definition of what a monitor is could be debated. However, if monitors are defined as the dynamic indicators of a
computer’s status, such interfaces do qualify. Monitors have come a long way in complexity since the early days, but they are still essentially a series of lights.
Monitors have never been constructed out of paper, but early printouts that approximate the functions of the modern-day monitor used the material. During the earliest days of office computer usage, punch cards and paper tape were the standard issue for reading and writing information. At that point, these readouts were the only methods for one computer to transfer data to another computer.
Punch cards were simply a stack of index cards that a punch card writer put small holes into. From there, a punch card reader would interpret information presented based on the location of these holes. By contrast, a paper tape writer and reader used a long roll of paper that could be perforated in a similar fashion and read the same way, rather like a simplified version of the magnetic tape in a cassette.
Once the information was interpreted, the computer would print out letters and numbers based on the information. While this was not a monitor in the modern sense, it was the precursor to the monitors people use today.
Monitors That Show Characters
The CRT or cathode ray tube monitors that made television a reality also made the earliest modern style computer monitors possible. Generally, these monitors made use of vector graphics which plotted points on the screen. The military put this to work in radar and sonar applications, and scientists used these early monitors in oscilloscopes. At this point, monitors could have been used to display text
or other characters, but this was a rare occurrence. Color was not an option at this stage.
People have been texting one another since 1902, if one considers the teletype. However, the teletype was originally used to communicate over wires or through radio signals. This interface, which resembles a typewriter, was connected directly to computers in the 1950s. The teletype remained the most cost effective way of interfacing with a computer until the 1970s.
In the early 1960s, engineers realized they could attach a CRT monitor to a teletype machine and produce a more efficient means of communication. While initially called “glass teletypes,” this was the first incarnation of what most people would consider the modern monitor. During the 1970s, the glass teletype exploded in its popularity for office use.
A curious fact about these monitors is that they were only able to display characters. At that point, there was no ability to display graphics.
Plasma: Older Than Most People Think
Plasma monitors were also developed in the 1960s. Using a gas trapped between two sheets of glass, such a monitor could apply a charge to the gas and create an image. This is the same technology used in today’s plasma TV displays, but it was originally applied to a small number of computer monitors in the 1970s and 80s.
Another display technology developed during the 1960s was the liquid crystal display or LCD. These displays were energy efficient, thin and inexpensive, but they were hard to read without either back lighting or direct illumination.
In the early 1970s, computers were insanely expensive. However, some very bright individuals such as Steve Wozniak and Lee Felsenstein determined that one could simply hook CCTV monitors to a computer and display almost anything. Composite monitors became the industry standard on the personal computer market by the early 1980s, and many of these monitors were completely interchangeable with one another. The image quality afforded by these types of monitors was drastically superior to that of earlier models.
Using an RF converter, some of the first home computers and video game consoles allowed users to access them on their existing TV screens. While there were resolution restraints, this allowed widespread color video output into the average home. Nowadays the gaming monitor is a much more high tech beast then ever, the demand is sky high too.
Increasingly Sharp Color Monitors
The 1980s saw IBM, Amiga and Commodore standing atop a large number of companies that developed color monitors for the home PC market. As time progressed, so did the number of colors and the sharpness of the displays. Until 1987, there were a variety of different monitor schemes, including CGA, EGA and VGA. In 1987, the VGA monitor scheme won out, and since then nearly every monitor has been built according to its standard.
The monitors of today are primarily LCD based. In the 1990s, LCD screens became more easily colored, cheaper and faster to refresh. These features made LCDs a favorite in laptop computer displays, and eventually LCD made the leap to TV and desktop applications. Since 2007, LCD screens have outsold CRT displays, and the numbers have gone up dramatically every year since.
Current work on monitors involves 3D support. While 3D screens still require the use of specialized glasses, work is being done on stereoscopic and glasses-free alternatives. While the basic premise of using dots to convey information is still in effect, monitors have come an incredibly long way from the blinking lights of the earliest computers.