Five Generations Computers From Time To Time
First Generation (1945-1956)
With the onset of the Second
World War, governments sought to develop computers to exploit their
potential strategic importance. This increased funding for computer development
projects hastened technical progress. By 1941 German engineer
Konrad Zuse had developed
a computer, the Z3, to design airplanes
and missiles. The Allied forces, however, made greater strides in developing
powerful computers. In 1943, the British completed a secret code-breaking
computer called Colossus
to decode German
messages. The Colossus's impact on the development of the computer industry
was rather limited for two important reasons. First, Colossus was not a
general-purpose computer; it was only designed to decode secret messages.
Second, the existence of the machine was kept secret until decades after the
war.
American efforts produced a broader achievement. Howard H. Aiken
(1900-1973), a Harvard engineer working with IBM, succeeded in producing an
all-electronic calculator by 1944. The purpose of the computer was to create
ballistic charts for the U.S. Navy. It was
about half as long as a football field and contained about 500 miles of wiring.
The Harvard-IBM Automatic Sequence Controlled Calculator, or Mark I for short,
was a electronic relay computer. It used electromagnetic signals to move
mechanical parts. The machine was slow (taking 3-5 seconds per calculation) and
inflexible (in that sequences of calculations could not change); but it could
perform basic arithmetic as well as more complex equations.
Another computer development spurred by the war was the Electronic Numerical
Integrator and Computer (ENIAC),
produced by a partnership between the U.S. government and the
University of Pennsylvania. Consisting
of 18,000 vacuum tubes, 70,000 resistors and 5 million soldered joints, the
computer was such a massive piece of machinery that it consumed 160 kilowatts of
electrical power, enough energy to dim the lights in an entire section of
Philadelphia.
Developed by
John
Presper Eckert (1919-1995) and John W. Mauchly (1907-1980),
ENIAC, unlike the Colossus and Mark I, was a general-purpose computer that
computed at speeds 1,000 times faster than Mark I.
In the mid-1940's John
von Neumann (1903-1957) joined the University of Pennsylvania team,
initiating concepts in computer design that remained central to computer
engineering for the next 40 years. Von Neumann designed the Electronic Discrete
Variable Automatic Computer (EDVAC) in
1945 with a memory to hold both a stored program as well as data. This "stored
memory" technique as well as the "conditional control transfer,"
that allowed the computer to be stopped at any point and then resumed, allowed
for greater versatility in computer programming. The key element to the von
Neumann architecture was the central processing unit, which allowed all computer
functions to be coordinated through a single source. In 1951, the
UNIVAC I
(Universal Automatic Computer), built by Remington Rand, became one of the first
commercially available computers to take advantage of these advances. Both the
U.S. Census Bureau and
General Electric owned UNIVACs. One of
UNIVAC's impressive early achievements was predicting the winner of the 1952
presidential election,
Dwight D.
Eisenhower.
First generation
computers were characterized by the fact that operating instructions were
made-to-order for the specific task for which the computer was to be used. Each
computer had a different binary-coded program called a machine language that
told it how to operate. This made the computer difficult to program and limited
its versatility and speed. Other distinctive features of first generation
computers were the use of vacuum tubes (responsible
for their breathtaking size) and magnetic drums for data storage.
Second Generation Computers (1956-1963)
By 1948, the invention of the transistor greatly changed the computer's development. The transistor replaced the large, cumbersome vacuum tube in televisions, radios and computers. As a result, the size of electronic machinery has been shrinking ever since. The transistor was at work in the computer by 1956. Coupled with early advances in magnetic-core memory, transistors led to second generation computers that were smaller, faster, more reliable and more energy-efficient than their predecessors. The first large-scale machines to take advantage of this transistor technology were early supercomputers, Stretch by IBM and LARC by Sperry-Rand. These computers, both developed for atomic energy laboratories, could handle an enormous amount of data, a capability much in demand by atomic scientists. The machines were costly, however, and tended to be too powerful for the business sector's computing needs, thereby limiting their attractiveness. Only two LARCs were ever installed: one in the Lawrence Radiation Labs in Livermore, California, for which the computer was named (Livermore Atomic Research Computer) and the other at the U.S. Navy Research and Development Center in Washington, D.C. Second generation computers replaced machine language with assembly language, allowing abbreviated programming codes to replace long, difficult binary codes.
Throughout the early 1960's, there were a number of commercially successful
second generation computers used in business, universities, and government from
companies such as Burroughs, Control Data,
Honeywell, IBM, Sperry-Rand, and
others. These second generation computers were also of solid state design, and
contained transistors in place of vacuum tubes. They also contained all the
components we associate with the modern day computer: printers, tape storage,
disk storage, memory, operating systems, and stored programs. One important
example was the IBM 1401, which was universally accepted throughout industry,
and is considered by many to be the Model T of the
computer industry.
By 1965, most large business routinely processed financial information using
second generation computers.
It was the stored program and programming language that gave computers the
flexibility to finally be cost effective and productive for business use. The
stored program concept meant that instructions to run a computer for a specific
function (known as a program) were held inside the computer's memory, and could
quickly be replaced by a different set of instructions for a different function.
A computer could print customer invoices and minutes later design products or
calculate paychecks. More sophisticated high-level languages such as
COBOL (Common Business-Oriented Language)
and FORTRAN
(Formula Translator) came into common use during this time, and have expanded to
the current day. These languages replaced cryptic binary machine code with
words, sentences, and mathematical formulas, making it much easier to program a
computer. New types of careers (programmer, analyst, and computer systems
expert) and the entire software
industry began with second generation computers.
Third Generation Computers (1964-1971)
Though transistors were clearly an improvement over the vacuum tube, they still generated a great deal of heat, which damaged the computer's sensitive internal parts. The quartz rock eliminated this problem. Jack Kilby, an engineer with Texas Instruments, developed the integrated circuit (IC) in 1958. The IC combined three electronic components onto a small silicon disc, which was made from quartz. Scientists later managed to fit even more components on a single chip, called a semiconductor. As a result, computers became ever smaller as more components were squeezed onto the chip. Another third-generation development included the use of an operating system that allowed machines to run many different programs at once with a central program that monitored and coordinated the computer's memory.
After the integrated circuits, the only place to go was down - in size, that
is. Large scale integration (LSI) could fit hundreds of components onto one
chip. By the 1980's, very large scale integration (VLSI) squeezed hundreds of
thousands of components onto a chip. Ultra-large scale integration (ULSI)
increased that number into the millions. The ability to fit so much onto an area
about half the size of a U.S. dime helped diminish the size and price of
computers. It also increased their power, efficiency and reliability. The
Intel 4004 chip, developed in 1971, took
the integrated circuit one step further by locating all the components of a
computer (central processing unit, memory, and input and output controls) on a
minuscule chip. Whereas previously the integrated circuit had had to be
manufactured to fit a special purpose, now one microprocessor could be
manufactured and then programmed to meet any number of demands. Soon everyday
household items such as microwave
ovens, television sets and automobiles
with electronic fuel
injection incorporated microprocessors.
Such condensed power allowed everyday people to harness a computer's power.
They were no longer developed exclusively for large business or government
contracts. By the mid-1970's, computer manufacturers sought to bring computers
to general consumers. These minicomputers came complete with user-friendly
software packages that offered even non-technical users an array of
applications, most popularly word processing and spreadsheet programs. Pioneers
in this field were Commodore,
Radio Shack and
Apple Computers. In the early
1980's, arcade
video games such as
Pac
Man and home video game
systems such as the Atari 2600 ignited consumer interest for more
sophisticated, programmable home computers.
In 1981, IBM introduced its personal computer (PC) for use in the home,
office and schools. The 1980's saw an expansion in computer use in all three
arenas as clones of the IBM PC made the personal computer even more affordable.
The number of personal computers in use more than doubled from 2 million in 1981
to 5.5 million in 1982. Ten years later, 65 million PCs were being used.
Computers continued their trend toward a smaller size, working their way down
from desktop to laptop computers (which could fit inside a briefcase) to palmtop
(able to fit inside a breast pocket). In direct competition with IBM's PC was
Apple's Macintosh line, introduced in 1984. Notable for its user-friendly
design, the Macintosh offered an operating system that allowed users to move
screen icons instead of typing instructions. Users controlled the screen cursor
using a mouse, a device that mimicked the movement of one's hand on the computer
screen.
As computers became more widespread in the workplace, new ways to harness
their potential developed. As smaller computers became more powerful, they could
be linked together, or networked, to share memory space, software, information
and communicate with each other. As opposed to a mainframe computer, which was
one powerful computer that shared time with many terminals for many
applications, networked computers allowed individual computers to form
electronic co-ops. Using either direct wiring, called a
Local Area
Network (LAN), or telephone lines, these networks could reach enormous
proportions. A global web of computer circuitry, the
Internet, for example, links
computers worldwide into a single network of information. During the 1992 U.S.
presidential election, vice-presidential candidate
Al
Gore promised to make the development of this so-called "information
superhighway" an administrative priority. Though the possibilities
envisioned by Gore and others for such a large network are often years (if not
decades) away from realization, the most popular use today for computer networks
such as the Internet is electronic mail, or E-mail, which allows users to type
in a computer address and send messages through networked terminals across the
office or across the world.
Fifth Generation (Present and Beyond)
Defining the fifth generation of computers is somewhat difficult because the
field is in its infancy. The most famous example of a fifth generation computer
is the fictional HAL9000
from Arthur
C. Clarke's novel, 2001:
A Space Odyssey. HAL performed all of the functions currently
envisioned for real-life fifth generation computers. With
artificial
intelligence, HAL could reason well enough to hold conversations with its
human operators, use visual input, and learn from its own experiences.
(Unfortunately, HAL was a little too human and had a psychotic breakdown,
commandeering a spaceship and killing most humans on board.)
Though the wayward HAL9000 may be far from the reach of real-life computer
designers, many of its functions are not. Using recent engineering advances,
computers may be able to accept spoken
word instructions and imitate human reasoning. The ability to translate a
foreign language is also a major goal of fifth generation computers. This feat
seemed a simple objective at first, but appeared much more difficult when
programmers realized that human understanding relies as much on context and
meaning as it does on the simple translation of words.
Many advances in the science of computer design and technology are coming
together to enable the creation of fifth-generation computers. Two such
engineering advances are parallel processing, which replaces von Neumann's
single central processing unit design with a system harnessing the power of many
CPUs to work as one. Another advance is
superconductor
technology, which allows the flow of electricity with little or no resistance,
greatly improving the speed of information flow. Computers today have some
attributes of fifth generation computers. For example, expert systems assist
doctors in making diagnoses by applying the problem-solving steps a doctor might
use in assessing a patient's needs. It will take several more years of
development before expert systems are in widespread use.
Sources
- Computers!, Timothy Trainor and Diane Trainor
- Infoculture The Smithsonian Book of Information Age Inventions, Steven Lubar. Houghton Mifflin Company, 1993.
- Alan Turing: The Enigma Andrew Hodges, 1983. Simon & Schuster, New York.
- "Insanely Great," Steven Levy. Popular Science, February, 1994.
- "Stevie Wonder," Joseph Nocera. GQ, October, 1993.
- "Reading Apple's Uncertain Future," MacWorld, October, 1993.
- "Ripe For Change," Michael Myer. Newsweek, August 29, 1994.
- "Future Games," James K. Willcox. Popular Mechanics, December, 1993
- "Electronic Worlds Without End," Keith Ferrell, Omni, October 1993.
- "Mario's Big Brother," David Sheff. Rolling Stone, January 9, 1992.
- "The PC Week Stat Sheet: A Decade of Computing," PC Week. February 28, 1994.
- "R.I.P Commodore, 1954-1994," Tom R. Halfhill. Byte, August, 1994.
- "Playing Catch Up…" Jim Carlton, Wall Street Journal October 17, 1994.
- Breakthrough to the Computer Age, Harry Wulforst
- IBM's Early Computers, Charles J. Bashe, Lyle R. Johnson, John H. Palmer, Emerson Pugh.
- The Computer Comes of Age, R. Moreau
- The Computer Pioneers, David Ritchie
- Zap: The Rise and Fall of Atari, Scott Cohen
- 1993 Grolier's Encyclopedia, Grolier Electronic Publishing, Inc.
0 komentar :
Post a Comment