Generation of Computer
Generation of Computer
The first computers used vacuum tubes for circuitryand magnetic drums for
memory, and were oftenenormous, taking up entire rooms. A magnetic drum,also
referred to as drum, is a metal cylinder coatedwith magnetic iron-oxide material on
which data andprograms can be stored. Magnetic drums were once usedas a
primary storage device but have since beenimplemented as auxiliary storage
devices.
They were very expensive to operate and in additionto using a great deal of
electricity, generated a lotof heat, which was often the cause of malfunctions.First
generation computers relied on machine languageto perform operations, and they
could only solve oneproblem at a time. Machine languages are the onlylanguages
understood by computers. While easilyunderstood by computers, machine
languages are almostimpossible for humans to use because they consistentirely of
numbers. Programmers, therefore, useeither a high-level programming language or
anassembly language. An assembly language contains thesame instructions as a
machine language, but theinstructions and variables have names instead of
beingjust numbers.
Every CPU has its own unique machine language.Programs must be rewritten or
recompiled, therefore,to run on different types of computers. Input wasbased on
punched cards and paper tape, and output wasdisplayed on printouts.
The transistor was invented in 1947 but did not seewidespread use in computers
until the late 50s. Thetransistor was far superior to the vacuum tube,allowing
computers to become smaller, faster, cheaper,more energy-efficient and more
reliable than theirfirst-generation predecessors. Though the transistorstill generated
a great deal of heat that subjectedthe computer to damage, it was a vast
improvement overthe vacuum tube. Second-generation computers stillrelied on
punched cards for input and printouts foroutput.
The first computers of this generation were developedfor the atomic energy
industry.
The development of the integrated circuit was thehallmark of the third generation of
computers.Transistors were miniaturized and placed on siliconchips, called
semiconductors, which drasticallyincreased the speed and efficiency of computers.
Silicon is the basic material used to make computerchips, transistors, silicon diodes
and otherelectronic circuits and switching devices because itsatomic structure
makes the element an idealsemiconductor. Silicon is commonly doped, or
mixed,with other elements, such as boron, phosphorous andarsenic, to alter its
conductive properties.
In both cases, the higher the value, the more powerfulthe CPU. For example, a 32-
bit microprocessor thatruns at 50MHz is more powerful than a 16-bitmicroprocessor
that runs at 25MHz.
What in the first generation filled an entire roomcould now fit in the palm of the
hand. The Intel 4004chip, developed in 1971, located all the components ofthe
computer - from the central processing unit andmemory to input/output controls -
on a single chip.
• The control unit, which extracts instructions frommemory and decodes and
executes them, calling on theALU when necessary.
In 1981 IBM introduced its first computer for the homeuser, and in 1984 Apple
introduced the Macintosh.Microprocessors also moved out of the realm of
desktopcomputers and into many areas of life as more and moreeveryday products
began to use microprocessors.
Currently, no computers exhibit full artificialintelligence (that is, are able to simulate
humanbehavior). The greatest advances have occurred in thefield of games playing.
The best computer chessprograms are now capable of beating humans. In
May,1997, an IBM super-computer called Deep Blue defeatedworld chess champion
Gary Kasparov in a chess match.
In the area of robotics, computers are now widely usedin assembly plants, but they
are capable only of verylimited tasks. Robots have great difficultyidentifying objects
based on appearance or feel, andthey still move and handle objects clumsily.
There are also voice recognition systems that canconvert spoken sounds into
written words, but they donot understand what they are writing; they simply
takedictation. Even these systems are quite limited -- youmust speak slowly and
distinctly.
In the early 1980s, expert systems were believed torepresent the future of artificial
intelligence and ofcomputers in general. To date, however, they have notlived up to
expectations. Many expert systems helphuman experts in such fields as medicine
andengineering, but they are very expensive to produceand are helpful only in
special situations.
Today, the hottest area of artificial intelligence isneural networks, which are proving
successful in anumber of disciplines such as voice recognition andnatural-language
processing.
There are several programming languages that are knownas AI languages because
they are used almostexclusively for AI applications. The two most commonare LISP
and Prolog.
Voice Recognition
The field of computer science that deals withdesigning computer systems that can
recognize spokenwords. Note that voice recognition implies only thatthe computer
can take dictation, not that itunderstands what is being said. Comprehending
humanlanguages falls under a different field of computerscience called natural
language processing. A number of voice recognition systems are available onthe
market. The most powerful can recognize thousandsof words. However, they
generally require an extendedtraining session during which the computer
systembecomes accustomed to a particular voice and accent.Such systems are said
to be speaker dependent.
Many systems also require that the speaker speakslowly and distinctly and separate
each word with ashort pause. These systems are called discrete speechsystems.
Recently, great strides have been made incontinuous speech systems -- voice
recognition systemsthat allow you to speak naturally. There are nowseveral
continuous-speech systems available forpersonal computers.
Most computers have just one CPU, but some models haveseveral. There are even
computers with thousands ofCPUs. With single-CPU computers, it is possible
toperform parallel processing by connecting thecomputers in a network. However,
this type of parallelprocessing requires very sophisticated software calleddistributed
processing software.
Qubits do not rely on the traditional binary nature ofcomputing. While traditional
computers encodeinformation into bits using binary numbers, either a 0or 1, and
can only do calculations on one set ofnumbers at once, quantum computers encode
informationas a series of quantum-mechanical states such as spindirections of
electrons or polarization orientationsof a photon that might represent a 1 or a 0,
mightrepresent a combination of the two or might representa number expressing
that the state of the qubit issomewhere between 1 and 0, or a superposition of
manydifferent numbers at once. A quantum computer can doan arbitrary reversible
classical computation on allthe numbers simultaneously, which a binary
systemcannot do, and also has some ability to produceinterference between various
different numbers. Bydoing a computation on many different numbers at once,then
interfering the results to get a single answer, aquantum computer has the potential
to be much morepowerful than a classical computer of the same size.In using only a
single processing unit, a quantumcomputer can naturally perform myriad operations
inparallel.
Quantum computing is not well suited for tasks such asword processing and email,
but it is ideal for taskssuch as cryptography and modeling and indexing verylarge
databases.
Although research in this field dates back to RichardP. Feynman's classic talk in
1959, the termnanotechnology was first coined by K. Eric Drexler in1986 in the book
Engines of Creation.
In the popular press, the term nanotechnology issometimes used to refer to any
sub-micron process,including lithography. Because of this, manyscientists are
beginning to use the term molecularnanotechnology when talking about true
nanotechnologyat the molecular level.
The goal of fifth-generation computing is to developdevices that respond to natural
language input and arecapable of learning and self-organization.
Here natural language means a human language. Forexample, English, French, and
Chinese are naturallanguages. Computer languages, such as FORTRAN and C,are
not.
The first computers of this generation were developed for the atomic
energy industry.
In 1981 IBM introduced its first computer for the home user, and in 1984
Apple introduced the Macintosh. Microprocessors also moved out of the
realm of desktop computers and into many areas of life as more and more
everyday products began to use microprocessors.