History of Computers
History of Computers
Pre-20th century
Devices have been used to aid computation for thousands of years, mostly
using one-to-one correspondence with fingers. The earliest counting device
was most likely a form of tally stick. Later record keeping aids throughout the
Fertile Crescent included calculi (clay spheres, cones, etc.) which represented
counts of items, likely livestock or grains, sealed in hollow unbaked clay
containers.[a][4] The use of counting rods is one example.
The abacus was initially used for arithmetic tasks. The Roman abacus was
developed from devices used in Babylonia as early as 2400 BCE. Since then,
many other forms of reckoning boards or tables have been invented. In a
medieval European counting house, a checkered cloth would be placed on a
table, and markers moved around on it according to certain rules, as an aid
to calculating sums of money.[5]
A slide rule
The slide rule was invented around 1620–1630, by the English clergyman
William Oughtred, shortly after the publication of the concept of the
logarithm. It is a hand-operated analog computer for doing multiplication and
division. As slide rule development progressed, added scales provided
reciprocals, squares and square roots, cubes and cube roots, as well as
transcendental functions such as logarithms and exponentials, circular and
hyperbolic trigonometry and other functions. Slide rules with special scales
are still used for quick performance of routine calculations, such as the E6B
circular slide rule used for time and distance calculations on light aircraft.
First computer
Charles Babbage
(A diagram of a portion of Babbage’s Difference engine)
The machine was about a century ahead of its time. All the parts for his
machine had to be made by hand – this was a major problem for a device
with thousands of parts. Eventually, the project was dissolved with the
decision of the British Government to cease funding. Babbage’s failure to
complete the analytical engine can be chiefly attributed to political and
financial difficulties as well as his desire to develop an increasingly
sophisticated computer and to move ahead faster than anyone else could
follow. Nevertheless, his son, Henry Babbage, completed a simplified version
of the analytical engine’s computing unit (the mill) in 1888. He gave a
successful demonstration of its use in computing tables in 1906.
( FORMULA
Analog computers
During the first half of the 20th century, many scientific computing needs
were met by increasingly sophisticated analog computers, which used a
direct mechanical or electrical model of the problem as a basis for
computation. However, these were not programmable and generally lacked
the versatility and accuracy of modern digital computers.[34] The first
modern analog computer was a tide-predicting machine, invented by Sir
William Thomson (later to become Lord Kelvin) in 1872. The differential
analyser, a mechanical analog computer designed to solve differential
equations by integration using wheel-and-disc mechanisms, was
conceptualized in 1876 by James Thomson, the elder brother of the more
famous Sir William Thomson.[16]
The art of mechanical analog computing reached its zenith with the
differential analyzer, built by H. L. Hazen and Vannevar Bush at MIT starting
in 1927. This built on the mechanical integrators of James Thomson and the
torque amplifiers invented by H. W. Nieman. A dozen of these devices were
built before their obsolescence became obvious. By the 1950s, the success
of digital electronic computers had spelled the end for most analog
computing machines, but analog computers remained in use during the
1950s in some specialized applications such as education (slide rule) and
aircraft (control systems).
Digital computers
Electromechanical
In 1941, Zuse followed his earlier machine up with the Z3, the world’s first
working electromechanical programmable, fully automatic digital computer.
[38][39] The Z3 was built with 2000 relays, implementing a 22 bit word
length that operated at a clock frequency of about 5–10 Hz.[40] Program
code was supplied on punched film while data could be stored in 64 words of
memory or supplied from the keyboard. It was quite similar to modern
machines in some respects, pioneering numerous advances such as floating-
point numbers. Rather than the harder-to-implement decimal system (used
in Charles Babbage’s earlier design), using a binary system meant that
Zuse’s machines were easier to build and potentially more reliable, given the
technologies available at that time.[41] The Z3 was not itself a universal
computer but could be extended to be Turing complete.[42][43]
Zuse’s next computer, the Z4, became the world’s first commercial
computer; after initial delay due to the Second World War, it was completed
in 1950 and delivered to the ETH Zurich.[44] The computer was
manufactured by Zuse’s own company, Zuse KG, which was founded in 1941
as the first company with the sole purpose of developing computers in Berlin.
[44]
During World War II, the British code-breakers at Bletchley Park achieved a
number of successes at breaking encrypted German military
communications. The German encryption machine, Enigma, was first
attacked with the help of the electro-mechanical bombes which were often
run by women.[48][49] To crack the more sophisticated German Lorenz SZ
40/42 machine, used for high-level Army communications, Max Newman and
his colleagues commissioned Flowers to build the Colossus.[47] He spent
eleven months from early February 1943 designing and building the first
Colossus.[50] After a functional test in December 1943, Colossus was
shipped to Bletchley Park, where it was delivered on 18 January 1944[51]
and attacked its first message on 5 February.[47]
The ENIAC[54] (Electronic Numerical Integrator and Computer) was the first
electronic programmable computer built in the U.S. Although the ENIAC was
similar to the Colossus, it was much faster, more flexible, and it was Turing-
complete. Like the Colossus, a “program” on the ENIAC was defined by the
states of its patch cables and switches, a far cry from the stored program
electronic machines that came later. Once a program was written, it had to
be mechanically set into the machine with manual resetting of plugs and
switches. The programmers of the ENIAC were six women, often known
collectively as the “ENIAC girls”.[55][56]
Modern computers
The principle of the modern computer was proposed by Alan Turing in his
seminal 1936 paper,[58] On Computable Numbers. Turing proposed a simple
device that he called “Universal Computing machine” and that is now known
as a universal Turing machine. He proved that such a machine is capable of
computing anything that is computable by executing instructions (program)
stored on tape, allowing the machine to be programmable. The fundamental
concept of Turing’s design is the stored program, where all the instructions
for computing are stored in memory. Von Neumann acknowledged that the
central concept of the modern computer was due to this paper.[59] Turing
machines are to this day a central object of study in theory of computation.
Except for the limitations imposed by their finite memory stores, modern
computers are said to be Turing-complete, which is to say, they have
algorithm execution capability equivalent to a universal Turing machine.
Stored programs
The Manchester Baby was the world’s first stored-program computer. It was
built at the University of Manchester in England by Frederic C. Williams, Tom
Kilburn and Geoff Tootill, and ran its first program on 21 June 1948.[60] It was
designed as a testbed for the Williams tube, the first random-access digital
storage device.[61] Although the computer was described as “small and
primitive” by a 1998 retrospective, it was the first working machine to
contain all of the elements essential to a modern electronic computer.[62] As
soon as the Baby had demonstrated the feasibility of its design, a project
began at the university to develop it into a practically useful computer, the
Manchester Mark 1.
The Mark 1 in turn quickly became the prototype for the Ferranti Mark 1, the
world’s first commercially available general-purpose computer.[63] Built by
Ferranti, it was delivered to the University of Manchester in February 1951.
At least seven of these later machines were delivered between 1953 and
1957, one of them to Shell labs in Amsterdam.[64] In October 1947 the
directors of British catering company J. Lyons & Company decided to take an
active role in promoting the commercial development of computers. Lyons’s
LEO I computer, modelled closely on the Cambridge EDSAC of 1949, became
operational in April 1951[65] and ran the world’s first routine office computer
job.
Transistors
Integrated circuits
The next great advance in computing power came with the advent of the
integrated circuit (IC). The idea of the integrated circuit was first conceived
by a radar scientist working for the Royal Radar Establishment of the Ministry
of Defence, Geoffrey W.A. Dummer. Dummer presented the first public
description of an integrated circuit at the Symposium on Progress in Quality
Electronic Components in Washington, D.C., on 7 May 1952.[88]
The first working ICs were invented by Jack Kilby at Texas Instruments and
Robert Noyce at Fairchild Semiconductor.[89] Kilby recorded his initial ideas
concerning the integrated circuit in July 1958, successfully demonstrating the
first working integrated example on 12 September 1958.[90] In his patent
application of 6 February 1959, Kilby described his new device as “a body of
semiconductor material ... wherein all the components of the electronic
circuit are completely integrated”.[91][92] However, Kilby’s invention was a
hybrid integrated circuit (hybrid IC), rather than a monolithic integrated
circuit (IC) chip.[93] Kilby’s IC had external wire connections, which made it
difficult to mass-produce.[94]
Noyce also came up with his own idea of an integrated circuit half a year
later than Kilby.[95] Noyce’s invention was the first true monolithic IC chip.
[96][94] His chip solved many practical problems that Kilby’s had not.
Produced at Fairchild Semiconductor, it was made of silicon, whereas Kilby’s
chip was made of germanium. Noyce’s monolithic IC was fabricated using the
planar process, developed by his colleague Jean Hoerni in early 1959. In turn,
the planar process was based on Carl Frosch and Lincoln Derick work on
semiconductor surface passivation by silicon dioxide.[97][98][99][100][101]
[102]
The development of the MOS integrated circuit led to the invention of the
microprocessor,[107][108] and heralded an explosion in the commercial and
personal use of computers. While the subject of exactly which device was the
first microprocessor is contentious, partly due to lack of agreement on the
exact definition of the term “microprocessor”, it is largely undisputed that
the first single-chip microprocessor was the Intel 4004,[109] designed and
realized by Federico Faggin with his silicon-gate MOS IC technology,[107]
along with Ted Hoff, Masatoshi Shima and Stanley Mazor at Intel.[b][111] In
the early 1970s, MOS IC technology enabled the integration of more than
10,000 transistors on a single chip.[81]
Mobile computers
The first mobile computers were heavy and ran from mains power. The 50 lb
(23 kg) IBM 5100 was an early example. Later portables such as the Osborne
1 and Compaq Portable were considerably lighter but still needed to be
plugged in. The first laptops, such as the Grid Compass, removed this
requirement by incorporating batteries – and with the continued
miniaturization of computing resources and advancements in portable
battery life, portable computers grew in popularity in the 2000s.[113] The
same developments allowed manufacturers to integrate computing
resources into cellular mobile phones by the early 2000s.