History of Computer
History of Computer
Pre-20th century
Devices have been used to aid computation for thousands of years, mostly using one-to-one
correspondence with fingers. The earliest counting device was most likely a form of tally stick.
Later record keeping aids throughout the Fertile Crescent included calculi (clay spheres, cones,
etc.) which represented counts of items, likely livestock or grains, sealed in hollow unbaked clay
containers.[a][4] The use of counting rods is one example.
The abacus was initially used for arithmetic tasks. The Roman abacus was developed from
devices used in Babylonia as early as 2400 BCE. Since then, many other forms of reckoning
boards or tables have been invented. In a medieval European counting house, a checkered cloth
would be placed on a table, and markers moved around on it according to certain rules, as an aid
to calculating sums of money.[5]
The Antikythera mechanism, dating back to ancient Greece circa
150–100 BCE, is an early analog computing device.
The Antikythera mechanism is believed to be the earliest known mechanical analog computer,
according to Derek J. de Solla Price.[6] It was designed to calculate astronomical positions. It was
discovered in 1901 in the Antikythera wreck off the Greek island of Antikythera, between
Kythera and Crete, and has been dated to approximately c. 100 BCE. Devices of comparable
complexity to the Antikythera mechanism would not reappear until the fourteenth century. [7]
Many mechanical aids to calculation and measurement were constructed for astronomical and
navigation use. The planisphere was a star chart invented by Abū Rayhān al-Bīrūnī in the early
11th century.[8] The astrolabe was invented in the Hellenistic world in either the 1st or 2nd
centuries BCE and is often attributed to Hipparchus. A combination of the planisphere and
dioptra, the astrolabe was effectively an analog computer capable of working out several
different kinds of problems in spherical astronomy. An astrolabe incorporating a mechanical
calendar computer[9][10] and gear-wheels was invented by Abi Bakr of Isfahan, Persia in 1235.[11]
Abū Rayhān al-Bīrūnī invented the first mechanical geared lunisolar calendar astrolabe,[12] an
early fixed-wired knowledge processing machine[13] with a gear train and gear-wheels,[14] c. 1000
AD.
The sector, a calculating instrument used for solving problems in proportion, trigonometry,
multiplication and division, and for various functions, such as squares and cube roots, was
developed in the late 16th century and found application in gunnery, surveying and navigation.
The planimeter was a manual instrument to calculate the area of a closed figure by tracing over it
with a mechanical linkage.
A slide rule
The slide rule was invented around 1620–1630, by the English clergyman William Oughtred,
shortly after the publication of the concept of the logarithm. It is a hand-operated analog
computer for doing multiplication and division. As slide rule development progressed, added
scales provided reciprocals, squares and square roots, cubes and cube roots, as well as
transcendental functions such as logarithms and exponentials, circular and hyperbolic
trigonometry and other functions. Slide rules with special scales are still used for quick
performance of routine calculations, such as the E6B circular slide rule used for time and
distance calculations on light aircraft.
In the 1770s, Pierre Jaquet-Droz, a Swiss watchmaker, built a mechanical doll (automaton) that
could write holding a quill pen. By switching the number and order of its internal wheels
different letters, and hence different messages, could be produced. In effect, it could be
mechanically "programmed" to read instructions. Along with two other complex machines, the
doll is at the Musée d'Art et d'Histoire of Neuchâtel, Switzerland, and still operates.[15]
The differential analyser, a mechanical analog computer designed to solve differential equations
by integration, used wheel-and-disc mechanisms to perform the integration. In 1876, Sir William
Thomson had already discussed the possible construction of such calculators, but he had been
stymied by the limited output torque of the ball-and-disk integrators.[16] In a differential analyzer,
the output of one integrator drove the input of the next integrator, or a graphing output. The
torque amplifier was the advance that allowed these machines to work. Starting in the 1920s,
Vannevar Bush and others developed mechanical differential analyzers.
In the 1890s, the Spanish engineer Leonardo Torres Quevedo began to develop a series of
advanced analog machines that could solve real and complex roots of polynomials,[17][18][19][20]
which were published in 1901 by the Paris Academy of Sciences.[21]
First computer
Charles Babbage
A diagram of a portion of Babbage's Difference engine
Charles Babbage, an English mechanical engineer and polymath, originated the concept of a
programmable computer. Considered the "father of the computer",[22] he conceptualized and
invented the first mechanical computer in the early 19th century.
After working on his difference engine he announced his invention in 1822, in a paper to the
Royal Astronomical Society, titled "Note on the application of machinery to the computation of
astronomical and mathematical tables".[23] He also designed to aid in navigational calculations, in
1833 he realized that a much more general design, an analytical engine, was possible. The input
of programs and data was to be provided to the machine via punched cards, a method being used
at the time to direct mechanical looms such as the Jacquard loom. For output, the machine would
have a printer, a curve plotter and a bell. The machine would also be able to punch numbers onto
cards to be read in later. The engine would incorporate an arithmetic logic unit, control flow in
the form of conditional branching and loops, and integrated memory, making it the first design
for a general-purpose computer that could be described in modern terms as Turing-complete.[24]
[25]
The machine was about a century ahead of its time. All the parts for his machine had to be made
by hand – this was a major problem for a device with thousands of parts. Eventually, the project
was dissolved with the decision of the British Government to cease funding. Babbage's failure to
complete the analytical engine can be chiefly attributed to political and financial difficulties as
well as his desire to develop an increasingly sophisticated computer and to move ahead faster
than anyone else could follow. Nevertheless, his son, Henry Babbage, completed a simplified
version of the analytical engine's computing unit (the mill) in 1888. He gave a successful
demonstration of its use in computing tables in 1906.
Engine. The paper contains a design of a machine capable to calculate formulas like ,
for a sequence of sets of values. The whole machine was to be controlled by a read-only
program, which was complete with provisions for conditional branching. He also introduced the
idea of floating-point arithmetic.[26][27][28] In 1920, to celebrate the 100th anniversary of the
invention of the arithmometer, Torres presented in Paris the Electromechanical Arithmometer,
which allowed a user to input arithmetic problems through a keyboard, and computed and
printed the results,[29][30][31][32] demonstrating the feasibility of an electromechanical analytical
engine.[33]
Analog computers
During the first half of the 20th century, many scientific computing needs were met by
increasingly sophisticated analog computers, which used a direct mechanical or electrical model
of the problem as a basis for computation. However, these were not programmable and generally
lacked the versatility and accuracy of modern digital computers.[34] The first modern analog
computer was a tide-predicting machine, invented by Sir William Thomson (later to become
Lord Kelvin) in 1872. The differential analyser, a mechanical analog computer designed to solve
differential equations by integration using wheel-and-disc mechanisms, was conceptualized in
1876 by James Thomson, the elder brother of the more famous Sir William Thomson.[16]
The art of mechanical analog computing reached its zenith with the differential analyzer, built by
H. L. Hazen and Vannevar Bush at MIT starting in 1927. This built on the mechanical
integrators of James Thomson and the torque amplifiers invented by H. W. Nieman. A dozen of
these devices were built before their obsolescence became obvious. By the 1950s, the success of
digital electronic computers had spelled the end for most analog computing machines, but analog
computers remained in use during the 1950s in some specialized applications such as education
(slide rule) and aircraft (control systems).
Digital computers
Electromechanical
Claude Shannon's 1937 master's thesis laid the foundations of digital computing, with his insight
of applying Boolean algebra to the analysis and synthesis of switching circuits being the basic
concept which underlies all electronic digital computers.[35][36]
By 1938, the United States Navy had developed an electromechanical analog computer small
enough to use aboard a submarine. This was the Torpedo Data Computer, which used
trigonometry to solve the problem of firing a torpedo at a moving target. During World War II
similar devices were developed in other countries as well.
Early digital computers were electromechanical; electric switches drove mechanical relays to
perform the calculation. These devices had a low operating speed and were eventually
superseded by much faster all-electric computers, originally using vacuum tubes. The Z2, created
by German engineer Konrad Zuse in 1939 in Berlin, was one of the earliest examples of an
electromechanical relay computer.[37]
Konrad Zuse, inventor of the modern computer[38][39]
In 1941, Zuse followed his earlier machine up with the Z3, the world's first working
electromechanical programmable, fully automatic digital computer.[40][41] The Z3 was built with
2000 relays, implementing a 22 bit word length that operated at a clock frequency of about 5–10
Hz.[42] Program code was supplied on punched film while data could be stored in 64 words of
memory or supplied from the keyboard. It was quite similar to modern machines in some
respects, pioneering numerous advances such as floating-point numbers. Rather than the harder-
to-implement decimal system (used in Charles Babbage's earlier design), using a binary system
meant that Zuse's machines were easier to build and potentially more reliable, given the
technologies available at that time.[43] The Z3 was not itself a universal computer but could be
extended to be Turing complete.[44][45]
Zuse's next computer, the Z4, became the world's first commercial computer; after initial delay
due to the Second World War, it was completed in 1950 and delivered to the ETH Zurich.[46] The
computer was manufactured by Zuse's own company, Zuse KG, which was founded in 1941 as
the first company with the sole purpose of developing computers in Berlin.[46] The Z4 served as
the inspiration for the construction of the ERMETH, the first Swiss computer and one of the first
in Europe.[47]
Purely electronic circuit elements soon replaced their mechanical and electromechanical
equivalents, at the same time that digital calculation replaced analog. The engineer Tommy
Flowers, working at the Post Office Research Station in London in the 1930s, began to explore
the possible use of electronics for the telephone exchange. Experimental equipment that he built
in 1934 went into operation five years later, converting a portion of the telephone exchange
network into an electronic data processing system, using thousands of vacuum tubes.[34] In the
US, John Vincent Atanasoff and Clifford E. Berry of Iowa State University developed and tested
the Atanasoff–Berry Computer (ABC) in 1942,[48] the first "automatic electronic digital
computer".[49] This design was also all-electronic and used about 300 vacuum tubes, with
capacitors fixed in a mechanically rotating drum for memory.[50]
Colossus, the first electronic digital programmable
computing device, was used to break German ciphers during World War II. It is seen here in use
at Bletchley Park in 1943.
During World War II, the British code-breakers at Bletchley Park achieved a number of
successes at breaking encrypted German military communications. The German encryption
machine, Enigma, was first attacked with the help of the electro-mechanical bombes which were
often run by women.[51][52] To crack the more sophisticated German Lorenz SZ 40/42 machine,
used for high-level Army communications, Max Newman and his colleagues commissioned
Flowers to build the Colossus.[50] He spent eleven months from early February 1943 designing
and building the first Colossus.[53] After a functional test in December 1943, Colossus was
shipped to Bletchley Park, where it was delivered on 18 January 1944[54] and attacked its first
message on 5 February.[50]
Colossus was the world's first electronic digital programmable computer.[34] It used a large
number of valves (vacuum tubes). It had paper-tape input and was capable of being configured to
perform a variety of boolean logical operations on its data, but it was not Turing-complete. Nine
Mk II Colossi were built (The Mk I was converted to a Mk II making ten machines in total).
Colossus Mark I contained 1,500 thermionic valves (tubes), but Mark II with 2,400 valves, was
both five times faster and simpler to operate than Mark I, greatly speeding the decoding process.
[55][56]
ENIAC was the first electronic, Turing-complete device, and performed ballistics trajectory
calculations for the United States Army.
The ENIAC[57] (Electronic Numerical Integrator and Computer) was the first electronic
programmable computer built in the U.S. Although the ENIAC was similar to the Colossus, it
was much faster, more flexible, and it was Turing-complete. Like the Colossus, a "program" on
the ENIAC was defined by the states of its patch cables and switches, a far cry from the stored
program electronic machines that came later. Once a program was written, it had to be
mechanically set into the machine with manual resetting of plugs and switches. The
programmers of the ENIAC were six women, often known collectively as the "ENIAC girls".[58]
[59]
It combined the high speed of electronics with the ability to be programmed for many complex
problems. It could add or subtract 5000 times a second, a thousand times faster than any other
machine. It also had modules to multiply, divide, and square root. High speed memory was
limited to 20 words (about 80 bytes). Built under the direction of John Mauchly and J. Presper
Eckert at the University of Pennsylvania, ENIAC's development and construction lasted from
1943 to full operation at the end of 1945. The machine was huge, weighing 30 tons, using 200
kilowatts of electric power and contained over 18,000 vacuum tubes, 1,500 relays, and hundreds
of thousands of resistors, capacitors, and inductors.[60]
Modern computers
The principle of the modern computer was proposed by Alan Turing in his seminal 1936 paper,
[61]
On Computable Numbers. Turing proposed a simple device that he called "Universal
Computing machine" and that is now known as a universal Turing machine. He proved that such
a machine is capable of computing anything that is computable by executing instructions
(program) stored on tape, allowing the machine to be programmable. The fundamental concept
of Turing's design is the stored program, where all the instructions for computing are stored in
memory. Von Neumann acknowledged that the central concept of the modern computer was due
to this paper.[62] Turing machines are to this day a central object of study in theory of
computation. Except for the limitations imposed by their finite memory stores, modern
computers are said to be Turing-complete, which is to say, they have algorithm execution
capability equivalent to a universal Turing machine.
Stored programs
Early computing machines had fixed programs. Changing its function required the re-wiring and
re-structuring of the machine.[50] With the proposal of the stored-program computer this changed.
A stored-program computer includes by design an instruction set and can store in memory a set
of instructions (a program) that details the computation. The theoretical basis for the stored-
program computer was laid out by Alan Turing in his 1936 paper. In 1945, Turing joined the
National Physical Laboratory and began work on developing an electronic stored-program digital
computer. His 1945 report "Proposed Electronic Calculator" was the first specification for such a
device. John von Neumann at the University of Pennsylvania also circulated his First Draft of a
Report on the EDVAC in 1945.[34]
The Manchester Baby was the world's first stored-program computer. It was built at the
University of Manchester in England by Frederic C. Williams, Tom Kilburn and Geoff Tootill,
and ran its first program on 21 June 1948.[63] It was designed as a testbed for the Williams tube,
the first random-access digital storage device.[64] Although the computer was described as "small
and primitive" by a 1998 retrospective, it was the first working machine to contain all of the
elements essential to a modern electronic computer.[65] As soon as the Baby had demonstrated the
feasibility of its design, a project began at the university to develop it into a practically useful
computer, the Manchester Mark 1.
The Mark 1 in turn quickly became the prototype for the Ferranti Mark 1, the world's first
commercially available general-purpose computer.[66] Built by Ferranti, it was delivered to the
University of Manchester in February 1951. At least seven of these later machines were
delivered between 1953 and 1957, one of them to Shell labs in Amsterdam.[67] In October 1947
the directors of British catering company J. Lyons & Company decided to take an active role in
promoting the commercial development of computers. Lyons's LEO I computer, modelled
closely on the Cambridge EDSAC of 1949, became operational in April 1951[68] and ran the
world's first routine office computer job.
Transistors
The concept of a field-effect transistor was proposed by Julius Edgar Lilienfeld in 1925. John
Bardeen and Walter Brattain, while working under William Shockley at Bell Labs, built the first
working transistor, the point-contact transistor, in 1947, which was followed by Shockley's
bipolar junction transistor in 1948.[69][70] From 1955 onwards, transistors replaced vacuum tubes
in computer designs, giving rise to the "second generation" of computers. Compared to vacuum
tubes, transistors have many advantages: they are smaller, and require less power than vacuum
tubes, so give off less heat. Junction transistors were much more reliable than vacuum tubes and
had longer, indefinite, service life. Transistorized computers could contain tens of thousands of
binary logic circuits in a relatively compact space. However, early junction transistors were
relatively bulky devices that were difficult to manufacture on a mass-production basis, which
limited them to a number of specialized applications.[71]
At the University of Manchester, a team under the leadership of Tom Kilburn designed and built
a machine using the newly developed transistors instead of valves.[72] Their first transistorized
computer and the first in the world, was operational by 1953, and a second version was
completed there in April 1955. However, the machine did make use of valves to generate its
125 kHz clock waveforms and in the circuitry to read and write on its magnetic drum memory,
so it was not the first completely transistorized computer. That distinction goes to the Harwell
CADET of 1955,[73] built by the electronics division of the Atomic Energy Research
Establishment at Harwell.[73][74]
The metal–oxide–silicon field-effect transistor (MOSFET), also known as the MOS transistor,
was invented at Bell Labs between 1955 and 1960[75][76][77][78][79][80] and was the first truly compact
transistor that could be miniaturized and mass-produced for a wide range of uses.[71] With its high
scalability,[81] and much lower power consumption and higher density than bipolar junction
transistors,[82] the MOSFET made it possible to build high-density integrated circuits.[83][84] In
addition to data processing, it also enabled the practical use of MOS transistors as memory cell
storage elements, leading to the development of MOS semiconductor memory, which replaced
earlier magnetic-core memory in computers. The MOSFET led to the microcomputer revolution,
[85]
and became the driving force behind the computer revolution.[86][87] The MOSFET is the most
widely used transistor in computers,[88][89] and is the fundamental building block of digital
electronics.[90]
Integrated circuits
The next great advance in computing power came with the advent of the integrated circuit (IC).
The idea of the integrated circuit was first conceived by a radar scientist working for the Royal
Radar Establishment of the Ministry of Defence, Geoffrey W.A. Dummer. Dummer presented
the first public description of an integrated circuit at the Symposium on Progress in Quality
Electronic Components in Washington, D.C., on 7 May 1952.[91]
The first working ICs were invented by Jack Kilby at Texas Instruments and Robert Noyce at
Fairchild Semiconductor.[92] Kilby recorded his initial ideas concerning the integrated circuit in
July 1958, successfully demonstrating the first working integrated example on 12 September
1958.[93] In his patent application of 6 February 1959, Kilby described his new device as "a body
of semiconductor material ... wherein all the components of the electronic circuit are completely
integrated".[94][95] However, Kilby's invention was a hybrid integrated circuit (hybrid IC), rather
than a monolithic integrated circuit (IC) chip.[96] Kilby's IC had external wire connections, which
made it difficult to mass-produce.[97]
Noyce also came up with his own idea of an integrated circuit half a year later than Kilby. [98]
Noyce's invention was the first true monolithic IC chip.[99][97] His chip solved many practical
problems that Kilby's had not. Produced at Fairchild Semiconductor, it was made of silicon,
whereas Kilby's chip was made of germanium. Noyce's monolithic IC was fabricated using the
planar process, developed by his colleague Jean Hoerni in early 1959. In turn, the planar process
was based on Carl Frosch and Lincoln Derick work on semiconductor surface passivation by
silicon dioxide.[100][101][102][103][104][105]
System on a Chip (SoCs) are complete computers on a microchip (or chip) the size of a coin.[115]
They may or may not have integrated RAM and flash memory. If not integrated, the RAM is
usually placed directly above (known as Package on package) or below (on the opposite side of
the circuit board) the SoC, and the flash memory is usually placed right next to the SoC. This is
done to improve data transfer speeds, as the data signals do not have to travel long distances.
Since ENIAC in 1945, computers have advanced enormously, with modern SoCs (such as the
Snapdragon 865) being the size of a coin while also being hundreds of thousands of times more
powerful than ENIAC, integrating billions of transistors, and consuming only a few watts of
power.
Mobile computers
The first mobile computers were heavy and ran from mains power. The 50 lb (23 kg) IBM 5100
was an early example. Later portables such as the Osborne 1 and Compaq Portable were
considerably lighter but still needed to be plugged in. The first laptops, such as the Grid
Compass, removed this requirement by incorporating batteries – and with the continued
miniaturization of computing resources and advancements in portable battery life, portable
computers grew in popularity in the 2000s.[116] The same developments allowed manufacturers to
integrate computing resources into cellular mobile phones by the early 2000s.
These smartphones and tablets run on a variety of operating systems and recently became the
dominant computing device on the market.[117] These are powered by System on a Chip (SoCs),
which are complete computers on a microchip the size of a coin.[115]