Additional Info-Module1
Additional Info-Module1
ABACUS
PASCALINE
JACQUARD’S LOOM
ANALYTICAL ENGINE
Developed by English Mathematician, Charles Babbage, often recognized as the Father of Modern
Computers and Lady Ada Byron , recognized as the First Programmer.
Babbage, designed two mechanical calculators, the Difference Engine (1823) and Analytical
Engine (1833)
TABULATING MACHINE
Invented by Herman Hollerith, In 1890 census data would not have been tabulated until well
after the 1900 census had been taken.
An automatic electrical tabulating machine which read punch cards Like used in Jacquard’ loom.
Computer Generations
The first computers used vacuum tubes for circuitry and magnetic drums for memory, and were
often enormous, taking up entire rooms. They were very expensive to operate and in addition to
using a great deal of electricity, generated a lot of heat, which was often the cause of
malfunctions. First generation computers relied on machine language to perform operations, and
they could only solve one problem at a time. Input was based on punched cards and paper tape,
and output was displayed on printouts.
The UNIVAC and ENIAC computers are examples of first-generation computing devices. The
UNIVAC was the first commercial computer delivered to a business client, the US Census Bureau
in 1951.
Transistors replaced vacuum tubes and ushered in the second generation of computers. The
transistor was invented in 1947 but did not see widespread use in computers until the late 1950s.
The transistor was far superior to the vacuum tube, allowing computers to become smaller,
faster, cheaper, more energy-efficient and more reliable than their first-generation predecessors.
Though the transistor still generated a great deal of heat that subjected the computer to
damage, it was a vast improvement over the vacuum tube. Second-generation computers still
relied on punched cards for input and printout for output.
The first computers of this generation were developed for the atomic energy industry.
The development of the integrated circuit was the hallmark of the third generation of computers.
Transistors were miniaturized and placed on silicon chips, called semiconductors, which
drastically increased the speed and efficiency of computers.
Instead of punched cards and printouts, users interacted with third generation computers
through keyboards and monitors and interfaced with an operating system, which allowed the
device to run many different applications at one time with a central program that monitored the
memory. Computers for the first time became accessible to a mass audience because they were
smaller and cheaper than their predecessors.
In 1981 IBM introduced its first computer for the home user, and in 1984 Apple introduced
Macintosh. Microprocessors also moved out of the realm of desktop computers and into many
areas of life as more and more everyday products began to use microprocessors.
As these small computers became more powerful, they could be linked together to form
networks, which eventually led to the development of GUIs, the mouse and handheld devices.
Fifth generation computing devices, based on artificial intelligence, are still in development,
though there are some applications, such as voice recognition, that are being used today. The
use of parallel processing and superconductors is helping to make artificial intelligence a reality.
Quantum computation and molecular and nanotechnology will radically change the face of
computers in years to come. The goal of fifth-generation computing is to develop devices that
respond to natural language input and are capable of learning and self-organization.