Software Is Capable of Accepting Data As Input, Processing The Data and Giving Out Information As Output. This
Software Is Capable of Accepting Data As Input, Processing The Data and Giving Out Information As Output. This
The history of the computer is the record of ongoing effort to make computer hardware faster, cheaper and
capable of storing more data. This history refers to computing hardware that have evolved from machines that
needed separate manual action to perform each arithmetic operation to punched card machines and then to
stored programs computer.
The stored programs computer refers to the organization of units to perform input, output, to store data and
to operate as an integrated mechanism. Increases in speed and memory capacity, decreases in cost and size in
relation to computer power as well as evolution of the computer language are major features of the history.
DEFINITION:
What Is a Computer? A computer is an electronic machine which under the control stored programs called
software is capable of accepting data as input, processing the data and giving out information as output. This
definition includes the following operations:
INPUT: A computer accepts data that are provided by means of an input device e.g. keyboard.
PROCESSING: A computer performs operations on the data received and transforms it to information.
OUTPUT: This is when the computer produces the information as output on a device such as a visual
display unit, printer, that shows the results of the processing operations.
STORAGE: A computer stores the results of the processing operations for future use.
This definition is often referred to as the IPOS CYCLE. The four steps of the IPOS cycle do not have to
occur in rigid I-P-O-S sequence. Under the direction of a program a computer uses the steps of this
process when needed and as often as needed.
Computing hardware has become a platform for uses other than mere computation, such as process automation,
electronic communications, equipment control, entertainment, education, etc. Each field in turn has imposed its
own requirements on the hardware, which has evolved in response to those requirements, such as the role of the
touch screen to create a more intuitive and natural user interface .
Aside from written numerals, the first aids to computation were purely mechanical devices which
required the operator to set up the initial values of an elementary arithmetic operation, then manipulate the
device through manual manipulations to obtain the result. A sophisticated (and comparatively recent) example
is the slide rule in which numbers are represented as lengths on a logarithmic scale and computation is
performed by setting a cursor and aligning sliding scales, thus adding those lengths. Numbers could be
represented in a continuous "analog" form, for instance a voltage or some other physical property was set to be
proportional to the number. Analog computers, like those designed and built by Vannevar Bush before WWII
were of this type. Or, numbers could be represented in the form of digits, automatically manipulated by a
mechanical mechanism. Although this last approach required more complex mechanisms in many cases, it made
for greater precision of results.
Both analog and digital mechanical techniques continued to be developed, producing many practical
computing machines. Electrical methods rapidly improved the speed and precision of calculating machines, at
first by providing motive power for mechanical calculating devices, and later directly as the medium for
representation of numbers. Numbers could be represented by voltages or currents and manipulated by linear
electronic amplifiers. Or, numbers could be represented as discrete binary or decimal digits, and electrically
controlled switches and combinational circuits could perform mathematical operations.
The invention of electronic amplifiers made calculating machines much faster than their mechanical or
electromechanical predecessors. Vacuum tube (thermionic valve) amplifiers gave way to solid state transistors,
and then rapidly to integrated circuits which continue to improve, placing millions of electrical switches
(typically transistors) on a single elaborately manufactured piece of semi-conductor the size of a fingernail. By
defeating the tyranny of numbers, integrated circuits made high-speed and low-cost digital computers a
widespread commodity.
HISTORICAL DEVELOPMENT.
1. EARLIEST HARDWARE.
The development of the computer has been an incremental process. Devices have been used to aid
computation for thousands of years, mostly using one-to-one correspondence with our fingers. Humans used
various methods to represent numbers. The first was probably stones. The word "calculate" comes from the
Latin word "calculus." Calculus means small stone. Eventually these stones were placed in rows and
columns. However, as numbers increased, the number of stones required became prohibitive. This led to the
creation of counting boards. The earliest counting device was probably a form of tally stick. The counting
rod is another example.
2. ABACUS:
The abacus was early used for arithmetic tasks. The first surviving written record of an abacus was by the
ancient Greek historian Herodotus. Herodotus mentioned the use of the abacus by Greeks and Egyptians.
What we now call the Roman abacus was used in Babylonia as early as 2400 BC. Since then, many other
forms of reckoning boards or tables have been invented. In a medieval European counting house, a checkered
cloth would be placed on a table, and markers moved around on it according to certain rules, as an aid to
calculating sums of money. The Chinese counting board used rods and beads in a wooden frame. In Russia,
the abacus was modified to have colored beads in the middle indicating the decimal place. he abacus is still
widely used at markets in the former Soviet Union.
3. ANALOG COMPUTERS:
An analog computer is a form of computer using electronic or mechanical phenomena to model the
problem being solved by using one kind of physical quantity to represent another. The term is used in
distinction to digital computers, in which physical or mechanical phenomena are used to construct a finite state
machine which is then used to model the problem being solved. There Computations are often performed, in
analog computers, by using properties of electrical resistance, voltages and so on.
The use of electrical properties in analog computers means that certain calculations on a computer are
performed in real time, without calculation delays as on digital computers. This property allows certain useful
calculations that are comparatively "difficult" for digital computers to perform for example numerical
integration. Any physical process which models some computation can be interpreted as an analog computer
Analog computers often have a complicated framework, but they have, at their core, a set of key
electrical components which perform the calculations, which the operator manipulates through the computer's
framework.
Several analog computers were constructed in ancient and medieval times to perform astronomical
calculations. These include the Antikythera mechanism and the astrolabe from ancient Greece (c. 150–100 BC),
which are generally regarded as the earliest known mechanical analog computers. Other early versions of
mechanical devices used to perform one or another type of calculations include the planisphere and other
mechanical computing devices invented by Abū Rayhān al-Bīrū (AD 1000).Some examples of modern analog
computers are:
A. Pascaline - This was a mathematical calculator which performs addition and subtraction. I n 1642, while still
a teenager, Blaise Pascal started some pioneering work on calculating machines and after three years of effort
and 50 prototypes, he invented the mechanical calculator (called the Pascaline). He built twenty of these
machines in the following ten years.
B. G.W Leinbiz : Gottfried Wilhelm von Leibniz invented the Stepped Reckoner and his famous cylinders
around 1672 while adding direct multiplication and division to the Pascaline. With stepped up performance and
uses punch card for storage of information, they had got additional function besides addition and subtraction,
and also performs square root functionality. Leibniz also described the binary numeral system, as a central
ingredient of all modern computers. However, up to the 1940s, many subsequent designs (including Charles
Babbage's machines of the 1822 and even ENIAC of 1945) were based on the decimal system.
D. DESKTOP CALCULATORS:
By the 20th century, earlier mechanical calculators, cash registers, accounting machines, and so on were
redesigned to use electric motors, with gear position as the representation for the state of a variable. The word
"computer" was a job title assigned to people who used these calculators to perform mathematical calculations.
Companies like Friden, Marchant Calculator and Monroe made desktop mechanical calculators from the 1930s
that could add, subtract, multiply and divide. In 1948, the CURTA was introduced. This was a small, portable,
mechanical calculator that was about the size of a pepper grinder. Over time, during the 1950s and 1960s a
variety of different brands of mechanical calculators appeared on the market.
The first all-electronic desktop calculator was the British ANITA Mk.VII, which used a Nixie tube
display. In June 1963, Friden introduced the four-function EC-130. It had an all transistor design. These are the
earliest examples of BINARY VACUUM-TUBE COMPUTERS.
4. HYBRID COMPUTERS:
A hybrid computer is an intermediate device in which an analog output is converted into digits. Because
of their ease of use and because of technological breakthroughs in digital computers in the early 70s, the
analog-digital hybrids were replacing the analog-only systems. Hybrid computers are used to obtain a very
accurate but not very mathematically precise value, using an analog computer front-end, which value is then
fed into a digital computer, using an iterative process to achieve the final desired degree of precision. With a
three or four digit precision, highly-accurate numerical value, the total computation time necessary to reach
the desired precision is dramatically reduced, since many fewer digital iterations are required (and the analog
computer reaches its result almost instantaneously). Digital electronic computers like the ENIAC spelled the
end for most analog computing machines, but hybrid analog computers, controlled by digital electronics,
remained in substantial use into the 1950s and 1960s, and later in some specialized applications.
C. COLOSSUS:
The Colossus was also built during the World War II. Colossus is a computer specifically designed for
code breaking. It was used by the British during World War II to break German coded messages. Colossus
was the world's first totally electronic programmable computing device. The Colossus used a large number
of valves (vacuum tubes). It had paper-tape input and was capable of being configured to perform a variety
of Boolean logical operations on its data.
D. ENIAC:
In April of 1943, the building of the Electronic Numerical Integrator Analyzer and Computer (ENIAC)
commenced and was developed by colleagues John Mauchley and J. Presper Eckert Junior and built at the
University of Pennsylvania's Moore School of Electrical Engineering. ENIAC was the first general-
purpose, all-electronic, programmable digital computer. It combined, for the first time, the high speed of
electronics with the ability to be programmed for many complex problems. It could add or subtract 5000
times a second, a thousand times faster than any other machine. It also had modules to multiply, divide,
and square root. High speed memory was limited to 20 words (about 80 bytes). The ENIAC also contained
accumulators, special registers used to store data, and in addition the computer used a digital number
system rather than the binary system used in modern computers today.
6. GENERATIONS OF COMPUTER:
The history of computer development from the era of the electronic digital computation is often referred
to in reference to the different generations of computing devices. Each generation of computer is characterized
by a major technological development that fundamentally changed the way computers operate, resulting in
increasingly smaller, cheaper, more powerful and more efficient devices. A generation refers to the state of
improvement in the product development process. This term is also used in the different advancements of new
computer technology.
With each new generation, the circuitry has gotten smaller and more advanced than the previous generation
before it. As a result of the miniaturization, speed, power, and computer memory has proportionally increased.
New discoveries are constantly being developed that affect the way we live, work and play. The generations are
as follows:
The arithmetic logic unit (ALU), which performs arithmetic and logical operations.
The control unit, which extracts instructions from memory and decodes and executes them, calling on
the ALU when necessary.
Microprocessors also moved out of the realm of desktop computers and into many areas of life as more
and more everyday products began to use microprocessors.
As these small computers became more powerful, they could be linked together to form networks, which
eventually led to the development of the Internet. Fourth generation computers also saw the development of
GUI's (Graphical User Interface), the mouse and handheld devices.
Companies which have been associated with the invention of the Fourth Generation Computer include
Intel 4001 chip developed in 1971, IBM which introduced its first home user computer in 1981 and Apple
which introduced the Macintosh in 1984 and present day personal computers by companies like HP, Dell,
Compaq etc.
Expert Systems: programming computers to make decisions in real-life situations (e.g. some expert systems
help doctors diagnose diseases based on symptoms)
Natural Language: programming computers to understand natural human languages
Neural Networks: Systems that simulate intelligence by attempting to reproduce the types of physical
connections that occur in animal brains
Robotics: programming computers to see and hear and react to other sensory stimuli
Currently, no computers exhibit full artificial intelligence. The greatest advances have occurred in the
field of games playing where computer chess programs are now capable of beating humans.
In the area of robotics, computers are now widely used in assembly plants, but they are capable only of
very limited tasks.
Natural-language processing offers the greatest potential rewards because it would allow people to
interact with computers without needing any specialized knowledge. You could simply walk up to a computer
and talk to it. Some rudimentary translation systems that translate from one human language to another are
already in existence.
There are also voice recognition systems that can convert spoken sounds into written words, but they do
not understand what they are writing; they simply take dictation.
In the early 1980s, expert systems were believed to represent the future of artificial intelligence and of
computers in general, however, they have not lived up to expectations. Many expert systems help human
experts in such fields as medicine and engineering, but they are very expensive to produce and are helpful only
in special situations.
There are several programming languages that are known as AI languages because they are used almost
exclusively for AI applications. The two most common are LISP and Prolog.
Basically, the goal of fifth generation computing is to develop devices that respond to natural language input
and are capable of learning and self organization.