CSC 111 Material (Week 1) Now-1
CSC 111 Material (Week 1) Now-1
FACULTY OF SCIENCE
DEPARTMENT OF PHYSICAL SCIENCES
LECTURE NOTE
1.1 INTRODUCTION
The computers in recent times have become a relevant tool particularly in the areas of storage
and dissemination of information. The ease with which the computer function, i.e. the speed,
accuracy and readiness. With the usefulness of the computer, it has become fashionable for
organizations to be computerized, that is, a computer department is created to serve the whole
organization and expert or professionals are employed to manage the department. It is today
becoming increasingly difficult for computer illiterates to get good employments, as computer
literacy is now a pre-requisite for most jobs. The world is becoming a global village through the
use of computer, thus there is the need for everyone to be computer illiterate. The computer age
was characterized by generation of computers, which signified that computer had pass through
stages of evolution or development. Before we could arrive at the present day computers, it has
undergone stages of development known as generation of computers.
1.2 WHAT IS A COMPUTER?
A computer is an electronic device used to store retrieve and manipulate data. A computer also
defines as a programmable electromechanical device that accept instruction (program) to direct
the operations of the computers. Four words can be deducted from the above definition for
further illustration.
Examples
i. Store: To put data somewhere for safe keeping
ii. Retrieve: To get and bring the data back.
iii. Process: To calculate compare arrange.
NOTE: Electromechanical devices are machines that combine electrical and mechanical processes to
perform specific tasks. E.g Generators: Convert mechanical energy into electrical energy, ctuators:
Devices that create movement in response to electrical signals, used in robotics, industrial machinery,
and automotive systems (e.g., power seats or windows). Hard Disk Drives (HDDs): Use electric motors to
spin disks and mechanical arms to read/write data.
Computer Science is the study of computers and computational systems, encompassing both the
theoretical aspects of computation and the practical techniques for implementing and applying
computer systems. It involves designing software, developing algorithms, and understanding the
principles that underlie the processing, storage, and transmission of data.
The field extends beyond just programming; it also includes the study of how data is managed,
how information is communicated across networks, and how software can be made secure and
reliable.
Computer Science is central to nearly every industry today, driving innovation in fields such as
healthcare, finance, education, entertainment, and even social sciences. The skills acquired
through studying Computer Science enable students to solve complex problems, develop new
technologies, and contribute to advancements in almost every aspect of modern life.
1.4 HISTORICAL BACKGROUND OF COMPUTER.
The history of computer dated back to the period of scientific revolution (i.e. 1543 – 1678). The
calculating machine invented by Blaise Pascal in 1642 and that of Goffried Liebnits marked the
genesis of the application of machine in industry.
This progressed up to the period 1760 – 1830 which was the period of the industrial revolution in
Great Britain where the use of machine for production altered the British society and the Western
world. During this period Joseph Jacquard invented the weaving loom (a machine used in textile
industry). The computer was born not for entertainment or email but out of a need to solve a
serious number-crunching crisis. By 1880, the United State (U.S) population had grown so large
that it took more than seven years to tabulate the U.S. Census results. The government sought a
faster way to get the job done, giving rise to punch-card based computers that took up entire
rooms. Today, we carry more computing power on our smart phones than was available in these
early models. The following brief history of computing is a timeline of how computers evolved
from their humble beginnings to the machines of today that surf the Internet, play games and
stream multimedia in addition to crunching numbers.
The followings are historical events of computer:
1623: Wilhelm Schickard designed and constructed the first working mechanical calculator.
1673: Gottfried Leibniz demonstrated a digital mechanical calculator, called the Stepped
Reckoner. He may be considered the first computer scientist and information theorist, for, among
other reasons, documenting the binary number system.
19th Century
1801: Joseph Marie Jacquard, a weaver and businessman from France, devised a loom that
employed punched wooden cards to automatically weave cloth designs.
1820: Thomas de Colmar launched the mechanical calculator industry when he released his
simplified arithmometer, which was the first calculating machine strong enough and reliable
enough to be used daily in an office environment.
1822: Charles Babbage, a mathematician, invented the steam-powered calculating machine
capable of calculating number tables. The “Difference Engine” idea failed owing to a lack of
technology at the time.
1848: The world’s first computer program was written by Ada Lovelace, an English
mathematician. Lovelace also includes a step-by-step tutorial on how to compute Bernoulli
numbers using Babbage’s machine.
1885: Herman Hollerith invented the tabulator, which used punched cards to process statistical
information; eventually his company became part of IBM.
1890: Herman Hollerith designs a punch card system to calculate the 1880 U.S census,
accomplishing the task in just three years and saving the government $5 million. He establishes a
company that would ultimately become IBM.
Early 20th Century
1930: Differential Analyzer was the first large-scale automatic general-purpose mechanical
analogue computer invented and built by Vannevar Bush.
1936: Alan Turing had an idea for a universal machine, which he called the Turing machine
which could compute anything that could be computed. The central concept of the modern
computer was based on his ideas.
1937: J.V. Atanasoff, a professor of physics and mathematics at Iowa State University, attempts
to build the first computer without gears, cams, belts or shafts.
1937: One hundred years after Babbage's impossible dream, Howard Aiken convinced IBM,
which was making all kinds of punched card equipment and was also in the calculator business
to develop his giant programmable calculator, the ASCC/Harvard Mark I, based on Babbage's
Analytical Engine, which itself used cards and a central computing unit. When the machine was
finished, some hailed it as "Babbage's dream come true".
1939: Hewlett-Packard was discovered in a garage in Palo Alto, California by Bill Hewlett and
David Packard.
1941: Atanasoff and his graduate student, Clifford Berry, design a computer that can solve 29
equations simultaneously. This marks the first time a computer is able to store information on its
main memory.
1943-1944: Two University of Pennsylvania professors, John Mauchly and J. Presper Eckert,
built the Electronic Numerical Integrator and Calculator (ENIAC). Considered the grandfather of
digital computers, it fills a 20-foot by 40-foot room and has 18,000 vacuum tubes.
1945: University of Pennsylvania academics John Mauchly and J. Presper Eckert create an
Electronic Numerical Integrator and Calculator (ENIAC). It was Turing-complete and capable of
solving “a vast class of numerical problems” by reprogramming, earning it the title of
“Grandfather of computers.” Mauchly and Presper leave the University of Pennsylvania and
receive funding from the Census Bureau to build the UNIVAC, the first commercial computer
for business and government applications.
1946: The UNIVAC I (Universal Automatic Computer) was the first general-purpose electronic
digital computer designed in the United States for corporate applications.
1947: William Shockley, John Bardeen and Walter Brattain of Bell Laboratories invented the
transistor. They discovered how to make an electric switch with solid materials and no need for a
vacuum.
1949: The Electronic Delay Storage Automatic Calculator (EDSAC), developed by a team at the
University of Cambridge, is the “first practical stored-program computer”.
1950: The Standards Eastern Automatic Computer (SEAC) was built in Washington, DC, and it
was the first stored-program computer completed in the United States.
Late 20th Century
1953: Grace Hopper, a computer scientist, creates the first computer language, which becomes
known as COBOL, which stands for COmmon, Business-Oriented Language. It allowed a
computer user to offer the computer instructions in English-like words rather than numbers.
1954: John Backus and a team of IBM programmers created the FORTRAN programming
language, an acronym for FORmula TRANslation. In addition, IBM developed the 650.
1958: The integrated circuit, sometimes known as the computer chip, was created by Jack Kirby
and Robert Noyce. Kilby was awarded the Nobel Prize in Physics in 2000 for his work.
1962: Atlas, the computer, makes its appearance. It was the fastest computer in the world at the
time, and it pioneered the concept of “virtual memory.”
1964: Douglas Engelbart proposes a modern computer prototype that combines a mouse and a
graphical user interface (GUI). This marks the evolution of the computer from a specialized
machine for scientists and mathematicians to technology that is more accessible to the general
public.
1969: Bell Labs developers, led by Ken Thompson and Dennis Ritchie, revealed UNIX, an
operating system developed in the C programming language that addressed program
compatibility difficulties. UNIX was portable across multiple platforms and became the
operating system of choice among mainframes at large companies and government entities. Due
to the slow nature of the system, it never quite gained traction among home PC users.
1970: The Intel 1103, the first Dynamic Access Memory (DRAM) chip, is unveiled by Intel.
1971: The floppy disc was invented by Alan Shugart and a team of IBM engineers. In the same
year, Xerox developed the first laser printer, which not only produced billions of dollars but also
heralded the beginning of a new age in computer printing.
1973: Robert Metcalfe, a member of Xerox’s research department, created Ethernet, which is
used to connect many computers and other gear.
1974: Personal computers were introduced into the market. The first were the Altair Scelbi &
Mark-8, IBM 5100, and Radio Shack’s TRS-80.
1975: Popular Electronics magazine touted the Altair 8800 as the world’s first minicomputer kit
in January. Paul Allen and Bill Gates offer to build software in the BASIC language for the
Altair.
1976: Apple Computers is founded by Steve Jobs and Steve Wozniak, who expose the world to
the Apple I, the first computer with a single-circuit board.
1977: At the first West Coast Computer Faire, Jobs and Wozniak announce the Apple II. It has
colour graphics and a cassette drive for storing music.
1978: The first computerized spreadsheet program, VisiCalc, is introduced.
2007: The first iPhone was produced by Apple, bringing many computer operations into the
palm of our hands. Amazon also released the Kindle, one of the first electronic reading systems,
in 2007.
2009: Microsoft released Windows 7.
2011: Google introduces the Chrome book, which runs Google Chrome OS.
2014: The University of Michigan Micro Mote (M3), the world’s smallest computer, was
constructed.
2015: Apple introduces the Apple Watch. Windows 10 was also released by Microsoft.
2016: The first reprogrammable quantum computer was created. "Until now, there hasn't been
any quantum-computing platform that had the capability to program new algorithms into their
system. They're usually each tailored to attack a particular algorithm," said study lead author
Shantanu Debnath, a quantum physicist and optical engineer at the University of Maryland,
College Park.
2017: The Defense Advanced Research Projects Agency (DARPA) is developing a new
"Molecular Informatics" program that uses molecules as computers. "Chemistry offers a rich set
of properties that we may be able to harness for rapid, scalable information storage and
processing," Anne Fischer, program manager in DARPA's Defense Sciences Office, said in a
statement.
"Millions of molecules exist, and each molecule has a unique three-dimensional atomic structure
as well as variables such as shape, size, or even color. This richness provides a vast design space
for exploring novel and multi-value ways to encode and process data beyond the 0s and 1s of
current logic-based, digital architectures.
1.5 GENERATIONS OF COMPUTER
The history of computer is considered with the generations of a computer from first
generation to fifth generation.
In 19th century English mathematics professor name Charles Babbage referred as a “Father of
Computer”. He designed the Analytical Engine and it was this design that the basic framework of
the computers of today are based on. Generally speaking, computers can be classified into five
generations. Each generation lasted for a certain period of time and each gave us either a new
and improved computer or an improvement to the existing computer.
The generations of computer are as follows:
1.5.1 First Generation of Computer (1937 – 1946)
In 1937 the first electronic digital computer was built by Dr. John V. Atanasoff and Clifford
Berry. It was called the Atanasoff-Berry Computer (ABC). In 1943 an electronic computer name
the Colossus was built for the military. Other developments continued until in 1946 the first
general– purpose digital computer, the Electronic Numerical Integrator and Calculator (ENIAC)
was built. It is said that this computer weighed 30 tons, and had 18,000 vacuum tubes which was
used for processing. When this computer was turned on for the first time lights dim in sections of
Philadelphia. Computers of this generation could only perform single task, and they had no
operating system.