0% found this document useful (0 votes)
50 views17 pages

CSC 111 Material (Week 1) Now-1

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
50 views17 pages

CSC 111 Material (Week 1) Now-1

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 17

BENSON IDAHOSA UNIVERSITY, BENIN CITY

FACULTY OF SCIENCE
DEPARTMENT OF PHYSICAL SCIENCES

LECTURE NOTE

Course Title: INTRODUCTION TO COMPUTING


Course Code: CSC 111
Credit Hours: 2 HOURS/WEEK
Course Status: COMPULSORY

1.1 INTRODUCTION
The computers in recent times have become a relevant tool particularly in the areas of storage
and dissemination of information. The ease with which the computer function, i.e. the speed,
accuracy and readiness. With the usefulness of the computer, it has become fashionable for
organizations to be computerized, that is, a computer department is created to serve the whole
organization and expert or professionals are employed to manage the department. It is today
becoming increasingly difficult for computer illiterates to get good employments, as computer
literacy is now a pre-requisite for most jobs. The world is becoming a global village through the
use of computer, thus there is the need for everyone to be computer illiterate. The computer age
was characterized by generation of computers, which signified that computer had pass through
stages of evolution or development. Before we could arrive at the present day computers, it has
undergone stages of development known as generation of computers.
1.2 WHAT IS A COMPUTER?
A computer is an electronic device used to store retrieve and manipulate data. A computer also
defines as a programmable electromechanical device that accept instruction (program) to direct
the operations of the computers. Four words can be deducted from the above definition for
further illustration.
Examples
i. Store: To put data somewhere for safe keeping
ii. Retrieve: To get and bring the data back.
iii. Process: To calculate compare arrange.

NOTE: Electromechanical devices are machines that combine electrical and mechanical processes to
perform specific tasks. E.g Generators: Convert mechanical energy into electrical energy, ctuators:
Devices that create movement in response to electrical signals, used in robotics, industrial machinery,
and automotive systems (e.g., power seats or windows). Hard Disk Drives (HDDs): Use electric motors to
spin disks and mechanical arms to read/write data.

1.3 WHAT IS COMPUTER SCIENCE?


Computer science (sometimes called computation science or computing science, but not to be
confused with computational science or software engineering) is the study of processes that
interact with data and that can be represented as data in the form of programs. It enables the use
of algorithms to manipulate, store, and communicate digital information. A computer scientist
studies the theory of computation and the practice of designing software systems.
Its fields can be divided into theoretical and practical disciplines. Computational complexity
theory is highly abstract, while computer graphics emphasizes real-world applications.
Programming language theory considers approaches to the description of computational
processes, while computer programming itself involves the use of programming languages and
complex systems. Human– computer interaction considers the challenges in making computers
useful, usable, and accessible.

Computer Science is the study of computers and computational systems, encompassing both the
theoretical aspects of computation and the practical techniques for implementing and applying
computer systems. It involves designing software, developing algorithms, and understanding the
principles that underlie the processing, storage, and transmission of data.

The field extends beyond just programming; it also includes the study of how data is managed,
how information is communicated across networks, and how software can be made secure and
reliable.

Key Fields and Disciplines in Computer Science:

1. Algorithms and Data Structures:


o Focuses on developing efficient methods for organizing, processing, and storing
data.
o Involves studying techniques to solve computational problems with the least
amount of time and resources.
2. Programming Languages:
o Concerned with the design, implementation, and analysis of programming
languages.
o Includes learning different paradigms like procedural, object-oriented, and
functional programming.
3. Software Engineering:
o Involves designing, developing, testing, and maintaining software systems.
o Emphasizes methodologies to manage software development projects and produce
high-quality software.
4. Computer Networks:
o Studies how data is transmitted between different computer systems.
o Includes concepts like the Internet, wireless networks, network protocols, and
security.
5. Databases and Information Systems:
o Focuses on how data is stored, retrieved, and manipulated.
o Covers topics like database design, SQL, data warehousing, and data mining.
6. Artificial Intelligence (AI) and Machine Learning:
o AI focuses on making computers perform tasks that require human intelligence,
such as speech recognition or game-playing.
o Machine Learning, a subset of AI, deals with the development of algorithms that
allow computers to learn from and make predictions based on data.
7. Human-Computer Interaction (HCI):
o Studies the ways humans interact with computers and designs user-friendly
software interfaces.
o Involves understanding usability and developing accessible technology.
8. Computer Graphics and Visualization:
o Focuses on creating visual content using computers.
o Involves rendering images, animations, and developing simulations and virtual
reality systems.
9. Operating Systems and System Software:
o Concerned with the design and development of software that manages computer
hardware.
o Includes topics like process management, memory management, and file systems.
10. Computer Architecture and Embedded Systems:
o Examines how computer hardware works and how it can be optimized.
o Involves studying processors, memory hierarchies, and specialized systems like
embedded devices.
11. Cybersecurity:
o Focuses on protecting computer systems from unauthorized access, attacks, and
damage.
o Includes the study of cryptography, secure communication, and network security.
12. Theoretical Computer Science:
o Explores the mathematical foundations of computation.
o Includes areas like computational complexity, automata theory, and formal
languages.

Why Computer Science is Important:

Computer Science is central to nearly every industry today, driving innovation in fields such as
healthcare, finance, education, entertainment, and even social sciences. The skills acquired
through studying Computer Science enable students to solve complex problems, develop new
technologies, and contribute to advancements in almost every aspect of modern life.
1.4 HISTORICAL BACKGROUND OF COMPUTER.
The history of computer dated back to the period of scientific revolution (i.e. 1543 – 1678). The
calculating machine invented by Blaise Pascal in 1642 and that of Goffried Liebnits marked the
genesis of the application of machine in industry.
This progressed up to the period 1760 – 1830 which was the period of the industrial revolution in
Great Britain where the use of machine for production altered the British society and the Western
world. During this period Joseph Jacquard invented the weaving loom (a machine used in textile
industry). The computer was born not for entertainment or email but out of a need to solve a
serious number-crunching crisis. By 1880, the United State (U.S) population had grown so large
that it took more than seven years to tabulate the U.S. Census results. The government sought a
faster way to get the job done, giving rise to punch-card based computers that took up entire
rooms. Today, we carry more computing power on our smart phones than was available in these
early models. The following brief history of computing is a timeline of how computers evolved
from their humble beginnings to the machines of today that surf the Internet, play games and
stream multimedia in addition to crunching numbers.
The followings are historical events of computer:
1623: Wilhelm Schickard designed and constructed the first working mechanical calculator.
1673: Gottfried Leibniz demonstrated a digital mechanical calculator, called the Stepped
Reckoner. He may be considered the first computer scientist and information theorist, for, among
other reasons, documenting the binary number system.
19th Century
1801: Joseph Marie Jacquard, a weaver and businessman from France, devised a loom that
employed punched wooden cards to automatically weave cloth designs.
1820: Thomas de Colmar launched the mechanical calculator industry when he released his
simplified arithmometer, which was the first calculating machine strong enough and reliable
enough to be used daily in an office environment.
1822: Charles Babbage, a mathematician, invented the steam-powered calculating machine
capable of calculating number tables. The “Difference Engine” idea failed owing to a lack of
technology at the time.
1848: The world’s first computer program was written by Ada Lovelace, an English
mathematician. Lovelace also includes a step-by-step tutorial on how to compute Bernoulli
numbers using Babbage’s machine.
1885: Herman Hollerith invented the tabulator, which used punched cards to process statistical
information; eventually his company became part of IBM.
1890: Herman Hollerith designs a punch card system to calculate the 1880 U.S census,
accomplishing the task in just three years and saving the government $5 million. He establishes a
company that would ultimately become IBM.
Early 20th Century
1930: Differential Analyzer was the first large-scale automatic general-purpose mechanical
analogue computer invented and built by Vannevar Bush.
1936: Alan Turing had an idea for a universal machine, which he called the Turing machine
which could compute anything that could be computed. The central concept of the modern
computer was based on his ideas.
1937: J.V. Atanasoff, a professor of physics and mathematics at Iowa State University, attempts
to build the first computer without gears, cams, belts or shafts.
1937: One hundred years after Babbage's impossible dream, Howard Aiken convinced IBM,
which was making all kinds of punched card equipment and was also in the calculator business
to develop his giant programmable calculator, the ASCC/Harvard Mark I, based on Babbage's
Analytical Engine, which itself used cards and a central computing unit. When the machine was
finished, some hailed it as "Babbage's dream come true".
1939: Hewlett-Packard was discovered in a garage in Palo Alto, California by Bill Hewlett and
David Packard.
1941: Atanasoff and his graduate student, Clifford Berry, design a computer that can solve 29
equations simultaneously. This marks the first time a computer is able to store information on its
main memory.
1943-1944: Two University of Pennsylvania professors, John Mauchly and J. Presper Eckert,
built the Electronic Numerical Integrator and Calculator (ENIAC). Considered the grandfather of
digital computers, it fills a 20-foot by 40-foot room and has 18,000 vacuum tubes.
1945: University of Pennsylvania academics John Mauchly and J. Presper Eckert create an
Electronic Numerical Integrator and Calculator (ENIAC). It was Turing-complete and capable of
solving “a vast class of numerical problems” by reprogramming, earning it the title of
“Grandfather of computers.” Mauchly and Presper leave the University of Pennsylvania and
receive funding from the Census Bureau to build the UNIVAC, the first commercial computer
for business and government applications.
1946: The UNIVAC I (Universal Automatic Computer) was the first general-purpose electronic
digital computer designed in the United States for corporate applications.
1947: William Shockley, John Bardeen and Walter Brattain of Bell Laboratories invented the
transistor. They discovered how to make an electric switch with solid materials and no need for a
vacuum.
1949: The Electronic Delay Storage Automatic Calculator (EDSAC), developed by a team at the
University of Cambridge, is the “first practical stored-program computer”.
1950: The Standards Eastern Automatic Computer (SEAC) was built in Washington, DC, and it
was the first stored-program computer completed in the United States.
Late 20th Century
1953: Grace Hopper, a computer scientist, creates the first computer language, which becomes
known as COBOL, which stands for COmmon, Business-Oriented Language. It allowed a
computer user to offer the computer instructions in English-like words rather than numbers.
1954: John Backus and a team of IBM programmers created the FORTRAN programming
language, an acronym for FORmula TRANslation. In addition, IBM developed the 650.
1958: The integrated circuit, sometimes known as the computer chip, was created by Jack Kirby
and Robert Noyce. Kilby was awarded the Nobel Prize in Physics in 2000 for his work.
1962: Atlas, the computer, makes its appearance. It was the fastest computer in the world at the
time, and it pioneered the concept of “virtual memory.”
1964: Douglas Engelbart proposes a modern computer prototype that combines a mouse and a
graphical user interface (GUI). This marks the evolution of the computer from a specialized
machine for scientists and mathematicians to technology that is more accessible to the general
public.
1969: Bell Labs developers, led by Ken Thompson and Dennis Ritchie, revealed UNIX, an
operating system developed in the C programming language that addressed program
compatibility difficulties. UNIX was portable across multiple platforms and became the
operating system of choice among mainframes at large companies and government entities. Due
to the slow nature of the system, it never quite gained traction among home PC users.
1970: The Intel 1103, the first Dynamic Access Memory (DRAM) chip, is unveiled by Intel.
1971: The floppy disc was invented by Alan Shugart and a team of IBM engineers. In the same
year, Xerox developed the first laser printer, which not only produced billions of dollars but also
heralded the beginning of a new age in computer printing.
1973: Robert Metcalfe, a member of Xerox’s research department, created Ethernet, which is
used to connect many computers and other gear.
1974: Personal computers were introduced into the market. The first were the Altair Scelbi &
Mark-8, IBM 5100, and Radio Shack’s TRS-80.
1975: Popular Electronics magazine touted the Altair 8800 as the world’s first minicomputer kit
in January. Paul Allen and Bill Gates offer to build software in the BASIC language for the
Altair.
1976: Apple Computers is founded by Steve Jobs and Steve Wozniak, who expose the world to
the Apple I, the first computer with a single-circuit board.
1977: At the first West Coast Computer Faire, Jobs and Wozniak announce the Apple II. It has
colour graphics and a cassette drive for storing music.
1978: The first computerized spreadsheet program, VisiCalc, is introduced.

1979: WordStar, a word processing tool from MicroPro International, is released.


1981: IBM unveils the Acorn, their first personal computer, which has an Intel CPU, two floppy
drives, and a colour display. The MS-DOS operating system from Microsoft is used by Acorn.
1983: The CD-ROM, which could carry 550 megabytes of pre-recorded data, hit the market.
This year also saw the release of the Gavilan SC, the first portable computer with a flip-form
design and the first to be offered as a “laptop.”
1984: Apple launched Macintosh during the Superbowl XVIII commercial. It was priced at
$2,500
1985: Microsoft introduces Windows, which enables multitasking via a graphical user interface.
In addition, the programming language C++ has been released.
1990: Tim Berners-Lee, an English programmer and scientist, creates HyperText Markup
Language, widely known as HTML. He also coined the term “WorldWideWeb.” It includes the
first browser, a server, HTML, and URLs.
1993: The Pentium CPU improves the usage of graphics and music on personal computers.
1995: Microsoft’s Windows 95 operating system was released. A $300 million promotional
campaign was launched to get the news out. Sun Microsystems introduces Java 1.0, followed by
Netscape Communications’ JavaScript.
1996: At Stanford University, Sergey Brin and Larry Page created the Google search engine.
1998: Apple introduces the iMac, an all-in-one Macintosh desktop computer. These PCs cost
$1,300 and came with a 4GB hard drive, 32MB RAM, a CD-ROM, and a 15-inch monitor.
1999: Wi-Fi, an abbreviation for “wireless fidelity,” is created, originally covering a range of up
to 300 feet.
21st Century
2000: The USB flash drive is first introduced in 2000. They were speedier and had more storage
space than other storage media options when used for data storage.
2001: Apple releases Mac OS X, later renamed OS X and eventually simply macOS, as the
successor to its conventional Mac Operating System.
2003: Customers could purchase AMD’s Athlon 64, the first 64-bit CPU for consumer
computers.
2004: Facebook began as a social networking website.
2005: Google acquires Android, a mobile phone OS based on Linux.
2006: Apple’s MacBook Pro was available. The Pro was the company’s first dual-core, Intel-
based mobile computer. Amazon Web Services, including Amazon Elastic Cloud 2 (EC2) and
Amazon Simple Storage Service, were also launched (S3)

2007: The first iPhone was produced by Apple, bringing many computer operations into the
palm of our hands. Amazon also released the Kindle, one of the first electronic reading systems,
in 2007.
2009: Microsoft released Windows 7.
2011: Google introduces the Chrome book, which runs Google Chrome OS.
2014: The University of Michigan Micro Mote (M3), the world’s smallest computer, was
constructed.
2015: Apple introduces the Apple Watch. Windows 10 was also released by Microsoft.
2016: The first reprogrammable quantum computer was created. "Until now, there hasn't been
any quantum-computing platform that had the capability to program new algorithms into their
system. They're usually each tailored to attack a particular algorithm," said study lead author
Shantanu Debnath, a quantum physicist and optical engineer at the University of Maryland,
College Park.
2017: The Defense Advanced Research Projects Agency (DARPA) is developing a new
"Molecular Informatics" program that uses molecules as computers. "Chemistry offers a rich set
of properties that we may be able to harness for rapid, scalable information storage and
processing," Anne Fischer, program manager in DARPA's Defense Sciences Office, said in a
statement.
"Millions of molecules exist, and each molecule has a unique three-dimensional atomic structure
as well as variables such as shape, size, or even color. This richness provides a vast design space
for exploring novel and multi-value ways to encode and process data beyond the 0s and 1s of
current logic-based, digital architectures.
1.5 GENERATIONS OF COMPUTER
The history of computer is considered with the generations of a computer from first
generation to fifth generation.
In 19th century English mathematics professor name Charles Babbage referred as a “Father of
Computer”. He designed the Analytical Engine and it was this design that the basic framework of
the computers of today are based on. Generally speaking, computers can be classified into five
generations. Each generation lasted for a certain period of time and each gave us either a new
and improved computer or an improvement to the existing computer.
The generations of computer are as follows:
1.5.1 First Generation of Computer (1937 – 1946)
In 1937 the first electronic digital computer was built by Dr. John V. Atanasoff and Clifford
Berry. It was called the Atanasoff-Berry Computer (ABC). In 1943 an electronic computer name
the Colossus was built for the military. Other developments continued until in 1946 the first
general– purpose digital computer, the Electronic Numerical Integrator and Calculator (ENIAC)
was built. It is said that this computer weighed 30 tons, and had 18,000 vacuum tubes which was
used for processing. When this computer was turned on for the first time lights dim in sections of
Philadelphia. Computers of this generation could only perform single task, and they had no
operating system.

Characteristics of the First Generation Computers:


i. Sizes of these computers were as large as the size of a room.
ii. Possession of Vacuum Tubes to perform calculation.
iii. They used an internally stored instruction called program.
iv. Use capacitors to store binary data and information.
v. They use punched card for communication of input and output data and information
vi. They generated a lot of heat.
vii. They have about One Thousand 1000 circuits per cubic foot.
Examples:
i. Mark I developed by Aiken in 1944.
ii. Electronic Numerical Integrator and Calculator (ENIAC) built at the
Moore School for Engineering of the University of Pennsylvania in 1946
by J. Presper Eckert and William Mauchley.
iii. Electronic Discrete Variable Automatic Computer (EDVAC) also
developed in 1947 by Eckert and Mauchley.
1.5.2 Second Generation of Computer (1947 – 1962):
Second generation of computers used transistors instead of vacuum tubes which were more
reliable. In 1951 the first computer for commercial use was introduced to the public; the
Universal Automatic Computer (UNIVAC 1). In 1953 the International Business Machine (IBM)
650 and 700 series computers made their mark in the computer world. During this generation of
computers over 100 computer programming languages were developed, computers had memory
and operating systems. Storage media such as tape and disk were in use also were printers for
output.
Characteristics of Second Generation Computers:
i. The computers were still large, but smaller than the first generation of computers.
ii. They use transistor in place of Vacuum Tubes to perform calculation.
iii. They were produced at a reduced cost compared to the first generation of computers.
iv. Possession of magnetic tapes as for data storage.
v. They were using punch cards as input and output of data and information. The use of keyboard
as an input device was also introduced.
vi. These computers were still generating a lot of heat in which an air conditioner is needed to
maintain a cold temperature.
vii. They have about one thousand circuits per cubic foot.
Example:
i. Leprechaun, IBM built by Bell Laboratories in 1947
ii. Transistors produced by philco, GE and RCA.
iii. UNIVAC 1107, UNIVAC III.
iv. RCA 501.
v. IBM 7030 stretch.
1.5.3 Third Generation of Computer (1963 – 1975):
The invention of integrated circuit brought us the third generation of computers. With this
invention computers became smaller, more powerful more reliable and they are able to run many
different programs at the same time.
Characteristics of Third Generation Computers:
i. They used large-scale integrated circuits, which were used for both data processing and
storage.
ii. Computers were miniaturized, that is, they were reduced in size compared to previous
generation.
iii. Keyboard and mouse were used for input while the monitor was used as output device.
iv. Use of programming language like COBOL and FORTRAN were developed.
v. They have hundred thousand circuits per cubic foot.
Examples:
i. Burroughs 6700, Mini computers
ii. Honeywell 200
iii. IBM system 360
iv. UNIVAC 9000 series.
1.5.4 Fourth Generation of Computer (PC 1975 – Current)
At this time of technological development, the size of computer was re-divided to what we called
Personal Computers, PC. This was the time the first Microprocessor was created by Intel. The
microprocessor was a very large scale, that is, VLS integrated circuit which contained thousands
of transistors. Transistors on one chip were capable performing all the functions of a computer’s
central processing unit.

Characteristics of Fourth Generation Computers:


i. Possession of microprocessor which performs all the task of a computer system use today.
ii. The size of computers and cost was reduced.
iii. Increase in speed of computers.
iv. Very large scale (VLS) integrated circuits were used.
v. They have millions of circuits per cubic foot.
Examples:
i. IBM system 3090, IBM RISC6000, IBM RT.
ii. ILLIAC IV.
iii. Cray 2 XMP.
iv. HP 9000.
v. Apple Computers.

1.5.5 Fifth Generation of Computers (Present and Beyond)


Fifth generations computing devices, based on artificial intelligence (AI) are still in
development, although there are some application such as voice recognition, facial face detector
and thumb print that are used today.

Characteristics of Fifth Generation Computers:


i. Consist of extremely large scale integration.
ii. Parallel processing
iii. Possession of high speed logic and memory chip.
iv. High performance, micro-miniaturization.
v. Ability of computers to mimic human intelligence, e.g. voice recognition, facial face detector,
thumb print.
vi. Satellite links, virtual reality.
vii. They have billions of circuits per cubic.
Examples:
i. Super computers
ii. Robots
iii. Facial face detector
iv. Thumb print.
SUMMARY.
The earliest foundations of what would become computer science predate the invention of the
modern digital computer. Machines for calculating fixed numerical tasks such as the abacus have
existed Charles Babbage, sometimes referred to as the "father of computing". Ada Lovelace is
often credited with publishing the first algorithm intended for processing on a computer. Since
antiquity, aiding in computations such as multiplication and division.
Algorithms for performing computations have existed since antiquity, even before the
development of sophisticated computing equipment. In1980 Microsoft Disk Operating System
(MS-Dos) was born and in 1981 IBM introduced the personal computer (PC) for home and
office use. Three years later Apple gave us the Macintosh computer with its icon driven interface
and the 90s gave us Windows operating system. As a result of the various improvements to the
development of the computer we have seen the computer being used in all areas of life. It is a
very useful tool that will continue to experience new development as time passes.
1.6 TYPES OF COMPUTERS
The types of computer are as follows −
i) Analog Computers – Analog computers are built with various components such as gears
and levers, with no electrical components. One advantage of analogue computation is that
designing and building an analogue computer to tackle a specific problem can be quite
straightforward.
ii) Digital Computers – Information in digital computers is represented in discrete form,
typically as sequences of 0s and 1s (binary digits, or bits). A digital computer is a system or
gadget that can process any type of information in a matter of seconds. Digital computers are
categorized into many different types. They are as follows:
iii) Mainframe computers – It is a computer that is generally utilized by large enterprises for
mission-critical activities such as massive data processing. Mainframe computers were
distinguished by massive storage capacities, quick components, and powerful computational
capabilities. Because they were complicated systems, they were managed by a team of systems
programmers who had sole access to the computer. These machines are now referred to as
servers rather than mainframes.
iv) Supercomputers – The most powerful computers to date are commonly referred to as
supercomputers. Supercomputers are enormous systems that are purpose-built to solve
complicated scientific and industrial problems. Quantum mechanics, weather forecasting, oil and
gas exploration, molecular modelling, physical simulations, aerodynamics, nuclear fusion
research, and crypto analysis are all done on supercomputers.
v) Minicomputers – A minicomputer is a type of computer that has many of the same
features and capabilities as a larger computer but is smaller in size. Minicomputers, which were
relatively small and affordable, were often employed in a single department of an organization
and were often dedicated to a specific task or shared by a small group.
vi) Microcomputers – A microcomputer is a small computer that is based on a
microprocessor integrated circuit, often known as a chip. A microcomputer is a system that
incorporates at a minimum a microprocessor, program memory, data memory, and input-output
system (I/O). A microcomputer is now commonly referred to as a personal computer (PC).
vii) Embedded processors – These are miniature computers that control electrical and
mechanical processes with basic microprocessors. Embedded processors are often simple in
design, have limited processing capability and I/O capabilities, and need little power. Ordinary
microprocessors and microcontrollers are the two primary types of embedded processors.
Embedded processors are employed in systems that do not require the computing capability of
traditional devices such as desktop computers, laptop computers, or workstations.
2.1 HARDWARE AND SOFTWARE
Hardware and software form the main parts of any computing device. The physical parts of the
computer make up the hardware, while the apps or programs is the software of any computing
device. Hardware is the ‘soul’ of the computer. It is the physical entity. Moreover, not every part
of the computer hardware is visible to us. In fact, many hardware parts are internal. For example,
hardware components such as motherboard, RAM and CPU are internal. Other examples of
hardware include output devices such as printer and monitor. Input devices such as the keyboard
and mouse. Also, secondary storage devices such as CD, DVD, hard disk, etc. are all hardware
components of a computer.
Software is the applications and programs of a computer. It is a set of instructions to perform the
tasks. These instructions are given by a software developer. They are written in a way that a
computer can understand it. Furthermore, if you design a program for a Linux operating system,
it will only work for Linux systems. In short, a software designed for a particular hardware is not
compatible with other hardware. For example, if we design software for Windows 7 then there
might be a compatibility issue if we use it for Windows 10. A software developer creates
software using a high-level programming language. A high-level programming language is easy
to read and understand for programmers. But a computer cannot understand these instructions.
So, these instructions should be changed to ‘machine language’ which the computer understands.
A machine language consists of instructions in binary code. So, when you are installing software,
instructions are already in the binary form.
The hardware and software of the computer are interconnected with each other. In other words,
hardware cannot function without the software. As it requires supporting software to run. Hence,
it is important to have the relevant software installed on the hardware to get the job done.
Furthermore, without the hardware, the software cannot perform the required tasks. Software
development is considered expensive because it needs regular updates. Also, it is a continuing
process. While hardware development is initially expensive but after that, there are no expenses.
It is only a one-time expense.
Computer hardware can only perform mechanical tasks. While software can perform several
complex tasks. This is because the software provides means required to perform the tasks.
Furthermore, there are two types of software:
a) System software
A system software helps to run the computer system and its hardware. Examples of system
software include device drivers, operating system, diagnostic tools and many more. As compared
to the application software, the system software is always pre-installed on your PC. The system
software is a collection of programs designed to operate, control, and extend the processing
capabilities of the computer itself. System software is generally prepared by the computer
manufacturers. These software products comprise of programs written in low-level languages,
which interact with the hardware at a very basic level. System software serves as the interface
between the hardware and the end users. Some examples of system software are Operating
System, Compilers, Interpreter, Assemblers, etc.
b) Application software
An application software performs specific tasks for users. Examples of application software
include a web browser, word processor, and other software that we install on our computer.
Application software products are designed to satisfy a particular need of a particular
environment. All software applications prepared in the computer lab can come under the
category of Application software. Application software may consist of a single program, such as
Microsoft's notepad for writing and editing a simple text. It may also consist of a collection of
programs, often called a software package, which work together to accomplish a task, such as a
spreadsheet package.
Examples of Application software are the following −
Payroll Software
Student Record Software
Inventory Management Software
Income Tax Software
Railways Reservation Software
Microsoft Office Suite Software
Microsoft Word
Microsoft Excel
Microsoft PowerPoint

You might also like