0% found this document useful (0 votes)
29 views21 pages

WEEK ONE

The document provides an overview of the history and evolution of computers, detailing early computing devices such as the abacus and the Difference Engine, and outlining the five generations of computers from the 1940s to the present day. It highlights key inventions and milestones in computing, including the development of programming languages and the introduction of personal computers. The document serves as a foundational lecture material for the course 'Introduction to Information and Communication Technology' for the 2024/2025 academic session.

Uploaded by

chibuike.eze
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
29 views21 pages

WEEK ONE

The document provides an overview of the history and evolution of computers, detailing early computing devices such as the abacus and the Difference Engine, and outlining the five generations of computers from the 1940s to the present day. It highlights key inventions and milestones in computing, including the development of programming languages and the introduction of personal computers. The document serves as a foundational lecture material for the course 'Introduction to Information and Communication Technology' for the 2024/2025 academic session.

Uploaded by

chibuike.eze
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 21

FACULTY OF SCIENCE

DEPARTMENT OF COMPUTING
2024/2025 ACADEMIC SESSION
FIRST SEMESTER LECTURE MATERIAL
COURSE CODE: CSC 112 CREDIT UNIT: 2 UNITS
COURSE TITLE: INTRODUCTION TO INFORMATION AND COMMUNICATION TECHNOLOGY
DURATION: 2 HOURS CLASS CODE: gkjpo6z LECTURE MATERIAL – WEEK 1 & 2

What is a computer?
A computer is an electronic machine that collects information, stores it, processes it according to
user instructions, and then returns the result.

A computer is a programmable electronic device that performs arithmetic and logical operations
automatically using a set of instructions provided by the user.

Early Computing Devices


People used sticks, stones, and bones as counting tools before computers were invented. More
computing devices were produced as technology advanced and the human intellect improved
over time. Let us look at a few of the early-age computing devices used by mankind.

1. Abacus

Abacus was invented by the Chinese around 4000 years ago. It’s a wooden rack with metal rods
with beads attached to them. The abacus operator moves the beads according to certain
guidelines to complete arithmetic computations.

2. Napier’s Bone

John Napier devised Napier’s Bones, a manually operated calculating apparatus. For calculating,
this instrument used 9 separate ivory strips (bones) marked with numerals to multiply and divide.
It was also the first machine to calculate using the decimal point system.

3. Pascaline

Pascaline was invented in 1642 by Biaise Pascal, a French mathematician and philosopher. It is
thought to be the first mechanical and automated calculator. It was a wooden box with gears and
wheels inside.
4. Stepped Reckoner or Leibniz wheel

In 1673, a German mathematician-philosopher named Gottfried Wilhelm Leibniz improved on


Pascal’s invention to create this apparatus. It was a digital mechanical calculator known as the
stepped reckoner because it used fluted drums instead of gears.

5. Difference Engine

In the early 1820s, Charles Babbage created the Difference Engine. It was a mechanical
computer that could do basic computations. It was a steam-powered calculating machine used to
solve numerical tables such as logarithmic tables.

6. Analytical Engine

Charles Babbage created another calculating machine, the Analytical Engine, in 1830. It was a
mechanical computer that took input from punch cards. It was capable of solving any
mathematical problem and storing data in an indefinite memory.

7. Tabulating machine

An American Statistician – Herman Hollerith invented this machine in the year 1890. Tabulating
Machine was a punch card-based mechanical tabulator. It could compute statistics and record or
sort data or information. Hollerith began manufacturing these machines in his company, which
ultimately became International Business Machines (IBM) in 1924.

8. Differential Analyzer

Vannevar Bush introduced the first electrical computer, the Differential Analyzer, in 1930. This
machine is made up of vacuum tubes that switch electrical impulses in order to do calculations. It
was capable of performing 25 calculations in a matter of minutes.

9. Mark I

Howard Aiken planned to build a machine in 1937 that could conduct massive calculations or
calculations using enormous numbers. The Mark I computer was constructed in 1944 as a
collaboration between IBM and Harvard.

History of Computers Generation


The word ‘computer’ has a very interesting origin. It was first used in the 16th century for a
person who used to compute, i.e. do calculations. The word was used in the same sense as a noun
until the 20th century. Women were hired as human computers to carry out all forms of
calculations and computations.
By the last part of the 19th century, the word was also used to describe machines that did
calculations. The modern-day use of the word is generally to describe programmable digital
devices that run on electricity.

Early History of Computer

Since the evolution of humans, devices have been used for calculations for thousands of years.
One of the earliest and most well-known devices was an abacus. Then in 1822, the father of
computers, Charles Babbage began developing what would be the first mechanical computer.
And then in 1833 he actually designed an Analytical Engine which was a general-purpose
computer. It contained an ALU, some basic flow chart principles and the concept of integrated
memory.

Then more than a century later in the history of computers, we got our first electronic computer
for general purpose. It was the ENIAC, which stands for Electronic Numerical Integrator and
Computer. The inventors of this computer were John W. Mauchly and J.Presper Eckert.

And with times the technology developed and the computers got smaller and the processing got
faster. We got our first laptop in 1981 and it was introduced by Adam Osborne and EPSON.

Browse more Topics under Basics Of Computers

 Number Systems
 Number System Conversions
 Generations of Computers
 Computer Organisation
 Computer Memory
 Computers Abbreviations
 Basic Computer Terminology
 Computer Languages
 Basic Internet Knowledge and Protocols
 Hardware and Software
 Keyboard Shortcuts
 I/O Devices
 Practice Problems On Basics Of Computers

Generations of Computers
In the history of computers, we often refer to the advancements of modern computers as the
generation of computers. We are currently on the fifth generation of computers. So let us look at
the important features of these five generations of computers.

 1st Generation: This was from the period of 1940 to 1955. This was when machine
language was developed for the use of computers. They used vacuum tubes for the
circuitry. For the purpose of memory, they used magnetic drums. These machines were
complicated, large, and expensive. They were mostly reliant on batch operating systems
and punch cards. As output and input devices, magnetic tape and paper tape were
implemented. For example, ENIAC, UNIVAC-1, EDVAC, and so on.
 2nd Generation: The years 1957-1963 were referred to as the “second generation of
computers” at the time. In second-generation computers, COBOL and FORTRAN are
employed as assembly languages and programming languages. Here they advanced from
vacuum tubes to transistors. This made the computers smaller, faster and more energy-
efficient. And they advanced from binary to assembly languages. For instance, IBM
1620, IBM 7094, CDC 1604, CDC 3600, and so forth.
 3rd Generation: The hallmark of this period (1964-1971) was the development of the
integrated circuit. A single integrated circuit (IC) is made up of many transistors, which
increases the power of a computer while simultaneously lowering its cost. These
computers were quicker, smaller, more reliable, and less expensive than their
predecessors. High-level programming languages such as FORTRON-II to IV, COBOL,
and PASCAL PL/1 were utilized. For example, the IBM-360 series, the Honeywell-6000
series, and the IBM-370/168.
 4th Generation: The invention of the microprocessors brought along the fourth
generation of computers. The years 1971-1980 were dominated by fourth generation
computers. C, C++ and Java were the programming languages utilized in this generation
of computers. For instance, the STAR 1000, PDP 11, CRAY-1, CRAY-X-MP, and Apple
II. This was when we started producing computers for home use.
 5th Generation: These computers have been utilized since 1980 and continue to be used
now. This is the present and the future of the computer world. The defining aspect of this
generation is artificial intelligence. The use of parallel processing and superconductors
are making this a reality and provide a lot of scope for the future. Fifth-generation
computers use ULSI (Ultra Large Scale Integration) technology. These are the most
recent and sophisticated computers. C, C++, Java,.Net, and more programming languages
are used. For instance, IBM, Pentium, Desktop, Laptop, Notebook, Ultrabook, and so on.

Brief History of Computers


The naive understanding of computation had to be overcome before the true power of computing
could be realized. The inventors who worked tirelessly to bring the computer into the world had
to realize that what they were creating was more than just a number cruncher or a calculator.
They had to address all of the difficulties associated with inventing such a machine,
implementing the design, and actually building the thing. The history of the computer is the
history of these difficulties being solved.

19th Century

1801 – Joseph Marie Jacquard, a weaver and businessman from France, devised a loom that
employed punched wooden cards to automatically weave cloth designs.

1822 – Charles Babbage, a mathematician, invented the steam-powered calculating machine


capable of calculating number tables. The “Difference Engine” idea failed owing to a lack of
technology at the time.

1848 – The world’s first computer program was written by Ada Lovelace, an English
mathematician. Lovelace also includes a step-by-step tutorial on how to compute Bernoulli
numbers using Babbage’s machine.

1890 – Herman Hollerith, an inventor, creates the punch card technique used to calculate the
1880 U.S. census. He would go on to start the corporation that would become IBM.

Early 20th Century

1930 – Differential Analyzer was the first large-scale automatic general-purpose mechanical
analogue computer invented and built by Vannevar Bush.

1936 – Alan Turing had an idea for a universal machine, which he called the Turing machine,
that could compute anything that could be computed.

1939 – Hewlett-Packard was discovered in a garage in Palo Alto, California by Bill Hewlett and
David Packard.

1941 – Konrad Zuse, a German inventor and engineer, completed his Z3 machine, the world’s
first digital computer. However, the machine was destroyed during a World War II bombing
strike on Berlin.

1941 – J.V. Atanasoff and graduate student Clifford Berry devise a computer capable of solving
29 equations at the same time. The first time a computer can store data in its primary memory.

1945 – University of Pennsylvania academics John Mauchly and J. Presper Eckert create an
Electronic Numerical Integrator and Calculator (ENIAC). It was Turing-complete and capable of
solving “a vast class of numerical problems” by reprogramming, earning it the title of
“Grandfather of computers.”

1946 – The UNIVAC I (Universal Automatic Computer) was the first general-purpose electronic
digital computer designed in the United States for corporate applications.
1949 – The Electronic Delay Storage Automatic Calculator (EDSAC), developed by a team at
the University of Cambridge, is the “first practical stored-program computer.”

1950 – The Standards Eastern Automatic Computer (SEAC) was built in Washington, DC, and it
was the first stored-program computer completed in the United States.

Late 20th Century

1953 – Grace Hopper, a computer scientist, creates the first computer language, which becomes
known as COBOL, which stands for COmmon, Business-Oriented Language. It allowed a
computer user to offer the computer instructions in English-like words rather than numbers.

1954 – John Backus and a team of IBM programmers created the FORTRAN programming
language, an acronym for FORmula TRANslation. In addition, IBM developed the 650.

1958 – The integrated circuit, sometimes known as the computer chip, was created by Jack Kirby
and Robert Noyce.

1962 – Atlas, the computer, makes its appearance. It was the fastest computer in the world at the
time, and it pioneered the concept of “virtual memory.”

1964 – Douglas Engelbart proposes a modern computer prototype that combines a mouse and a
graphical user interface (GUI).

1969 – Bell Labs developers, led by Ken Thompson and Dennis Ritchie, revealed UNIX, an
operating system developed in the C programming language that addressed program
compatibility difficulties.

1970 – The Intel 1103, the first Dynamic Access Memory (DRAM) chip, is unveiled by Intel.

1971 – The floppy disc was invented by Alan Shugart and a team of IBM engineers. In the same
year, Xerox developed the first laser printer, which not only produced billions of dollars but also
heralded the beginning of a new age in computer printing.

1973 – Robert Metcalfe, a member of Xerox’s research department, created Ethernet, which is
used to connect many computers and other gear.

1974 – Personal computers were introduced into the market. The first were the Altair Scelbi &
Mark-8, IBM 5100, and Radio Shack’s TRS-80.

1975 – Popular Electronics magazine touted the Altair 8800 as the world’s first minicomputer kit
in January. Paul Allen and Bill Gates offer to build software in the BASIC language for the
Altair.

1976 – Apple Computers is founded by Steve Jobs and Steve Wozniak, who expose the world to
the Apple I, the first computer with a single-circuit board.
1977 – At the first West Coast Computer Faire, Jobs and Wozniak announce the Apple II. It has
colour graphics and a cassette drive for storing music.

1978 – The first computerized spreadsheet program, VisiCalc, is introduced.

1979 – WordStar, a word processing tool from MicroPro International, is released.

1981 – IBM unveils the Acorn, their first personal computer, which has an Intel CPU, two floppy
drives, and a colour display. The MS-DOS operating system from Microsoft is used by Acorn.

1983 – The CD-ROM, which could carry 550 megabytes of pre-recorded data, hit the market.
This year also saw the release of the Gavilan SC, the first portable computer with a flip-form
design and the first to be offered as a “laptop.”

1984 – Apple launched Macintosh during the Superbowl XVIII commercial. It was priced at
$2,500

1985 – Microsoft introduces Windows, which enables multitasking via a graphical user interface.
In addition, the programming language C++ has been released.

1990 – Tim Berners-Lee, an English programmer and scientist, creates HyperText Markup
Language, widely known as HTML. He also coined the term “WorldWideWeb.” It includes the
first browser, a server, HTML, and URLs.

1993 – The Pentium CPU improves the usage of graphics and music on personal computers.

1995 – Microsoft’s Windows 95 operating system was released. A $300 million promotional
campaign was launched to get the news out. Sun Microsystems introduces Java 1.0, followed by
Netscape Communications’ JavaScript.

1996 – At Stanford University, Sergey Brin and Larry Page created the Google search engine.

1998 – Apple introduces the iMac, an all-in-one Macintosh desktop computer. These PCs cost
$1,300 and came with a 4GB hard drive, 32MB RAM, a CD-ROM, and a 15-inch monitor.

1999 – Wi-Fi, an abbreviation for “wireless fidelity,” is created, originally covering a range of up
to 300 feet.

21st Century

2000 – The USB flash drive is first introduced in 2000. They were speedier and had more storage
space than other storage media options when used for data storage.

2001 – Apple releases Mac OS X, later renamed OS X and eventually simply macOS, as the
successor to its conventional Mac Operating System.
2003 – Customers could purchase AMD’s Athlon 64, the first 64-bit CPU for consumer
computers.

2004 – Facebook began as a social networking website.

2005 – Google acquires Android, a mobile phone OS based on Linux.

2006 – Apple’s MacBook Pro was available. The Pro was the company’s first dual-core, Intel-
based mobile computer.

Amazon Web Services, including Amazon Elastic Cloud 2 (EC2) and Amazon Simple Storage
Service, were also launched (S3)

2007 – The first iPhone was produced by Apple, bringing many computer operations into the
palm of our hands. Amazon also released the Kindle, one of the first electronic reading systems,
in 2007.

2009 – Microsoft released Windows 7.

2011 – Google introduces the Chromebook, which runs Google Chrome OS.

2014 – The University of Michigan Micro Mote (M3), the world’s smallest computer, was
constructed.

2015 – Apple introduces the Apple Watch. Windows 10 was also released by Microsoft.

2016 – The world’s first reprogrammable quantum computer is built.

Types of Computers
1. Analog Computers – Analog computers are built with various components such as gears
and levers, with no electrical components. One advantage of analogue computation is that
designing and building an analogue computer to tackle a specific problem can be quite
straightforward.
2. Digital Computers – Information in digital computers is represented in discrete form,
typically as sequences of 0s and 1s (binary digits, or bits). A digital computer is a system
or gadget that can process any type of information in a matter of seconds. Digital
computers are categorized into many different types. They are as follows:
a. Mainframe computers – It is a computer that is generally utilized by large
enterprises for mission-critical activities such as massive data processing.
Mainframe computers were distinguished by massive storage capacities, quick
components, and powerful computational capabilities. Because they were
complicated systems, they were managed by a team of systems programmers who
had sole access to the computer. These machines are now referred to as servers
rather than mainframes.
b. Supercomputers – The most powerful computers to date are commonly referred
to as supercomputers. Supercomputers are enormous systems that are purpose-
built to solve complicated scientific and industrial problems. Quantum mechanics,
weather forecasting, oil and gas exploration, molecular modelling, physical
simulations, aerodynamics, nuclear fusion research, and cryptoanalysis are all
done on supercomputers.
c. Minicomputers – A minicomputer is a type of computer that has many of the
same features and capabilities as a larger computer but is smaller in size.
Minicomputers, which were relatively small and affordable, were often employed
in a single department of an organization and were often dedicated to a specific
task or shared by a small group.
d. Microcomputers – A microcomputer is a small computer that is based on a
microprocessor integrated circuit, often known as a chip. A microcomputer is a
system that incorporates at a minimum a microprocessor, program memory, data
memory, and input-output system (I/O). A microcomputer is now commonly
referred to as a personal computer (PC).
e. Embedded processors – These are miniature computers that control electrical
and mechanical processes with basic microprocessors. Embedded processors are
often simple in design, have limited processing capability and I/O capabilities,
and need little power. Ordinary microprocessors and microcontrollers are the two
primary types of embedded processors. Embedded processors are employed in
systems that do not require the computing capability of traditional devices such as
desktop computers, laptop computers, or workstations.

Elements of a Computer System


There are six main elements that make up a computer system. They all interact with each other
and perform the task at hand. Let us take a look at all of them.

1] Hardware

These are all the physical aspects of a computer system. They are tangible, i.e. you can see and
touch them. Hardware components are the electronic or mechanical instruments, like keyboard,
monitor, printer etc. They help the users interface with the software, and also display the result of
the tasks being performed.

Hardware can actually be of four types, depending on which function they perform. The four
types of hardware are,

 Input Hardware: For users to input data into the computer system. Examples: Keyboard,
mouse, Scanner
 Output Hardware: To translate and display the result of the data processing =. Example:
Monitor Screen, Printer etc
 Processing and Memory Hardware: Where data and information are processed and
manipulated to perform the task at hand. It is also the workspace of the computer, where
it temporarily stores data. Examples: Central Processing Unit (CPU), Read Only Memory
(RAM)
 Secondary Storage Hardware: Where the computer system stores data permanently.
Example: Harddisk, Pendrive etc

2] Software

Software is nothing but a set of programmes (computer instructions), which helps the user to do
a set of specific tasks. It helps the user interact with the computer system with the help of
hardware. Software, as you can imagine, is the intangible aspect of the computer system.

Basically, there are six main types of software, which are as follows,

 Operating System: These specialized programmes allow the communication between


software and hardware. The operating systems run all the other computer programmes,
and even regulate the startup process of the computer. Examples: Windows XP,
Macintosh etc
 Application Software: These are designed to perform a specific task or a bunch of tasks.
They can be user-designed (specific to the user’s needs) or readymade application
software. Example: PowerPoint, Tally etc.
 Utility Software: Like operating systems, it is a system software. It helps maintain and
protect the computer system. For example, Anti-virus software is a utility software.
 Language Processors: Software that interprets computer language and translates it into
machine language. It also checks for errors in language syntax and fixes the problems.
 System Software: This types of software control the hardware, the reading of the data and
other such internal functions.
 Connectivity Software: The special software that facilitates the connection between the
computer system and the server. This allows the computer to share information and
communicate with each other.

3] People

The people interacting with the computer system are also an element of it. We call this element
the Liveware. They are the ultimate “users” of the computer systems. There are three types of
people that interact with the system, namely

 Programmers: Professionals who write the computer programs that allow users to interact
with the computer. They must have technical knowledge of computers and computer
languages.
 System Analyst: They mainly design data processing systems and solve problems that
arise in data processing.
 End-Users: Also known as operators, they are the people who interact with the computer
system.

4] Procedures

These are a set of instructions, written in code, to instruct a computer on how to perform a task,
run a software, do calculations etc. There are three types of procedures in a computer They are,

 Hardware-Oriented Procedure: Instructs the hardware components of the system, ensures


they work smoothly.
 Software Oriented Procedure: Provides instructions to launch and run software programs.
 Internal Procedures: Directs the flow of information and sequences the data.

5] Data

Data is essentially the raw facts and figures that we input in the computer. The data gets
processed via the computer system and becomes information, which is processed and organized
data. Information can then be used for decision-making purposes.

The measurement of data is done in terms of “bytes”. One kilobyte (KB) is approximately 1000
bytes, 1 megabyte (MB) is 1 million bytes and finally, 1 gigabyte (GB) is approximately 1 billion
bytes.

6] Connectivity

This is when the computers are linked to a network. It facilitates sharing of information, files,
and other facilities. Computers can connect to a network via LAN cables, Bluetooth, Wi-Fi,
satellites etc. The internet is the most obvious example of connectivity in a computer system.
What is software?
Software is a set of instructions, data or programs used to operate computers and execute specific
tasks. It is the opposite of hardware, which describes the physical aspects of a computer.
Software is a generic term used to refer to applications, scripts and programs that run on a
device. It can be thought of as the variable part of a computer, while hardware is the invariable
part.

The two main categories of software are application software and system software. An
application is software that fulfills a specific need or performs tasks. System software is designed
to run a computer's hardware and provides a platform for applications to run on top of.

Other types of software include the following:

 Programming software, which provides the programming tools software developers need.
 Middleware, which sits between system software and applications.
 Driver software, which operates computer devices and peripherals.

Early software was written for specific computers and sold with the hardware it ran on. In the
1980s, software began to be sold on floppy disks and, later, CDs and DVDs. Today, most
software is purchased and directly downloaded over the internet. Software can be found on
vendor and application service provider websites.
Examples and types of software
Among the various categories of software, the most common types include the following:

Application software. The most frequently used software is application software,

 which is a computer software package that performs a specific function for a user or, in
some cases, for another application.
 An application can be self-contained, or it can be a group of programs that run the
application for the user.
 Examples of modern applications include office suites, graphics software, databases,
database management programs, web browsers, word processors, software development
tools, image editors and communication platforms.

System software. These software programs are designed to run a computer's application
programs and hardware.

 System software coordinates the activities and functions of the hardware and software.
 In addition, it controls the operations of the computer hardware and provides an
environment or platform for all the other types of software to work in.
 An operating system (OS) is the best example of system software; it manages all the
other computer programs.
 Other examples of system software include firmware, computer language translators and
system utilities.

Driver software. Also known as device drivers, this software is often considered a type of
system software.

 Device drivers control the devices and peripherals connected to a computer, helping them
perform their specific tasks.
 Every device that's connected to a computer needs at least one device driver to function.
 Examples include software that comes with any nonstandard hardware, including special
game controllers, as well as the software that enables standard hardware, such as USB
storage devices, keyboards, headphones and printers.

Middleware. The term middleware describes software that mediates between application and
system software or between two different kinds of application software.

 For example, middleware lets Microsoft Windows talk to Excel and Word.
 It's used to send a remote work request from an application in a computer that has one
kind of OS to an application in a computer with a different OS.
 It also lets newer applications work with legacy ones.

Programming software. Computer programmers use programming software to write code.


 Programming software and programming languages, such as Java or Python, let
developers develop, write, test and debug other software programs.
 Examples of programming software include assemblers, compilers, debuggers and
interpreters.

A full software stack includes many components, ranging from hardware to operating system
services to the application.

How does software work?


All software provides the directions and data computers need to work and meet users' needs.
However, the two different types -- application software and system software -- work in
distinctly different ways.

Application software

 Application software consists of many programs that perform specific functions for end
users, such as writing reports and navigating websites.
 Applications also perform tasks for other applications.
 Applications on a computer can't run on their own; they require a computer's OS along
with other supporting system software programs to work.
 These desktop applications are installed on a user's computer and use the computer
memory to carry out tasks. They take up space on the computer's hard drive and don't
need an internet connection to work.

 However, desktop applications must adhere to the requirements of the hardware devices
they run on.

 Web applications, on the other hand, do require internet access to work, but they don't
rely on the hardware and system software to run.
 Consequently, users can launch web applications from devices that have a web browser.
Since the components responsible for the application functionality are on the server, users
can launch the app from Windows, Mac, Linux or any other OS.

System software

 System software sits between the computer hardware and the application software.
 Users don't interact directly with system software as it runs in the background, handling
the basic functions of the computer.
 This software coordinates a system's hardware and software so users can run high-level
application software to perform specific actions.
 System software executes when a computer system boots up and continues running as
long as the system is on.

System software and application software differ in some key ways.


Design and implementation of software
The software development lifecycle is a framework that project managers use to describe the
stages and tasks associated with designing software. The first steps in the design lifecycle are
planning the effort, then analyzing the needs of the individuals who will use the software and
creating detailed requirements. After the initial requirements analysis, the design phase aims to
specify how to fulfill those user requirements.

The next is step is implementation, where development work is completed, and then software
testing happens. The maintenance phase involves any tasks required to keep the system running.

Software design includes a description of the structure of the software that will be implemented,
data models, interfaces between system components and potentially the algorithms the software
engineer will use.

The software design process transforms user requirements into a form that computer
programmers can use to do the software coding and implementation. Software engineers develop
the software design iteratively, adding detail and correcting the design as they develop it.

The different types of software design include the following:

 Architectural design. This is the foundational design, which identifies the overall
structure of the system, its main components and their relationships with one another
using architectural design tools.
 High-level design. This is the second layer of design that focuses on how the system,
along with all its components, can be implemented in forms of modules supported by a
software stack. A high-level design describes the relationships between data flow and the
various modules and functions of the system.
 Detailed design. This third layer of design focuses on all the implementation details
necessary for the specified architecture.
There are six main steps in the software development lifecycle.

How to maintain software quality


Software quality measures if the software meets both its functional and nonfunctional
requirements.

Functional requirements identify what the software should do. They include technical details,
data manipulation and processing, calculations and any other function that specifies what an
application aims to accomplish.

Nonfunctional requirements, also known as quality attributes, determine how the system should
work. Nonfunctional requirements include portability, disaster recovery, security, privacy and
usability.

Software testing detects and solves technical issues in the software source code and assesses the
overall usability, performance, security and compatibility of the product to ensure it meets its
requirements.

The dimensions of software quality include the following characteristics:

 Accessibility. This is the degree to which a diverse group of people, including individuals
who require adaptive technologies such as voice recognition and screen magnifiers, can
comfortably use the software.
 Compatibility. This is the suitability of the software for use in a variety of environments.
Software compatibility is important for different OSes, devices and browsers.
 Efficiency. This is the ability of the software to perform well without wasting energy,
resources, effort, time or money.
 Functionality. This is software's ability to carry out its specified functions.
 Installation. This is the ability of the software to be installed in a specified environment.
 Localization. For software to function correctly, it needs localization, which entails the
various languages, time zones and other features a software program can works in.
 Maintainability. This is how easily the software can be modified to add and improve
features and fix bugs.
 Performance. This is how fast the software performs under a specific load.
 Portability. This is the ease with which the software can be transferred from one location
to another.
 Reliability. This is the software's ability to perform a required function under specific
conditions and for a defined period without any errors.
 Scalability. A software's ability to increase or decrease performance in response to
changes in its processing demands is its scalability.
 Security. This is the software's ability to protect against unauthorized access, invasion of
privacy, theft, data loss and malicious software.
 Testability. This is how easy it is to test the software.
 Usability. This is how easy it is to use the software.

To maintain software quality once it's deployed, developers must constantly adapt it to meet new
customer requirements and handle problems customers identify. This includes improving
functionality, fixing bugs and adjusting software code to prevent issues. How long a product lasts
on the market depends on developers' ability to keep up with these maintenance requirements.

When it comes to maintenance approaches, there are four types of changes developers can make,
including the following:

 Corrective. Users often identify and report bugs that developers must fix, including
coding errors and other problems that keep the software from meeting its requirements.
 Adaptive. Developers must regularly make changes to their software to ensure it's
compatible with changing hardware and software environments, such as when a new
version of the OS comes out.
 Perfective. These are changes that improve system functionality, such as improving the
user interface or adjusting software code to enhance performance.
 Preventive. These changes keep software from failing and include tasks such as
restructuring and optimizing code.

Modern software development

DevOps is an organizational approach that brings together software development and IT


operations teams. It promotes communication and collaboration between these two groups. The
term also describes the use of iterative software development practices that use automation and
programmable infrastructure. Get the full picture in our ultimate guide to DevOps.

Software licensing and patents


A software license is a legally binding document that restricts the use and distribution of
software.

Typically, software licenses provide users with the right to one or more copies of the software
without violating copyright. The license outlines the responsibilities of the parties that enter into
the agreement and might place restrictions on how the software is used.

Software licensing terms and conditions generally include fair use of the software, the limitations
of liability, warranties, disclaimers and protections if the software or its use infringes on the
intellectual property rights of others.

Licenses typically are for proprietary software, which remains the property of the organization,
group or individual that created it. They are also used for free software, where users can run,
study, change and distribute the software. Open source is a type of software that's developed
collaboratively, and the source code is freely available. With open source software licenses, users
can run, copy, share and change the software similar to free software.

Over the last two decades, software vendors have moved away from selling software licenses on
a one-time basis. Instead, they offer a software as a service (SaaS) subscription model. Software
vendors host the software in the cloud and make it available to customers, who pay a
subscription fee and access the software over the internet.

Although copyright can prevent others from copying a developer's code, a copyright can't stop
them from developing the same software independently without copying. However, a patent
stops another person from using the functional aspects of the software a developer claims in a
patent, even if that second person developed the software independently.

In general, the more technical software is, the more likely it can be patented. For example, a
software product could be granted a patent if it creates a new kind of database structure or
enhances the overall performance and function of a computer.

The history of software


The term software wasn't used until the late 1950s. During this time, although different types of
programming software were being created, they weren't typically commercially available.
Consequently, users -- mostly computer science experts and large enterprises -- often had to write
their own software.

The following is a brief timeline of the history of software:

 June 21, 1948. Tom Kilburn, a computer scientist, writes the world's first piece of
software for the Manchester Baby computer at the University of Manchester in England.
 Early 1950s. General Motors creates the first OS, for the IBM 701 Electronic Data
Processing Machine. It is called General Motors Operating System, or GM OS.
 1958. Statistician John Tukey coins the word software in an article about computer
programming.
 Late 1960s. Floppy disks are introduced and used through the 1990s to distribute
software.
 Nov. 3, 1971. AT&T releases the first edition of the Unix OS.
 1977. Apple releases the Apple II and consumer software takes off.
 1979. VisiCorp releases VisiCalc for the Apple II, the first spreadsheet software for
personal computers.
 1981. Microsoft releases MS-DOS, the OS on which many of the early IBM computers
ran. IBM begins selling software, and commercial software becomes available to the
average consumer.
 1980s. Hard drives become standard on PCs, and manufacturers start bundling software
in computers.
 1983. The free software movement is launched with Richard Stallman's GNU -- GNU is
not Unix -- Linux project to create a Unix-like OS with source code that can be freely
copied, modified and distributed.
 1984. Mac OS is released to run Apple's Macintosh line.
 Mid-1980s. Key software applications, including AutoDesk AutoCAD, Microsoft Word
and Microsoft Excel, are released.
 1985. Microsoft Windows 1.0 is released.
 1989. CD-ROMs become standard and hold much more data than floppy disks. Large
software programs can be distributed quickly, easily and relatively inexpensively.
 1991. The Linux kernel, the basis for the open source Linux OS, is released.
 1997. DVDs are introduced and can hold more data than CDs, making it possible to put
bundles of programs, such as Microsoft Office Suite, onto one disk.
 1999. Salesforce.com uses cloud computing to pioneer software delivery over the
internet.
 2000. SaaS technology starts to be used.
 2007. The iPhone is launched and mobile applications emerge.
 2010s. DVDs become obsolete as users buy and download software from the internet and
the cloud. Vendors move to subscription-based SaaS models.
 2020s. Generative artificial intelligence as well as other AI and machine learning
capabilities are increasingly added to software platforms.

Future of software
The future of software development and applications will be a continuation of current trends.
The focus will be on tools to simplify application development and make software user-friendly
for nontechnical consumers, accessible from any device and able to process large data volumes.
Some of the technologies involved include the following:

 AI and machine learning. This will provide software users with new capabilities, like
generating original text and images through generative AI, analyzing and visualizing data
spreadsheets, and automating workflows.
 Sustainable development. This will increase in importance as more attention is brought
to the environmental impact of compute-intensive resources. Software developers will for
ways to reduce electricity use, for instance.
 Quantum computing. Though not yet available to the generally available, quantum
computing processes vast quantities of data faster than traditional computers and will
expand into software.
 Low- and no-code. This technology will expand to help users with little or no technical
experience to customize their own apps and software functionalities.
 Cybersecurity. As hackers become more sophisticated in their techniques, developers
will need to be equipped with deep knowledge of cybersecurity knowledge to counter
them.
 Microservices. Developers miniaturized microservices services to make software more
efficient. Microservices are created as part of a containers during development but are
presented to users as separate microservices.

You might also like