0% found this document useful (0 votes)
7 views

COS101 Material

The document outlines a comprehensive course on computing, covering its history, basic components, input/output devices, and the evolution of technology. It details the progression from early counting tools to modern computing devices, highlighting key inventions and the development of hardware and software. Additionally, it explains the fundamental components of a computer, including the CPU, memory, and peripherals, while discussing the significance of the Internet and future trends in computing.

Uploaded by

chukwuhapaul25
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views

COS101 Material

The document outlines a comprehensive course on computing, covering its history, basic components, input/output devices, and the evolution of technology. It details the progression from early counting tools to modern computing devices, highlighting key inventions and the development of hardware and software. Additionally, it explains the fundamental components of a computer, including the CPU, memory, and peripherals, while discussing the significance of the Internet and future trends in computing.

Uploaded by

chukwuhapaul25
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 45

COURSE OUTLINE

​ ​ ​ ​ ​ ​
1.​ Brief history of computing.
2.​ Description of the basic components of a computer/computing device.
3.​ Input/Output devices and peripherals.
4.​ Hardware, software, and humanware.
5.​ Diverse and growing computer/digital applications.
6.​ Information processing and its roles in society.
7.​ The Internet, its applications, and its impact on the world today.
8.​ The different areas/programs of the computing discipline.
9.​ The job specializations for computing professionals.
10.​The future of computing.
​ ​ ​ ​
​ ​ ​
​ ​
CHAPTER ONE
Brief history of computing.
The history of computing reflects a continuous evolution from basic counting tools to complex
digital ecosystems that has profoundly impacted every aspect of modern life. The history of
computing is a vast and evolving narrative that spans centuries of innovation.

Early Beginnings:
The journey began with simple counting devices like the abacus, used for basic arithmetic by
ancient civilizations. Over time, tools such as the counting board, Napier’s bones, and the slide
rule were developed to aid more complex calculations. The abacus, dating back to around 3000
BC, is considered the earliest computing device. It aided in basic arithmetic calculations.

Events Computer Inventions

❖​ 1623: Wilhelm Schickard invents the "Calculating Clock," a mechanical calculator


capable of addition, subtraction, multiplication, and division.
❖​ 1642: Blaise Pascal develops the Pascaline, a mechanical calculator that could perform
addition and subtraction directly.
❖​ 1673: Gottfried Wilhelm Leibniz creates the Stepped Reckoner, a more advanced
mechanical calculator that could also multiply and divide.

The Industrial and Pre-Digital Era


Charles Babbage and Ada Lovelace: In the 19th century, Charles Babbage conceptualized the
Difference Engine and the more ambitious Analytical Engine, considered the first designs for a
programmable computer. Ada Lovelace, often regarded as the first computer programmer,
recognized the potential of these machines beyond mere calculation.
The Industrial and Pre-Digital Era Inventions:

❖​ 1801: Joseph Marie Jacquard invents a loom that uses punched cards to control patterns,
demonstrating the concept of programmable machines.
❖​ 1822: Charles Babbage designs the Difference Engine, a mechanical computer intended
to calculate tables of numbers.
❖​ 1837: Charles Babbage conceives the Analytical Engine, a general-purpose
programmable mechanical computer, considered by many to be the first concept for a
computer as we know it today. Ada Lovelace, considered the first computer programmer,
writes an algorithm for the Analytical Engine.
❖​ 1890: Herman Hollerith develops a punch card-based tabulating machine for the 1890 US
Census, significantly speeding up data processing. His company would eventually
become IBM.

The Birth of Modern Computing


The 20th century brought rapid advancements. During World War II, machines like the British
Colossus and the American ENIAC (Electronic Numerical Integrator and Computer) emerged.
These early electronic computers, using vacuum tubes, were massive machines designed for
specific, often military-related tasks. Pioneers such as Alan Turing laid the groundwork for
modern computer science by formalizing the concept of algorithms and computation with the
Turing machine.

The 20th Century Innovations

❖​ 1930s: The first analog computers are developed, such as the Differential Analyzer by
Vannevar Bush.
❖​ 1936: Alan Turing introduces the concept of a Turing machine, a theoretical model of
computation that lays the foundation for computer science.
❖​ 1937: John Vincent Atanasoff and Clifford Berry begin developing the Atanasoff-Berry
Computer (ABC), considered by some to be the first electronic digital computer.
❖​ 1941: Konrad Zuse completes the Z3, the first fully automatic, programmable digital
computer.
❖​ 1944: The Harvard Mark I, an electromechanical computer, is completed.
❖​ 1946: ENIAC (Electronic Numerical Integrator and Computer), the first general-purpose
electronic digital computer, is unveiled. It is massive and requires significant power.
❖​ 1948: The Manchester Baby, the first stored-program computer, runs its first program.
The Evolution of Hardware and Software
Transistors and Integrated Circuits: The invention of the transistor in the late 1940s and its
subsequent miniaturization into integrated circuits during the 1960s revolutionized computing,
making machines smaller, faster, and more reliable. The development of the microprocessor in
the early 1970s set the stage for personal computing. The late 1970s and early 1980s saw the rise
of home computers, with companies like Apple, IBM, and Commodore leading the way.​
Some inventions in this era include:
❖​ 1950s:
●​ The invention of the transistor revolutionizes computing, leading to smaller,
faster, and more reliable computers.
●​ Early programming languages like FORTRAN are developed, making it easier to
write software.
❖​ 1960s:
●​ The integrated circuit (microchip) is invented, further miniaturizing computers
and increasing their power.
●​ The development of operating systems like UNIX enables multitasking and more
efficient use of computer resources.
❖​ 1970s:
●​ The microprocessor is invented, leading to the development of the first personal
computers (PCs).
●​ Companies like Apple, Commodore, and IBM release popular home computers,
making computing accessible to individuals.
❖​ 1980s:
●​ The IBM PC becomes the standard for business computing, and the rise of
software companies like Microsoft further fuels the PC revolution.
●​ The internet begins to grow, connecting computers worldwide.
❖​ 1990s:
●​ The World Wide Web is created, making the internet accessible to the general
public and transforming communication and information sharing.
●​ Personal computers become ubiquitous in homes and offices.
The Digital Age

The Internet and Networking: The latter part of the 20th century was defined by the advent of
networking and the internet, transforming computers from isolated machines into globally
connected devices. Advances in operating systems and software design made computers more
accessible to a broader audience, while the development of graphical user interfaces (GUIs)
improved user interaction.

GENERATIONS OF COMPUTER

Computers have gone through many changes over time. The first generation of computers started
around 1940, and since then there have been five generations of computers until 2023.
Computers evolved over a long period of time, starting from the 16th century, and continuously
improved themselves in terms of speed, accuracy, size, and price to become the modern-day
computer. The different phases of this long period are known as computer generations. The first
generation of computers was developed from 1940 to 1956, followed by the second generation
from 1956 to 1963, the third generation from 1964 to 1971, the fourth generation from 1971 to
1980, and the fifth generation is still being developed.

1st Generation (1940s - 1950s)

❖​ Key Technology: Vacuum tubes


❖​ Characteristics:
●​ Large, bulky, and generated a lot of heat
●​ Consumed a lot of power
●​ Slow processing speeds
●​ Used machine language for programming
●​ Examples: ENIAC, UNIVAC
2nd Generation (1950s - 1960s)

❖​ Key Technology: Transistors


❖​ Characteristics:
●​ Smaller, faster, and more reliable than 1st-generation computers
●​ Less heat generation and power consumption
●​ Used assembly language for programming
●​ Examples: IBM 1401, IBM 7090

3rd Generation (1960s - 1970s)

❖​ Key Technology: Integrated circuits (ICs)


❖​ Characteristics:
●​ Even smaller and faster than 2nd-generation computers
●​ Lower power consumption and increased reliability
●​ Introduction of operating systems and high-level programming languages
●​ Examples: IBM System/360, DEC PDP-11

4th Generation (1970s - Present)

❖​ Key Technology: Microprocessors


❖​ Characteristics:
●​ Very large-scale integration (VLSI) allowed for complex circuits on a single chip
●​ Led to the development of personal computers (PCs)
●​ Increased processing power and affordability
●​ Examples: IBM PC, Apple Macintosh

5th Generation (Present and Beyond)

❖​ Key Focus: Artificial intelligence (AI) and advanced technologies


❖​ Characteristics:
●​ Emphasis on parallel processing, AI, and machine learning
●​ Development of natural language processing and computer vision
●​ Use of ULSI (Ultra Large Scale Integration)
●​ Examples: AI-powered systems, robotics, quantum computers (emerging)

​ ​ ​ ​ ​

​ ​ ​ ​

​ ​ ​

​ ​
CHAPTER TWO
Basic Computer Components

A computer is an electronic device that accepts data, performs operations, displays results, and
stores the data or results as needed. It is a combination of hardware and software resources that
integrate and provide various functionalities to the user. Hardware is the physical component of a
computer, such as a processor, memory devices, monitor, keyboard, etc., while software is a set
of programs or instructions that are required by the hardware resources to function properly.

​ ​

Basic Components of the Computer


Components of a Computer
There are basically three important components of a computer:

1.​ Input Unit


2.​ Central Processing Unit (CPU)
3.​ Output Unit

1. Input Unit:​
The input unit consists of input devices that are attached to the computer. These devices take
input and convert it into binary language that the computer understands. Some of the common
input devices are keyboard, mouse, joystick, scanner, etc.

●​ The input unit is formed by attaching one or more input devices to a computer.
●​ A user inputs data and instructions through input devices such as a keyboard, mouse,
etc.
●​ The input unit is used to provide data to the processor for further processing.

2. Central Processing Unit​


Once the information is entered into the computer by the input device, the processor processes it.
The CPU is called the brain of the computer because it is the control center of the computer. It
first fetches instructions from memory and then interprets them so as to know what is to be done.
If required, data is fetched from memory or the input device. Thereafter, the CPU executes or
performs the required computation and then either stores the output or displays it on the output
device. The CPU has three main components, which are responsible for different functions:
Arithmetic Logic Unit (ALU), Control Unit (CU) and Memory registers

A. Arithmetic and Logic Unit (ALU): The ALU, as its name suggests, performs mathematical
calculations and takes logical decisions. Arithmetic calculations include addition, subtraction,
multiplication, and division. Logical decisions involve the comparison of two data items to see
which one is larger, smaller, or equal.

●​ Arithmetic Logical Unit is the main component of the CPU


●​ It is the fundamental building block of the CPU.
●​ Arithmetic and Logical Unit is a digital circuit that is used to perform arithmetic and
logical operations.

B. Control Unit: The control unit coordinates and controls the data flow in and out of the CPU
and also controls all the operations of the ALU, memory registers, and input/output units. It is
also responsible for carrying out all the instructions stored in the program. It decodes the fetched
instruction, interprets it, and sends control signals to input/output devices until the required
operation is done properly by the ALU and memory.

●​ The control unit is a component of the central processing unit of a computer that
directs the operation of the processor.
●​ It instructs the computer’s memory, arithmetic and logic unit, and input and output
devices on how to respond to the processor’s instructions.
●​ In order to execute the instructions, the components of a computer receive signals
from the control unit.
●​ It is also called the central nervous system or brain of the computer.

C. Memory Registers: A register is a temporary unit of memory in the CPU. These are used to
store the data, which is directly used by the processor. Registers can be of different sizes (16-bit,
32-bit, 64-bit, and so on), and each register inside the CPU has a specific function, like storing
data, storing an instruction, storing the address of a location in memory, etc. The user registers
can be used by an assembly language programmer for storing operands, intermediate results, etc.
Accumulator (ACC) is the main register in the ALU and contains one of the operands of an
operation to be performed in the ALU.

Memory attached to the CPU is used for the storage of data and instructions and is called internal
memory The internal memory is divided into many storage locations, each of which can store
data or instructions. Each memory location is of the same size and has an address. With the help
of the address, the computer can read any memory location easily without having to search the
entire memory. When a program is executed, its data is copied to the internal memory and stored
in the memory till the end of the execution. The internal memory is also called the Primary
memory or Main memory. This memory is also called RAM, i.e., Random Access Memory. The
time of access of data is independent of its location in memory; therefore, this memory is also
called Random Access Memory (RAM).

●​ The memory unit is the primary storage of the computer.


●​ It stores both data and instructions.
●​ Data and instructions are stored permanently in this unit so that they are available
whenever required.

3. Output Unit ​
The output unit consists of output devices that are attached to the computer. It converts the
binary data coming from the CPU to a human-understandable form. The common output devices
are monitor, printers, plotters, etc.

●​ The output unit displays or prints the processed data in a user-friendly format.
●​ The output unit is formed by attaching the output devices of a computer.
●​ The output unit accepts the information from the CPU and displays it in a
user-readable form.
CHAPTER THREE
Input/Output Devices and Peripherals

A computer peripheral is essentially any device that connects to a computer to expand its
capabilities or allow for interaction with the system. These devices aren't essential for the
computer to function at its most basic level, but they greatly enhance its usability and the tasks it
can perform. It can be either internal or external, though we more commonly think of external
devices when talking about peripherals. Most commonly, when we talk about peripheral devices,
we are talking about input and output devices.

Input Devices

Input devices are used to allow us to enter information into a computer system. This might be,
for example, to control a character in a game, click on a shortcut icon on your desktop, or type
data into a spreadsheet.

Some example input devices include:

❖​ Keyboard: The most common input device, used for typing text, numbers, and symbols.

❖​ Mouse: A pointing device used to navigate and interact with graphical elements on the
screen.
❖​ Touchpad: A flat surface, often found on laptops, used as a pointing device.

❖​ Scanner: Converts physical documents or images into digital formats.

❖​ Webcam: A camera connected to a computer, used for video conferencing, recording,


and streaming.
❖​ Digital Camera: Captures still images and videos, which can then be transferred to a
computer.
❖​ Microphone: Records sound, allowing you to input voice, music, or other audio into the
computer.
❖​ Joystick: A stick-like device used for controlling movement in games or simulations.

❖​ Gamepad (or Joypad): A handheld controller with buttons and directional pads,
commonly used for gaming.
❖​ Racing Wheel: A wheel-shaped controller used for racing games.

❖​ Barcode Reader: Scans barcodes to input product information or other data.

❖​ Graphics Tablet: A flat surface used with a stylus for drawing or creating digital art.

❖​ MIDI Keyboard: An electronic musical instrument used to input musical notes and other
data into music software.
❖​ Fingerprint Scanner: Captures fingerprint data for biometric authentication.

❖​ Touchscreen: A display that allows input through touching the screen with a finger or
stylus.
❖​ Light Pen: A light-sensitive pen used to draw or select objects on a screen (mostly older
CRT monitors).
❖​ Trackball: A ball held in a socket that you roll to move the cursor.

❖​ Pointing Stick: A small, pressure-sensitive stick used for cursor control, often found on
laptops.

Output Devices
Output devices are used to send data from a digital device to a user or another device. This could
be so you can see the photo you’ve just taken on your digital camera, print out a hard copy of the
report you’ve just written, or hear the voice of someone you’re having a VoIP conversation with.

❖​ Monitor: The most common output device, displaying text, images, and video. There are
various types like LCD, LED, OLED, etc.
❖​ Projector: Projects images onto a large surface, like a screen or wall, often used for
presentations or home theaters.
❖​ Television (TV): Can be used as a display for a computer, especially for multimedia or
gaming.
❖​ Head-Mounted Display (HMD): Worn on the head, providing immersive virtual or
augmented reality experiences.
❖​ Speakers: Produce sound, allowing you to hear music, dialogue, and other audio.
❖​ Headphones: Worn over the ears, providing private audio output.
❖​ Earbuds: Small speakers that fit inside the ear canal.
❖​ Printer: Creates physical copies of documents and images on paper. Types include
inkjet, laser, and 3D printers.
❖​ Plotter: A specialized printer used for creating large-format drawings, maps, and
diagrams.
❖​ Haptic Devices: Provide tactile feedback, such as vibrations or forces, often used in
gaming or simulations.
❖​ Actuators: Devices that produce physical movement, used in robotics or automation. For
example, a motor that moves a robotic arm is an output device.
❖​ Braille Display: A device that creates tactile output readable by blind or visually
impaired users.
❖​ GPS (Global Positioning System) Device: While it receives signals, it also outputs
location information, directions, etc.
❖​ VR (Virtual Reality) Headsets: These often have both input and output functionalities,
but the display and audio they provide are key output aspects.
CHAPTER FOUR
Hardware, Software and Humanware

A Breakdown of Computers​
There are three major components of a computer system:​

❖​ Hardware
❖​ Software
❖​ Humanware

Computer Hardware​
These are computer system components that can be touched by the human hand. Examples
include:

❖​ Display monitor
❖​ Keyboard
❖​ Mouse
❖​ Motherboard
❖​ Memory modules
❖​ Disk drive

These parts are housed within the laptop or the desktop system unit. For the desktop, however,
the keyboard and mouse are attached and used externally.
A typical installation of computer hardware components includes a monitor, a computer, a
keyboard, and a mouse.

The most important piece of hardware is the microprocessor chip, which is commonly known as
the central processing unit (CPU).

New and slim laptop computers merge the traditional CPU and the graphics processing chip
(GPU) into what is called an accelerated processing unit (APU). These chips are usually attached
to the motherboard in these laptops.

The CPU and APU are responsible for all arithmetic and graphics manipulation.
The AMD CPU is an example of a microprocessor and, as shown above, is attached to a
motherboard.

A component just as important is the disk drive. This is where computer data is stored. It is
classified as secondary memory.

There are two popular types of disk drives to choose from.

Hard disk: This drive is mechanical by design and stores data on magnetic and metallic platters.
Its data is read magnetically by read/write heads, which makes it reliant on an uninterrupted
supply of power. A sudden power outage can lead to data loss or drive failure. It must be used
properly for the sake of data integrity and a long lifespan.
Solid-state disk: This new type of disk drive stores data on flash memory chips and is less prone
to erratic behavior. It is faster and more reliable even in the event of sudden power outages.

An illustration of various hardware components inside the system unit

Another vital part within a system is the motherboard. It provides communication and direct
connectivity to devices throughout the computer.
Sample Acer motherboard

Connectivity to a motherboard can be internal or external.​


Internal devices that connect to the motherboard include:

❖​ Microprocessor (CPU)

❖​ Disk drive

❖​ Random access memory (memory modules)

❖​ Power supply unit (PSU)

External peripherals that connect to the motherboard include:


●​ Monitor

●​ Keyboard

●​ Mouse

●​ Printer

2. Computer Software

The software component refers to the instructions, programs, data, and protocols that run on top
of hardware. It is also retained temporarily and persistently in primary and secondary hardware
media. The random access memory (RAM) chip is an example of primary hardware, while the
hard disk drive is an example of secondary hardware.

An illustration of software componet running on a computer


Software can be divided into ​
System software,​
Application softwares​
Malicious, and ​
Programming softwares.

System Software​
The system manages other software and devices inside the computer. The foremost example of
system software is the operating system (OS).
In a typical setup, the operating system is like the motherboard for software. It is the first thing
that is installed, followed by applications and other software. Three popular operating systems
for traditional computers include Windows, Mac OS X, and Linux. Popular mobile operating
systems include Android OS, iPhone OS, Windows Phone OS, and Firefox OS.

Application Software​
This is designed for end-users to perform a specialized assignment in order to output useful
information. An example would be a word processing application used to compose letters or
brochures, such as Microsoft Word. Other popular examples include Adobe Photoshop, Corel
Draw, and AutoCAD.
A collection of application software is bundled in a package that is commonly known as a
software suite. A typical suite includes software for word processing, presentation, graphic
design, and spreadsheet. Examples include Microsoft Office, OpenOffice, and iWork.​
NB: Software is written in computer languages such as Visual Basic, C, and Java.
The software component is stored on optical media, disk drives, and cloud storage spaces.

Malicious Software​
Malware is short for malicious software, which is a generic term that refers to exploitative code
designed by criminals and black hat hackers to maim normal operations of a computer. Malware
attacks can result in data loss and hacker access to private information. Affected computers can
also be converted into zombies and used in a bigger mission of criminal activities like launching
denial of service attacks (DOS) and spreading spam.​
Malware scripts are delivered to the computer as viruses, trojans, rootkits, keyloggers, worms, or
through email and websites as adware, spyware, ransomware, and scareware.

Programming Software​
These are tools used by developers to create all kinds of software like Windows OS and Word
processing. Also called languages, they are used to write source codes, debug errors, maintain
and create new software for computers and write malicious scripts like viruses and trojans.
Popular examples of high-level languages are Java, Javascript, BASIC, PHP, Visual Basic,
Visual C++, Visual Basic, Python, Ruby, Perl, and Java.

3. Humanware​
The humanware component refers to the person that uses the computer. More specifically, it is
about the individual that makes hardware and software components productive.

Typically, a great deal of testing is done on software packages and hardware parts to ensure they
enhance the end-user experience to aid in creating documents, musical and video recordings, and
all forms of raw and finished data.

A baby using a computer is one cute example of humanware.


CHAPTER FIVE
Diverse and Growing Computer/Digital Applications.

Some diverse and rapidly growing computer/digital applications include data analysis, cloud
computing, robotics, coding, graphics, social media, video editing, communication and
telemedicine, computer-assisted design, data mining, electronic health records (EHRs), patient
monitoring, quantum computing, video streaming, and word processing.

Data Analysis:​
Utilizing machine learning algorithms to extract valuable insights from large datasets, crucial
for various industries.

Cloud Computing:​
Providing scalable and flexible computing services through online platforms like Google Drive,
Dropbox, and Office 365.

Robotics:​
Development of automated systems with intelligent capabilities, expanding into manufacturing
and healthcare sectors.

Coding:​
The foundation for building software applications, websites, and games, with high demand for
skilled coders.

Graphics:​
Utilizing computer technology to create visual content for entertainment, design, and education.

Social Media:​
Platforms enabling users to connect, share content, and interact online.

Video Editing:​
Editing and manipulating video footage for professional and personal use.
Communication and Telemedicine:​
Utilizing digital platforms for remote healthcare consultations and patient monitoring.

Computer-Assisted Design (CAD):​


Creating 3D models and designs using computer software across various industries.

Data Mining:​
Extracting patterns and insights from large datasets using advanced algorithms.

Electronic Health Records (EHRs):​


Digital storage of patient medical information for efficient access and management.

Patient Monitoring:​
Real-time tracking of vital signs and health metrics using wearable devices and remote
monitoring systems.

Quantum Computing:​
Emerging technology with potential to solve complex problems beyond the capabilities of
traditional computers.

Video Streaming:​
Delivering video content over the internet for live or on-demand viewing.
CHAPTER SIX
Information Processing and Its Roles in Society.

Information processing is a fundamental part of modern society that helps people make decisions
and take actions. It's a cognitive process that involves analyzing and interpreting information to
understand the world around us.
How information processing impacts society:

❖​ Decision-making​
Information processing helps people make informed decisions about their personal lives and
the world around them.
❖​ Business​
Businesses use information processing to improve customer experiences, identify market
trends, and develop new products.
❖​ Governance​
Data-driven decision-making helps governments be more transparent and responsive to
citizens.
❖​ Communication​
Information processing has made it possible for people to communicate with each other
globally.
CHAPTER SEVEN
The Internet, Its Applications, and Its Impact on the World Today.

The internet is a vast, global network that connects computers and devices all over the world. It's
like a giant spider web, with billions of interconnected points that allow people to communicate,
share information, and access resources from anywhere with an internet connection.

Here's a breakdown of what the internet is:

●​ A Network of Networks: The internet is not a single network but rather a network of
networks. It connects countless smaller networks, from home networks to large corporate
networks, allowing them to communicate with each other.
●​ A Global System: The internet spans the entire globe, connecting people and devices
across continents and oceans.
●​ Uses Standardized Protocols: The internet relies on a set of rules and standards called
protocols, which ensure that devices can communicate with each other seamlessly,
regardless of their location or type. The most important of these is the TCP/IP protocol
suite.
●​ Enables Communication and Information Sharing: The internet enables various forms of
communication, including email, instant messaging, video conferencing, and social
media. It also provides access to a vast amount of information through the World Wide
Web and other online resources.
●​ Constantly Evolving: The internet is constantly evolving, with new technologies and
applications being developed all the time.

Key Concepts:

●​ World Wide Web (WWW): Often confused with the internet itself, the web is actually a
part of the internet. It's a collection of interconnected documents and resources that can
be accessed using web browsers.
●​ IP Address: Every device connected to the internet has a unique identifier called an IP
address, which allows it to be located and communicate with other devices.
●​ Domain Name System (DNS): The DNS translates human-readable domain names (like
google.com) into IP addresses, making it easier for people to find websites and other
online resources.
●​ Packet Switching: Data sent over the internet is broken down into small packets, which
travel independently to their destination and are then reassembled. This makes the
network more efficient and resilient.

Impact of the Internet on Modern Society

Today, the internet connects people from all over the world and allows for a global conversation.
It has altered society in many ways, from cultural exchange to social and economic development.
It has rewritten many rules of engagement, and the internet has enabled many new ways of
thinking and connecting. The internet has enabled us to access an almost infinite source of
information, from news sites to local news sources. Now, we can access any news source from
anywhere.The speed of access varies by location, cost, and bandwidth availability. However,
internet use has improved the quality of life for everyone. Those who want to read the news can
choose the sources they want to use and what information they want to receive. It’s been
postulated that about 95% of all information available has been digitised and made accessible
via the internet. This processing system has also led to a complete transformation in
communication, availability of knowledge, and social interaction.

However, as with all major technological changes, there are positive and negative effects of the
internet in society too.

The Internet’s Positive effects on Society Include the following:

❖​ The internet provides effective communication using email and instant messaging
services, no matter where you are.
❖​ It saves time, which improves business relationships and transactions.
❖​ Shopping and banking online has made everyday life less complex.
❖​ You can get global news without relying on television or newspapers.
❖​ The availability of millions of books and journals online has provided a huge boost to
education. Students can now take online courses using the internet. Research has become
easier as a result.
❖​ Modern job applications have become simpler, as most jobs are posted online and
applications are now the norm.
❖​ Professionals can now enhance their research by exchanging information and materials
online.

The Internet’s Negative effects on Society Include the following:

❖​ The availability of illegal or inappropriate materials online that isn’t age-suitable is easy
to access.
❖​ Long periods of screen time can negatively affect our health and communication skills by
causing insomnia, eye strain, and anxiety and depression.
❖​ A person’s personal and professional life can be disrupted by an addiction to social
networking
❖​ Some criminals use the internet to hack into people's accounts for nefarious purposes,
such as stealing data or financial information. ​

Application of the Internet


Communication:
❖​ Global Connectivity: The internet has connected people worldwide, breaking down
geographical barriers. Communication has become instantaneous, allowing for real-time
interaction with individuals and communities across the globe.
❖​ Social Media: Platforms like Facebook, Twitter, Instagram, and others have emerged,
enabling people to share their lives, thoughts, and opinions with a wide audience. This
has altered the way we form and maintain relationships.

Information Access:

❖​ Information Overload: The internet provides access to an immense amount of


information on virtually any topic. This has led to both a democratization of knowledge
and the challenge of navigating through vast amounts of data.
❖​ Education: Online learning has become more prevalent, offering flexible and accessible
educational opportunities. It has also facilitated lifelong learning and skill development.

Commerce:

❖​ E-Commerce: Online shopping has become a significant part of the economy. Consumers
can purchase goods and services from the comfort of their homes, leading to changes in
retail and business models.
❖​ Digital Payments: The internet has facilitated the transition from traditional cash
transactions to digital payments, making financial transactions more efficient.

Work and Business:

❖​ Remote Work: The internet has enabled remote work, allowing individuals to work from
any location with an internet connection. This has implications for work-life balance, job
flexibility, and the nature of office space.
❖​ Startups and Entrepreneurship: The internet has lowered barriers to entry for
entrepreneurs, allowing for the creation of online businesses and startups with global
reach.

Entertainment:

❖​ Streaming Services: The rise of platforms like Netflix, Hulu, and others has changed the
way we consume entertainment, providing on-demand access to a vast array of content.
❖​ Gaming: Online gaming has become a significant industry, connecting players worldwide
and creating virtual communities.

Political and Social Activism:

❖​ Online Activism: Social media and online platforms have played a role in political
movements and social activism, providing a platform for individuals to voice their
opinions and mobilize for social change.

Privacy and Security:

❖​ Privacy Concerns: The internet has raised concerns about the privacy of personal
information, as individuals share a significant amount of data online. This has led to
debates and discussions

about digital privacy and data protection.

❖​ Cybersecurity: The interconnected nature of the internet has brought about new
challenges in terms of cybersecurity, with increased risks of hacking, identity theft, and
other cybercrimes.

Cultural Impact:

❖​ Cultural Exchange: The internet has facilitated cultural exchange on a global scale,
allowing people from different cultures to share and appreciate diverse perspectives.
❖​ Online Communities: The formation of online communities around shared interests has
created new avenues for social interaction and cultural expression.
CHAPTER EIGHT
The different areas/programs of the computing discipline.

Fields of study in computer science

Here are 15 computer science disciplines you can explore:

Artificial intelligenceArtificial intelligence, or AI, is the study and design of systems that can function
autonomously from human input. Examples of AI are programs that offer music recommendations based
on your previous listening habits or programs that can play complex games like chess against a human
competitor. Some AI studies focus on creating machines that can perform human tasks like visual
perception or speech recognition. Machine learning is a subset of AI that focuses specifically on the
possibility of creating a machine that can use algorithms and programming to mirror the processes of the
human mind.

Programming languages and logic​


Programming languages are an integral part of computer science because most other applications of this
study include optimizing these languages so they can write complex programs using the smallest amount
of code that the computer can understand.

Scientific computing applications​


Scientific computing applications is a study of computer science that uses computer algorithms and
modeling capabilities to predict the outcome of scientific experiments that scientists can't conduct
physically. Some situations where scientific experiments need to be conducted through models are:

❖​ Big in scale: Some scientific experiments or research are simply too big to conduct an
experiment accurately outside of a digital model, such as predicting the progress of climate
change and its effects.
❖​ Dangerous: Some materials or chemical reactions may be too dangerous or unethical to
conduct in person, like experimenting with toxic or radioactive chemicals.
❖​ Expensive: Some experiments or research are too expensive or time-consuming. Using
scientific computing can help to speed up these processes for a fraction of the cost, such as
repeatedly crash testing aircraft for safety optimization.

Scientific computing is multidisciplinary because it includes people who are experts in the field
that require the model, as well as the computer scientist to build the algorithms.
Theory of computation​
The theory of computation is a discipline that focuses on determining what problems computational
algorithms solve and if they can solve them completely or partially. The ultimate purpose of this subject
is to determine what the fundamental capabilities and limitations of computers are. There are three major
branches of this subject:

❖​ Automata theory and formal languages: Automata theory is the study of abstract machines called
automata, which computer scientists use to describe and analyze the behavior of computer
systems.
❖​ Computability theory: Computability theory, or recursion theory, is the study of what decision
problems a computer program can and cannot solve. A decision problem is a yes or no question
that can have an infinite number of factors. For example , if a computer can determine whether
numbers in a set are even or odd, no matter what the numbers are, that would be a decision
problem the computer can solve.
❖​ Computational complexity: Computational complexity focuses on how much time and memory
different algorithms require. The more resources the algorithm requires, the more complex it is.

Data structures and algorithms​


This discipline focuses on the way data structures and algorithms can interact and how computer
scientists can improve them to create better computer programs. A data structure is a location where you
can organize and store data. An algorithm is a set of tasks you can command from a computer. You can
use an algorithm to retrieve and perform computations on the data, which creates a computer program.
The focus of this discipline is to study the overlap of these two functions and optimize them.

Computer architecture and organization​


This discipline focuses on the study, design, implementation, and operation of a computer system.
Architecture focuses on how the design of hardware like computers, storage devices, and network
connection components store programs, transmit data, and facilitate connections with other devices.
Organization is how those components connect and how to optimize those connections.

Computer networks​
The study of computer networks focuses on the analysis, design, and implementation of networks that
link computers together. For example, the internet is a type of network that links computers together.
Computer scientists study how to develop these links using different connections, like light signals or
radio waves. They also work to develop protocols that establish limitations and protections for these
networks.
Computer security in cryptography​
This includes creating software that is invulnerable to theft, destruction, fraud, or access by an
unauthorized user. Cryptography is a part of computer security developed to protect data. It is the
practice of using algorithms to encrypt information by translating it from its natural state into a
hard-to-decipher pattern using a set of rule-based computations, as well as using algorithms to decrypt
data.

Databases and data mining​


The study of databases and data mining focuses on how computer scientists organize and store data. Big
data is a term for large sets of data that are collected from a specific source. An example of big data
would be the location data, browsing habits, and app usage that cell phones collect to aid their users.
Data mining is combing through that data to identify patterns. One important emphasis of this discipline
is to create database structures that allow for the efficient organization and recall of data from a big data
set, as well as facilitating easy and quick data mining.

Computer graphics and visualization​


This discipline of computer science focuses on the display of computer systems and the control of
images on the computer screen. This includes studying and improving the hardware capabilities of a
computer. It also handles:

❖​ Rendering: Generating a realistic image from a two-dimensional model using a computer


program.
❖​ Modeling: Generating probable outcomes based on a set of criteria.
❖​ Animation: Creating the effect of movement through a sequence of still images on a display
screen.
❖​ Visualization: Interpreting data into a graphic form and interacting with the data to manipulate
the graphic.

Image and sound processing​


Image and sound processing focuses on studying the forms that information can take and how to
interpret and process that information. Image processing is when you use a digital computer to interpret
an image as a set of data that you can manipulate. Manipulating the set of data can create more exact
changes to the image than altering it manually.

Concurrent, parallel and distributed computing​


This discipline is the study of computers and networks that have multiple computations happening at
once. The central question of this topic is how to design machines or strategies that can improve the
speed and correctness with which these simultaneous tasks occur. Concurrent computing is when
multiple computations happen at once. Computer scientists can improve concurrent computing through a
distributed system, which is when multiple computers connect to a network and process individual
computations at one time.

Human-computer interaction​
This topic within computer science focuses on how users interact with computers and the user interface
that facilitates this interaction. The central aim of this discipline is to construct hardware and software
that makes using the computer easy and manageable for its user, without insisting on their knowing
computer science. This discipline incorporates user psychology, anthropology, and engineering because
it focuses on interpreting users' instincts and expectations. Then, computer scientists construct hardware
and software that meet those expectations.

Software engineering​
Software engineering​ focuses on using engineering approaches to the theory and practice of building
software systems. The cost and time involved in developing complex software include teams of
computer scientists. The process of software engineering consists of

❖​ Development of requirements
❖​ Analysis of possibilities
❖​ Design
❖​ Construction
❖​ Validation or checking to make sure it meets requirements
❖​ Deployment of the software Operation
CHAPTER NINE
The Job Specializations for Computing Professionals. ​

15 Careers That Involve Computer Science Specializations


Professionals who specialize in computer science are often knowledgeable about internet security,
programming languages, and application development. They may work in roles that require them to
improve and maintain the technological capabilities of a company and build products to satisfy clients and
stakeholders. Exploring career opportunities that include expertise in computer science can enable you to
find a position that's right for you and calculate your earning potential. In this article, we define computer
science specializations and explain the job duties and salary information for 15 occupations in the
technology industry.

Here's a list of 15 jobs in the technology industry that specialize in computer science. For the most
up-to-date Indeed salaries, please click on the links below:

Cybersecurity analyst​
Primary duties: A cybersecurity analyst is a professional who protects an organization's network from
threats in the digital landscape. Working closely with upper management, the cybersecurity analyst
recommends safeguards to keep information confidential and authorize internal employees to use parts of
the network. They also develop procedures to respond to emergencies and restore or back up digital items.

User experience researcher​


Primary duties: A user experience researcher is a data expert who analyzes members of target audiences to
understand what they look for when using a digital program. To help develop a product that satisfies users,
the UX researcher determines what problems the product can address and what functions are most
appealing to the demographics that are likely to use the product. The professional often uses quantitative
research, such as surveys, and qualitative research, which includes interviews.

Video game designer​


primary duties: A video game designer creates visual elements for video games for mobile devices,
computers, and gaming systems. Using programming languages and graphic design, the video game
designer builds characters and settings that coincide with the games' storylines, and they test the game for
functionality, easy navigation, and visual appeal. The professional works closely with animators and
programmers to build the game, and they
Business intelligence analyst​
Primary duties: A business intelligence analyst is a professional who evaluates the operations of a
company to identify ways to make it more successful. With expertise in data science, the business
intelligence analyst determines if the company is making progress toward its goals by assessing the
resources it uses and the challenges it's faced. Another responsibility is performing a competitor analysis,
which helps the professional stay informed about the industry and develop strategies to exceed competing
businesses.

Database Administrator​
Primary duties: A database administrator​ oversees activities in software databases that a company uses
to store and organize information, such as user login credentials, client interactions, and survey results. To
maintain the confidentiality of the records, the database administrator ensures the structures are working
effectively, and they install security procedures to identify threats, remove viruses, and restore lost data.
The administrator may also install updates on the databases to boost their performance and expand their
capabilities.

UX designer​ ​
Primary duties: A UX designer is a professional who develops computer programs by prioritizing the
needs and desires of end-users. Working by themselves or with a group of developers, the UX designer
creates technology that's accessible and appealing to users and is functional enough to help them reach
their goals. The work of a UX designer may appear on a website or a mobile or computer application.

System engineer​
Primary duties: A system designer is an industry specialist who creates a project for conceptualizing,
developing, and implementing a system, such as a new software application or piece of computer
hardware. To maximize efficiency for the process, the systems engineer compiles a list of necessary
resources, collaborating with professionals and establishing parameters to evaluate the success of the
project. They also prioritize the safety and security of their products and lend technical expertise to assist
other technology specialists on their team.

Algorithm engineer​
Primary duties: An algorithm engineer is a professional who specializes in the development and
implementation of algorithms, which are sets of rules that can solve issues within technology. After the
design process is complete, the algorithm engineer conducts experiments to determine if the algorithm can
address a particular problem that a client or organization has with their software applications. They also
use the findings of their experiments to optimize the performance of the algorithm before the
implementation process begins.

Front-end developer​
Primary duties: A front-end developer is an industry expert who builds the front end of a website, which is
the part that users can see when they use the product. To create the interface, the front-end developer uses
programming languages, such as HTML, and they control how information and visual elements display on
the screen so users can navigate the website. The professional also ensures the interface performs
optimally and maintains its layout regardless of the browser and type of device the user chooses to access
the website.

Network Security Engineer​


Primary duties: A network security engineer is an IT professional who installs safeguards to protect a
computer network from harm, which can include viruses and malware. Network security engineers
analyze the performance of the computer to identify malfunctions and prevent them from recurring, and
they conduct tests to see how vulnerable the network is to external threats. Examples of defense
mechanisms for the network include encrypting the data on important files and implementing firewalls to
stop the entry from unauthorized users.

Full-stack developer​
Primary duties: A full-stack developer is a versatile technology specialist who manages all elements of a
computer system, including the front end and the back end. To build quality products, full-stack
developers examine the parts of the programs that users can see and manipulate, such as the navigation
menu on a mobile application, and they develop the internal architecture that allows the programs to work
properly. They also look for connections between the front and back ends to make sure the hidden
components are executing commands the visible components are displaying as intended.

Software engineer​
Primary duties: A software engineer is a technology specialist who participates in the development of new
software products, such as video games, operating systems, and computer programs. After evaluating the
needs of users, the software engineer works with developers to write the code that allows the applications
to perform specific functions, and they test the performance of the software to fix bugs. Other
responsibilities include identifying needs for updates in current programs and recording product
development for future reference on upcoming projects.
CHAPTER TEN
The future of computing.

Future of Computers

What does the future hold for computers? We may be surprised to see how much has changed from today!
There could be no physical machines as we know them; instead, advanced computing systems might fit
within our bodies or on a piece of jewelry.
Humans will continue to control these technologies, though; this is something that isn’t likely to
change in what lies ahead!

The future is here! Robotic systems are rapidly evolving, and computers have become an integral part of
our lives, providing us with a more luxurious lifestyle. Breakthroughs in many domains, such as science,
technology, and biotechnology, continue to revolutionize the capabilities of computer-based technologies
making futuristic dreams a reality.

Imagine a future where your home is brimming with possibilities thanks to the Fifth Generation of
Computers. We’re already seeing glimpses of these devices in action today, providing us with
sleek and integrated systems that allow for unprecedented
control over our electronic equipment. As technology continues to evolve, humanity can unlock countless
potential opportunities through this remarkable generation of
Computers!

The Internet of Things (IoT)

Imagine a future where you can command your appliances from anywhere else in the world no need to be
at home! This radical notion is made possible by the Internet of Things (IoT) technology, making devices
smarter and more connected than ever before. With IoT, computers are able to communicate
autonomously with each other like never seen before; it’s truly revolutionary!

Imagine living in a world where cars start on their own, ovens know when to turn off and bicycles greet
you with an automated “hello” when they sense your watch. The future of computers has much more
potential than we ever imagined enter the Internet of Things!

This technology will make homes smarter, cities better managed, schools safer, and hospitals run like
well-oiled machines. With this mind-boggling connectivity between devices powered by computer
software, it’s easy to see just how efficient our day-to-day lives can become due to the power of ‘smart’
communication.

Quantum Computers

Quantum computing will revolutionize computers in a truly remarkable way. This will apply principles
of quantum mechanics and measure data, not in bits, but qubits.
Quantum computing will redefine computers and offer them a limitless scope. Imagine a scientist
experimenting virtually instead of using any solid physical materials. Or an engineer designing a car model
without actually using any tools. When a computer is powered by quantum computing technology, it will
use physical matter such as subatomic particles such as photons and electrons to carry out complex
processing. Data will exist in multiple states, and a computer will make billions of copies of a
computation, which will also exist in different states. Quantum computing is the
future of computers that will offer limitless possibilities for all.

Desktop Virtualization​
Computers will not only get more powerful, but they will also get virtual. Offices and workspaces will get
digital in a way that they will offer users to use them virtually.
Desktop virtualization is a technology that will broaden the scope and make organizations connect their
resources with others from a global perspective.
Desktop virtualization, in the simplest sense, is a way to make the computer accessible from anywhere
and by anybody. Remote users can log on to the computer from any
corner of the earth and operate it virtually. Desktops will become virtual so that they get connected to a
network. Operating systems and applications will be located in a cloud server, and users can access it
from anywhere. Any computer, laptop, smartphone, or tablet can get virtual and behave as a virtual​
desktop. This desktop virtualization technology pools resources to get connected easily across so many
platforms and make data more secure and safe.

Bio-Computers

Make way for the newest medical marvels—bio-computers! Imagine taking a computer not just as
small as an aspirin, but actually swallowing it like one. Or, getting a chip implanted in your hand to
constantly monitor any unexpected changes in your DNA
cells? Believe it or not, this is no longer science fiction. These amazing new technologies are closer than
ever and will revolutionize healthcare by providing cutting-edge solutions for biotechnology fields.

Imagine a computer that is much, much smaller than it currently exists and doesn’t just offer tremendous
processing power but can even learn by itself! This will be possible with bio-computers of the future,
which have biological and organic elements running processes to store data. With such technology
available in the near future, there are
endless possibilities for how we could use this new form of computing from detecting abnormalities or
badly structured DNA to providing large benefits both economically and socially.
Artificial Intelligence

Artificial Intelligence, or AI for short, is already transforming the way computers think and act. But its
potential goes far beyond what we’ve seen so far; this rapidly evolving technology promises to take
computer intelligence to new heights in the years ahead! From hospitals relying on automated diagnoses to
understanding customer preferences more quickly than ever before; from faster manufacturing processes
that boost
productivity, Artificial Intelligence has transformative possibilities across a range of industries like
healthcare services, education, and even farming. We are on the brink of a new era, one where robots will
not only clean our cars, serve us food, and make our homes more secure but also completely revolutionize
society through Artificial Intelligence-enabled computers. Automation is just scratching the surface: soon
we’ll be able to shop with ease and pay quickly without tedious manual entries and manufacture products
faster than ever before. AI technology stands at the heart of it all! With its help, we can unlock
unprecedented levels of efficiency in almost every aspect of life has to offer.
Optical Computers​
Our computing technology is changing rapidly in this digital age we live in, and there’s the potential for it to
get even faster. Researchers are examining how light can be applied to computers with photons controlled by
engineered particles; they’d be able to access speeds far beyond what today’s machines offer! With light being
the fastest thing known in our universe, imagine just how powerful optical-based computers could potentially
become Scientists around the world are determined to make them a reality soon enough.

You might also like