0% found this document useful (0 votes)
9 views

Introduction to Informatics-Copy

The document provides an introduction to informatics, detailing the evolution of computers, key concepts in information technology, and the various applications of computers across different fields. It covers essential terminology such as data, information, algorithms, and types of computers, while also highlighting the advantages and historical development of computing technology. The content is aimed at first-year students in the Faculty of Islamic Sciences at the University of Algiers, preparing them for further studies in informatics.

Uploaded by

merimebahar6
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views

Introduction to Informatics-Copy

The document provides an introduction to informatics, detailing the evolution of computers, key concepts in information technology, and the various applications of computers across different fields. It covers essential terminology such as data, information, algorithms, and types of computers, while also highlighting the advantages and historical development of computing technology. The content is aimed at first-year students in the Faculty of Islamic Sciences at the University of Algiers, preparing them for further studies in informatics.

Uploaded by

merimebahar6
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

Introduction to informatics

Ministry of Higher Education and Scientific Research


University of Algiers 1 Benyoucef Benkhedda
Faculty of Islamic Sciences
First Year - Common Trunk

Introduction to Informatics
Concepts and Principles
Prepared by: Noureddine ZERROUKI
[email protected]
[email protected]

35
2024/2025
Introduction to informatics

Introduction
The present era is marked by significant advancements in information technology,
the result of years of effort and innovation aimed at achieving optimal outcomes.
The computer has undergone numerous complex stages of development to
become what it is today—user-friendly, compact, diverse, and accessible to all
segments of society. Computers are now integral in fields such as computation,
programming, medicine, military, commerce, and more. Informatics provides an
understanding of fundamental concepts in information technology, covering
computer usage, system management, popular applications, and the essentials of
the Internet, the web, and distance learning tools.

General terminology
Technology
Application of scientific knowledge for practical purposes, especially in industry.
It encompasses tools, machines, systems, and processes used to solve problems or
perform specific functions. Technology can include physical devices (like
computers, phones) and intangible systems (like software or data networks).

Example: The invention of the smartphone is a significant technological


development that revolutionized communication.

Information
Data that has been processed, organized, or structured in a meaningful way,
making it useful and actionable. It is raw data transformed into a format that can
be interpreted by users for decision-making, communication, or analysis.

Example: Customer data like names, addresses, and purchasing history.

Information Technology (IT)


Information Technology involves the use of computers, networking, storage, and
other physical devices, infrastructure, and processes to create, process, store,

2
2024/2025
Introduction to informatics

secure, and exchange all forms of electronic data. IT is often used in business
contexts to manage and support information systems.

Informatics
The study and practice of information processing and the engineering of systems
that handle data. It encompasses the intersection of people, information, and
technology, focusing on how data is managed and used. Informatics is often used
in areas like healthcare (bioinformatics), library science, and social sciences.

Example: Bioinformatics applies informatics to analyze biological data, such as


DNA sequences, using computational tools.

Computer Science
The study of computers and computational systems, focusing on algorithms, data
structures, programming, theory of computation, and software and hardware
development. It is a discipline that encompasses both theoretical and practical
aspects of computing.

Example: Learning programming languages like Python or studying algorithms


for data sorting are parts of a computer science curriculum.

Computer
A computer is an electronic device that can process, store, and retrieve data. It
operates by executing a set of instructions (known as programs) to perform a
variety of tasks, such as calculations, data management, and communication.
Computers can process both numerical and non-numerical data and are capable
of automating complex processes.

A computer generally consists of hardware (physical components like the CPU,


memory, and storage devices) and software (programs and applications that
control the hardware and allow the user to perform tasks).

Example: A personal laptop, which allows users to browse the internet, write
documents, and run software, is a typical example of a computer.
3
2024/2025
Introduction to informatics

Data
Data refers to raw, unprocessed facts, figures, or symbols that have no inherent
meaning until they are organized or processed. Data can be numbers, text, images,
audio, or video. Once processed, data can become information.

Example: A list of numbers like [12, 45, 67] is raw data that has no context by
itself.

Processing
Actions a computer or system performs to manipulate, transform, or operate on
data based on a set of instructions. This includes analyzing, calculating, sorting,
organizing, or modifying data to produce meaningful information or results.

Processing is a key function of computing systems, allowing raw data to be


converted into useful outputs.

In Computing:
Processing happens in the central processing unit (CPU), where instructions are

executed, and data is manipulated according to a program's logic.

Example: When you use a calculator to add two numbers, the input numbers go
through a series of processing steps in the calculator's system to produce the sum.

Program
Set of instructions written in a programming language that a computer follows to
perform specific tasks. These instructions tell the computer how to process data,
execute operations, and produce desired outcomes. Programs can range from
simple scripts that perform a single function to complex software systems with
multiple features and capabilities.

Programs are written by programmers and can be designed for various purposes,
such as word processing, games, data analysis, or controlling hardware.

4
2024/2025
Introduction to informatics

Example: Microsoft Word is a program that allows users to create and edit text
documents, while a calculator app is a program that performs mathematical
operations.

Algorithm
Step-by-step set of instructions or rules designed to solve a specific problem or
perform a task. It is a conceptual framework, not tied to any specific programming
language.

Example:

1. Start, 2. Take two numbers, 3. Add them, 4. Output the result, 5.End.

Database
Organized collection of structured data typically stored and accessed
electronically from a computer system. Databases are designed to efficiently store,
retrieve, and manage data, often using tables with rows and columns.

Example: A company may have a customer database that stores names, addresses,
purchase story, and contact details in an organized format.

Database Management System (DBMS)


Software that provides tools and functions to create, manage, and manipulate
databases. It allows users to store, retrieve, and modify data in an organized
manner. It handles tasks like data security, consistency, and recovery.

Example: MySQL, PostgreSQL, Oracle, and Microsoft SQL Server.

SQL (Structured Query Language)


Standardized programming language used to manage and manipulate relational
databases. SQL enables users to query, insert, update, and delete data from
databases, as well as manage database structure.

Example: SELECT * FROM customers WHERE city = 'London';

5
2024/2025
Introduction to informatics

Big Data
Extremely large datasets that are too complex or voluminous to be processed by
traditional data management tools. Big data often includes diverse and high-speed
data from various sources, such as social media, IoT sensors, and transaction logs.
Big data is characterized by the "3 Vs": Volume, Velocity, and Variety.

Example: Analyzing all tweets on Twitter over a year is a Big Data task because
of the massive volume, variety of content, and speed of new data generation.

Computer working principle


Based on the input-process-output (IPO) cycle, which describes how data is
received, processed, and transformed into useful information.

Input
The computer receives data and instructions from external sources (users, devices,
sensors) through input devices like a keyboard, mouse, or scanner.

Example:

Typing numbers into a calculator program or scanning a barcode into a computer


system.

Processing
The central processing unit (CPU) is the brain of the computer, where the actual
processing takes place. The CPU executes instructions and manipulates data
according to the program’s logic. The operations include arithmetic calculations,
logical comparisons, and data manipulation.

6
2024/2025
Introduction to informatics

Example:

A CPU adds two numbers as instructed by a program, calculates a result.

Output
After processing, the results are sent to an output device like a monitor, printer, or
speaker. This is the stage where data is presented as useful information.

Example: A computer displays the sum of two numbers on the screen after it
performs a calculation.

Storage
Computers store data for later use, either temporarily (in RAM, or Random
Access Memory) or permanently (in hard drives or SSD, Solid State Drives).
Data can be retrieved, modified, or reprocessed as needed.

Example: Saving a Word document to the hard drive allows you to retrieve it
later.

Computer key advantages


Speed: Computers can process data and perform calculations at extremely high
speeds. Tasks that would take humans hours or days can be done in seconds or
minutes by computers.

Accuracy: Computers perform operations with a high degree of accuracy,


minimizing human error. As long as the input data and program logic are correct,
the results will be error-free.

Automation: Computers can automate repetitive or complex tasks without the


need for constant human intervention. Automation saves time and labor and
increases efficiency.

7
2024/2025
Introduction to informatics

Storage and Retrieval: Computers can store vast amounts of data in relatively
small spaces and retrieve that data quickly. Data can be stored permanently on
hard drives, cloud services, or other storage media.

Multitasking: Computers can perform multiple tasks simultaneously without a


loss in performance. Modern operating systems allow running several applications
at once.

Communication: Computers enable instant communication across the world


through networks (e.g., the internet). They allow the exchange of information via
email, video calls, messaging.

Data Management and Analysis: Computers excel at organizing, managing, and


analyzing large datasets, providing valuable insights and enabling decision-
making based on data trends and patterns.

Versatility: Used for a wide variety of tasks, from entertainment (watching


movies, playing games) to professional work (software development, graphic
design, accounting).

Connectivity and Networking: Connected to each other via networks (LAN,


WAN, or the internet), enabling resource sharing, remote access, and distributed
computing.

Cost Efficiency: Help reduce operational costs by automating tasks, reducing


human labor, and minimizing errors. They improve productivity and efficiency in
businesses and industries.

Access to Information and Education: Allow to access vast amounts of


information online, ranging from educational resources to entertainment and
professional development tools. E-learning platforms have expanded access to
education globally.

8
2024/2025
Introduction to informatics

Consistency and Reliability: Perform tasks with consistent outcomes. Unlike


humans, computers do not experience fatigue, making them ideal for tasks that
require continuous performance and reliability.

Application domains
Business and Finance:
Accounting and Financial Management, E-commerce, Customer Relationship
Management (CRM):

Education and E-learning:


E-learning Platforms, Research and simulations, virtual Labs and Educational
Software

Healthcare:
Medical Diagnosis and Imaging, Electronic Health Records, Telemedicine

Engineering and Manufacturing:


Computer-Aided Design, Automation and Robotics, 3D Printing

Science and Research:


Simulations and Modeling, Data Analysis and Visualization, Artificial
Intelligence (AI)

Entertainment and Media:


Video Games and Graphics, Streaming Services, Audio and Video Production.

Communication and Social Media:


Email and Messaging Services, Social Media, Collaborative Tools

Transportation and Navigation:


Air Traffic Control, Autonomous Vehicles, Logistics and Supply Chain
Management

9
2024/2025
Introduction to informatics

Defense and Security:


Cybersecurity, Simulation and Training, Surveillance Systems

Banking and Finance:


Online Banking, Stock Trading, Crypto-occurrency

Government and Public Services:


E-Government, Smart Cities, Public Health

Agriculture:
Precision Agriculture, Automated Machinery, Drones

Types of Computer
There are five main categories according to size and capabilities: Supercomputer,
Mainframe, Server, Personal Computer PC, and Embedded Systems.

Personal Computer (PC), also called microcomputer


• The smallest, least expensive, most widely used.
• Has small memory and low processing power.
• Usually used by one person (single-use).
• It includes three categories: Desktop, Mobile, and Wearable.

Desktop Computer:
It works on, continuously, in a fixed location.

Mobile computer
• Long-lasting battery
• Wireless capabilities such as Wi-Fi
• Portability.
• Most common: Laptop, Tablet, Smartphone

10
2024/2025
Introduction to informatics

Wearable Gadget
Small device designed to be attached or mounted on the body Specific functions
such as health monitoring, and General functions such as reading emails.

Supercomputer
• The largest in size
• The most powerful computer in terms of speed and accuracy.
• Solve complex problems
• Use multiple processors in one system.

Mainframe
• A large and powerful computer.
• Users access its resources simultaneously using peripherals or PC
• Used for central storage, central processing and managing large amounts of
data.
• It can handle huge amounts of incoming and outgoing data at the same time.
• It is flexible.

Server computer
• Powerful
• Servers are the backbone of the Internet.
• They are used to provide resources, services and functions to client
computers.
• They are of the following types: file server, database server, print, FTP,
applications, web.
• They operate without interruption
• Allows for quick switching of storage units and other devices

Embedded Systems
• Autonomous electronic devices designed to perform specific computing tasks
Consist of a set of external devices, a microprocessor chip, and software

11
2024/2025
Introduction to informatics

• The core of these systems is the microprocessor that performs the specified
task in a repetitive manner.
• They are used in control and monitoring of industrial and medical machines,
transportation, and communication.

Famous devices include Drones, ATMs, Anti-lock braking systems ABS, Digital
cameras, Digital watches.

Historical overview of informatics and computers


There are five generations since 1946

Before the first generation:

• Primitive mechanical calculating machines, then electronic


• The first programmed computer was created in 1938 in Germany, called Z1,
But it was electromechanical.

First generation 1946-1959:


Most important events:
• The first electronic computer in 1946 Called ENIAC. Making calculations to
design the hydrogen bomb, it weighed 30 tons and occupied an area of 167
meters square. It contained 18,000 vacuum tubes.
• The invention of the transistor in 1947 (Silicon 54) The Fortran programming
language in 1957.
• Invention of the integrated circuit (chip) in 1958

Features:
• Computers are very large and heavy
• Made of Vacuum tubes, consume high electrical power ---- High heat
• Speed of execution of operations is somewhat slow
• Machine language (binary system) in writing programs (very complex)
• Use of magnetic cylinder as a medium for entering data

12
2024/2025
Introduction to informatics

• Primitive printing machines to extract results

Second generation 1959-1965:


Most important events:
• Manufacturing and marketing integrated circuits in 1961.

Features:
• Replacing vacuum tubes with transistors (Reducing the size)
• The speed of executing operations is estimated at thousands per second
• Using hard disks for storage
• Easy and sophisticated programming languages instead of machine language:
algol, fortran, cobol

Third generation 1965-1970:


Most important events:
• Appearance of the first minicomputer in 1965 (8-PDP). Integrated circuits
were used in its manufacture.
• Invention of the Unix text operating system in 1969.

Features:
• Use of integrated circuits made of silicon chips in computer manufacturing
• Small size and low cost in manufacturing: the emergence of mini computers
• Speed is now measured in nanoseconds
• Emergence of fast input and output devices and color screens
• Emergence of new programming languages: logo, pascal
• Birth of the computer network

Fourth generation 1970-1980:


Most important events:
• In 1971: The American company Intel was able to assemble several
integrated circuits into one integrated circuit, announcing the birth of the first
processor (microprocessor) called "4004", which was made up of 2300
transistors and could perform ninety thousand operations per second.
13
2024/2025
Introduction to informatics

• In 1973, the first French-made microcomputer (Micral-N) was invented


using an Intel processor.
• In 1975, the first computer was launched by Apple, which was the Apple I,
followed by the Apple II in 1977, which is considered the first successful
computer on the market among ordinary users, announcing the emergence of
the name personal computer (PC).

Features:
• A major revolution in computer hardware and software at the same time.
• Smallness, accuracy, memory capacity and low cost
• Speed is measured in millions of operations per second.
• Emergence of memories (RAM) and (ROM)
• Input and output devices are more advanced and easier to use
• Floppy disks
• High-level programming languages: basic, c
• Hacking in the computer network

Fifth generation since 1981:


Most important events:
• IBM PC release supported by MS DOS operating system
• Its ability to work with other non-IBM electronic devices
• Computers compatible with the IBM standard from international companies
• Standard unification

Features:
• Huge speed and very large storage capacity
• Very advanced and object-oriented languages such as: C++, Java, Python
• Graphic operating systems: Windows that depend on the mouse.
• Spread and expansion of networks and Internet and its various services.
• Artificial intelligence, which may exceed human intelligence.
• Increase in productivity : sound and image interacting with the computer
• Huge diversity in the shapes, sizes, tasks, OS and hardware

14
2024/2025
Introduction to informatics

• Race to make supercomputers.


• Explosion of computer software industry
• Branching and ramifications of computer science

Emerging technologies and future aspirations:


Artificial intelligence
Artificial intelligence is a branch of computer science and is a specific behavior
and characteristics of computer programs that make them simulate human mental
abilities and work patterns. The most important of these characteristics are the
ability to learn, infer, and react to certain situations.

Nanocomputer
A nanocomputer is technically a computer whose basic parts do not exceed a few
nanometers. Nanocomputers are not yet commercially available, but the term has
been used in science and science fiction.

Quantum computer
A quantum computer is a computer that relies on quantum physics. It emerged as
a result of reaching the lowest possible limit for miniaturizing the transistor. It is
based on the principle of storing data in the atom. Its most important
characteristics are the possibility of the superposition state of the basic unit Qbit,
and the fantastic speed in processing and executing complex operations.

Cloud computing
Cloud computing is a term that refers to computer resources and systems available
on demand over the network that can provide a number of integrated computer
services without being restricted to local resources in order to facilitate the user.
These resources include space for data storage, backup, and self-synchronization,
as well as software processing capabilities, task scheduling, email push, and
remote printing. When connected to the network, the user can control these

15
2024/2025
Introduction to informatics

resources through an easy programming interface that facilitates and ignores


many internal details and operations.

Internet of Things (IoT)


Internet of Things (IoT) is a network of physical devices, vehicles, home
appliances, and other elements embedded with electronic devices, computers,
sensors, motors, and communication that enable these devices to communicate
and exchange data. Everything is uniquely defined by an embedded computer
system but is able to interact within the current Internet infrastructure.

The Internet of Things enables humans to effectively and easily control things
from near and far. For example, the user can: start the engine of his car and control
it from his computer. Controlling the washing machine's washing, remotely
identifying the contents of the refrigerator using an internet connection. However,
these are examples of the primitive form of the Internet of Things. The more
mature form is for different "things" to communicate with each other using the
internet.

3D printing
3D printing is an additive manufacturing technology, where parts are
manufactured by dividing their 3D designs into very small layers using computer
programs, then using 3D printers they are manufactured by printing one layer on
top of the other until the final shape of the part is formed.

Augmented Reality
Augmented Reality is based on projecting virtual objects and information into the
real environment to provide additional information or serve as a guide for it,
unlike virtual reality.

Virtual reality
Virtual reality is based on projecting real objects into a virtual environment.

16
2024/2025

You might also like