Lecture 2.1.1
Lecture 2.1.1
Course Objectives
2
Course Outcomes
CO Course Outcome
Number
3
Scheme of Evaluation
4
CONTENTS
5
• Data and information are interrelated. Data usually refers to raw
data or unprocessed data.
http://www.differencebetween.info/difference-betwee
n-data-and-information
6
Data Type & Data
Structure
Data type
•Set of possible values for variables
•Operations on those values
Ex : int, float, char ……….
Data Structure
A data structure is an arrangement of data in a
computer's memory or even disk storage.
The logical and mathematical model of a
particular organization of data is called a data
structure.
A data structure is a particular way of storing
and organizing data in a computer so that it https://www.kullabs.com/classes/subjects/units/lessons/notes/n
can be used efficiently. ote-detail/4094
7
Operations on Data
Structure
Traversing: Accessing each records exactly once so
that certain items in the record may be processed.
Searching: Finding the location of a particular
record with a given key value, or finding the location
of all records which satisfy one or more conditions.
Inserting: Adding a new record to the structure.
Deleting: Removing the record from the structure.
Sorting: Managing the data or record in some logical
order(Ascending or descending order).
Merging: Combining the record in two different
sorted files into a single sorted file.
https://www.slideshare.net/rameeshasadaqat/data-structure-its
-types
/
8
Algorithm Complexity
• Algorithm complexity is a measure which evaluates the order of the count of operations, performed by a given or algorithm
as a function of the size of the input data.
• To put this simpler, complexity is a rough approximation of the number of steps necessary to execute an algorithm.
• When we evaluate complexity we speak of order of operation count, not of their exact count.
• For example if we have an order of N2 operations to process N elements, then N2/2 and 3*N2 are of one and the same
quadratic order.
• Algorithm complexity is commonly represented with the O(f) notation, also known as asymptotic notation or “Big O
notation”, where f is the function of the size of the input data.
• The asymptotic computational complexity O(f) measures the order of the consumed resources (CPU time, memory, etc.) by
certain algorithm expressed as function of the input data size.
9
Algorithm Complexity
• 1. Space Complexity - The space complexity of an algorithm ,hence program ,is the amount of memory it needs to run to
completion. Some of the reasons for studying space complexity are:
• If the program is to run on multi-user system, it may be required to specify the amount of memory to be allocated to
the program.
• We may be interested to know in advance that whether sufficient memory is available to run the program.
• can be used to estimate the size of the largest problem that a program can solve.
• 2. Time Complexity – The time complexity of an algorithm is the amount of time it needs to run to completion. Some of the
reasons for studying time complexity are:
• We may be interested to know in advance that whether the program will provide a satisfactory real-time response. For ex, an
interactive program such as editor, must provide such response. If it takes even a few seconds to move cursor one page up or
down, it will not be acceptable to the user.
•
10
Algorithm Complexity
• Time-space tradeoff– The best algorithm , hence best program , to solve a given problem is one that requires less
space in memory and takes less time to complete its execution. But in practice, it is not always possible to achieve
both of these objectives.
• Best case Complexity – The term best-case performance is used in computer science to describe an
algorithm's behaviour under optimal conditions. For example, the best case for a simple linear search on a
list occurs when the desired element is the first element of the list.
• Average Case Complexity– The average-case complexity of an algorithm is the amount of some
computational resource used by the algorithm, averaged over all possible inputs
• Worst Case Complexity– The worst-case measures the resources (e.g. running time, memory) an algorithm
requires in the worst-case. It gives an upper bound on the resources required by the algorithm.
11
Various complexities of algorithms
• Constant: O(1) constant number of operations, not depending on the input data size,
e.g. n=10000 1 operation.
• Logarithmic: O(log n) No. of operations proportional of log n where n is the size of input data,
e.g. n=100010operations.
• Linear: O(n)no. of operations proportional to the input data size,
e.g. n=10000 10000operations.
• Quadratic: O(n2)No. of operations proportional to the square of the size of the input data.
e.g. n=500250000 operations
• Cubic: O(n3) no. of operations proportional to the cube of the size of the input data,
e.g. n=10 1000 operations
• Exponential: exponential no. of operations, fast growing,
e.g. n=20 1048576 operations.
12
Applications
• To improve the agility of software.
• To decrease maintenance cost.
• To meet architectural standards and improve code quality.
13
REFERENCES
• Lipschutz, Seymour, “Data Structures”, Schaum's Outline Series, Tata McGraw Hill.
• Goodrich, Michael T., Tamassia, Roberto, and Mount, David M., “Data Structures and Algorithms in C++”, Wiley Student
Edition.
• https://www.tutorialspoint.com/data_structures_algorithms/algorithms_basics.htm
• https://www.cs.utexas.edu/users/djimenez/utsa/cs1723/lecture2.html
• Lipschutz, Seymour, “Data Structures”, Schaum's Outline Series, Tata McGraw Hill.
• Lipschutz, Seymour, “Data Structures”, Schaum's Outline Series, Tata McGraw Hill.
• Gilberg/Forouzan,” Data Structure with C ,Cengage Learning.
• Augenstein,Moshe J , Tanenbaum, Aaron M, “Data Structures using C and C++”, Prentice Hall of India
14
THANK YOU