0% found this document useful (0 votes)
4 views

Time Complexity

Uploaded by

sambhavsharma108
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views

Time Complexity

Uploaded by

sambhavsharma108
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 21

Time Complexity

What Is Time Complexity?

1. Time complexity is defined in terms of how many times it


takes to run a given algorithm, based on the length of the input.

2. Time complexity is not a measurement of how much time it


takes to execute a particular algorithm because such factors as
programming language, operating system, and processing power
are also considered.
3. Time complexity is a type of computational complexity that
describes the time required to execute an algorithm. The time
complexity of an algorithm is the amount of time it takes for
each statement to complete.
Types Of Time Complexity :

1. Best Time Complexity: Define the input for which


algorithm takes less time or minimum time. In the best case
calculate the lower bound of an algorithm. Example: In the
linear search when search data is present at the first location
of large data then the best case occurs.

2. Average Time Complexity: In the average case take all


random inputs and calculate the computation time for all
inputs. And then we divide it by the total number of inputs.
3. Worst Time Complexity: Define the input for which algorithm
takes a long time or maximum time. In the worst calculate the
upper bound of an algorithm. Example: In the linear search when
search data is present at the last location of large data then the
worst case occurs.
Understanding various Time Complexity of
Notations

The following are some of the most common asymptotic notations


for calculating an algorithm's running time complexity.

1. O Notation
2. Ω Notation
3. θ Notation
Big oh Notation , O
The systematic way to express the upper limit of an
algorithm's running time is to use the Big-O notation O(n). it
calculate the worst-case time complexity, or the maximum
time an algorithm will take to complete execution.

Do remember that this is the most commonly used notation


for expressing the time complexity of different algorithms
unless specified .
Omega Notation, Ω

The systematic way to express the lower bound of an


algorithm's running time is to use the notation Ω(n). it
calculate the best-case time complexity, or the shortest time
an algorithm will take to complete.
Theta Notation , θ
The systematic way to express both the lower limit and upper
bound of an algorithm's running time is to use the notation
(n).
Time Complexity for Some condition
O(1) -> Constant

O(log n) -> Logarithmic

O(n) -> Linear

O(n^2) -> Quadratic

O(n^3) -> Cubic

O(2^n) -> Exponential

O(n!) -> Factorial


How to Calculate Time Complexity
1. Identify Basic Operations: Look at the number of basic
operations (like comparisons, assignments, arithmetic operations)
in the algorithm.

2. Count the Most Significant Operations:


-> Focus on loops, recursive calls, and statements
executed multiple times.
->Ignore constant factors and lower-order terms.

3. Analyze Loops:
-> Single Loop: A loop running from 1 to n has a time
complexity of O(n).
->Nested Loops: If one loop runs n times, and inside it,
another loop runs n times, the total time complexity is O(n^2).
4.Analyze Recursion:
-> Recurrence relations can help in determining
time complexity for recursive algorithms.
-> Example: A function that calls itself twice
for each value of n has a complexity of O(2^n).

5. Combine Time Complexities:


-> If the algorithm has multiple parts, find the
time complexity of each part and take the maximum.

-> For example, if one part takes O(n) and


another part takes O(n^2), the overall time complexity
is O(n^2).
Example 1: Single Loop

for (int i = 0; i < n; i++)


{
// Constant time operation O(1)
}

Time Complexity: O(n) because the loop runs n times.

Example 2: Nested Loop

for (int i = 0; i < n; i++)


{
for (int j = 0; j < n; j++)
{ // Constant time operation O(1) }}

Time Complexity: O(n^2) because the outer loop runs n times, and
for each iteration of the outer loop, the inner loop runs n times.
Space complexity

Space complexity in data structures is the amount of memory


an algorithm needs to solve a problem, as a function of the
input's characteristics. It's a key factor in determining how
scalable a solution is and how well a program can handle
large amounts of data.
Components of Space Complexity
1. Fixed Part
This includes the space required for constants, simple variables,
fixed-size data structures, and the code itself. It is the memory that
does not change regardless of the input size.

Examples:
-> Space needed for storing constants like numbers or fixed-size
arrays.

->Memory used by the program code and instructions.


2. Variable Part
This includes the space required for dynamic memory allocation,
recursive stack space, and temporary variables whose size
depends on the input size.

Examples:
-> Space needed for dynamic data structures like linked lists,
trees, or graphs, which grow with the input size.

-> Memory used for the recursion stack during recursive function
calls.

-> Temporary variables and data structures used during the


execution of the algorithm.
Common Space Complexities
Space Complexity Description Examples

O(1) Constant space Finding max in an array

O(n) Linear space Storing a list of n elements

O(n^2) Quadratic space 2D matrix operations

O(log n) Logarithmic space Binary search recursion


depth
O(n log n) Linearithmic space Merge Sort

O(2^n) Exponential space Subset sum problem using


recursion
O(n!) Factorial space Generating all
permutations of a string
Data Structure Space Explanation
Complexity
Array O(n) Space is proportional to the number of
elements stored in the array.
Linked List O(n) Space is proportional to the number of
nodes, each node having data and a
pointer.
Stack O(n) Space is proportional to the number of
elements in the stack.
Queue O(n) Space is proportional to the number of
elements in the queue.
Hash Table O(n) Space is proportional to the number of
elements stored, including array and linked
lists or other collision handling.
Binary Tree O(n) Space is proportional to the number of
nodes in the tree.
Binary Search Tree O(n) Space is proportional to the number
of nodes in the tree.

Balanced Trees (e.g., O(n) Space is proportional to the number


AVL, Red-Black Tree) of nodes in the tree.

Heap (Binary Heap) O(n) Space is proportional to the number


of elements in the heap.
Thank You

You might also like