Time and Space Complexity
Time and Space Complexity
Complexity
Time Complexity
Applying the asymptotic analysis to measure the time requirement of an algorithm as a function of input size
is known as time complexity. We assume each instruction takes a constant amount of time for time
complexity analysis.
Total time complexity of a program is equal to the summation of all the running time of disconnected
fragments
Need for Time Complexity: When analyzing any algorithm, we need to evaluate the effectiveness of that
algorithm. Accordingly, we need to prefer the most optimized algorithm so as to save the time taken by it to
execute. An example for the same could be linear search and binary search. Let's suppose we need to search
a given value in a sorted array of size 109. This would take 109 iterations for linear search whereas it would just
take log(109) ~ 30 iterations for binary search to search the same element, thereby saving a lot of time. Time
Complexity for both binary search and linear search will be discussed in further lectures
Types of Time Complexity Analysis:
a. Worst Case Time Complexity: It is that case where the algorithm takes the longest time to complete its
execution. It is represented by the Big-O notation. By default we calculate the worst case time complexity
when we refer to the time complexity of a code.
b. Best Case Time Complexity: It is that case where the algorithm takes the least time to complete its
execution. It is represented by the Omega Notation (Ω-Notation).
c. Average Case Time Complexity: As the name suggests, it gives average time for a program to complete its
execution. It is represented by the theta notation (Θ-Notation).
Q1. Calculate the time complexity for the following code snippet.
int c = 0;
for(int i = 1; i < n; i += i) {
c++;
Answer: O(logn)
Explanation: As we are traversing the loop from 1 and each time we double its value, the value of i will be like 1
,2, 4, … ,2^k,2^k+1 and it will exit the loop as soon as 2^k<n,
Finally at 2^k+1 the value exceeds n thus loop breaks so 2^k<n => k<log(n)
Q2. Calculate the time complexity for the following code snippet.
Java + DSA
int c = 0;
for(int i = 1; i < n; i += i) {
c++;
Answer: O(n)
Explanation: The value of outer loop will go like 1,2,4,8 … ,2^k as in prev ques
While the inner loop will run i items. So for each i the number of operation will be
Equal to i thus number of operation are 1+2+4+8+ .. +2^k suck 2^k<n so it will nearly go upto
So a total of logn terms, now of series 1+2+4 + .. 2^k is given by 2^k-1 and k is of order logn
2^k is of order n thus 2^k-1 is of order n as well so the time complexity is O(n)
Q3. Calculate the time complexity for the following code snippet.
int c = 0;
for(int i = 1; i < n; i += i) {
c++;
Answer: O(nlogn)
Explanation: Outer loop is running for logn as showed in Q.1, and inter loop is running n times for each i (going
from n to 1 by decrement of 1 at each step) So overtime complexity
Q4. Calculate the time complexity for the following code snippet.
int c = 0;
for(int i = 1; i < n; i *= 2) {
c++;
Answer: O(n)
Explanation: outer loop will go like i=1, 2 ,4, 8 .. 2^k such that 2^k is less than n so k is of order log(n) and inner
loop will run i times for each i that gives
1+2+4+8+16+ …. K times => 2^logn-1 => n operations which is similar to what we got in Q.2 thus the time
complexity is O(logn)
Java + DSA
Q5. Calculate the time complexity for the following code snippet.
int c = 0;
for(int i = 1; i * i < n; i *= 2) {
c++;
Answer: O(sqrt(n) )
Explanation: The inner loop will run i times for each loop and the outer loop will go from i=1, 2,4,8 2^k such that
square(2^k)<n so appx 2^(2k)=n thus 2k=log(n),k=log(n)/2
So outer loop will run log(n)/2 times thus sum of series is 1+2+4 … log(n)/2 times
Also log(n)/2 can we written as log(sqrt(n)), as we know sum of series 1+2+4+..k times is
Q6. Calculate the time complexity for the following code snippet.
int c = 0;
for(int i = 1; i * i < n; i += i) {
c++;
Answer: O(nlogn)
Explanation: As explained previously the outer loop will run log(sqrt(n)) times and the inner loop is running n-i
times for each I so time complexity, so its number of operations will be,
n-1 + n-2+n-4+ .. log(sqrt(n)) times => n(log(sqrt(n)) - (1+2+4+ ..) => n(log(sqrt(n))-sqrt(n) => n/2log(n)-
sqrt(n) , so in big O notation n/2log(n) term will dominate so time complexity will be O(n/2logn) or simply
O(nlogn)
Q7. Calculate the time complexity for the following code snippet.
int c = 0;
for(int i = 2; i < n; i *= i) {
c++;
Answer: O(logn)
Explanation: i is being squared each time so i=> 2,4,8..2^k, such that 2^k<n
So k<log(n) hence ther will total log(n) operation so time complexity is O(logn)
Q8. Calculate the time complexity for the following code snippet.
Java + DSA
int c = 0;
for(int i = 2; i * i < n; i *= i) {
c++;
k<log(n)/2 , so k is of order log(sqrt(n)) so time complexity is log(sqrt(n)) which nearly of Order log(n)
Q9. Given an array of size n+1 consisting of integers from 1 to n. One of the elements is duplicate in the array.
Find that duplicate element.
Output: 3
if (arr[i] == arr[j])
return arr[i];
return -1;
TimeComplexity: O(n^2), traversing the whole array 1+2+3 ..+n time which is of order n^2
//method 2 sorting
Java + DSA
// compare array element with its index
if (arr[i] != i + 1) {
return arr[i];
return -1;
Traverse the array and compare the array elements with its index
Otherwise, the array does not contain duplicates from 1 to n-1, In this case, return -1
int sum = 0;
sum += arr[i];
Explanation: Sum of the first n natural number is n*(n+1)/2 so we can simply subtract this from the sum of the
arr to get the repeating element
Time Complexity: O(n), traversing the whole array to cal the sum
Asymptotic Notations: Asymptotic Notations are mathematical tools that allow you to analyze an algorithm’s
running time by identifying its behavior as its input size grows.
Java + DSA
There are mainly three asymptotic notations:
Describes both the upper and lower bounds of an algorithm's time complexity
Represents a tight bound on how the algorithm's performance scales with input size
Example: Θ(n) means the algorithm's time grows linearly and is neither faster nor slower than a linear
function for input size n.
Java + DSA
Q11. What if this time we increment the pointer by 2?
Explanation: here each time i is increasing by 2 so loop will run n/2 times and the text will be printed n/2 times
which indeed verifies the time complexity we calculated is right
Q12. Calculate the time complexity for traversing 2 arrays of size n and m.
a[i] = i;
b[i] = m - i;
Time Complexity: O(n+m), as the first loop runs n times and the second m , so total n+m operations
Explanation: firstly n operations are being performed by the first loop as i go from o to n-1 similarly m are being
performed in the loop to i go from 0 to m-1 so in total n+m operations, thus O(n+m).
System.out.print("okay");
The first loop is running n times and the inside loop is running m times, and the inside loop will be called again &
again for each iteration of the outer loop so for i=1 it will do m operations same for 2 ..n thus in total there will be
n*m operations are performed, so time complexity is O(n*m)
Java + DSA
Q14. Calculate the time complexity for the below code snippet.
int c = 0;
c++;
Q15. Calculate the time complexity for the below code snippet.
int c = 0;
c++;
Case 1: if (m>n) => the operation will (m-1)+(m-1) .. (m-n) = n/2(2m-n-1) which is of order O(m*n)
Case 2: if(n>m) => the operation will be m-1,m-2 … 1,0 = m(m-1)/2, which if order O(m*m) or O(m^2)
Space Complexity
Space Complexity is the extra memory space requirement of an algorithm by applying asymptotic analysis
We only care about what is known as auxiliary space complexity, which does not include the input size
Need for Space Complexity: The amount of space a system has can be limited and therefore we need to
optimize the memory/space taken by the algorithm to execute on that particular system with bounded
space limits.
Q1. Calculate the space complexity for the below code snippet.
Java + DSA
for (int i = 0; i < n; i++) {
a[i]++;
Q2. What will be the space complexity if we just traverse without creating any array?
int c = 0;
c++;
Space Complexity: O(1), because here we just declaring a single int apart from that we are using any extra
space
Q3. Calculate the space complexity for the below nested loop code snippet.
Java + DSA
Q4. Space Complexity of creating a 2d matrix
Q5. What will be the space complexity if we create 3 arrays of the same size?
c[i]++;
Q6. Calculate the time and space complexity for the below nested loop code snippet.
a[i][j]++;
Java + DSA
Space Complexity: O(n^2) ( or O(n^2/2) to be precise)
Explanation: A 2d matrix is being declared with n rows and n/2 columns thus as a total of n*n/2 space wil bel
used
Time Complexity: O(n*logn) ( or O(n/2log(n) to be precise)
Explanation: Outer loop is running logn times and the inner loop is running n/2 times for each i
Java + DSA
THANK
YOU !