0% found this document useful (0 votes)
113 views

Time and Space Complexity

Uploaded by

adigavhane058
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
113 views

Time and Space Complexity

Uploaded by

adigavhane058
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 13

Lesson Plan

Time and Space

Complexity
Time Complexity
Applying the asymptotic analysis to measure the time requirement of an algorithm as a function of input size
is known as time complexity. We assume each instruction takes a constant amount of time for time
complexity analysis.
Total time complexity of a program is equal to the summation of all the running time of disconnected
fragments
Need for Time Complexity: When analyzing any algorithm, we need to evaluate the effectiveness of that
algorithm. Accordingly, we need to prefer the most optimized algorithm so as to save the time taken by it to
execute. An example for the same could be linear search and binary search. Let's suppose we need to search
a given value in a sorted array of size 109. This would take 109 iterations for linear search whereas it would just
take log(109) ~ 30 iterations for binary search to search the same element, thereby saving a lot of time. Time
Complexity for both binary search and linear search will be discussed in further lectures
Types of Time Complexity Analysis:
a. Worst Case Time Complexity: It is that case where the algorithm takes the longest time to complete its
execution. It is represented by the Big-O notation. By default we calculate the worst case time complexity
when we refer to the time complexity of a code.
b. Best Case Time Complexity: It is that case where the algorithm takes the least time to complete its
execution. It is represented by the Omega Notation (Ω-Notation).
c. Average Case Time Complexity: As the name suggests, it gives average time for a program to complete its
execution. It is represented by the theta notation (Θ-Notation).

Q1. Calculate the time complexity for the following code snippet.

int c = 0;

for(int i = 1; i < n; i += i) {

c++;

Answer: O(logn)
Explanation: As we are traversing the loop from 1 and each time we double its value, the value of i will be like 1
,2, 4, … ,2^k,2^k+1 and it will exit the loop as soon as 2^k<n,

Finally at 2^k+1 the value exceeds n thus loop breaks so 2^k<n => k<log(n)

Thus number of operations will in order of log(n)

Q2. Calculate the time complexity for the following code snippet.

Java + DSA
int c = 0;

for(int i = 1; i < n; i += i) {

for(int j = 0; j < i; j++) {

c++;

Answer: O(n)
Explanation: The value of outer loop will go like 1,2,4,8 … ,2^k as in prev ques

While the inner loop will run i items. So for each i the number of operation will be

Equal to i thus number of operation are 1+2+4+8+ .. +2^k suck 2^k<n so it will nearly go upto 

So a total of logn terms, now of series 1+2+4 + .. 2^k is given by 2^k-1 and k is of order logn

2^k is of order n thus 2^k-1 is of order n as well so the time complexity is O(n)

Q3. Calculate the time complexity for the following code snippet.

int c = 0;

for(int i = 1; i < n; i += i) {

for(int j = n; j >=0; j--) {

c++;

Answer: O(nlogn)
Explanation: Outer loop is running for logn as showed in Q.1, and inter loop is running n times for each i (going
from n to 1 by decrement of 1 at each step) So overtime complexity 

Is logn * n i.e O(nlogn)

Q4. Calculate the time complexity for the following code snippet.

int c = 0;

for(int i = 1; i < n; i *= 2) {

for(int j = 0; j < i; j++) {

c++;

Answer: O(n)
Explanation: outer loop will go like i=1, 2 ,4, 8 .. 2^k such that 2^k is less than n so k is of order log(n) and inner
loop will run i times for each i that gives

1+2+4+8+16+ …. K times => 2^logn-1 => n operations which is similar to what we got in Q.2 thus the time
complexity is O(logn)

Java + DSA
Q5. Calculate the time complexity for the following code snippet.

int c = 0;

for(int i = 1; i * i < n; i *= 2) {

for(int j = 0; j < i; j++) {

c++;

Answer: O(sqrt(n) )
Explanation: The inner loop will run i times for each loop and the outer loop will go from i=1, 2,4,8 2^k such that
square(2^k)<n so appx 2^(2k)=n thus 2k=log(n),k=log(n)/2

So outer loop will run log(n)/2 times thus sum of series is 1+2+4 … log(n)/2 times

Also log(n)/2 can we written as log(sqrt(n)), as we know sum of series 1+2+4+..k times is

2^k-1 thus here it will be of order 2^(log(sqrt(n)) which equal sqrt(n)

Q6. Calculate the time complexity for the following code snippet.

int c = 0;

for(int i = 1; i * i < n; i += i) {

for(int j = n; j > i; j--) {

c++;

Answer: O(nlogn)
Explanation: As explained previously the outer loop will run log(sqrt(n)) times and the inner loop is running n-i
times for each I so time complexity, so its number of operations will be,

n-1 + n-2+n-4+ .. log(sqrt(n)) times => n(log(sqrt(n)) - (1+2+4+ ..) => n(log(sqrt(n))-sqrt(n) => n/2log(n)-
sqrt(n) , so in big O notation n/2log(n) term will dominate so time complexity will be O(n/2logn) or simply
O(nlogn)

Q7. Calculate the time complexity for the following code snippet.

int c = 0;

for(int i = 2; i < n; i *= i) {

c++;

Answer: O(logn)
Explanation: i is being squared each time so i=> 2,4,8..2^k, such that 2^k<n

So k<log(n) hence ther will total log(n) operation so time complexity is O(logn)

Q8. Calculate the time complexity for the following code snippet.

Java + DSA
int c = 0;

for(int i = 2; i * i < n; i *= i) {

c++;

Answer: O(log(sqrt(n) ) or simply O(logn)


Explanation: => i will go like 2,4, … 2^k , such that (2^k)^2<n or simply 2^(2k)<n so

k<log(n)/2 , so k is of order log(sqrt(n)) so time complexity is log(sqrt(n)) which nearly of Order log(n)

Q9. Given an array of size n+1 consisting of integers from 1 to n. One of the elements is duplicate in the array.
Find that duplicate element.

Input: a[] = {1, 3, 2, 3, 4}

Output: 3

Explanation: The number 3 is the only repeating element.

//method 1 Use two nested loops. The outer loop traverses


through all elements and the inner loop checks if the
element picked by the outer loop appears anywhere else.

static int findRepeating(int[] arr)

for (int i = 0; i < arr.length; i++) {

for (int j = i + 1; j < arr.length; j++) {

if (arr[i] == arr[j])

return arr[i];

return -1;

TimeComplexity: O(n^2), traversing the whole array 1+2+3 ..+n time which is of order n^2

SpaceComplexity: O(1), not using any extra space

//method 2 sorting

static int findRepeating(int[] arr, int N)

Arrays.sort(arr); // sort array

for (int i = 0; i <= N; i++) {

Java + DSA
// compare array element with its index

if (arr[i] != i + 1) {

return arr[i];

return -1;

Explanation: Sort the given array.

Traverse the array and compare the array elements with its index 

if arr[i] != i+1, it means that arr[i] is repetitive, So Just return arr[i]. 

Otherwise, the array does not contain duplicates from 1 to n-1, In this case, return -1

Time complexity: O(N * log N)

Auxiliary Space: O(1)

//method 3 using math

static int findRepeating(int[] arr, int N)

int sum = 0;

for (int i = 0; i <= N; i++)

sum += arr[i];

return sum - (((N +1) * N) / 2);

Explanation: Sum of the first n natural number is n*(n+1)/2 so we can simply subtract this from the sum of the
arr to get the repeating element

Time Complexity: O(n), traversing the whole array to cal the sum

Space Complexity: O(1), not using any extra space

# Notations for different types of Time Complexity

Asymptotic Notations: Asymptotic Notations are mathematical tools that allow you to analyze an algorithm’s
running time by identifying its behavior as its input size grows.

Java + DSA
There are mainly three asymptotic notations:

Big O Notation (O-notation):

Describes the worst-case scenario of an algorithm's time complexity


Represents the upper bound on how the algorithm's performance scales with input size
Example: O(n^2) means the algorithm's time grows no faster than n^2 for input size n.

Big Omega Notation (Ω-notation):

Describes the best-case scenario of an algorithm's time complexity


Represents the lower bound on how the algorithm's performance scales with input size
Example: Ω(n) means the algorithm's time is at least linear for input size n.

Big Theta Notation (Θ-notation):

Describes both the upper and lower bounds of an algorithm's time complexity
Represents a tight bound on how the algorithm's performance scales with input size
Example: Θ(n) means the algorithm's time grows linearly and is neither faster nor slower than a linear
function for input size n.

Q10. Calculate the time complexity for iterating in a loop.

for (int i = 0; i < n; i++) {

System.out.println("I Love Coding");

Time Complexity: O(n)


Explanation: We previously said if the loop is running n times, so time complexity is O(n), here we can verify that
we run the following code I love coding will be printed n times which indeed tells us n operations have been
performed, so time complexity if O(n)

Java + DSA
Q11. What if this time we increment the pointer by 2?

for (int i = 0; i < n; i += 2) {

System.out.println("I Love Coding");

Time Complexity: O(n/2) (which can approximated to the order of n so O(n) ) 

Explanation: here each time i is increasing by 2 so loop will run n/2 times and the text will be printed n/2 times
which indeed verifies the time complexity we calculated is right

Q12. Calculate the time complexity for traversing 2 arrays of size n and m.

int[] a = new int[n];

int[] b = new int[m];

for (int i = 0; i < n; i++) {

a[i] = i;

for (int i = 0; i < m; i++) {

b[i] = m - i;

Time Complexity: O(n+m), as the first loop runs n times and the second m , so total n+m operations

Explanation: firstly n operations are being performed by the first loop as i go from o to n-1 similarly m are being
performed in the loop to i go from 0 to m-1 so in total n+m operations, thus O(n+m).

Q13. What if this time we traverse them in a nested manner?

for (int i = 0; i < n; i++) {

for (int j = 0; j < m; j++) {

System.out.print("okay");

Time Complexity: O(n*m)


Explanation: Here if you see the output ‘okay’ will be printed n*m times that's because

The first loop is running n times and the inside loop is running m times, and the inside loop will be called again &
again for each iteration of the outer loop so for i=1 it will do m operations same for 2 ..n thus in total there will be
n*m operations are performed, so time complexity is O(n*m)

Java + DSA
Q14. Calculate the time complexity for the below code snippet.

int c = 0;

for(int i = 1; i <= n; i*=k) {

c++;

Time Complexity: O(log(n))


Explanation: each time i is multiplied by k so i will go like 1,k,k^2 .. k^m such k^m<=n

So m=logk(n) or log(base_k)(m)) so time complexity is O(log(n))

Note: k>1, for k<1 loop will be running forever

Q15. Calculate the time complexity for the below code snippet.

int c = 0;

for(int i = 0; i < n; i++) {

for(int j = i+1; j < m; j++) {

c++;

Time Complexity: O(m*min(n,m))


Explanation: first loop is running n time from i=0 to i=n-1 for each i inner loop is running m-i-1 times so the total
operation for i=0,1, ..n-1 will be m-1+m-2+m-3 up to n times or until m-i-1 become zero

Case 1: if (m>n) => the operation will (m-1)+(m-1) .. (m-n) = n/2(2m-n-1) which is of order O(m*n)

Case 2: if(n>m) => the operation will be m-1,m-2 … 1,0 = m(m-1)/2, which if order O(m*m) or O(m^2)

So the two cases can be combined and written as O(m*min(n,m))

Space Complexity
Space Complexity is the extra memory space requirement of an algorithm by applying asymptotic analysis

We only care about what is known as auxiliary space complexity, which does not include the input size

Need for Space Complexity: The amount of space a system has can be limited and therefore we need to
optimize the memory/space taken by the algorithm to execute on that particular system with bounded
space limits.

Q1. Calculate the space complexity for the below code snippet.

int[] a = new int[n];

Java + DSA
for (int i = 0; i < n; i++) {

a[i]++;

Space Complexity: O(n)


Explanation: Here we are only declaring an array of size n, thus O(n) is space

Q2. What will be the space complexity if we just traverse without creating any array?

int c = 0;

for(int i = 0; i < n; i++) {

c++;

Space Complexity: O(1), because here we just declaring a single int apart from that we are using any extra
space

Q3. Calculate the space complexity for the below nested loop code snippet.

ArrayList<Integer> a = new ArrayList<>(n);

ArrayList<Integer> b = new ArrayList<>(m);

for (int i = 0; i < n; i++) {

for (int j = 0; j < m; j++) {

a.set(i, a.get(i) + 1);

b.set(j, b.get(j) + 1);

Time Complexity: O(n*m), two nested loops


Space Complexity: O(n+m),
Explanation: as we declare two arrays one of size m, and the other of size n, it doesn’t matter how many times
we Traverse them we are not using any extra space only space we used was when we declared the array

Java + DSA
Q4. Space Complexity of creating a 2d matrix

int[][] arr = new int[n][m];

Space Complexity: O(n*m)


Explanation: Consider it as a 2-d table as shown in the above image which has n rows and m colums so each
row takes up m space and there n such rows so space taken is n*m

Q5. What will be the space complexity if we create 3 arrays of the same size?

int[] a = new int[n];

int[] b = new int[n];

int[] c = new int[n];

for (int i = 0; i < n; i++) {

c[i]++;

Space Complexity: O(n) ( or O(3n) to be precise)


Explanation: each array takes up n space so a total of 3n space is allotted thus space complexity is O(3n)
which is of order n so can be written O(n)

Q6. Calculate the time and space complexity for the below nested loop code snippet.

int[][] a = new int[n][n/2];

for (int i = 1; i < n; i *= 2) {

for (int j = 0; j < n/2; j++) {

a[i][j]++;

Java + DSA
Space Complexity: O(n^2) ( or O(n^2/2) to be precise)
Explanation: A 2d matrix is being declared with n rows and n/2 columns thus as a total of n*n/2 space wil bel
used
Time Complexity: O(n*logn) ( or O(n/2log(n) to be precise)
Explanation: Outer loop is running logn times and the inner loop is running n/2 times for each i

So a total of logn*n/2 operations are being performed

Java + DSA
THANK

YOU !

You might also like