Slide 2
Slide 2
Need of Algorithm
► 10. To measure the behavior (or performance) of the methods in all cases
(best cases, worst cases, average cases)
► 11. With the help of an algorithm, we can also identify the resources
(memory, input-output) cycles required by the algorithm.
► 12. With the help of algorithm, we convert art into a science.
► 13. To understand the principle of designing.
► 14. We can measure and analyze the complexity (time and space) of the
problems concerning input size without implementing and running it; it will
reduce the cost of design.
Complexity of Algorithm
► The term algorithm complexity measures how many steps are required by the
algorithm to solve the given problem. It evaluates the order of count of
operations executed by an algorithm as a function of input data size.
► To assess the complexity, the order (approximation) of the count of operation
is always considered instead of counting the exact steps.
► O(f) notation represents the complexity of an algorithm, which is also termed
as an Asymptotic notation or "Big O" notation. Here the f corresponds to the
function whose size is the same as that of the input data. The complexity of
the asymptotic computation O(f) determines in which order the resources
such as CPU time, memory, etc. are consumed by the algorithm that is
articulated as a function of the size of the input data.
Continued…
► Constant Complexity:
► It imposes a complexity of O(1). It undergoes an execution of a constant
number of steps like 1, 5, 10, etc. for solving a given problem. The count of
operations is independent of the input data size.
► Logarithmic Complexity:
► It imposes a complexity of O(log(N)). It undergoes the execution of the order
of log(N) steps. To perform operations on N elements, it often takes the
logarithmic base as 2.
► For N = 1,000,000, an algorithm that has a complexity of O(log(N)) would
undergo 20 steps (with a constant precision). Here, the logarithmic base does
not hold a necessary consequence for the operation count order, so it is
usually omitted.
Linear Complexity:
► Since the constants do not hold a significant effect on the order of count
of operation, so it is better to ignore them. Thus, to consider an algorithm
to be linear and equally efficient, it must undergo N, N/2 or 3N count of
operation, respectively, on the same number of elements to solve a
particular problem.
How to approximate the time taken by
the Algorithm?
► So, to find it out, we shall first understand the types of the algorithm we
have. There are two types of algorithms:
► Iterative Algorithm: In the iterative approach, the function repeatedly runs until
the condition is met or it fails. It involves the looping construct.
► Recursive Algorithm: In the recursive approach, the function calls itself until the
condition is met. It integrates the branching structure.
► However, it is worth noting that any program that is written in iteration could
be written as recursion. Likewise, a recursive program can be converted to
iteration, making both of these algorithms equivalent to each other.
Continued…
► But to analyze the iterative program, we have to count the number of times
the loop is going to execute, whereas in the recursive program, we use
recursive equations, i.e., we write a function of F(n) in terms of F(n/2).
► Suppose the program is neither iterative nor recursive. In that case, it can be
concluded that there is no dependency of the running time on the input data
size, i.e., whatever is the input size, the running time is going to be a
constant value. Thus, for such programs, the complexity will be O(1).
For Iterative Programs
► Consider the following programs that are written in simple English and
does not correspond to any syntax.
► In the first example, we have an integer i and a for loop running from i
equals 1 to n. Now the question arises, how many times does the name get
printed?
1. A
2. {
3. int i;
4. for (i=1 to n)
5. printf("Edward");
6. }
Continued…
1. A
2. {
3. int i, j:
4. for (i=1 to n)
5. for (j=1 to n)
6. printf("Edward");
7. }
► In this case, firstly, the outer loop will run n times, such that for each time,
the inner loop will also run n times. Thus, the time complexity will be O(n2).
Example3:
1. A
2. {
3. i = 1; S = 1;
4. while S<=n)
5. {
6. i++;
7. SS = S + i;
8. printf("Edward");
9. }
10. }
Continued…
► As we can see from the above example, we have two variables; i, S and then we
have while S<=n, which means S will start at 1, and the entire loop will stop
whenever S value reaches a point where S becomes greater than n.
► Here i is incrementing in steps of one, and S will increment by the value of i, i.e.,
the increment in i is linear. However, the increment in S depends on the i.
► Initially;
► i=1, S1
► After 1st iteration;
► i=2, S3
► After 2nd iteration;
► i=3, S6
► After 3rd iteration;
► i=4, S10 … and so on.
Continued…
►
For Recursive Program
1. A(n)
2. {
3. if (n>1
4. return A(n-1))
5. }
► Solution;
► Here we will see the simple Back Substitution method to solve the above
problem.
► T(n) = 1 T(n-1) …Eqn. 1
Continued…
► Now, according to Eqn. 1, i.e. T(n) = 1 T(n-1), the algorithm will run until
n>1. Basically, n will start from a very large number, and it will decrease
gradually. So, when T(n) = 1, the algorithm eventually stops, and such a
terminating condition is called anchor condition, base condition or
stopping condition.
► Thus, for k = n-1, the T(n) will become.
► Step5 Substitute k = n-1 in eqn. 5
► T(n) = (n-1) + T(n-(n-1)) = (n-1) + T1 = n-11
► Hence, T(n) = n or O(n).
?