INTERNAL_1-DAA-KEY
INTERNAL_1-DAA-KEY
Reg.
Yr/Sem/Dept: II/III/AI&DS Date: 27.9.24
No:
A finite set of instruction that specifies a sequence of operation is to be carried out in order to solve a
while b != 0:
temp = b
b=a%b
a = temp
return a
# Example usage:
num1 = 48
num2 = 18
4. Compare between worst-case, best-case, and average- case efficiencies of algorithm (AU.,
APR/MAY 2023)
The best-case time complexity represents the minimum amount of time an algorithm takes to
complete when given an input that leads to the most favorable conditions.
The average-case time complexity represents the average amount of time an algorithm takes to
complete over all possible inputs of a given size. It considers the expected performance over a
distribution of inputs.
The worst-case time complexity represents the maximum amount of time an algorithm takes to
complete when given an input that leads to the least favorable conditions.
Time complexity measures the amount of time an algorithm takes to complete as a function of the size
of the input (denoted as n).
Space complexity measures the amount of memory space an algorithm uses as a function of the size of
the input.
i) Dynamic programming
ii) Backtracking
i) Input
ii) Output
iii) Well-definedness
iv) Definiteness
8. Describe the following important problem types i) Sorting ii) Searching iii) Combinatorial iv)
i) Sorting
Description: Sorting involves arranging data in a particular order, typically ascending or descending.
It's a fundamental operation that often serves as a precursor to other algorithms, making the data easier
to work with.
Examples: Bubble Sort, Merge Sort, Quick Sort, Heap Sort.
ii) Searching
Description: Searching refers to finding specific data within a collection of data. This can be done in
an unordered or ordered manner, and the efficiency of the search often depends on the structure of the
data.
Examples: Linear Search, Binary Search, Depth-First Search (DFS), Breadth-First Search (BFS).
iii) Combinatorial
Description: Combinatorial problems deal with the arrangement and combination of objects according
to specific rules. These problems are often associated with optimization and counting.
Examples: Permutations, Combinations, Traveling Salesman Problem, Knapsack Problem.
Description: Graph problems involve the study of graphs, which are mathematical structures used to
model pairwise relations between objects. They can represent networks like social networks, computer
networks, or biological networks.
Examples: Shortest Path (Dijkstra's Algorithm), Minimum Spanning Tree (Prim's and Kruskal's
Algorithms), Graph Coloring, Network Flow.
9. Explain the steps for Analyzing the efficiency of the algorithm with example ( AU .,NOV/DEC2023)
10. Explain briefly Asymptotic notation with suitable example (AU ., Nov/Dec 2023 /APR/MAY2023)
Asymptotic Notations are mathematical tools used to analyze the performance of algorithms
by understanding how their efficiency changes as the input size grows.
There are mainly three asymptotic notations:
1. Big-O Notation (O-notation)
2. Omega Notation (Ω-notation)
3. Theta Notation (Θ-notation)
1. Theta Notation (Θ-Notation) :
Theta notation encloses the function from above and below. Since it represents the upper and the lower
bound of the running time of an algorithm, it is used for analyzing the average-case complexity of an
algorithm.
.Theta (Average Case) You add the running times for each possible input combination and take the average
in the average case.
Let g and f be the function from the set of natural numbers to itself. The function f is said to be Θ(g), if
there are constants c1, c2 > 0 and a natural number n0 such that c1* g(n) ≤ f(n) ≤ c2 * g(n) for all n ≥ n0
Theta notation
If f(n) describes the running time of an algorithm, f(n) is O(g(n)) if there exist a positive constant C and n0
such that, 0 ≤ f(n) ≤ cg(n) for all n ≥ n0
It returns the highest possible output value (big-O)for a given input.
The execution time serves as an upper bound on the algorithm’s time complexity.
11. Explain the fundamentals of Algorithmic Problem Solving with neat Diagram.
Algorithmic problem solving is a systematic approach to solving computational problems. The process
involves several key steps to develop an efficient and effective algorithm. Let's break down the fundamentals
and include a neat diagram for better understanding.
Problem Definition:
o Clearly define the problem that needs to be solved, including the input and the expected output.
Understanding the Problem:
o Analyze the problem thoroughly to understand its requirements and constraints.
Designing an Algorithm:
o Develop a step-by-step procedure to solve the problem. This includes choosing appropriate
data structures and techniques.
Proving the Correctness:
o Ensure that the algorithm produces the correct output for all possible inputs.
Analyzing the Algorithm:
o Determine the time complexity and space complexity to evaluate the efficiency of the
algorithm.
Implementing the Algorithm:
o Write the code to implement the algorithm in a programming language.
Testing and Debugging:
o Test the algorithm with various inputs to ensure it works as expected and debug any issues that
arise.
Optimization (if necessary):
o Optimize the algorithm to improve its performance if needed.