0% found this document useful (0 votes)
24 views26 pages

DAA Soltions

The document provides an overview of various algorithms and data structures, including definitions, characteristics, and complexities. It covers topics such as algorithms, Fibonacci heaps, binary search trees, the fractional knapsack problem, graph coloring, and more. Additionally, it discusses specific algorithms like Dijkstra's and the Boyer-Moore algorithm, along with their applications and time complexities.

Uploaded by

devil742638
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
24 views26 pages

DAA Soltions

The document provides an overview of various algorithms and data structures, including definitions, characteristics, and complexities. It covers topics such as algorithms, Fibonacci heaps, binary search trees, the fractional knapsack problem, graph coloring, and more. Additionally, it discusses specific algorithms like Dijkstra's and the Boyer-Moore algorithm, along with their applications and time complexities.

Uploaded by

devil742638
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 26

1. What do you mean by an algorithm? Write the characteristics of an algorithm.

An algorithm is a finite sequence of well-defined instructions designed to perform a specific task or


solve a particular problem. It takes input, processes it, and produces output.

Characteristics of an Algorithm:

• Finiteness: An algorithm must always terminate after a finite number of steps.

• Definiteness: Each step in an algorithm must be clear and unambiguous.

• Input: An algorithm should have zero or more inputs.

• Output: It should produce at least one output as the result.

• Effectiveness: Each step must be basic enough to be performed manually or by a computer


in a reasonable amount of time.

• Generality: It should solve a class of problems, not just one instance.

2. Show that 10n2+9=O(n2)10n^2 + 9 = O(n^2)10n2+9=O(n2):

To prove 10n2+9∈O(n2)10n^2 + 9 \in O(n^2)10n2+9∈O(n2), we use the definition of Big-O notation:

f(n)∈O(g(n)) if there exist constants c>0 and n0 such that f(n)≤c⋅g(n) for all n≥n0.f(n) \in O(g(n)) \text{
if there exist constants } c > 0 \text{ and } n_0 \text{ such that } f(n) \leq c \cdot g(n) \text{ for all } n
\geq n_0.f(n)∈O(g(n)) if there exist constants c>0 and n0 such that f(n)≤c⋅g(n) for all n≥n0.

Let f(n)=10n2+9f(n) = 10n^2 + 9f(n)=10n2+9 and g(n)=n2g(n) = n^2g(n)=n2. For large nnn, the
dominant term is 10n210n^210n2.

10n2+9≤11n2(for n≥1).10n^2 + 9 \leq 11n^2 \quad \text{(for \(n \geq 1\))}.10n2+9≤11n2(for n≥1).

Thus, we can choose c=11c = 11c=11 and n0=1n_0 = 1n0=1. Therefore, 10n2+9=O(n2)10n^2 + 9 =
O(n^2)10n2+9=O(n2).

3. Write a short note on Fibonacci Heap:

A Fibonacci Heap is a data structure for priority queues that supports a collection of trees. It is
particularly efficient for operations like decrease-key and delete.

Key Features:

• Composed of a collection of min-heaps or max-heaps.

• Supports operations like insert, find-min, delete-min, decrease-key, and merge.

• The amortized time complexity for operations is:

o Insert: O(1)O(1)O(1)

o Find-Min: O(1)O(1)O(1)

o Delete-Min: O(log⁡n)O(\log n)O(logn)


o Decrease-Key: O(1)O(1)O(1).

Fibonacci Heaps are used in algorithms like Dijkstra's shortest path and Prim's minimum spanning
tree.

4. Explain Binary Search Tree (BST):

A Binary Search Tree (BST) is a binary tree where each node has at most two children, and the tree
satisfies the following properties:

• The left subtree of a node contains nodes with values less than the node's value.

• The right subtree of a node contains nodes with values greater than the node's value.

• Both left and right subtrees are also binary search trees.

Applications:

• Searching, insertion, and deletion (time complexity O(log⁡n)O(\log n)O(logn) for balanced
BSTs).

• Used in database indexing and memory storage.

5. Define the fractional Knapsack problem:

The Fractional Knapsack Problem is a greedy algorithm problem where a thief can take fractions of
items rather than whole items to maximize the total value in the knapsack.

Problem Statement:

• Given nnn items with weights wiw_iwi and values viv_ivi, and a knapsack with capacity
WWW, the goal is to maximize the total value by selecting items or fractions of items such
that their total weight does not exceed WWW.

Approach:

• Calculate the value-to-weight ratio for each item: vi/wiv_i / w_ivi/wi.

• Sort items by this ratio in descending order.

• Select items or fractions of items based on the sorted order until the knapsack is full.

6. Name of Spanning Tree Algorithms with Complexity:

• Kruskal's Algorithm: O(Elog⁡E)O(E \log E)O(ElogE), where EEE is the number of edges.

• Prim's Algorithm: O(E+Vlog⁡V)O(E + V \log V)O(E+VlogV), where VVV is the number of


vertices.

7. Define the term “Graph Coloring”:


Graph Coloring is the assignment of colors to the vertices of a graph such that no two adjacent
vertices share the same color.

Applications:

• Scheduling Problems: Assigning time slots or resources.

• Map Coloring: Ensuring no adjacent regions share the same color.

• Register Allocation: Assigning registers to variables in a program.

The minimum number of colors required for a graph is called its chromatic number.

8. What do you mean by the Activity Selection Problem?

The Activity Selection Problem is a greedy algorithm problem that involves selecting the maximum
number of activities that can be performed by a single person or resource, given that no two
selected activities overlap in time.

Approach:

• Sort activities by their finish time.

• Select the first activity and iteratively choose the next activity that starts after the current
one finishes.

Time Complexity: O(nlog⁡n)O(n \log n)O(nlogn) for sorting.

9. What do you mean by the Boyer-Moore Algorithm?

The Boyer-Moore Algorithm is a string-searching algorithm known for its efficiency in finding
substrings within a text.

Key Features:

• Preprocesses the pattern to determine the bad character rule and good suffix rule, which
help in skipping unnecessary comparisons.

• Performs comparisons from right to left in the pattern.

Time Complexity:

• Best Case: O(n/m)O(n/m)O(n/m), where nnn is the text length and mmm is the pattern
length.

• Worst Case: O(n⋅m)O(n \cdot m)O(n⋅m), but often performs better in practice.

10. Fast Fourier Transform (FFT):

The Fast Fourier Transform (FFT) is an efficient algorithm to compute the Discrete Fourier Transform
(DFT) and its inverse. The DFT converts a signal from its time domain to its frequency domain, widely
used in signal processing, image analysis, and solving differential equations.

Key Features of FFT:


• Reduces the time complexity of DFT from O(n2)O(n^2)O(n2) to O(nlog⁡n)O(n \log
n)O(nlogn).

• Uses a divide-and-conquer approach by breaking the DFT into smaller DFTs.

• Often implemented using the Cooley-Tukey algorithm.

11. Counting Sort for Array A={2,5,3,0,2,3,0,3}A = \{2, 5, 3, 0, 2, 3, 0, 3\}A={2,5,3,0,2,3,0,3}:

Steps of Counting Sort:

1. Find the range of the elements (minimum =0= 0=0, maximum =5= 5=5).

2. Create a count array to store the frequency of each element in AAA.

3. Modify the count array to hold the cumulative frequency.

4. Place elements from AAA into their correct positions in a sorted array.

5. Result: [0,0,2,2,3,3,3,5][0, 0, 2, 2, 3, 3, 3, 5][0,0,2,2,3,3,3,5].

Implementation Example:

• Count Array: [2,0,2,3,0,1][2, 0, 2, 3, 0, 1][2,0,2,3,0,1].

• Cumulative Count: [2,2,4,7,7,8][2, 2, 4, 7, 7, 8][2,2,4,7,7,8].

• Sorted Array: [0,0,2,2,3,3,3,5][0, 0, 2, 2, 3, 3, 3, 5][0,0,2,2,3,3,3,5].

12. Properties of Binomial Tree (Prove):

A Binomial Tree BkB_kBk is defined recursively:

1. B0B_0B0 is a single node.

2. BkB_kBk is formed by linking two Bk−1B_{k-1}Bk−1 trees.

Properties of Binomial Tree:

1. Number of Nodes: A BkB_kBk has 2k2^k2k nodes.

2. Height: The height of BkB_kBk is kkk.

3. Child Distribution: The root of BkB_kBk has kkk children, which are roots of
Bk−1,Bk−2,…,B0B_{k-1}, B_{k-2}, \ldots, B_0Bk−1,Bk−2,…,B0.

4. Recursive Structure: BkB_kBk can be split into two Bk−1B_{k-1}Bk−1 trees.

Proof: These properties can be proved by induction and recursion.

14. Short Notes:

(i) Randomized Algorithm:


An algorithm that makes random choices during execution to reduce time complexity or simplify
implementation. Example: QuickSort (random pivot selection).

(ii) Approximation Algorithm:

Algorithms used for optimization problems where finding the exact solution is computationally
expensive. Example: Greedy algorithms for TSP.

16. Stable Sorting Algorithms:

A sorting algorithm is stable if it preserves the relative order of equal elements.

Stable Algorithms:

• Bubble Sort

• Merge Sort

• Insertion Sort

Unstable Algorithms:

• Quick Sort

• Heap Sort

Example: Sorting [(4,A),(4,B)][(4, A), (4, B)][(4,A),(4,B)]: Stable sorts keep AAA before BBB.

17. Algorithm for Merge Sort:

Algorithm:

1. Divide the array into two halves.

2. Recursively sort each half.

3. Merge the two sorted halves into a single sorted array.

Time Complexity:

• Worst Case: O(nlog⁡n)O(n \log n)O(nlogn)

18. Inserting Elements in an RB Tree

Input Sequence: 61,58,51,32,39,2961, 58, 51, 32, 39, 2961,58,51,32,39,29

Rules of Red-Black Tree (RB Tree):

1. Every node is either red or black.

2. The root is always black.

3. Red nodes cannot have red children (no consecutive reds).

4. Every path from a node to its descendant leaves contains the same number of black nodes.

Steps to Insert:
1. Insert the element using the BST rules.

2. Fix any violations of the RB tree properties by rotations or recoloring.

19. B-Tree and Properties:

A B-Tree is a self-balancing search tree that maintains sorted data for efficient insertions, deletions,
and searches.

Properties:

1. All leaves are at the same level.

2. Each node has at most 2t−12t-12t−1 keys and 2t2t2t children (for degree ttt).

3. Each internal node has at least t−1t-1t−1 keys.

Deletion Cases:

1. Key is in a leaf node.

2. Key is in an internal node.

20. Longest Common Subsequence (LCS):

Given X={A,B,C,B,D,A,B}X = \{A, B, C, B, D, A, B\}X={A,B,C,B,D,A,B} and Y={B,D,C,A,B,A}Y = \{B, D, C, A,


B, A\}Y={B,D,C,A,B,A},
LCS = {B, C, B, A}.

21. Backtracking for Subset Sum Problem

Input: Set S={1,3,4,5}S = \{1, 3, 4, 5\}S={1,3,4,5}, Target X=8X = 8X=8

Definition:

Backtracking systematically explores all subsets to find the ones that sum to XXX.

Steps:

1. Start with an empty subset.

2. Add elements to the subset if the total sum does not exceed XXX.

3. Backtrack (remove the last added element) if the sum exceeds XXX.

Example:

• Start: [][][], Add 111, Add 333, Add 444 → Success (1+3+4=81+3+4=81+3+4=8).

• Alternate Path: [][][], Add 333, Add 555 → Success (3+5=83+5=83+5=8).

22. Minimum Cost Spanning Tree (MST):

An MST connects all vertices in a graph with the minimum total edge weight.

Kruskal’s Algorithm:

1. Sort edges by weight.


2. Add edges to the MST, ensuring no cycles.

3. Stop when V−1V-1V−1 edges are added.

Time Complexity: O(Elog⁡E)O(E \log E)O(ElogE).

23. Dijkstra’s Algorithm:

Steps:

1. Initialize distances from source to all vertices (000 for source, ∞\infty∞ for others).

2. Use a priority queue to select the vertex with the smallest distance.

3. Update distances for its neighbors.

4. Repeat until all vertices are processed.

24. Apply Branch and Bound Technique to Solve TSP

Branch and Bound (BnB) for TSP:

The goal of the Traveling Salesman Problem (TSP) is to find the shortest possible route that visits
each city exactly once and returns to the starting city.

Steps in Branch and Bound:

1. Initialize the Problem:

o Represent the cities as a cost matrix. The cost at (i,j)(i, j)(i,j) represents the distance
between city iii and city jjj.

2. Bounding Function:

o Perform row and column reductions to calculate a lower bound (minimum cost of
visiting all cities from the current node).

3. Branching:

o Divide the problem into smaller subproblems by considering all possible paths from
the current city.

4. Priority Queue:

o Maintain a priority queue of subproblems sorted by their lower bounds.

o Expand the node with the smallest lower bound first.

5. Solution:

o Terminate when all cities are visited and return to the starting city.

Example:
Cost matrix:
[∞20301015∞35251030∞15202530∞]\begin{bmatrix} \infty & 20 & 30 & 10 \\ 15 & \infty & 35 &
25 \\ 10 & 30 & \infty & 15 \\ 20 & 25 & 30 & \infty \end{bmatrix}∞15102020∞30253035∞30
102515∞

Follow the above steps to compute the minimum tour cost.

25. Explain P, NP, NP-Hard, and NP-Complete with Examples

1. P (Polynomial Time):

o Problems solvable in polynomial time on a deterministic machine.

o Example: Sorting an array (O(nlog⁡n)O(n \log n)O(nlogn)).

2. NP (Nondeterministic Polynomial):

o Problems where solutions can be verified in polynomial time.

o Example: Checking if a given subset sums to a target value.

3. NP-Hard:

o Problems as hard as any NP problem but may not belong to NP.

o Example: Halting Problem.

4. NP-Complete:

o Problems that are in NP and as hard as the hardest problems in NP.

o Example: Traveling Salesman Problem, 3-SAT Problem.

Relationship:

P⊆NP,NP-Hard⊇NP-Complete.P \subseteq NP, \quad \text{NP-Hard} \supseteq \text{NP-


Complete}.P⊆NP,NP-Hard⊇NP-Complete.

26. Explain KMP Matcher and Implement It

KMP Algorithm Steps:

1. Preprocessing:
Compute the prefix function (π\piπ) for the pattern PPP. This array indicates the longest
prefix of PPP that is also a suffix.

2. Pattern Matching:
Use π\piπ to skip unnecessary comparisons while searching for PPP in TTT.

Example:

• Pattern PPP: aababbaababbaababb

• Text TTT: baabaabbababaabaabbababaabaabbaba

Prefix Function:
π=[0,1,0,1,2,0,1].\pi = [0, 1, 0, 1, 2, 0, 1].π=[0,1,0,1,2,0,1].

Match PPP against TTT by skipping characters using π\piπ.

27. Difference Between Greedy Technique and Dynamic Programming

Feature Greedy Technique Dynamic Programming

Key Idea Make the best local choice at each step. Solve overlapping subproblems.

Structure No backtracking or recomputation. Uses memoization or tabulation.

Optimality May not produce the global optimum. Guarantees the global optimum.

Examples Kruskal’s Algorithm, Prim’s Algorithm. Longest Common Subsequence, Matrix Chain.

28. Solve T(n)=4T(n/3)+n2T(n) = 4T(n/3) + n^2T(n)=4T(n/3)+n2 Using Master Method

The recurrence is of the form:

T(n)=aT(nb)+f(n).T(n) = aT\left(\frac{n}{b}\right) + f(n).T(n)=aT(bn)+f(n).

Here:

• a=4a = 4a=4, b=3b = 3b=3, f(n)=n2f(n) = n^2f(n)=n2.

• Compare f(n)f(n)f(n) with nlog⁡ba=nlog⁡34≈n1.26n^{\log_b{a}} = n^{\log_3{4}} \approx


n^{1.26}nlogba=nlog34≈n1.26.

Since f(n)=n2>n1.26f(n) = n^2 > n^{1.26}f(n)=n2>n1.26, the recurrence falls under Case 3 of Master
Theorem:

T(n)=O(f(n))=O(n2).T(n) = O(f(n)) = O(n^2).T(n)=O(f(n))=O(n2).

29. Explain Single-Source Shortest Path

Definition:

Find the shortest paths from a source vertex to all other vertices in a weighted graph.

Algorithms:

1. Dijkstra's Algorithm:

o Works for non-negative weights.

o Complexity: O(Vlog⁡V+E)O(V \log V + E)O(VlogV+E) with a priority queue.

2. Bellman-Ford Algorithm:

o Handles graphs with negative weights.

o Complexity: O(V⋅E)O(V \cdot E)O(V⋅E).


30. Compare Time Complexity with Space Complexity

Time Complexity:

• Measures the number of steps or operations relative to input size.

• Example: O(nlog⁡n)O(n \log n)O(nlogn) for merge sort.

Space Complexity:

• Measures the memory usage of an algorithm.

• Example: Merge sort uses O(n)O(n)O(n) extra space.

Trade-Offs:

Some algorithms, like quicksort, are time-efficient but may use less additional space, while dynamic
programming uses more memory for optimal solutions.

31. Characteristics of an Algorithm

An algorithm is a well-defined set of instructions designed to solve a problem or perform a


computation. Its key characteristics include:

1. Finiteness:

o The algorithm must terminate after a finite number of steps.

2. Definiteness:

o Each step of the algorithm must be precisely defined.

3. Input:

o Algorithms take zero or more inputs.

4. Output:

o They produce at least one output.

5. Effectiveness:

o All steps must be basic enough to be carried out in a finite amount of time using
resources like computation or memory.

6. Generality:

o It should be applicable to a broad class of problems.

32. Differences Between Backtracking and Branch and Bound

Aspect Backtracking Branch and Bound

Approach Depth-first search (DFS). Uses breadth-first or best-first search.


Aspect Backtracking Branch and Bound

Discards paths that violate Discards paths with a bound worse than the best
Pruning
constraints. solution found.

Focus Feasibility of solutions. Optimality of solutions.

N-Queens, Sudoku, Sum of Traveling Salesman Problem (TSP), Knapsack


Applications
Subsets. Problem.

Efficiency May explore unnecessary nodes. Uses bounds to limit exploration.

33. Insert Elements into a B-Tree (Degree t=3t = 3t=3)

B-Tree Properties:

1. Each node has a maximum of 2t−12t - 12t−1 keys and a minimum of t−1t-1t−1 keys (except
the root).

2. The tree grows in height or splits nodes when full.

Process:

Given t=3t = 3t=3, start with an empty B-Tree and sequentially insert:

F,S,Q,K,C,L,H,T,V,W,M,R,N,P,A,B,X,Y,D,Z,E,G,I.F, S, Q, K, C, L, H, T, V, W, M, R, N, P, A, B, X, Y, D, Z, E, G,
I.F,S,Q,K,C,L,H,T,V,W,M,R,N,P,A,B,X,Y,D,Z,E,G,I.

• Initially, the root can hold up to 5 keys.

• When a node exceeds 5 keys, it splits into two nodes and promotes the middle key to the
parent.

This iterative insertion will result in a balanced B-Tree. (You can simulate this step-by-step if needed.)

34. Minimum Cost Spanning Tree and Kruskal’s Algorithm

Minimum Cost Spanning Tree (MST):

A subset of the edges of a graph that connects all vertices without cycles and has the minimum
possible total edge weight.

Kruskal’s Algorithm:

1. Sort all edges in ascending order of weight.

2. Initialize an empty spanning tree.

3. Add edges to the tree one by one, ensuring no cycles form.

4. Stop when the spanning tree has V−1V - 1V−1 edges.

Example:
Graph:

Edges: (A,B,1),(B,C,2),(A,C,3).\text{Edges: } (A, B, 1), (B, C, 2), (A, C, 3).Edges: (A,B,1),(B,C,2),(A,C,3).

Steps:

1. Sort edges: (A,B,1)(A, B, 1)(A,B,1), (B,C,2)(B, C, 2)(B,C,2), (A,C,3)(A, C, 3)(A,C,3).

2. Add (A,B,1)(A, B, 1)(A,B,1), then (B,C,2)(B, C, 2)(B,C,2).

3. Stop, as the tree spans all vertices.

Time Complexity: O(Elog⁡E+V)O(E \log E + V)O(ElogE+V).

35. Red-Black Tree and Node Insertion

Definition:

A self-balancing binary search tree with properties:

1. Every node is either red or black.

2. Root and leaves (NIL) are black.

3. Red nodes cannot have red children.

4. All paths from a node to its descendants have the same number of black nodes.

Insertion Algorithm:

1. Insert the node as in a BST.

2. Color the node red.

3. Fix violations of red-black properties using rotations and recoloring.

Example: Insert 10,20,30,1510, 20, 30, 1510,20,30,15. The tree balances after each step.

36. Heap Sort

Algorithm:

1. Build a max heap from the array.

2. Swap the root (maximum element) with the last element.

3. Reduce heap size and heapify the root.

4. Repeat until the heap is empty.

Example Array: A={6,14,3,25,2,10,20,7,6}A = \{6, 14, 3, 25, 2, 10, 20, 7, 6\}A={6,14,3,25,2,10,20,7,6}.


Step-by-step, build the heap and extract the maximum.

37. Convex Hull Problem


Definition:

Given a set of points, the convex hull is the smallest convex polygon containing all points.

Algorithms:

1. Graham’s Scan:

o Sort points by polar angle.

o Use a stack to construct the hull.

2. Jarvis March (Gift Wrapping):

o Start at the leftmost point and wrap around.

38. Dijkstra’s Algorithm for Shortest Path

Steps:

1. Initialize distances to infinity, except the source (d[s]=0d[s] = 0d[s]=0).

2. Use a priority queue to select the vertex with the smallest tentative distance.

3. Update distances to adjacent vertices.

4. Repeat until all vertices are processed.

Example Graph: Solve for vertex 1 as the source.

39. Backtracking and Subset Sum Problem

Definition:

Backtracking explores all possible solutions by constructing a solution incrementally.

Example:

Subset sum for w={5,7,10,12,15,18,20}w = \{5, 7, 10, 12, 15, 18, 20\}w={5,7,10,12,15,18,20}, m=35m
= 35m=35.
Use a state-space tree to explore all subsets.

40. Problem Classes P, NP, and NP-Complete

• P: Solvable in polynomial time. (e.g., Matrix Multiplication)

• NP: Verifiable in polynomial time. (e.g., Hamiltonian Cycle)

• NP-Complete: Problems in NP as hard as any problem in NP. (e.g., 3-SAT)

Relationship:
If P=NPP = NPP=NP, every NP problem has a polynomial-time solution.

40. Problem Classes: P, NP, and NP-Complete


P (Polynomial Time):

• These are problems solvable in polynomial time by a deterministic algorithm.

• Example: Sorting algorithms like Merge Sort, finding the greatest common divisor (GCD).

NP (Nondeterministic Polynomial Time):

• These problems may not have a known polynomial-time solution but can be verified in
polynomial time.

• Example: Subset Sum Problem, Hamiltonian Path Problem.

NP-Complete (NPC):

• These are the hardest problems in NP. If any NP-Complete problem can be solved in
polynomial time, all NP problems can be solved in polynomial time.

• Example: Traveling Salesman Problem (TSP), 3-SAT Problem.

Relationship Between Classes:

1. P⊆NPP \subseteq NPP⊆NP: Every problem in P is also in NP.

2. NP−Complete⊆NPNP-Complete \subseteq NPNP−Complete⊆NP: NP-Complete problems are


part of NP.

3. P=NPP = NPP=NP (hypothetical): If this equality holds, all NP problems are solvable in
polynomial time.

41. Properties of Binomial Heap

Properties:

1. A binomial heap is a collection of binomial trees.

2. Each binomial tree follows the min-heap property (the key at the root is the smallest).

3. A binomial heap of nnn nodes contains at most log⁡2(n+1)\log_2(n+1)log2(n+1) binomial


trees.

4. No two binomial trees in a heap have the same degree.

Algorithm to Unite Two Binomial Heaps:

pseudo

Copy code

BinomialHeapUnion(H1, H2):

H = Merge(H1, H2) // Merge root lists by degree

if H is empty:

return H

prev_x = None
x = H.head

next_x = x.sibling

while next_x ≠ NULL:

if (x.degree ≠ next_x.degree) or (next_x.sibling ≠ NULL and next_x.sibling.degree == x.degree):

prev_x = x

x = next_x

else:

if x.key ≤ next_x.key:

x.sibling = next_x.sibling

BinomialLink(next_x, x)

else:

if prev_x == NULL:

H.head = next_x

else:

prev_x.sibling = next_x

BinomialLink(x, next_x)

x = next_x

next_x = x.sibling

return H

Finding the Minimum Key:

pseudo

Copy code

BinomialHeapMinimum(H):

min_key = ∞

x = H.head

while x ≠ NULL:

if x.key < min_key:

min_key = x.key

x = x.sibling

return min_key
42. Knapsack Problem Using Greedy Approach

Given Items:

• Weight (WWW) = {10, 20, 30, 40, 50}

• Value (VVV) = {60, 100, 120, 140, 200}

• Knapsack Capacity (CCC) = 100.

Steps:

1. Compute Value-to-Weight Ratio:

Ratios={6,5,4,3.5,4}\text{Ratios} = \left\{6, 5, 4, 3.5, 4\right\}Ratios={6,5,4,3.5,4}

2. Sort Items by Ratio: Sorted order = {Item 1, Item 2, Item 3, Item 5, Item 4}.

3. Greedy Approach (Fractional Knapsack):

o Add Item 1 (Weight = 10, Value = 60).

o Add Item 2 (Weight = 20, Value = 100).

o Add a fraction (2/3) of Item 3 (Value = 23×120=80 \frac{2}{3} \times 120 = 8032
×120=80).

Total Value: 60+100+80=24060 + 100 + 80 = 24060+100+80=240.

43. Prefix Function for P=abacabP = \text{abacab}P=abacab Using KMP Algorithm

Prefix Function (π[i]π[i]π[i]):

π=[0,0,1,0,1,2]π = [0, 0, 1, 0, 1, 2]π=[0,0,1,0,1,2]

Steps to Compute π[i]π[i]π[i]:

1. π[0]=0π[0] = 0π[0]=0: No prefix and suffix match for one character.

2. Iterate through the pattern, updating the longest prefix-suffix match for each position.

Naïve String Matching Algorithm:

• Slide the pattern over the text one character at a time.

• Compare pattern with substring at the current position.

44. Traveling Salesman Problem (TSP) Using Branch & Bound

Branch & Bound Approach:

1. Represent the graph using a cost matrix.

2. Start with a reduced cost matrix.

3. Compute bounds for each unvisited node.


4. Explore the node with the least cost bound recursively.

5. Backtrack when bounds exceed the current best solution.

44. Vertex Cover Problem Using Approximation Algorithm

Approximation Algorithm:

1. Start with an empty vertex cover.

2. Pick an edge (u,v)(u, v)(u,v) from the graph.

3. Add both uuu and vvv to the vertex cover.

4. Remove all edges incident to uuu or vvv.

5. Repeat until no edges remain.

Time Complexity: O(V+E)O(V + E)O(V+E).

Approximation Ratio: 2 (solution is at most twice the optimal size).

45. Knuth-Morris-Pratt (KMP) Algorithm for Pattern Matching

Explanation:

• KMP algorithm efficiently searches for occurrences of a pattern PPP in a text TTT.

• It preprocesses the pattern to create a "prefix table" (or π[]\pi[]π[]) that allows the algorithm
to avoid redundant comparisons.

Algorithm:

Prefix Table Construction:

pseudo

Copy code

ComputePrefixFunction(P):

m = length(P)

π[0] = 0

k=0

for q = 1 to m - 1:

while k > 0 and P[k] ≠ P[q]:

k = π[k - 1]

if P[k] == P[q]:

k=k+1

π[q] = k
return π

Pattern Matching:

pseudo

Copy code

KMPMatcher(T, P):

n = length(T)

m = length(P)

π = ComputePrefixFunction(P)

q=0

for i = 0 to n - 1:

while q > 0 and P[q] ≠ T[i]:

q = π[q - 1]

if P[q] == T[i]:

q=q+1

if q == m:

print("Pattern found at index", i - m + 1)

q = π[q - 1]

Time Complexity:

• Preprocessing (π[]\pi[]π[]): O(m)O(m)O(m)

• Matching: O(n)O(n)O(n)

• Total: O(n+m)O(n + m)O(n+m)

46. Bellman-Ford Algorithm

Purpose:

• Finds the shortest paths from a single source to all other vertices in a weighted graph.

• Works with graphs containing negative weight edges.

Algorithm:

pseudo

Copy code

BellmanFord(G, src):

Initialize distance of all vertices as ∞


distance[src] = 0

for i = 1 to |V| - 1:

for each edge (u, v) with weight w in G:

if distance[u] + w < distance[v]:

distance[v] = distance[u] + w

for each edge (u, v) with weight w in G:

if distance[u] + w < distance[v]:

print("Graph contains negative weight cycle")

return distance

Time Complexity:

• Edge Relaxation: O(V×E)O(V \times E)O(V×E)

• Overall: O(VE)O(VE)O(VE)

47. Fractional Knapsack Problem (Greedy Approach)

Problem:

• W={3,5,9,5}W = \{3, 5, 9, 5\}W={3,5,9,5}, P={45,30,45,10}P = \{45, 30, 45,


10\}P={45,30,45,10}, Capacity C=16C = 16C=16.

Steps:

1. Compute Value-to-Weight Ratios:

Ratios={15,6,5,2}\text{Ratios} = \{15, 6, 5, 2\}Ratios={15,6,5,2}

2. Sort Items by Ratio: Order = {1,2,3,4}\{1, 2, 3, 4\}{1,2,3,4}.

3. Greedy Approach:

o Take all of Item 1 (W=3,P=45W = 3, P = 45W=3,P=45).

o Take all of Item 2 (W=5,P=30W = 5, P = 30W=5,P=30).

o Take part of Item 3 (7/9×45=357/9 \times 45 = 357/9×45=35).

Total Profit: 45+30+35=11045 + 30 + 35 = 11045+30+35=110.

48. Stable vs. Unstable Sorting

Stable Sorting:

• Maintains relative order of equal elements.

• Examples: Merge Sort, Bubble Sort.


Unstable Sorting:

• Does not maintain relative order of equal elements.

• Examples: Quick Sort, Heap Sort.

Heap Sort Implementation:

pseudo

Copy code

HeapSort(A):

BuildMaxHeap(A)

for i = length(A) to 2:

swap(A[1], A[i])

heap_size = heap_size - 1

MaxHeapify(A, 1)

Heapify on Input [25,57,48,36,12,91,86,32][25, 57, 48, 36, 12, 91, 86, 32][25,57,48,36,12,91,86,32]:
Sorted Array: [12,25,32,36,48,57,86,91][12, 25, 32, 36, 48, 57, 86, 91][12,25,32,36,48,57,86,91].

49. Rabin-Karp String Matching Algorithm

Algorithm:

pseudo

Copy code

RabinKarp(T, P, q):

m = length(P)

n = length(T)

h = pow(d, m-1) % q

p=0

t=0

for i = 0 to m-1:

p = (d * p + P[i]) % q

t = (d * t + T[i]) % q

for s = 0 to n-m:

if p == t:

if P == T[s:s+m]:
print("Pattern found at index", s)

if s < n-m:

t = (d * (t - T[s] * h) + T[s+m]) % q

Spurious Hits Example:

• Pattern P=26P = 26P=26, Text T=3141592653589793T =


3141592653589793T=3141592653589793, Modulo q=11q = 11q=11.

• Spurious hits occur when hash values match but actual substrings do not.

50. N-Queens Problem (State Space Tree for n=4n = 4n=4)

• Place 4 queens on a 4x4 chessboard such that no two queens attack each other.

• Use backtracking to explore and eliminate invalid placements.

51. Kruskal’s Algorithm for MST

Algorithm:

1. Sort edges by weight.

2. Initialize disjoint sets for vertices.

3. For each edge, check if it forms a cycle using union-find.

4. If no cycle, add the edge to MST.

Time Complexity: O(Elog⁡E+V)O(E \log E + V)O(ElogE+V).

52. Binomial Heap (Decrease Key Operation)

Algorithm:

pseudo

Copy code

BinomialHeapDecreaseKey(H, x, k):

if k > x.key:

print("New key is greater than current key")

return

x.key = k

y=x

z = y.parent
while z is not NULL and y.key < z.key:

swap(y.key, z.key)

y=z

z = y.parent

Time Complexity: O(log⁡n)O(\log n)O(logn).

53. Skip List

• A probabilistic data structure for search, insertion, and deletion.

• Consists of multiple layers of linked lists.

54. Naïve String Matching

Pattern P=aabP = aabP=aab, Text T=acaabcT = acaabcT=acaabc:

• Compare PPP with substrings of TTT at every position.

• Matches: 1.

55. Sum of Subsets Problem

Definition:

The problem involves finding all subsets of a given set w={w1,w2,…,wn}w = \{w_1, w_2, \dots,
w_n\}w={w1,w2,…,wn} that sum up to a target value mmm.

Input:

• w={5,7,10,12,15,18,20}w = \{5, 7, 10, 12, 15, 18, 20\}w={5,7,10,12,15,18,20}, m=35m =


35m=35.

Approach Using Recursive Backtracking:

1. Start with the first element.

2. Include the element in the subset and proceed if the current sum does not exceed mmm.

3. Exclude the element and move to the next.

4. Stop when the current sum equals mmm.

Algorithm:

pseudo

Copy code

SumOfSubsets(w, n, currentSum, targetSum, subset):

if currentSum == targetSum:
print(subset)

return

if currentSum > targetSum or n == 0:

return

// Include current element

SumOfSubsets(w[1:], n-1, currentSum + w[0], targetSum, subset + [w[0]])

// Exclude current element

SumOfSubsets(w[1:], n-1, currentSum, targetSum, subset)

State-Space Tree (Partial):

• Root: currentSum=0currentSum = 0currentSum=0, subset={}subset = \{\}subset={}.

• Level 1: Include/Exclude 555.

• Level 2: Include/Exclude 777, and so on.

Output Subsets:

{5,10,20},{5,12,18},{7,10,18},{15,20}.\{5, 10, 20\}, \{5, 12, 18\}, \{7, 10, 18\}, \{15,
20\}.{5,10,20},{5,12,18},{7,10,18},{15,20}.

56. String Matching Algorithm - Rabin-Karp

Definition:

Rabin-Karp finds all occurrences of a pattern PPP in a text TTT using a hash function for efficient
comparison.

Algorithm:

1. Compute the hash value of PPP and the first window of TTT.

2. Slide the pattern over the text:

o Compare hash values.

o If hash values match, compare character by character.

3. Update hash values using a rolling hash.

pseudo

Copy code

RabinKarp(T, P, q):

n = length(T)

m = length(P)

h = pow(d, m-1) % q
p_hash = 0

t_hash = 0

for i = 0 to m-1:

p_hash = (d * p_hash + P[i]) % q

t_hash = (d * t_hash + T[i]) % q

for s = 0 to n-m:

if p_hash == t_hash:

if P == T[s:s+m]:

print("Pattern found at index", s)

if s < n-m:

t_hash = (d * (t_hash - T[s] * h) + T[s+m]) % q

Example:

• P=26P = 26P=26, T=3141592653589793T = 3141592653589793T=3141592653589793,


q=11q = 11q=11.

• Matches are detected based on hash, with possible spurious hits corrected by direct
comparison.

57. Counting Sort Algorithm

Algorithm:

1. Count occurrences of each element.

2. Use the counts to determine positions.

3. Place elements in a sorted array.

Algorithm:

pseudo

Copy code

CountingSort(A):

maxVal = max(A)

count = [0] * (maxVal + 1)

for x in A:

count[x] += 1

for i in range(1, len(count)):

count[i] += count[i-1]
sortedArray = [0] * len(A)

for x in reversed(A):

sortedArray[count[x] - 1] = x

count[x] -= 1

return sortedArray

Example (Array: A={0,1,3,0,3,2,4,5,2,4,6,2,2,3}A =


\{0,1,3,0,3,2,4,5,2,4,6,2,2,3\}A={0,1,3,0,3,2,4,5,2,4,6,2,2,3}):

Sorted Array: {0,0,1,2,2,2,2,3,3,3,4,4,5,6} \{0, 0, 1, 2, 2, 2, 2, 3, 3, 3, 4, 4, 5,


6\}{0,0,1,2,2,2,2,3,3,3,4,4,5,6}.

58. Recurrence Relations

1. (i) T(n)=T(n−1)+n4T(n) = T(n-1) + n^4T(n)=T(n−1)+n4:

o Expansion: T(n)=n4+(n−1)4+(n−2)4+⋯+14T(n) = n^4 + (n-1)^4 + (n-2)^4 + \dots +


1^4T(n)=n4+(n−1)4+(n−2)4+⋯+14.

o Complexity: O(n5)O(n^5)O(n5) (sum of n4n^4n4).

2. (ii) T(n)=T(n/4)+T(n/2)+n2T(n) = T(n/4) + T(n/2) + n^2T(n)=T(n/4)+T(n/2)+n2:

o Split and conquer analysis gives complexity O(n2log⁡n)O(n^2 \log n)O(n2logn).

59. Naïve String Matching Algorithm

Algorithm:

pseudo

Copy code

NaiveStringMatcher(T, P):

n = length(T)

m = length(P)

for s = 0 to n - m:

if T[s:s+m] == P:

print("Pattern occurs at shift", s)

Example (P=aab,T=acaabcP = aab, T = acaabcP=aab,T=acaabc):

• Compare PPP with substrings of TTT:

o Matches at index 333.


60. Bellman-Ford Algorithm

See Above Answer.

61. P and NP Problems

Definitions:

• P: Problems solvable in polynomial time.

• NP: Problems verifiable in polynomial time.

• NP-Complete: Problems in NP where every NP problem can be reduced to it in polynomial


time.

Key Relationships:

• P⊆NPP \subseteq NPP⊆NP.

• If any NP-complete problem is in PPP, then P=NPP = NPP=NP.

You might also like