0% found this document useful (0 votes)
67 views2 pages

Cit 203 Data Structures and Algorithms

This document outlines the examination details for the CIT 203: Data Structures and Algorithms course at Tom Mboya University for the 2024/2025 academic year. It includes instructions for candidates, a series of questions covering key concepts such as abstraction, data structures, algorithms, and their applications, as well as specific tasks like writing pseudocode and implementing algorithms. The exam is structured into sections with varying marks allocated to each question.

Uploaded by

lijodipromise
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
67 views2 pages

Cit 203 Data Structures and Algorithms

This document outlines the examination details for the CIT 203: Data Structures and Algorithms course at Tom Mboya University for the 2024/2025 academic year. It includes instructions for candidates, a series of questions covering key concepts such as abstraction, data structures, algorithms, and their applications, as well as specific tasks like writing pseudocode and implementing algorithms. The exam is structured into sections with varying marks allocated to each question.

Uploaded by

lijodipromise
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd

TOM MBOYA UNIVERSITY

KNOWLEDGE FOR SUSTAINABLE INNOVATION ENTERPRISE

2024/2025 ACADEMIC YEAR UNIVERSITY EXAMINATIONS

SECOND YEAR FIRST SEMESTER UNIVERSITY EXAMINATIONS


FOR BACHELOR OF SCIENCE IN MATHEMATICS WITH IT

CIT 203: DATA STRUCTURES AND ALGORITHMS

DATE: 18/12/2024 TIME:3 HOURS

INSTRUCTIONS TO CANDIDATES

1. Answer ALL questions in section A and ANY TWO Questions in section B


2. DURATION OF EXAMINATION: THREE (3) HOURS

EXAMINATION CHEATING IS A CRIME


QUESTIONS ONE (30 MARKS)

a) Define abstraction in the context of algorithms and explain its relevance to problem-
solving.
(5 marks)
b) By giving examples, explain what Abstract Data Types (ADTs) are and their importance in
designing algorithms. (6 marks)
c) Describe the operations of a stack. Provide an example of where a stack can be used in a
real-world problem. (5 marks)
d) Explain the difference between: (4 marks)
i. A stack and a queue.
ii. Data structure and an algorithm
iii. Array and a linked list
e) Define recursion and explain how recursive algorithms differ from iterative algorithms.
(5 marks)
f) Write a recursive algorithm for calculating the factorial of a number. (5 marks)

QUESTION TWO (20 MARKS)

a) Explain the difference between time complexity and space complexity. (4 marks)
b) Compare and contrast arrays and linked lists, focusing on their memory usage and
access times. (4 marks)
c)
i. Explain how a stack differs from a queue, providing one use case where each
might be preferable. (6 marks)
ii. Write pseudocode to implement a basic queue using an array. (6 marks)
QUESTION THREE (20 MARKS)

a) Write the pseudocode for the Quick Sort algorithm. (6 marks)


b) You are given the following unsorted list: [45, 23, 76, 12, 78, 34, 56]. Perform one full
pass of the
Bubble Sort algorithm on this list and show the resulting list. (4 marks)
c) Explain the divide-and-conquer technique and illustrate with an example how it is
applied in binary search. (10 marks)

QUESTION FOUR (20 MARKS)

a) Implement a stack using an array in either C++ or Java. Write the code for the following
stack operations: push, pop, and peek. (10 marks)
b) Explain the binary search algorithm and compare its efficiency with linear search. (10
marks)

QUESTION FIVE (20 MARKS)

a) Implement a C++/Java function to insert an element into a linked list at the beginning.
Provide the code and explain the main steps. (10 marks)
b) You are tasked with developing an algorithm to solve the knapsack problem. Describe
how dynamic programming can be used to find an optimal solution. (10 marks)

Common questions

Powered by AI

Time complexity refers to the amount of computational time an algorithm takes to complete as a function of the length of the input, often expressed in big O notation. For example, O(n) describes linear time complexity, where the time taken increases linearly with input size. Space complexity, on the other hand, refers to the total memory space required by an algorithm to run, also expressed in big O notation. An example is an algorithm that requires an array of size n, having a space complexity of O(n). Understanding both complexities helps in evaluating algorithm efficiency .

Abstraction in algorithms involves simplifying complex reality by modeling classes based on the essential properties relevant to the problem being solved, while ignoring less pertinent details. It helps in managing complexity by breaking down a problem into more manageable parts and focusing on interactions at a higher level of details. This is crucial in software development as it allows developers to think conceptually, enabling improved problem-solving, reducing complexity, and increasing the ability to manage large software systems over time .

A binary search is more efficient than linear search, offering O(log n) time complexity by dividing the search interval in half, whereas linear search scans each element, resulting in O(n) complexity. The prerequisite for binary search implementation is a sorted data set – a condition not required for linear search, which can operate on unsorted data. This efficiency and prerequisite make binary search highly effective in large datasets, where sorting time is justified by significantly reduced search times .

Abstract Data Types (ADTs) are fundamental to designing algorithms as they provide a clear specification of data and operations without detailing implementation, allowing for modular programming and ease of understanding. Examples include stacks, queues, and lists. Using ADTs, developers can implement algorithms that are more robust and can be easily modified, as the implementation details are separated from the interface. This abstraction promotes code reuse and maintains efficiency of algorithms across different applications .

Recursion simplifies code and problem-solving by employing self-referential function calls, making algorithms easier to visualize and implement for problems like tree traversals and complex mathematical computations. However, recursion can lead to increased memory usage due to stack overheads and can result in stack overflow for deep recursions. Iterative algorithms, in contrast, use loops that are memory efficient and avoid issues like stack overflow, but can be complex to design and less intuitive for certain problems. Balancing these trade-offs is key in choosing appropriate algorithm design techniques .

Dynamic programming solves the knapsack problem by breaking it into overlapping subproblems, storing results of subproblems to avoid redundant calculations, leading to a polynomial time complexity solution. Unlike recursive approaches that may exhibit exponential time complexity due to repeated calculation of the same subproblems, dynamic programming optimizes the solution with a clear bottom-up approach, providing both time efficiency and scalability in solving complex problems like the knapsack problem .

Divide-and-conquer in binary search involves dividing the sorted data into halves, recursively searching the relevant half. This results in log(n) time complexity, making it more efficient than linear search, which scans each element sequentially with O(n) time complexity. However, binary search requires sorted input for effective application, limiting its use in unsorted data scenarios where linear search might be preferable despite its inefficiency .

The pseudocode for implementing a queue using an array includes functions for enqueue (adding an element at the end), dequeue (removing an element from the front), and checking if the queue is full or empty. Using a queue is beneficial in scheduling tasks where resources are allocated in the order jobs arrive, ensuring fairness and order consistency, such as in printer spoolers or CPU task scheduling .

Stacks operate on a Last-In-First-Out (LIFO) principle whereas queues function on a First-In-First-Out (FIFO) principle. This fundamental operational difference means stacks are more suited to tasks like parsing and depth-first search, where the last element needs to be accessed immediately. In contrast, queues are more applicable in scenarios such as scheduling and breadth-first search, where processing must occur in the order of arrival. Understanding these differences can influence application design, ensuring that the right structure is chosen for task efficiency and performance .

Linked lists provide dynamic memory allocation leading to efficient memory usage since they do not require a contiguous block of memory as arrays do, allowing for resizing during execution. However, arrays allow faster access times due to direct indexing, while accessing an element in a linked list requires sequential traversal until the desired node is found, leading to increased access times. The trade-off between memory efficiency of linked lists and access speed of arrays influences data structure choice .

You might also like