0% found this document useful (0 votes)
86 views164 pages

Queue Notes

Queue Data Structure details notes basic to advance.

Uploaded by

Bhavesh Bari
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
86 views164 pages

Queue Notes

Queue Data Structure details notes basic to advance.

Uploaded by

Bhavesh Bari
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Subject: data structure

Topic: queue

Introduction to Queues
Definition, FIFO (First-In, First-Out) principle, real-world analogies, and basic
characteristics of a queue data structure.

1. Introduction
Definition A Queue is a linear data structure that follows a particular
order in which operations are performed. The order is First-In, First-
Out (FIFO). This means the element that was added first will be the
first one to be removed. Think of it as a line of people waiting for a
service; the person who arrived first is served first.

Why it is needed Queues are essential when there is a need to


process items or tasks in a strict, sequential order, ensuring fairness
or natural ordering of events.

Real-life analogy: Imagine a supermarket checkout line. The


first customer to join the line is the first one to be served by the
cashier. New customers join the end of the line, and customers are
served from the front. This is a perfect representation of a queue:
items enter at one end (the "rear" or "tail") and leave from the
other end (the "front" or "head").

Applications

[Link] Page 1 of 164


Operating Systems: CPU scheduling (jobs processed in the order
they arrive), disk scheduling, handling interrupts.

Computer Networks: Packet buffering at routers, network


congestion management.

Printers: Print spoolers, where multiple documents wait in a


queue to be printed one after another.

Web Servers: Handling client requests in the order they arrive.

Simulation: Event-driven simulations often use queues to manage


events in chronological order.

Breadth-First Search (BFS) algorithm: Used to traverse graphs


and trees level by level.

2. Theory Explanation
In-depth explanation A queue maintains two primary pointers (or
indices) to manage its elements:

1. Front (or Head): This pointer keeps track of the element that is at
the very beginning of the queue and is ready to be removed.

2. Rear (or Tail): This pointer keeps track of the element that was
most recently added to the queue, and new elements are always
added after it.

The fundamental principle, FIFO, means:

Enqueue operation: When an element is added, it is always added


at the rear of the queue.

Dequeue operation: When an element is removed, it is always


removed from the front of the queue.

If the queue is empty, both front and rear typically point to an


invalid state (e.g., -1 or NULL ). When the first element is added, both
front and rear point to that element. As elements are dequeued,
front advances. As elements are enqueued, rear advances.

[Link] Page 2 of 164


Diagrams/flowcharts (described in text) Imagine a queue as a
horizontal row of cells, where data can only be added to the rightmost
cell and removed from the leftmost cell.

Empty Queue:

[][][][][]
^ ^
front rear
(both may be -1, or NULL)

After Enqueue(10):

[10] [ ] [ ] [ ] [ ]
^ ^
front rear

(Initially front and rear were -1. After adding 10, both become 0,
pointing to index of 10)

After Enqueue(20):

[10] [20] [ ] [ ] [ ]
^ ^
front rear

(Rear moves to index 1, 20 is placed there)

After Enqueue(30):

[10] [20] [30] [ ] [ ]


^ ^
front rear

(Rear moves to index 2, 30 is placed there)

[Link] Page 3 of 164


After Dequeue(): (10 is removed)

[ ] [20] [30] [ ] [ ]
^ ^
front rear

(Front moves to index 1. The value 10 is considered removed/no


longer accessible from the queue, though it might still be in
memory until overwritten).

Step-by-step breakdown

1. Initialization: When a queue is created, it is empty. front and


rear are typically initialized to -1 (for array-based implementation)
or NULL (for linked list implementation).

2. Enqueue Operation:
Check if the queue is full (if array-based). If full, report an error.

If the queue was initially empty ( front == -1 ), set front = 0 .

Increment rear .

Add the new element at the position indicated by rear .

3. Dequeue Operation:
Check if the queue is empty. If empty, report an error.

Retrieve the element at the position indicated by front .

Increment front .

If front becomes greater than rear after dequeuing (meaning


the last element was removed), reset both front and rear to
-1 to indicate an empty queue.

4. Peek/Front Operation:
Check if the queue is empty. If empty, report an error.

Return the element at the position indicated by front without


modifying the queue.

5. isEmpty Operation:

[Link] Page 4 of 164


Returns true if front is -1 (or front > rear ), false otherwise.

6. isFull Operation (Array-based):


Returns true if rear has reached the maximum capacity of the
underlying array, false otherwise.

3. Properties & Characteristics


Key properties

FIFO (First-In, First-Out): The most fundamental property,


ensuring elements are processed in the order they arrive.

Linear Data Structure: Elements are arranged sequentially.

Dynamic Size: If implemented using a linked list, a queue can grow


or shrink dynamically based on the number of elements.

Fixed Size: If implemented using an array, the queue has a


maximum capacity and can become full.

Restricted Access: Elements can only be added at the rear and


removed from the front.

Advantages & disadvantages

Advantages:
Maintains Order: Guarantees that elements are processed in
the sequence they were added.

Fairness: Useful in scenarios where fair processing is required


(e.g., CPU scheduling).

Simple Implementation: The basic operations are


straightforward to implement.

Buffering: Excellent for buffering data between processes or


systems operating at different speeds.

Disadvantages:
Limited Access: Only the front and rear elements are directly
accessible.

[Link] Page 5 of 164


Fixed Size Limitation (Array-based): An array-based queue
can suffer from "overflow" (cannot add more elements) or
"underflow" (cannot remove elements from an empty queue). It
can also lead to inefficient use of space if elements are
frequently dequeued, creating unused space at the front of the
array (unless a circular queue is used).

4. Mathematical / Logical Representation


Important formulas (for array-based implementation)

isEmpty() :
front == -1 (initial state)

front > rear (after all elements have been dequeued from a
non-empty queue)

isFull() : (Assuming MAX_SIZE is the capacity of the array)


rear == MAX_SIZE - 1

size() :
If isEmpty() , size is 0.

Otherwise, rear - front + 1

Derivations: Not typically applicable for the introductory concept of a


queue's state. The "formulas" are more logical conditions rather than
complex mathematical derivations.

5. Operations / Algorithms
We will use an array-based implementation for demonstration, assuming
a fixed maximum size MAX_SIZE .

Queue Class Structure (Conceptual)

class Queue:
int queueArray[MAX_SIZE]
int front

[Link] Page 6 of 164


int rear

// Constructor
Queue():
front = -1
rear = -1

5.1 Enqueue (Insert an element)

Problem statement: Add a new element to the end (rear) of the


queue.

Step-by-step algorithm:

1. Check if the queue is full. If rear is equal to MAX_SIZE - 1 , the


queue is full. Return an error or indicate that the operation cannot
be performed.

2. If the queue is empty ( front == -1 ), set front = 0 . This is for the


first element being added.

3. Increment rear by 1.

4. Place the new element at queueArray[rear] .

Dry run example: Let MAX_SIZE = 3 . Initial: queueArray = [_, _, _] ,


front = -1 , rear = -1

1. Enqueue(10) :

front == -1 is true, so front = 0 .

rear becomes 0 .

queueArray[0] = 10 .

State: queueArray = [10, _, _] , front = 0 , rear = 0

2. Enqueue(20) :

front == -1 is false.

rear becomes 1 .

[Link] Page 7 of 164


queueArray[1] = 20 .

State: queueArray = [10, 20, _] , front = 0 , rear = 1

3. Enqueue(30) :

front == -1 is false.

rear becomes 2 .

queueArray[2] = 30 .

State: queueArray = [10, 20, 30] , front = 0 , rear = 2

4. Enqueue(40) :

rear == MAX_SIZE - 1 ( 2 == 3 - 1 ) is true. Queue is full.

Output: "Queue is Full"

State remains: queueArray = [10, 20, 30] , front = 0 , rear = 2

Pseudocode:

function Enqueue(item):
if rear == MAX_SIZE - 1:
print "Queue is Full. Cannot enqueue " + item
return
if front == -1: // First element being added
front = 0
rear = rear + 1
queueArray[rear] = item
print item + " enqueued to queue"

Code (C++):

#include <iostream>
#define MAX_SIZE 5 // Example maximum size

class Queue {
private:
int arr[MAX_SIZE];

[Link] Page 8 of 164


int front;
int rear;

public:
Queue() {
front = -1;
rear = -1;
}

bool isEmpty() {
return front == -1;
}

bool isFull() {
return rear == MAX_SIZE - 1;
}

void enqueue(int value) {


if (isFull()) {
std::cout << "Queue Overflow: Cannot enqueue " << value << ".
return;
}
if (isEmpty()) {
front = 0;
}
rear++;
arr[rear] = value;
std::cout << value << " enqueued to queue." << std::endl;
}

// Other operations will be defined below


};

5.2 Dequeue (Remove an element)

[Link] Page 9 of 164


Problem statement: Remove and return the element from the front
of the queue.

Step-by-step algorithm:

1. Check if the queue is empty. If front == -1 or front > rear , the


queue is empty. Return an error or a special value (e.g., -1).

2. Store the element at queueArray[front] in a temporary variable.

3. Increment front by 1.

4. If after incrementing front , front becomes greater than rear


(meaning the queue is now empty), reset both front and rear to
-1.

5. Return the stored element.

Dry run example: Let MAX_SIZE = 3 . Initial: queueArray = [10, 20,


30] , front = 0 , rear = 2

1. Dequeue() :

front == -1 is false.

item = queueArray[front] (item = queueArray[0] which is


10 ).

front becomes 1 .

front > rear ( 1 > 2 ) is false.

Return 10 .

State: queueArray = [10, 20, 30] , front = 1 , rear = 2 (Note:


10 is logically removed, but physically still in arr[0] )

2. Dequeue() :

front == -1 is false.

item = queueArray[front] (item = queueArray[1] which is


20 ).

front becomes 2 .

front > rear ( 2 > 2 ) is false.

[Link] Page 10 of 164


Return 20 .

State: queueArray = [10, 20, 30] , front = 2 , rear = 2

3. Dequeue() :

front == -1 is false.

item = queueArray[front] (item = queueArray[2] which is


30 ).

front becomes 3 .

front > rear ( 3 > 2 ) is true. Reset front = -1 , rear = -1 .

Return 30 .

State: queueArray = [10, 20, 30] , front = -1 , rear = -1

4. Dequeue() :

front == -1 is true. Queue is empty.

Output: "Queue is Empty"

Return -1 (or throw error).

State remains: queueArray = [10, 20, 30] , front = -1 , rear =


-1

Pseudocode:

function Dequeue():
if front == -1 or front > rear:
print "Queue is Empty. Cannot dequeue."
return -1 // Indicate error
item = queueArray[front]
front = front + 1
if front > rear: // Queue became empty after this dequeue
front = -1
rear = -1
print item + " dequeued from queue"
return item

[Link] Page 11 of 164


Code (C++):

// Inside the Queue class definition


// ... (previous code) ...

int dequeue() {
if (isEmpty()) {
std::cout << "Queue Underflow: Cannot dequeue. Queue is em
return -1; // Or throw an exception
}
int dequeuedValue = arr[front];
front++;
if (front > rear) { // If queue becomes empty
front = -1;
rear = -1;
}
std::cout << dequeuedValue << " dequeued from queue." << std::
return dequeuedValue;
}

int peek() { // Or front()


if (isEmpty()) {
std::cout << "Queue is empty. No front element." << std::endl;
return -1;
}
return arr[front];
}
};

int main() {
Queue myQueue;
[Link](10); // Enqueued 10
[Link](20); // Enqueued 20
std::cout << "Front element is: " << [Link]() << std::endl; //
[Link](); // Dequeued 10
[Link](30); // Enqueued 30

[Link] Page 12 of 164


[Link](); // Dequeued 20
[Link](); // Dequeued 30
[Link](); // Queue Underflow
std::cout << "Is queue empty? " << ([Link]() ? "Yes" : "No
return 0;
}

6. Complexity Analysis
For standard queue operations (enqueue, dequeue, peek, isEmpty, isFull)
using either an array or a linked list:

Time complexity

Enqueue: O(1) - Constant time. Adding an element involves a few


pointer/index manipulations and an assignment, irrespective of the
queue's size.

Dequeue: O(1) - Constant time. Removing an element also involves


a few pointer/index manipulations and retrieval.

Peek/Front: O(1) - Constant time. Accessing the element at the


front takes constant time.

isEmpty/isFull: O(1) - Constant time. Checking conditions involves


comparing pointers/indices.

Space complexity

O(N) - Where N is the number of elements currently stored in the


queue.

For an array-based implementation, it's O(MAX_SIZE) because the


array always occupies its maximum declared size, even if not all
slots are filled.

For a linked list-based implementation, it's O(N) as each element


uses a fixed amount of memory for data and a pointer.

Comparison with alternatives

[Link] Page 13 of 164


vs. Stacks: Stacks follow LIFO (Last-In, First-Out), while queues
follow FIFO. Their primary use cases differ based on the required
order of element processing. Both offer O(1) for their core
operations.

vs. Arrays/Linked Lists (general purpose): While queues can be


implemented using arrays or linked lists, the key difference is the
restricted interface. Raw arrays/linked lists allow for random access
or insertion/deletion anywhere, which would violate the queue's
FIFO principle if not carefully managed. Queues provide a higher-
level abstraction with specific rules. The O(1) complexity for queue
operations is achieved by only allowing access at the front and
rear .

7. Examples
Solved Problem 1 (Trace Queue Operations) Consider an empty
queue with a MAX_SIZE of 4. Trace the state of the queue ( front ,
rear , elements) after each operation.

1. Initial State: front = -1 , rear = -1 , Queue = [_, _, _, _]

2. enqueue(A) :
front becomes 0 . rear becomes 0 . Queue = [A, _, _, _]

3. enqueue(B) :
rear becomes 1 . Queue = [A, B, _, _]

4. enqueue(C) :
rear becomes 2 . Queue = [A, B, C, _]

5. peek() :
Returns A . Queue state unchanged.

6. dequeue() :
Returns A . front becomes 1 . Queue = [A, B, C, _] (logically
B, C remain)

7. enqueue(D) :

[Link] Page 14 of 164


rear becomes 3 . Queue = [A, B, C, D] (logically B, C, D )

8. enqueue(E) :
isFull() is true ( rear == MAX_SIZE - 1 , i.e., 3 == 4-1 ). Output:
"Queue is Full". Queue state unchanged.

9. dequeue() :
Returns B . front becomes 2 . Queue = [A, B, C, D] (logically
C, D remain)

10. dequeue() :
Returns C . front becomes 3 . Queue = [A, B, C, D] (logically
D remains)

11. peek() :
Returns D . Queue state unchanged.

12. dequeue() :
Returns D . front becomes 4 . Since front > rear ( 4 > 3 ),
front = -1 , rear = -1 .

Queue = [A, B, C, D] (logically empty)

13. dequeue() :
isEmpty() is true. Output: "Queue is Empty". Returns -1.

8. Common Mistakes & Edge Cases


Where students go wrong

Confusing front and rear : Incorrectly adding at front or


removing from rear .

Off-by-one errors: Forgetting to handle 0 -based indexing


correctly, especially with MAX_SIZE - 1 .

Not resetting front and rear for empty queue: After the last
element is dequeued, front might become MAX_SIZE and rear
might stay at MAX_SIZE-1 . Failing to reset them to -1 leads to an
incorrect isEmpty() check and potential issues if elements are
enqueued again.

[Link] Page 15 of 164


Incorrect isFull() logic: In a basic array queue, rear == MAX_SIZE
- 1 signifies full. Students might forget this.

Ignoring isEmpty() and isFull() checks: Attempting to


dequeue from an empty queue or enqueue to a full queue
without proper checks leads to runtime errors or incorrect
behavior.

Special cases

Empty Queue: Both front and rear are -1. isEmpty() should
return true.

Full Queue (Array-based): rear is at MAX_SIZE - 1 . isFull()


should return true.

Queue with One Element: front and rear point to the same
index.
When this element is dequeued, front will become rear + 1 ,
which is the condition to reset front and rear to -1.

Queue after multiple enqueues and dequeues: In an array-


based implementation, the "front" of the queue might not be at
index 0 anymore, even if there's space at the beginning of the
array. This issue is addressed by Circular Queues, but for an
introduction, it's important to understand the basic linear queue
limitation.

9. Real-Life Applications
Operating Systems:

CPU Scheduling: Processes waiting for CPU time are often kept in
a ready queue. The operating system picks the next process from
the front of the queue using scheduling algorithms (e.g., Round
Robin, which uses a queue directly).

Printer Spooler: Documents sent to a printer are placed in a


queue. The printer processes them one by one in the order they
were received.

[Link] Page 16 of 164


Keyboard Buffer: Characters typed on a keyboard are stored in a
buffer (queue) until the operating system or application can
process them.

Computer Networks:

Packet Buffering: Routers and network switches use queues to


temporarily store data packets when network traffic is high,
processing them in the order they arrived to avoid loss.

Web Servers:

Request Handling: Web servers often put incoming client requests


into a queue to process them sequentially, ensuring that no
request is dropped and resources are allocated fairly.

Data Processing:

Message Queues: Used in distributed systems to facilitate


communication between different parts of an application.
Messages are sent to a queue and processed asynchronously by
recipient services.

Algorithms:

Breadth-First Search (BFS): A graph traversal algorithm that


explores all the neighbor nodes at the present depth level before
moving on to nodes at the next depth level. It uses a queue to
manage the nodes to visit.

10. Practice Problems


Easy:

1. Implement a basic queue data structure using an array. Include


enqueue , dequeue , peek , isEmpty , and isFull operations.

2. Write a program that takes a sequence of integers as input,


enqueues them, and then dequeues and prints them,
demonstrating the FIFO principle.

[Link] Page 17 of 164


Medium:

1. Implement a queue using a linked list instead of an array. This will


make the queue dynamic (no fixed MAX_SIZE ).

2. Consider the array-based queue implementation provided. If


front moves past the elements but rear is still at the end of the
array (e.g., front=3, rear=MAX_SIZE-1 ), and new space exists at
index 0 , the queue is logically not full, but isFull() would say it is.
How can you modify the array-based queue to efficiently use this
space? (Hint: Think about Circular Queues).

Hard:

1. Implement a queue using two stacks. Your enqueue and


dequeue operations should meet the FIFO requirements using
only stack operations (push, pop, peek, isEmpty). Analyze the time
complexity of your enqueue and dequeue operations.

11. Summary / Key Points


Quick revision of points covered

A Queue is a FIFO (First-In, First-Out) linear data structure.

Elements are added at the rear (enqueue) and removed from the
front (dequeue).

Key operations are enqueue() , dequeue() , peek()/front() ,


isEmpty() , and isFull() (for array-based).

Implemented using either arrays (fixed size, potential for wasted


space or need for circular implementation) or linked lists (dynamic
size).

Queue operations ( enqueue , dequeue , peek , isEmpty ,


isFull ) generally have a time complexity of O(1).

Space complexity is O(N) for N elements, or O(MAX_SIZE) for array-


based.

[Link] Page 18 of 164


Used widely in operating systems, networks, simulations, and
algorithms like BFS.

Watch out for isEmpty() and isFull() conditions, especially when


handling the first or last element, and resetting front / rear
appropriately.

Highlighted formulas and shortcuts (Array-based)

Empty: front == -1 OR front > rear

Full: rear == MAX_SIZE - 1

Size: rear - front + 1 (if not empty)

Enqueue: Increment rear , then add element. If first element, set


front = 0 .

Dequeue: Increment front . If front > rear after dequeue, reset


front = -1 , rear = -1 .

Queue Abstract Data Type (ADT)


Understanding the core operations: enqueue (add), dequeue (remove),
front/peek (access front element), isEmpty, isFull, and size.

1. Introduction
Definition A Queue is a linear Abstract Data Type (ADT) that follows a
specific order for operations: First-In, First-Out (FIFO). This means the
element that was added first will be the first one to be removed. It's
often compared to a line of people waiting for a service. Queues are
characterized by two primary operations:

Enqueue: Adding an element to the rear (back) of the queue.

Dequeue: Removing an element from the front of the queue.

[Link] Page 19 of 164


Why it is needed (real-life analogy if possible) Queues are
fundamental when the order of processing is critical, specifically when
items need to be handled sequentially based on their arrival time.
Imagine a ticket counter at a railway station:

People arrive and join the end of the line (Enqueue).

The person at the front of the line is served first and leaves
(Dequeue).

No one can cut in line (elements can only be added at the rear),
and no one can be served out of order (elements can only be
removed from the front). This analogy perfectly illustrates the FIFO
principle and the necessity of queues to manage ordered
processing.

Applications Queues are widely used in various computing scenarios:

Operating Systems: CPU scheduling (tasks are processed in the


order they arrive), disk scheduling, handling interrupts.

Network Systems: Packet buffering (packets are transmitted in


the order they are received).

Printer Spooling: Documents sent to a printer are placed in a


queue and printed one by one.

Call Center Systems: Customer calls are queued and answered by


agents in the order they came in.

Algorithms: Breadth-First Search (BFS) in graph traversal, level-


order traversal of a tree.

Data Buffers: Temporary storage for data flowing between


different parts of a system that operate at different speeds.

2. Theory Explanation
In-depth explanation A Queue ADT typically maintains two pointers
or indices:

[Link] Page 20 of 164


Front (or Head): Points to the first element in the queue. This is
where elements are removed (dequeued).

Rear (or Tail): Points to the last element in the queue. This is
where new elements are added (enqueued). When a queue is
empty, both front and rear might point to a special value (e.g.,
-1 or null). When the first element is added, both front and rear
point to it. When elements are added, rear is incremented. When
elements are removed, front is incremented. A crucial condition
for an empty queue is when front becomes greater than rear
after a dequeue operation. A full queue (for array-based
implementations) is when rear reaches the maximum capacity.

Diagrams/flowcharts (describe in text if not possible to draw)


Let's describe a simple array-based queue's state changes:

Initial State (Empty Queue): [ ] [ ] [ ] [ ] [ ] ^ ^ F R (Front = -1, Rear


= -1)

After Enqueue(10): [10] [ ] [ ] [ ] [ ] ^ ^ F R (Front = 0, Rear = 0)

After Enqueue(20): [10] [20] [ ] [ ] [ ] ^ ^ F R (Front = 0, Rear = 1)

After Enqueue(30): [10] [20] [30] [ ] [ ] ^ ^ F R (Front = 0, Rear = 2)

After Dequeue() (removes 10): [ ] [20] [30] [ ] [ ] ^ ^ F R (Front = 1,


Rear = 2)

After Dequeue() (removes 20): [ ] [ ] [30] [ ] [ ] ^ F ^ R (Front = 2,


Rear = 2)

After Dequeue() (removes 30): [ ] [ ] [ ] [ ] [ ] ^ F ^ R (Front = 3, Rear


= 2. Here F > R, indicating empty) Note: In a typical implementation,
when the last element is dequeued, both Front and Rear are reset
to -1 or a similar indicator of an empty queue.

Step-by-step breakdown

1. Queue Initialization: Create an empty queue. For an array, this


means allocating space and setting front = -1 and rear = -1 . For

[Link] Page 21 of 164


a linked list, front = null and rear = null .

2. Enqueue Operation:
Check if the queue is full (for array-based). If full, report "Queue
Overflow".

If the queue is initially empty ( front == -1 and rear == -1 ), set


front = 0 .

Increment rear .

Add the new element at the position indicated by rear .

3. Dequeue Operation:
Check if the queue is empty ( front == -1 or front > rear for
array-based). If empty, report "Queue Underflow".

Retrieve the element at the position indicated by front .

Increment front .

If, after incrementing front , front becomes greater than


rear , it means the queue has become empty. Reset both
front = -1 and rear = -1 to properly reflect an empty state.

4. Peek/Front Operation:
Check if the queue is empty. If empty, report an error.

Otherwise, return the element at the front position without


removing it.

5. IsEmpty Operation:
Return true if front == -1 or ( front > rear in array-based
queue where front and rear are managed by indices),
otherwise false.

6. IsFull Operation:
(For array-based queues only) Return true if rear == MAX_SIZE -
1 , otherwise false.

3. Properties & Characteristics


Key properties

[Link] Page 22 of 164


FIFO (First-In, First-Out): Elements are processed in the strict
order of their arrival.

Linear Data Structure: Elements are arranged sequentially.

Two Pointers: Operations are controlled by front and rear


pointers/indices.

Dynamic vs. Static Size: Can be implemented with a fixed-size


array (static) or a dynamic linked list (dynamic).

Limited Access: Elements can only be added at the rear and


removed from the front. Access to elements in the middle is not
allowed.

Advantages & disadvantages

Advantages:
Order Preservation: Guarantees that elements are processed
in the exact sequence they were added.

Simplicity: Conceptually straightforward and easy to


implement.

Resource Management: Excellent for managing shared


resources or tasks that need sequential execution.

Concurrency Control: Useful in multi-threaded environments


for inter-process communication and task scheduling.

Disadvantages:
Fixed Size (Array-based): An array implementation can suffer
from overflow if the queue reaches its maximum capacity,
leading to wasted space if the capacity is too large or failures if
too small. This can be mitigated by using a circular queue or
dynamic arrays.

Memory Overhead (Linked List-based): Each element in a


linked list queue requires extra memory for pointers, which can
be an overhead for very small data elements.

Inefficient Space Utilization (Linear Array-based): In a simple


array-based queue, after several dequeue operations, the

[Link] Page 23 of 164


front pointer moves forward, leaving empty space at the
beginning of the array that cannot be reused until the queue
becomes completely empty (and often reset), or a circular array
approach is used.

4. Mathematical / Logical Representation


Important formulas The "formulas" for a Queue ADT are primarily
logical conditions that define its state and govern its operations.

Let Q be an array of size MAX_SIZE . Let front be the index of the


front element, and rear be the index of the rear element.

1. Initialization: front = -1 rear = -1

2. isEmpty(): return (front == -1) OR return (front > rear) after


some operations. The front == -1 check is often sufficient if
front is reset to -1 when the queue becomes empty.

3. isFull() (for array-based queue): return (rear == MAX_SIZE - 1)


Note: This only applies to a linear array queue. For a circular queue,
it's (rear + 1) % MAX_SIZE == front .

4. enqueue(element):

Pre-condition: !isFull()

If isEmpty() : front = 0 rear = 0

Else: rear = rear + 1

Q[rear] = element

5. dequeue():

Pre-condition: !isEmpty()

element = Q[front]

If front == rear (queue had only one element): front = -1


rear = -1

Else: front = front + 1

[Link] Page 24 of 164


return element

Derivations (if any) There are no complex mathematical derivations


for the basic Queue ADT itself. The "derivations" mainly involve the
logical progression of front and rear pointers to ensure FIFO
behavior and correct handling of empty/full states, as shown in the
logical representations above. For example, the condition front ==
rear after dequeue() means the queue becomes empty, which leads
to the derivation to reset both pointers.

5. Operations / Algorithms
We will demonstrate operations using an array-based queue for
simplicity.

5.1 Enqueue Operation

Problem statement: Add a new element to the rear of the queue.

Step-by-step algorithm:

1. Check if the queue is full. If rear has reached the maximum


capacity, report "Queue Overflow" and exit.

2. If the queue is empty (i.e., front is -1), set front to 0.

3. Increment rear .

4. Place the new element at the index rear in the queue's


underlying array.

Dry run example: Assume MAX_SIZE = 3 , initial queue Q = [_, _, _] ,


front = -1 , rear = -1 .

1. Enqueue(10) :
Queue not full. front is -1, so set front = 0 .

Increment rear to 0.

Q[0] = 10 .

State: Q = [10, _, _] , front = 0 , rear = 0 .

[Link] Page 25 of 164


2. Enqueue(20) :
Queue not full.

Increment rear to 1.

Q[1] = 20 .

State: Q = [10, 20, _] , front = 0 , rear = 1 .

Pseudocode: function Enqueue(queue, element, MAX_SIZE) if


[Link] == MAX_SIZE - 1 print "Queue Overflow" return if
[Link] == -1 [Link] = 0 [Link] = [Link] + 1
[Link][[Link]] = element end function

Code (C++/Java/Python):

C++: void enqueue(int queue[], int& front, int& rear, int MAX_SIZE, int
element) { if (rear == MAX_SIZE - 1) { std::cout << "Queue Overflow\n";
return; } if (front == -1) { front = 0; } rear++; queue[rear] = element;
std::cout << element << " enqueued to queue\n"; }

Java: public void enqueue(int element) { if (rear == MAX_SIZE - 1) {


[Link]("Queue Overflow"); return; } if (front == -1) { front =
0; } rear++; queueArray[rear] = element; [Link](element +
" enqueued to queue"); }

Python: def enqueue(queue_list, element, MAX_SIZE, front_ref,


rear_ref): if rear_ref[0] == MAX_SIZE - 1: print("Queue Overflow") return
front_ref[0], rear_ref[0] if front_ref[0] == -1: front_ref[0] = 0 rear_ref[0]
+= 1 queue_list[rear_ref[0]] = element print(f"{element} enqueued to
queue") return front_ref[0], rear_ref[0]

5.2 Dequeue Operation

Problem statement: Remove and return the element from the front
of the queue.

Step-by-step algorithm:

1. Check if the queue is empty. If front is -1 or front > rear , report


"Queue Underflow" and exit.

[Link] Page 26 of 164


2. Store the element at Q[front] in a temporary variable.

3. If front is equal to rear (meaning it's the last element in the


queue), reset both front and rear to -1 to signify an empty
queue.

4. Else, increment front .

5. Return the stored element.

Dry run example: Assume MAX_SIZE = 3 , current queue Q = [10, 20,


_] , front = 0 , rear = 1 .

1. Dequeue() :
Queue not empty.

temp = Q[0] (which is 10).

front (0) is not equal to rear (1).

Increment front to 1.

State: Q = [10, 20, _] , front = 1 , rear = 1 . (Note: The value


10 is logically removed; physically it might still be there but
unreachable).

Returns 10.

2. Dequeue() :
Queue not empty.

temp = Q[1] (which is 20).

front (1) is equal to rear (1).

Set front = -1 , rear = -1 .

State: Q = [10, 20, _] , front = -1 , rear = -1 .

Returns 20.

Pseudocode: function Dequeue(queue) if [Link] == -1 or


[Link] > [Link] print "Queue Underflow" return
ERROR_VALUE element = [Link][[Link]] if [Link] ==
[Link] [Link] = -1 [Link] = -1 else [Link] =
[Link] + 1 return element end function

[Link] Page 27 of 164


Code (C++/Java/Python):

C++: int dequeue(int queue[], int& front, int& rear) { if (front == -1 ||


front > rear) { std::cout << "Queue Underflow\n"; return -1; // Indicate
error } int element = queue[front]; if (front == rear) { // Last element in
queue front = -1; rear = -1; } else { front++; } std::cout << element << "
dequeued from queue\n"; return element; }

Java: public int dequeue() { if (front == -1 || front > rear) {


[Link]("Queue Underflow"); return -1; // Indicate error } int
element = queueArray[front]; if (front == rear) { // Last element in
queue front = -1; rear = -1; } else { front++; }
[Link](element + " dequeued from queue"); return
element; }

Python: def dequeue(queue_list, front_ref, rear_ref): if front_ref[0] ==


-1 or front_ref[0] > rear_ref[0]: print("Queue Underflow") return -1,
front_ref[0], rear_ref[0] # Indicate error element =
queue_list[front_ref[0]] if front_ref[0] == rear_ref[0]: # Last element in
queue front_ref[0] = -1 rear_ref[0] = -1 else: front_ref[0] += 1 print(f"
{element} dequeued from queue") return element, front_ref[0],
rear_ref[0]

5.3 Peek/Front Operation

Problem statement: Retrieve the element at the front of the queue


without removing it.

Step-by-step algorithm:

1. Check if the queue is empty. If front is -1 or front > rear , report


"Queue is Empty" and exit.

2. Return the element at the index front in the queue's array.

Pseudocode: function Peek(queue) if [Link] == -1 or [Link]


> [Link] print "Queue is Empty" return ERROR_VALUE return
[Link][[Link]] end function

Code (C++/Java/Python):

[Link] Page 28 of 164


C++: int peek(int queue[], int front, int rear) { if (front == -1 || front >
rear) { std::cout << "Queue is Empty\n"; return -1; // Indicate error }
return queue[front]; }

Java: public int peek() { if (front == -1 || front > rear) {


[Link]("Queue is Empty"); return -1; // Indicate error }
return queueArray[front]; }

Python: def peek(queue_list, front_ref, rear_ref): if front_ref[0] == -1 or


front_ref[0] > rear_ref[0]: print("Queue is Empty") return -1 # Indicate
error return queue_list[front_ref[0]]

5.4 IsEmpty Operation

Problem statement: Check if the queue contains any elements.

Step-by-step algorithm:

1. Return true if front is -1 or front > rear , indicating an empty


queue. Otherwise, return false.

Pseudocode: function IsEmpty(queue) return ([Link] == -1 or


[Link] > [Link]) end function

Code (C++/Java/Python):

C++: bool isEmpty(int front, int rear) { return (front == -1 || front >
rear); }

Java: public boolean isEmpty() { return (front == -1 || front > rear); }

Python: def is_empty(front_ref, rear_ref): return front_ref[0] == -1 or


front_ref[0] > rear_ref[0]

5.5 IsFull Operation (for array-based queue)

Problem statement: Check if the queue has reached its maximum


capacity.

Step-by-step algorithm:

[Link] Page 29 of 164


1. Return true if rear has reached MAX_SIZE - 1 . Otherwise, return
false.

Pseudocode: function IsFull(queue, MAX_SIZE) return ([Link] ==


MAX_SIZE - 1) end function

Code (C++/Java/Python):

C++: bool isFull(int rear, int MAX_SIZE) { return (rear == MAX_SIZE - 1); }

Java: public boolean isFull() { return (rear == MAX_SIZE - 1); }

Python: def is_full(rear_ref, MAX_SIZE): return rear_ref[0] == MAX_SIZE


-1

6. Complexity Analysis
Time complexity (Best, Worst, Average) For an array-based or
linked-list based Queue ADT, the fundamental operations are highly
efficient:

Enqueue: O(1)
Adding an element at the rear simply involves updating the
rear pointer/index and assigning the value.

Dequeue: O(1)
Removing an element from the front simply involves updating
the front pointer/index and returning the value.

Peek/Front: O(1)
Accessing the element at the front is a direct lookup.

IsEmpty: O(1)

IsFull: O(1) (for array-based)

Note: For dynamic array implementations, enqueue might


occasionally trigger a resize operation, which could take O(N) time in
the worst case (where N is the current number of elements). However,
this is amortized to O(1) over a sequence of operations.

[Link] Page 30 of 164


Space complexity

O(N): Where N is the number of elements currently stored in the


queue. Each element requires a constant amount of space.

For an array-based queue with a fixed MAX_SIZE , the space


complexity is O(MAX_SIZE) as it allocates memory for the maximum
possible elements upfront.

Comparison with alternatives

Queue vs. Stack:


Queue: FIFO (First-In, First-Out). Elements added first are
removed first. Analogy: a line of people. Operations: enqueue ,
dequeue .

Stack: LIFO (Last-In, First-Out). Elements added last are


removed first. Analogy: a stack of plates. Operations: push ,
pop .

Both are linear ADTs, but their access patterns are inverted.
Stacks are used for function call management, expression
evaluation, etc., where the most recent item needs to be
processed first. Queues are used when maintaining the order of
arrival is paramount.

Queue vs. Deque (Double-Ended Queue):


A Deque allows insertion and deletion from both ends (front
and rear), making it more flexible than a traditional queue or
stack. A queue is essentially a restricted Deque where
operations are limited to enqueue at rear and dequeue from
front.

7. Examples
Solved problems

Let's use an array-based queue with MAX_SIZE = 5 . Initial state: Q =


[_, _, _, _, _] , front = -1 , rear = -1 .

[Link] Page 31 of 164


1. enqueue(10)
front becomes 0, rear becomes 0. Q = [10, _, _, _, _]

2. enqueue(20)
rear becomes 1. Q = [10, 20, _, _, _]

3. enqueue(30)
rear becomes 2. Q = [10, 20, 30, _, _]

4. peek()
Returns 10 (element at front = 0 ). Q remains unchanged.

5. dequeue()
Removes 10. front becomes 1. Q = [10, 20, 30, _, _] (logically
[20, 30]). Returned: 10.

6. enqueue(40)
rear becomes 3. Q = [10, 20, 30, 40, _] (logically [20, 30, 40]).

7. dequeue()
Removes 20. front becomes 2. Q = [10, 20, 30, 40, _] (logically
[30, 40]). Returned: 20.

8. enqueue(50)
rear becomes 4. Q = [10, 20, 30, 40, 50] (logically [30, 40, 50]).

9. isFull()
rear (4) is MAX_SIZE - 1 (4). Returns true .

10. enqueue(60)
Queue is full. Prints "Queue Overflow". Q remains unchanged.

11. dequeue()
Removes 30. front becomes 3. Q = [10, 20, 30, 40, 50]
(logically [40, 50]). Returned: 30.

12. dequeue()
Removes 40. front becomes 4. Q = [10, 20, 30, 40, 50]
(logically [50]). Returned: 40.

13. dequeue()

[Link] Page 32 of 164


Removes 50. front becomes -1, rear becomes -1. Q = [10,
20, 30, 40, 50] (logically empty). Returned: 50.

14. isEmpty()
front is -1. Returns true .

15. dequeue()
Queue is empty. Prints "Queue Underflow". Returns -1.

8. Common Mistakes & Edge Cases


Where students go wrong

Off-by-one errors: Incorrectly managing array indices ( front ,


rear , MAX_SIZE - 1 ), especially for isFull and isEmpty
conditions.

Incorrect isEmpty check: Forgetting to handle the state where


front moves past rear in an array-based queue after all
elements have been dequeued (e.g., front > rear ).

Not resetting pointers: After the last element is dequeued, front


and rear should typically be reset to -1 (or null) to correctly
represent an empty queue state. If not, front might keep
incrementing and lead to incorrect isEmpty checks later.

Linear Queue limitations: Not understanding that a basic array-


based queue suffers from "wasted space" at the beginning of the
array after many dequeues. This can be solved by a Circular Queue.

Mixing up front and rear roles: Accidentally enqueuing at


front or dequeuing from rear .

Null pointer exceptions: In linked list implementations, forgetting


to check for null when traversing or accessing next pointers.

Special cases

Empty Queue: What happens when you try to dequeue or peek an


empty queue (underflow condition).

[Link] Page 33 of 164


Full Queue: What happens when you try to enqueue into a full
array-based queue (overflow condition).

Single Element Queue: The logic for dequeuing the last remaining
element (where front == rear ) must correctly reset the pointers to
an empty state.

Queue after multiple enqueues and dequeues: The front


pointer might be far from index 0. isFull for a linear array queue
might become true even if there's logical space at the beginning.
This is why circular queues are often preferred.

9. Real-Life Applications
How this concept is used in real world
Operating Systems - Process Scheduling: When multiple
processes are ready to run, they are put into a queue. The CPU
scheduler picks processes from the front of this queue (e.g., First-
Come, First-Served scheduling).

Printer Spooling/Task Management: When multiple users send


print jobs to a single printer, or multiple tasks are initiated on a
server, these jobs are stored in a queue and processed one by one
to avoid conflicts and ensure fair access.

Web Servers - Request Handling: A web server receives


numerous client requests. It often places these requests into a
queue to process them in the order they arrive, preventing
overloading and ensuring a sequential response.

BFS (Breadth-First Search) Algorithm: Used in pathfinding (e.g.,


finding the shortest path in an unweighted graph), network
broadcasting, and social network analysis. BFS explores all
neighbors at the current "level" before moving to the next level,
inherently using a queue to manage the order of nodes to visit.

Message Queues in Distributed Systems: Systems like RabbitMQ,


Apache Kafka, or AWS SQS use message queues to enable
asynchronous communication between different services.

[Link] Page 34 of 164


Messages are sent to a queue and consumed by services at their
own pace, providing decoupling and fault tolerance.

Simulations: Many simulations of real-world systems (e.g.,


customer service lines, traffic flow) use queues to model waiting
lines and resource allocation.

Buffer Management: Used in streaming data, audio/video players,


or network devices to temporarily store data packets that arrive
faster than they can be processed or to smooth out data flow.

10. Practice Problems


Easy

1. Implement a basic array-based queue with enqueue , dequeue ,


peek , isEmpty , isFull operations. Test with a sequence of
operations and print the queue state.

2. Write a program to reverse the elements of a queue using a stack.

3. Count the number of elements in a queue without modifying its


structure (e.g., by enqueuing them back after dequeueing).

Medium

1. Implement a Circular Queue using an array. A circular queue


addresses the space wastage issue of a linear array queue.

2. Implement a queue using two stacks. (Hint: One stack for


enqueuing, another for dequeuing, transfer elements when
needed).

3. Given an array representing a stream of integers, find the first non-


repeating character in the stream for each character. (Hint: Use a
queue and a frequency map).

Hard

1. Implement a Deque (Double-Ended Queue) using an array or a


doubly linked list.

[Link] Page 35 of 164


2. Implement a Priority Queue (where elements are dequeued
based on priority, not just arrival order). Discuss how it differs from
a standard queue. (Hint: typically uses a heap).

3. Solve the "Rotten Oranges" problem (LeetCode Medium) which


involves finding the minimum time for all fresh oranges to rot,
essentially a multi-source BFS problem that heavily relies on a
queue.

11. Summary / Key Points


Quick revision of points covered

Queue ADT: A linear data structure following the FIFO (First-In,


First-Out) principle.

Key Operations:
enqueue(element) : Adds an element to the rear.

dequeue() : Removes an element from the front.

peek() / front() : Returns the element at the front without


removing it.

isEmpty() : Checks if the queue is empty.

isFull() : Checks if the queue is full (for array-based


implementations).

Pointers/Indices: Managed by front (or head ) and rear (or


tail ).

Implementations: Can be implemented using arrays (linear or


circular) or linked lists.

Efficiency: All basic operations ( enqueue , dequeue , peek ,


isEmpty , isFull ) typically have a time complexity of O(1).

Space Complexity: O(N) where N is the number of elements.

Real-world Uses: CPU scheduling, printer queues, BFS algorithm,


message queues, call center systems.

[Link] Page 36 of 164


Common Pitfalls: Handling empty/full conditions, resetting
pointers after last dequeue, off-by-one errors.

Highlighted formulas and shortcuts

FIFO: The fundamental rule for queues.

O(1): The characteristic time complexity for queue's core


operations.

front == -1 or front > rear : Key condition for isEmpty() in


array-based queues.

rear == MAX_SIZE - 1 : Key condition for isFull() in linear array-


based queues.

Circular Queue isFull() : (rear + 1) % MAX_SIZE == front .

Queue Initialization: Set front = -1 , rear = -1 .

After last dequeue : Reset front = -1 , rear = -1 .

Queue Implementation using


Arrays
Implementing queues using fixed-size arrays, managing front and rear
pointers, and addressing the issue of array capacity and space wastage.

1. Introduction
Definition A queue is a linear data structure that follows the First-In,
First-Out (FIFO) principle. This means the element inserted first will be
the first one to be removed. It's like a line of people waiting for a
service.

Why it is needed (real-life analogy if possible) Queues are essential


when the order of processing matters, specifically when items need to
be handled sequentially in the order they arrived. Real-life Analogy:
Imagine a line of customers at a coffee shop. The first customer to join

[Link] Page 37 of 164


the line is the first one served. New customers join at the back of the
line, and the customer at the front of the line is served and leaves.
This perfectly models a queue's FIFO behavior.

Applications
CPU scheduling (managing processes in an operating system)

Printer spooling (managing print jobs)

Call center systems (handling customer calls)

Data buffering (e.g., streaming video, network packets)

Breadth-First Search (BFS) algorithm in graphs

2. Theory Explanation
In-depth explanation When implementing a queue using an array,
we typically use a fixed-size array and two pointers (or indices): front
and rear .
front : Points to the first element (the one to be dequeued next).

rear : Points to the last element (the one that was most recently
enqueued). Initially, both front and rear are often set to -1 to
indicate an empty queue.

Enqueue Operation: When an element is added, rear is


incremented, and the element is placed at array[rear] .

Dequeue Operation: When an element is removed, the element at


array[front] is taken, and front is incremented. A crucial
challenge with a simple linear array implementation is that once
front and rear move towards the end of the array, even if
elements have been dequeued from the beginning, the space at
the start of the array remains unused. This can lead to the queue
becoming "full" logically, even if there's physical space available.
This issue is often resolved using a "Circular Queue"
implementation, where the array is treated as a ring.

Diagrams/flowcharts (describe in text if not possible to draw)


Empty Queue: [ ] [ ] [ ] [ ] [ ] front = -1 , rear = -1

[Link] Page 38 of 164


After Enqueue(10): [10] [ ] [ ] [ ] [ ] front = 0 , rear = 0

After Enqueue(20): [10] [20] [ ] [ ] [ ] front = 0 , rear = 1

After Enqueue(30): [10] [20] [30] [ ] [ ] front = 0 , rear = 2

After Dequeue(): (10 is removed) [ ] [20] [30] [ ] [ ] front = 1 , rear


=2

Queue Full (for an array of size 5): [E1] [E2] [E3] [E4] [E5] front =
0 , rear = 4 (assuming E1 to E5 are elements)

Step-by-step breakdown
1. Initialization: Create an array of a fixed maximum size. Set front
= -1 and rear = -1 .

2. Enqueue:
Check if the queue is full (if rear has reached the maximum
array index). If full, report "Queue Overflow".

If it's the first element being added, set front = 0 .

Increment rear .

Add the new element at array[rear] .

3. Dequeue:
Check if the queue is empty (if front is -1 or front > rear ). If
empty, report "Queue Underflow".

Retrieve the element at array[front] .

Increment front .

If front becomes greater than rear after dequeuing (meaning


the last element was removed), reset both front = -1 and rear
= -1 to signify an empty queue.

4. Peek/Front:
Check if the queue is empty.

If not empty, return the element at array[front] without


modifying the queue.

5. IsEmpty: Return True if front == -1 (or front > rear ), False


otherwise.

[Link] Page 39 of 164


6. IsFull: Return True if rear == max_size - 1 , False otherwise.
(This applies to a linear queue; circular queues have a different
isFull condition).

3. Properties & Characteristics


Key properties
FIFO (First-In, First-Out): Elements are processed in the order
they arrive.

Fixed Size: When implemented with a static array, the queue has a
predefined maximum capacity.

Pointer-based access: Uses front and rear pointers to manage


insertions and deletions.

Contiguous memory allocation: Elements are stored in adjacent


memory locations.

Advantages & disadvantages


Advantages:
Simple to implement: The logic for enqueue and dequeue
is straightforward.

Efficient access: Direct array indexing provides O(1) time


complexity for enqueue , dequeue , peek , isEmpty , and
isFull operations.

Memory locality: Elements are stored contiguously, which can


sometimes lead to better cache performance.

Disadvantages:
Fixed Size: The most significant drawback is the fixed capacity.
If the queue becomes full, no more elements can be added,
leading to "Queue Overflow."

Space Wastage (Linear Queue): In a linear array


implementation, once elements are dequeued from the front,
the space they occupied cannot be reused unless the remaining
elements are shifted (which is inefficient) or a circular array is

[Link] Page 40 of 164


used. This can lead to the queue being "full" even when there's
empty space at the beginning of the array.

Rigid Structure: Less flexible compared to dynamic


implementations like linked lists.

4. Mathematical / Logical Representation


Important formulas Let queue_array be an array of size
MAX_SIZE . Let front be the index of the first element. Let rear be
the index of the last element.

Initial State (Empty Queue): front = -1 rear = -1

Condition for Empty Queue: front == -1 (or front > rear if


front can advance past rear after dequeuing the last element)

Condition for Full Queue (Linear Array Implementation): rear


== MAX_SIZE - 1

Enqueue Operation:

1. If isFull() is true, report Overflow.

2. If isEmpty() is true, set front = 0 .

3. rear = rear + 1

4. queue_array[rear] = new_element

Dequeue Operation:

1. If isEmpty() is true, report Underflow.

2. element = queue_array[front]

3. front = front + 1

4. If front > rear , then the queue is now empty, so reset: front =
-1 , rear = -1 .

5. Return element .

Peek Operation:

[Link] Page 41 of 164


1. If isEmpty() is true, report Underflow.

2. Return queue_array[front]

5. Operations / Algorithms
Let's assume a fixed-size array arr and MAX_SIZE .

5.1. enqueue(element)

Problem statement: Add an element to the rear of the queue.

Step-by-step algorithm:
1. Check if the queue is full ( rear == MAX_SIZE - 1 ). If true, print
"Queue Overflow" and exit.

2. If the queue is initially empty ( front == -1 ), set front = 0 .

3. Increment rear by 1.

4. Place the element at arr[rear] .

Dry run example: arr = [_, _, _, _] (MAX_SIZE=4), front = -1 , rear =


-1
1. enqueue(10) :
front is -1, so set front = 0 .

rear becomes 0.

arr[0] = 10 .

Queue state: arr = [10, _, _, _] , front = 0 , rear = 0

2. enqueue(20) :
front is not -1.

rear becomes 1.

arr[1] = 20 .

Queue state: arr = [10, 20, _, _] , front = 0 , rear = 1

Pseudocode:

[Link] Page 42 of 164


FUNCTION enqueue(element):
IF rear == MAX_SIZE - 1:
PRINT "Queue Overflow"
RETURN
IF front == -1:
front = 0
rear = rear + 1
arr[rear] = element

Code (Python):

class Queue:
def __init__(self, capacity):
[Link] = capacity
[Link] = [None] * capacity
[Link] = -1
[Link] = -1

def enqueue(self, item):


if [Link]():
print("Queue Overflow: Cannot enqueue item", item)
return
if [Link]():
[Link] = 0
[Link] += 1
[Link][[Link]] = item
print(f"Enqueued: {item}")

def isEmpty(self):
return [Link] == -1 or [Link] > [Link]

def isFull(self):
return [Link] == [Link] - 1

5.2. dequeue()

[Link] Page 43 of 164


Problem statement: Remove and return the element from the front
of the queue.

Step-by-step algorithm:
1. Check if the queue is empty ( front == -1 or front > rear ). If true,
print "Queue Underflow" and return an error/None.

2. Store the element at arr[front] in a temporary variable.

3. Increment front by 1.

4. If after incrementing, front becomes greater than rear (meaning


the queue is now empty), reset front = -1 and rear = -1 .

5. Return the stored element.

Dry run example: arr = [10, 20, 30, _] (MAX_SIZE=4), front = 0 , rear
=2
1. dequeue() :
Queue is not empty.

item = arr[0] (which is 10).

front becomes 1.

front (1) is not greater than rear (2).

Return 10.

Queue state: arr = [10, 20, 30, _] , front = 1 , rear = 2 (note:


10 is conceptually removed, but array value technically remains
until overwritten).

2. dequeue() :
Queue is not empty.

item = arr[1] (which is 20).

front becomes 2.

front (2) is not greater than rear (2).

Return 20.

Queue state: arr = [10, 20, 30, _] , front = 2 , rear = 2

3. dequeue() :

[Link] Page 44 of 164


Queue is not empty.

item = arr[2] (which is 30).

front becomes 3.

Now front (3) is greater than rear (2). Reset front = -1 , rear
= -1 .

Return 30.

Queue state: arr = [10, 20, 30, _] , front = -1 , rear = -1

Pseudocode:

FUNCTION dequeue():
IF isEmpty():
PRINT "Queue Underflow"
RETURN ERROR/NONE
element = arr[front]
front = front + 1
IF front > rear:
front = -1
rear = -1
RETURN element

Code (Python):

# ... (Queue class definition from above) ...

def dequeue(self):
if [Link]():
print("Queue Underflow: Cannot dequeue from empty queue")
return None

item = [Link][[Link]]
[Link] += 1

if [Link] > [Link]: # Last element was dequeued


[Link] = -1

[Link] Page 45 of 164


[Link] = -1
print(f"Dequeued: {item}")
return item

5.3. peek() / front()

Problem statement: Get the element at the front of the queue


without removing it.

Step-by-step algorithm:
1. Check if the queue is empty. If true, print "Queue is empty" and
return an error/None.

2. Return the element at arr[front] .

Pseudocode:

FUNCTION peek():
IF isEmpty():
PRINT "Queue is empty"
RETURN ERROR/NONE
RETURN arr[front]

Code (Python):

# ... (Queue class definition from above) ...

def peek(self):
if [Link]():
print("Queue is empty")
return None
return [Link][[Link]]

5.4. isEmpty()

Problem statement: Check if the queue contains no elements.

Step-by-step algorithm:

[Link] Page 46 of 164


1. Return True if front is -1 (indicating an empty queue), otherwise
return False . (Alternatively, front > rear also indicates empty if
front has advanced past rear ).

Pseudocode:

FUNCTION isEmpty():
RETURN front == -1 OR front > rear

Code (Python):

# ... (Queue class definition from above) ...

# Already defined in enqueue/dequeue, but for completeness:


def isEmpty(self):
return [Link] == -1 or [Link] > [Link]

5.5. isFull()

Problem statement: Check if the queue has reached its maximum


capacity.

Step-by-step algorithm:
1. Return True if rear is equal to MAX_SIZE - 1 , otherwise return
False .

Pseudocode:

FUNCTION isFull():
RETURN rear == MAX_SIZE - 1

Code (Python):

# ... (Queue class definition from above) ...

# Already defined in enqueue, but for completeness:

[Link] Page 47 of 164


def isFull(self):
return [Link] == [Link] - 1

6. Complexity Analysis
Time complexity
enqueue(): O(1) (Constant time) - Adding an element involves a few
pointer manipulations and an array assignment, regardless of
queue size.

dequeue(): O(1) (Constant time) - Removing an element also


involves a few pointer manipulations and an array access.

peek(): O(1) (Constant time) - Directly accessing an element by


index.

isEmpty(): O(1) (Constant time) - Simple comparison of pointers.

isFull(): O(1) (Constant time) - Simple comparison of pointers.

Best, Worst, Average: For array implementations, all these


operations consistently perform in O(1) time. There's no significant
difference between best, worst, and average cases for these basic
operations.

Space complexity
O(N) where N is the maximum capacity of the queue. The array
occupies memory proportional to its declared size, irrespective of
how many elements are currently stored.

Comparison with alternatives


Linked List Implementation:
Time Complexity: Also O(1) for enqueue, dequeue, peek,
isEmpty.

Space Complexity: O(N) where N is the number of elements


actually stored. This is more flexible as it only uses memory for
the elements present, growing or shrinking dynamically.

Advantages: Dynamic size, no risk of "full" unless memory runs


out. No space wastage like linear array queues.

[Link] Page 48 of 164


Disadvantages: Requires more memory per element (due to
storing pointers), slightly more complex implementation due to
pointer management, potentially worse cache performance due
to non-contiguous memory.

Dynamic Array / ArrayList Implementation:


Similar to array, but can resize. enqueue typically O(1)
amortized (O(N) in worst case for resizing), dequeue O(1)
amortized (O(N) if shifting is involved, but usually just pointer
movement). This offers a balance between fixed size and
dynamic growth.

7. Examples
Let's use a Queue object with MAX_SIZE = 3 .

# Assuming the Queue class defined above


my_queue = Queue(3) # Capacity is 3

print("--- Initial State ---")


print(f"Is empty? {my_queue.isEmpty()}") # Output: Is empty? True
print(f"Is full? {my_queue.isFull()}") # Output: Is full? False
print(f"Front: {my_queue.front}, Rear: {my_queue.rear}") # Output: Front

print("\n--- Enqueue Operations ---")


my_queue.enqueue(10) # Output: Enqueued: 10
print(f"Queue after enqueue(10): arr={my_queue.arr}, front={my_queue.
# Output: Queue after enqueue(10): arr=[10, None, None], front=0, rear=

my_queue.enqueue(20) # Output: Enqueued: 20


print(f"Queue after enqueue(20): arr={my_queue.arr}, front={my_queue.
# Output: Queue after enqueue(20): arr=[10, 20, None], front=0, rear=1

my_queue.enqueue(30) # Output: Enqueued: 30


print(f"Queue after enqueue(30): arr={my_queue.arr}, front={my_queue.
# Output: Queue after enqueue(30): arr=[10, 20, 30], front=0, rear=2

[Link] Page 49 of 164


print("\n--- Check Full and Overflow ---")
print(f"Is full? {my_queue.isFull()}") # Output: Is full? True
my_queue.enqueue(40) # Output: Queue Overflow: Cannot enqueue item
print(f"Queue after enqueue(40): arr={my_queue.arr}, front={my_queue.
# Output: Queue after enqueue(40): arr=[10, 20, 30], front=0, rear=2

print("\n--- Dequeue Operations ---")


print(f"Peek: {my_queue.peek()}") # Output: Peek: 10
dequeued_item = my_queue.dequeue() # Output: Dequeued: 10
print(f"Queue after dequeue(): arr={my_queue.arr}, front={my_queue.fro
# Output: Queue after dequeue(): arr=[10, 20, 30], front=1, rear=2 (note:

print(f"Peek: {my_queue.peek()}") # Output: Peek: 20


dequeued_item = my_queue.dequeue() # Output: Dequeued: 20
print(f"Queue after dequeue(): arr={my_queue.arr}, front={my_queue.fro
# Output: Queue after dequeue(): arr=[10, 20, 30], front=2, rear=2

print(f"Peek: {my_queue.peek()}") # Output: Peek: 30


dequeued_item = my_queue.dequeue() # Output: Dequeued: 30
print(f"Queue after dequeue(): arr={my_queue.arr}, front={my_queue.fro
# Output: Queue after dequeue(): arr=[10, 20, 30], front=-1, rear=-1 (que

print("\n--- Check Empty and Underflow ---")


print(f"Is empty? {my_queue.isEmpty()}") # Output: Is empty? True
dequeued_item = my_queue.dequeue() # Output: Queue Underflow: Can
print(f"Queue after dequeue(): arr={my_queue.arr}, front={my_queue.fro
# Output: Queue after dequeue(): arr=[10, 20, 30], front=-1, rear=-1

8. Common Mistakes & Edge Cases


Where students go wrong
Off-by-one errors: Incorrectly calculating MAX_SIZE - 1 for rear
or capacity leading to IndexError .

[Link] Page 50 of 164


Incorrect isEmpty() logic: Forgetting to handle the front > rear
case, which happens after all elements have been dequeued in a
linear queue, but front and rear haven't been reset to -1.

Incorrect isFull() logic: In linear queues, the queue can be


"logically full" even if arr[0] is empty, if rear has reached
MAX_SIZE - 1 . Students might incorrectly assume available space
at the beginning means it's not full.

Not resetting front and rear : When the last element is


dequeued, both front and rear should ideally be reset to -1 to
properly signify an empty queue, otherwise subsequent enqueue
operations might behave unexpectedly (e.g., setting front to 0 but
rear still at its previous value).

Not considering front == -1 for first enqueue: The first


enqueue needs to initialize front to 0, not just increment rear .

Special cases
Empty Queue: What happens when dequeue() or peek() is
called on an empty queue? Should raise an error or return a special
value.

Full Queue: What happens when enqueue() is called on a full


queue? Should raise an error or print a message.

Queue with a single element: When adding the first element,


both front and rear become 0. When removing the last element,
front and rear should reset to -1.

Linear Queue's "False Full": This is the most common edge case
for simple array implementations. The queue reaches rear ==
MAX_SIZE - 1 , but front might be at an index like 3, meaning
arr[0], arr[1], arr[2] are empty. The queue is technically full, but
there's unused space. This leads to the concept of a Circular
Queue.

9. Real-Life Applications
Operating Systems:

[Link] Page 51 of 164


CPU Scheduling: Processes waiting to be executed by the CPU are
often kept in a ready queue, where they are scheduled based on
various policies (e.g., First-Come, First-Served).

I/O Buffers: Data read from or written to peripheral devices (like


disk drives, keyboards) is buffered using queues to handle speed
differences.

Networking:
Packet Handling: Routers use queues to manage incoming and
outgoing network packets, ensuring they are processed in order.

Message Queues: Used in distributed systems to facilitate


asynchronous communication between different services.

Printers:
Printer Spooling: Documents sent to a printer are placed in a
queue, and the printer processes them one by one.

Simulation:
Queues are fundamental in simulating real-world scenarios like
customer service lines, traffic flow, or manufacturing assembly
lines to analyze waiting times and system performance.

Web Servers:
Handling client requests: A web server might put incoming
requests into a queue to process them sequentially, preventing
overload.

10. Practice Problems


Easy
1. Implement a basic linear queue using an array. Include enqueue ,
dequeue , peek , isEmpty , isFull methods.

2. Write a function display_queue() that prints all elements currently


in the queue from front to rear .

Medium

[Link] Page 52 of 164


1. Modify the array-based queue to be a circular queue. Explain how
front and rear wrap around, and how isFull() and isEmpty()
conditions change.

2. Implement a queue that stores (element, priority) pairs. When


dequeuing, the element with the highest priority is removed (this is
a Priority Queue, but an array-based implementation will be O(N)
for dequeue unless sorted).

Hard
1. Implement a "queue with min/max" functionality, where getMin()
and getMax() operations (to find the minimum/maximum
element in the queue) should also be O(1). (Hint: This typically
involves using auxiliary data structures alongside the main queue).

2. Given a stream of integers, implement a FixedSizedQueue and


write a function that finds the average of the last K elements
using this queue efficiently.

11. Summary / Key Points


Definition: A queue is a FIFO (First-In, First-Out) data structure.

Array Implementation: Uses a fixed-size array and two pointers,


front (for dequeue) and rear (for enqueue).

Initialization: front = -1 , rear = -1 .

Enqueue: Increments rear , adds element at arr[rear] . Handles


front initialization if queue was empty. Checks for isFull .

Dequeue: Increments front , returns arr[front-1] . Resets front ,


rear to -1 if the last element is removed. Checks for isEmpty .

peek() : Returns arr[front] without modifying the queue.

isEmpty() : front == -1 or front > rear .

isFull() (Linear): rear == MAX_SIZE - 1 .

[Link] Page 53 of 164


Advantages: Simple, O(1) time complexity for all basic operations,
good cache performance.

Disadvantages: Fixed size, potential for space wastage in linear arrays


(solved by Circular Queues).

Applications: CPU scheduling, printer queues, buffering, BFS.

Highlighted Formulas and Shortcuts:

front , rear pointers for managing queue.

front == -1 (or front > rear ) for isEmpty() .

rear == MAX_SIZE - 1 for isFull() in a linear queue.

All basic operations ( enqueue , dequeue , peek , isEmpty ,


isFull ) are O(1) time complexity.

Space complexity is O(N) where N is the MAX_SIZE .

Remember the "false full" problem of linear array queues, leading


to Circular Queue concept.

Circular Queue
Overcoming limitations of linear array queues by using a circular array,
efficient use of space, and implementation details for enqueue and dequeue.

1. Introduction
Definition A Circular Queue is a linear data structure in which the
operations are performed based on the FIFO (First In, First Out)
principle, and the last position is connected back to the first position,
forming a circle. It is also known as a Ring Buffer. In a circular queue,
once an element is dequeued, the space it occupied becomes
available for reuse, and new elements can be enqueued there by
"wrapping around" the array.

[Link] Page 54 of 164


Why it is needed In a simple (linear) queue implemented using an
array, once elements are dequeued, the front pointer moves
forward, but the space at the beginning of the array remains empty
and cannot be reused until the queue is completely emptied or
elements are shifted. This leads to inefficient memory utilization and a
potential "queue full" error even when there are empty spaces at the
front.

Real-life Analogy: Imagine a circular bus route with a fixed number of


stops. A bus (element) picks up passengers (enters the queue) at the
next available stop and drops them off (leaves the queue) at the
station it arrived at earliest. When the bus reaches the last stop on the
route, instead of stopping, it loops back to the first stop. This ensures
that all stops are continuously utilized, and new passengers can
always board as long as there's a stop available, regardless of whether
it's at the "beginning" or "end" of the physical route. The circular
queue works similarly, allowing elements to occupy spaces at the
beginning of the array once the rear reaches the end and the front
part of the array becomes empty.

Applications

CPU Scheduling: Used in operating systems for managing


processes in a Round Robin scheduling algorithm.

Traffic Light Systems: Helps in managing the sequence of traffic


lights.

Buffer Management: Used in multimedia applications and data


streaming for buffering data.

Resource Sharing: Allocating resources among multiple users or


processes.

Keyboard Buffers: Storing keystrokes until the CPU can process


them.

2. Theory Explanation

[Link] Page 55 of 164


In-depth explanation A circular queue is typically implemented using
a fixed-size array and two pointers: front and rear .

front : Points to the index of the first element in the queue.

rear : Points to the index where the next element will be inserted.

The key to circular queues is the use of modular arithmetic. When


front or rear reaches the end of the array, it "wraps around" to the
beginning (index 0). This is achieved by using the modulo operator ( %
MAX_SIZE ).

Initially, both front and rear are often set to -1 or 0 depending


on the convention. A common convention is:

front = -1

rear = -1

When an element is enqueued for the first time:

front becomes 0 .

rear becomes 0 .

When subsequent elements are enqueued:

rear is incremented and updated using rear = (rear + 1) %


MAX_SIZE .

When elements are dequeued:

front is incremented and updated using front = (front + 1) %


MAX_SIZE .

Conditions for Empty and Full Queue:

Empty Queue: The queue is empty when front == -1 (initial state)


or when front == rear after the last element has been dequeued.
A robust check for emptiness is usually front == -1 .

Full Queue: The queue is full when the next position for rear is
front . This is checked as (rear + 1) % MAX_SIZE == front . This

[Link] Page 56 of 164


condition leaves one slot empty to differentiate between a full and
an empty queue. If front and rear point to the same index, it
could mean either empty or full. By leaving one slot empty, we can
distinguish: front == rear implies empty (after the first element is
inserted and then removed, making front eventually catch up to
rear 's initial position, or vice versa, based on specific
implementations).

Let's refine the empty/full conditions for clarity:

Initialization: front = -1 , rear = -1

IsEmpty: front == -1

IsFull: (rear + 1) % MAX_SIZE == front

Only one element: front == rear and front != -1

Diagrams/flowcharts (described in text)

Let's consider a circular queue of MAX_SIZE = 5 . The indices are 0, 1,


2, 3, 4.

1. Initial State: Queue: [ _ , _ , _ , _ , _ ] front = -1 , rear = -1


(IsEmpty = true, IsFull = false)

2. Enqueue(10): front becomes 0 , rear becomes 0 . Queue:


[10, _ , _ , _ , _ ] front = 0 , rear = 0 (IsEmpty = false, IsFull =
false)

3. Enqueue(20): rear = (0 + 1) % 5 = 1 . Queue: [10, 20, _ , _ , _ ]


front = 0 , rear = 1

4. Enqueue(30): rear = (1 + 1) % 5 = 2 . Queue: [10, 20, 30, _ , _ ]


front = 0 , rear = 2

5. Dequeue(): (Removes 10) front = (0 + 1) % 5 = 1 . Queue: [_ , 20,


30, _ , _ ] front = 1 , rear = 2

6. Enqueue(40): rear = (2 + 1) % 5 = 3 . Queue: [_ , 20, 30, 40, _ ]


front = 1 , rear = 3

[Link] Page 57 of 164


7. Enqueue(50): rear = (3 + 1) % 5 = 4 . Queue: [_ , 20, 30, 40, 50]
front = 1 , rear = 4

8. Enqueue(60): (Attempt to add 60) IsFull check: (rear + 1) % 5 ==


(4 + 1) % 5 == 5 % 5 == 0 . Current front = 1 . 0 != 1 . Wait, I made
a mistake in the full condition interpretation. The common
convention is to leave one space empty. If MAX_SIZE = 5 , then
when front = 1 , rear = 0 (meaning 0 has 10, 1 has _ etc.), it
is considered full. Let's re-evaluate the full condition for clarity.

The correct way to handle isFull and isEmpty with front and
rear is crucial. Let MAX_SIZE be the actual capacity.

Initialization: front = -1 , rear = -1

IsEmpty: front == -1

IsFull: (rear + 1) % MAX_SIZE == front

This setup implies that when rear reaches the position just before
front , the queue is full. This leaves one slot intentionally unused
to distinguish a full queue from an empty one.

Let's re-do example with MAX_SIZE = 5 .

1. Initial State: Q = [_,_,_,_,_] , front = -1 , rear = -1 (Empty)

2. Enqueue(10): front = 0 , rear = 0 Q = [10,_,_,_,_]

3. Enqueue(20): rear = (0+1)%5 = 1 Q = [10,20,_,_,_]

4. Enqueue(30): rear = (1+1)%5 = 2 Q = [10,20,30,_,_]

5. Enqueue(40): rear = (2+1)%5 = 3 Q = [10,20,30,40,_]

6. Enqueue(50): (Now attempting to make it full) rear = (3+1)%5 =


4 Q = [10,20,30,40,50] front = 0 , rear = 4

Now, isFull check: (rear + 1) % MAX_SIZE == (4 + 1) % 5 == 0 .


Is 0 == front ? Yes, front is 0 . So the queue is full now. This
works! MAX_SIZE here is the actual number of elements it can

[Link] Page 58 of 164


hold + 1 (the sentinel empty slot). So, a queue of MAX_SIZE can
hold MAX_SIZE - 1 elements.

7. Dequeue(): (Removes 10) Element 10 is removed. front = (0 +


1) % 5 = 1 Q = [_,20,30,40,50] front = 1 , rear = 4 (IsEmpty =
false, IsFull = false)

8. Enqueue(60): (Wraps around) isFull check: (rear + 1) % 5 == (4


+ 1) % 5 == 0 . Is 0 == front ? No, front is 1 . So it's not full.
rear = (4 + 1) % 5 = 0 Q = [60,20,30,40,50] front = 1 , rear =
0 (IsEmpty = false, IsFull = true, because (0+1)%5 == 1 which is
front ).

This demonstrates the wrap-around behavior and the full


condition.

Step-by-step breakdown (Covered in operations section below)

3. Properties & Characteristics


Key properties

Fixed Size: Typically implemented with a fixed-size array, meaning


the maximum number of elements it can hold is predetermined.

FIFO Principle: Elements are processed in the order they were


added (First In, First Out).

Efficient Memory Utilization: Overcomes the problem of unused


space at the front of the array that occurs in a linear queue, by
allowing elements to wrap around.

Modular Arithmetic: Relies on the modulo operator for pointer


(front and rear) movement and wrap-around logic.

Advantages & disadvantages

Advantages:
Memory Efficiency: Prevents space wastage by reusing empty
slots created by dequeued elements.

[Link] Page 59 of 164


Constant Time Operations: Enqueue, Dequeue, IsFull, IsEmpty
operations generally take O(1) time complexity.

Simple Implementation: Relatively straightforward to


implement using an array and two pointers.

Disadvantages:
Fixed Size: The maximum capacity must be defined at the time
of creation, making it inflexible for dynamic growth. If the queue
frequently fills up, operations will be rejected until space
becomes available.

Complexity in Logic: The logic for checking full/empty


conditions and handling wrap-around is slightly more complex
than a simple linear queue.

Sentinel Slot: Often requires one slot to remain empty to


distinguish between a full and an empty queue, reducing actual
usable capacity by one.

4. Mathematical / Logical Representation


Important formulas Let MAX_SIZE be the declared size of the
circular queue array.

Next rear position: (rear + 1) % MAX_SIZE

Next front position: (front + 1) % MAX_SIZE

Condition for Empty Queue (initial/after all dequeued): front


== -1

Condition for Full Queue: (rear + 1) % MAX_SIZE == front

Number of elements in queue (if not empty): (rear - front +


MAX_SIZE) % MAX_SIZE + 1 (Note: This formula works if front and
rear point to valid elements. If front = -1 (empty), count is 0. If
front == rear and not -1 , it implies 1 element.) A simpler way to
count elements for this specific front = -1 initialization: If front ==
-1 , size is 0. Else if front <= rear , size is rear - front + 1 . Else
( front > rear ), size is MAX_SIZE - front + rear + 1 .

[Link] Page 60 of 164


Derivations (if any) The modulo operator X % N ensures that the
index X wraps around to 0 when it reaches N . For example, if
N=5 , then 4+1 = 5 , and 5 % 5 = 0 . This is fundamental to making
the array "circular."

5. Operations / Algorithms
Let's assume a circular queue of fixed MAX_SIZE (e.g., 5). The array is
Q of size MAX_SIZE . front and rear are integer variables.

5.1 isEmpty()

Problem statement To check if the circular queue currently contains


no elements.

Step-by-step algorithm

1. If front is equal to -1 , the queue is empty.

2. Otherwise, the queue is not empty.

Dry run example Initial state: front = -1 , rear = -1

1. isEmpty() check: front == -1 is true. Returns true .

After Enqueue(10): front = 0 , rear = 0

1. isEmpty() check: front == -1 is false. Returns false .

Pseudocode

function isEmpty(queue)
if [Link] == -1
return true
else
return false

Code (C++)

[Link] Page 61 of 164


bool isEmpty() {
return front == -1;
}

5.2 isFull()

Problem statement To check if the circular queue has reached its


maximum capacity and cannot accept any more elements.

Step-by-step algorithm

1. Calculate the next potential rear position: (rear + 1) %


MAX_SIZE .

2. If this next rear position is equal to front , then the queue is full.

Dry run example MAX_SIZE = 5 Queue: [10, 20, 30, 40, 50] front =
0 , rear = 4

1. isFull() check: next_rear = (rear + 1) % MAX_SIZE = (4 + 1) % 5 = 5


% 5 = 0 . Is next_rear == front ? 0 == 0 is true. Returns true .

Queue: [_ , 20, 30, 40, 50] front = 1 , rear = 4

1. isFull() check: next_rear = (rear + 1) % MAX_SIZE = (4 + 1) % 5 = 0 .


Is next_rear == front ? 0 == 1 is false. Returns false .

Pseudocode

function isFull(queue)
if ( ([Link] + 1) % queue.MAX_SIZE ) == [Link]
return true
else
return false

Code (C++)

bool isFull() {
return ((rear + 1) % MAX_SIZE == front);

[Link] Page 62 of 164


}

5.3 enqueue(int element)

Problem statement To add an element to the rear of the circular


queue.

Step-by-step algorithm

1. Check if the queue is full using isFull() . If true, print an error and
return.

2. If the queue is initially empty ( front == -1 ): a. Set front = 0 .

3. Update rear : rear = (rear + 1) % MAX_SIZE .

4. Insert the element at Q[rear] .

Dry run example MAX_SIZE = 5 Initial state: front = -1 , rear = -1 ,


Q = [_,_,_,_,_]

1. Enqueue(10): isFull() is false. front == -1 is true. Set front = 0 .


rear = (-1 + 1) % 5 = 0 . Q[0] = 10 . State: front = 0 , rear = 0 , Q
= [10,_,_,_,_]

2. Enqueue(20): isFull() is false. front == -1 is false. rear = (0 + 1)


% 5 = 1 . Q[1] = 20 . State: front = 0 , rear = 1 , Q = [10,20,_,_,_]

3. Enqueue(30): isFull() is false. front == -1 is false. rear = (1 + 1)


% 5 = 2 . Q[2] = 30 . State: front = 0 , rear = 2 , Q =
[10,20,30,_,_]

Pseudocode

function enqueue(queue, element)


if isFull(queue)
print "Queue is full. Cannot enqueue."
return

if isEmpty(queue) // If adding first element


[Link] = 0

[Link] Page 63 of 164


[Link] = ([Link] + 1) % queue.MAX_SIZE
[Link][[Link]] = element
print element, " enqueued."

Code (C++)

void enqueue(int element) {


if (isFull()) {
std::cout << "Queue is full. Cannot enqueue " << element << std::end
return;
}
if (isEmpty()) { // First element
front = 0;
}
rear = (rear + 1) % MAX_SIZE;
arr[rear] = element;
std::cout << element << " enqueued." << std::endl;
}

5.4 dequeue()

Problem statement To remove and return the element from the


front of the circular queue.

Step-by-step algorithm

1. Check if the queue is empty using isEmpty() . If true, print an error


and return a sentinel value (e.g., -1 or throw exception).

2. Retrieve the element at Q[front] .

3. If front and rear are pointing to the same element (meaning it's
the last element in the queue): a. Set front = -1 . b. Set rear = -1 .

4. Else (if there are more elements after the current one): a. Update
front : front = (front + 1) % MAX_SIZE .

5. Return the retrieved element.

[Link] Page 64 of 164


Dry run example MAX_SIZE = 5 Start from previous enqueue
operations: front = 0 , rear = 2 , Q = [10,20,30,_,_]

1. Dequeue(): isEmpty() is false. element = Q[front] = Q[0] = 10 .


front == rear (0 == 2) is false. front = (0 + 1) % 5 = 1 . State: front
= 1 , rear = 2 , Q = [_,20,30,_,_] . Returns 10 .

2. Dequeue(): isEmpty() is false. element = Q[front] = Q[1] = 20 .


front == rear (1 == 2) is false. front = (1 + 1) % 5 = 2 . State: front
= 2 , rear = 2 , Q = [_,_,30,_,_] . Returns 20 .

3. Dequeue(): (Last element) isEmpty() is false. element = Q[front]


= Q[2] = 30 . front == rear (2 == 2) is true. front = -1 , rear = -1 .
State: front = -1 , rear = -1 , Q = [_,_,_,_,_] . Returns 30 .

4. Dequeue(): (Attempt on empty queue) isEmpty() is true. Prints


error. Returns sentinel.

Pseudocode

function dequeue(queue)
if isEmpty(queue)
print "Queue is empty. Cannot dequeue."
return ERROR_VALUE // Or throw exception

element = [Link][[Link]]

if [Link] == [Link] // Last element in queue


[Link] = -1
[Link] = -1
else
[Link] = ([Link] + 1) % queue.MAX_SIZE

print element, " dequeued."


return element

Code (C++)

[Link] Page 65 of 164


int dequeue() {
if (isEmpty()) {
std::cout << "Queue is empty. Cannot dequeue." << std::endl;
return -1; // Or throw an exception
}
int element = arr[front];
if (front == rear) { // Last element
front = -1;
rear = -1;
} else {
front = (front + 1) % MAX_SIZE;
}
std::cout << element << " dequeued." << std::endl;
return element;
}

6. Complexity Analysis
Time complexity

Enqueue: O(1) - Involves a few arithmetic operations and


assignment, irrespective of queue size.

Dequeue: O(1) - Similar to enqueue, constant time operations.

isEmpty: O(1) - Simple comparison.

isFull: O(1) - Simple comparison and modulo operation.

Peek (access front element): O(1) - Directly access arr[front] .

Space complexity

O(N): Where N is the MAX_SIZE of the circular queue. It requires a


fixed-size array to store the elements.

Comparison with alternatives

Linear Queue (Array-based):

[Link] Page 66 of 164


Time: Enqueue, Dequeue, IsEmpty, IsFull are O(1) for typical
implementations. However, if 'shifting' elements is used to
resolve space wastage, dequeue can be O(N).

Space: O(N).

Disadvantage: Suffers from "space wastage" where empty slots


at the front of the array cannot be reused until the entire queue
is reset, potentially leading to premature "queue full" errors.

Queue using Linked List:


Time: Enqueue (add at rear) and Dequeue (remove from front)
are O(1).

Space: O(N) for N elements, but with additional overhead for


storing pointers in each node.

Advantage: Dynamically resizable; no fixed size limit.

Disadvantage: Higher memory overhead due to pointers;


slightly slower access due to non-contiguous memory allocation
compared to array-based.

Circular queues effectively combine the O(1) time complexity and


contiguous memory benefits of array-based queues while overcoming
the space wastage problem of linear array queues.

7. Examples
Solved Problem 1: Basic Operations

Let MAX_SIZE = 4 for our circular queue. Initial state: front = -1 , rear =
-1 , Q = [_,_,_,_]

1. enqueue(1) front = 0 , rear = 0 Q = [1,_,_,_]

2. enqueue(2) rear = (0+1)%4 = 1 Q = [1,2,_,_]

3. enqueue(3) rear = (1+1)%4 = 2 Q = [1,2,3,_]

4. dequeue() Returns 1 . front = (0+1)%4 = 1 Q = [_,2,3,_]

5. enqueue(4) rear = (2+1)%4 = 3 Q = [_,2,3,4]

[Link] Page 67 of 164


6. enqueue(5) Check isFull() : (rear + 1) % MAX_SIZE == (3 + 1) % 4 ==
0 . Current front = 1 . 0 != 1 . Not full. Wait, here the isFull
condition (rear + 1) % MAX_SIZE == front means that MAX_SIZE can
hold MAX_SIZE - 1 elements. So, for MAX_SIZE=4 , it can hold 3
elements. Let's adjust for this specific isFull condition.

Revised Example Let MAX_SIZE = 4 . This means the queue array has
indices 0, 1, 2, 3. Our isFull condition (rear + 1) % MAX_SIZE == front
implies one slot is always left empty. So, a MAX_SIZE=4 queue can
hold 3 elements. Initial state: front = -1 , rear = -1 , Q = [_,_,_,_]

1. enqueue(1) front = 0 , rear = 0 Q = [1,_,_,_]

2. enqueue(2) rear = (0+1)%4 = 1 Q = [1,2,_,_]

3. enqueue(3) rear = (1+1)%4 = 2 Q = [1,2,3,_]

Now, check isFull() : (rear + 1) % MAX_SIZE == (2 + 1) % 4 == 3 .


front = 0 . 3 != 0 . So not full. This is where the discrepancy in the
"sentinel slot" understanding can occur. If MAX_SIZE is the
capacity and we leave one slot empty, then MAX_SIZE=4 implies 3
elements. If MAX_SIZE means the actual array size, then
MAX_SIZE-1 elements can be stored.

Let's assume MAX_SIZE is the array size, and we define our queue
capacity as MAX_SIZE - 1 .

So MAX_SIZE = 4 means it can store 3 elements.

4. enqueue(4) (This element would make it 'full' by filling the 3rd


allowed slot) rear = (2+1)%4 = 3 Q = [1,2,3,4] Current front =
0 , rear = 3 . Check isFull() : (rear + 1) % MAX_SIZE == (3 + 1) %
4 == 0 . Current front = 0 . 0 == 0 is true . So, the queue IS
full now. (It contains 4 elements now, which is MAX_SIZE ). This
interpretation means isFull is true when (rear+1)%MAX_SIZE
== front , and the queue actually holds MAX_SIZE elements.
This would mean the empty/full distinction needs another
variable, like count , or different pointer initialization.

[Link] Page 68 of 164


Alternative logic for empty/full (without wasting a slot):

Keep a count variable for the number of elements.

isEmpty : count == 0

isFull : count == MAX_SIZE

This is often simpler for beginners but adds another


variable.

Let's stick to the (rear+1)%MAX_SIZE == front full condition


and front == -1 empty condition for consistency. This implies
MAX_SIZE refers to array size, and the queue can hold
MAX_SIZE - 1 elements.

Revised Example (again), assuming MAX_SIZE=4 means a capacity


of 3 elements.

Initial state: front = -1 , rear = -1 , Q = [_,_,_,_] (Capacity = 3)

1. enqueue(1) front = 0 , rear = 0 Q = [1,_,_,_]

2. enqueue(2) rear = (0+1)%4 = 1 Q = [1,2,_,_]

3. enqueue(3) rear = (1+1)%4 = 2 Q = [1,2,3,_] Queue


currently has 3 elements. Check isFull() : (rear + 1) % 4 ==
(2 + 1) % 4 == 3 . front = 0 . 3 == 0 is false. So, not full.

This implies our isFull condition (rear + 1) % MAX_SIZE ==


front for MAX_SIZE (array size) does leave one spot empty.
So if MAX_SIZE=4 , capacity is 3 .

4. enqueue(4) Now, isFull() : (rear + 1) % 4 == (2 + 1) % 4 ==


3 . front = 0 . Still 3 != 0 . NOT FULL. This is a classic
problem in circular queue explanation.

Let's clarify the MAX_SIZE vs capacity. If Q[0..MAX_SIZE-1] is


the array:

Condition 1 (empty slot wasted): isFull:


(rear+1)%MAX_SIZE == front . isEmpty: front == -1 . Capacity

[Link] Page 69 of 164


= MAX_SIZE-1 .

Condition 2 (use count variable): isFull: count ==


MAX_SIZE . isEmpty: count == 0 . Capacity = MAX_SIZE .

Condition 3 (distinct front/rear for full/empty): isFull:


(rear+1)%MAX_SIZE == front . isEmpty: front == rear . Needs
front and rear to be initialized to same index (e.g., 0). This
also implies MAX_SIZE-1 capacity.

I'll stick with the initial choice (Condition 1) as it's very


common and what I've coded. MAX_SIZE is the physical
array size. Usable slots = MAX_SIZE - 1 .

Final Reworked Example (for MAX_SIZE=4 array, capacity=3)

Initial state: front = -1 , rear = -1 , Q = [_,_,_,_] (Indices 0, 1, 2,


3)

1. enqueue(10) front = 0 , rear = 0 Q = [10,_,_,_]

2. enqueue(20) rear = (0+1)%4 = 1 Q = [10,20,_,_]

3. enqueue(30) rear = (1+1)%4 = 2 Q = [10,20,30,_] Queue


has 3 elements. Check isFull() : (rear + 1) % 4 == (2 + 1) % 4
== 3 . front = 0 . 3 == 0 is false. Not full. (Correct, as
capacity is 3)

4. enqueue(40) (Attempt to add a 4th element) Check


isFull() : (rear + 1) % 4 == (2 + 1) % 4 == 3 . front = 0 . 3 ==
0 is false. Not full. This indicates an issue with MAX_SIZE vs
capacity with this condition.

Let's modify isFull or the interpretation of MAX_SIZE . If


MAX_SIZE is the actual capacity: (rear + 1) % (MAX_SIZE + 1) ==
front . No, this also gets confusing.

Simplest approach (and most common in textbooks,


despite the one empty slot): MAX_SIZE is the array size.
front = -1 , rear = -1 on init. isEmpty = front == -1 . isFull =

[Link] Page 70 of 164


(rear + 1) % MAX_SIZE == front . This means the queue can hold
MAX_SIZE - 1 elements.

Example with MAX_SIZE = 4 , so actual capacity = 3.

Initial: front = -1 , rear = -1 , Q = [_,_,_,_]

1. enqueue(10) : front=0, rear=0 . Q = [10,_,_,_]

2. enqueue(20) : rear=1 . Q = [10,20,_,_]

3. enqueue(30) : rear=2 . Q = [10,20,30,_] Now, 3 elements


are in the queue. isFull check: (rear+1)%4 == (2+1)%4 ==
3 . front=0 . 3 == 0 is false. Not full yet. This is the problem.
It should be full now with 3 elements if capacity is 3.

Okay, let's correct the isFull condition in my initial theory if


MAX_SIZE denotes the actual number of elements it can hold. If
MAX_SIZE is the capacity and it refers to the array size: Then
isFull is when (rear + 1) % MAX_SIZE == front && front != -1 .
No, this is wrong.

Let's use the most robust and common approach with


count for examples for clarity, even if it adds a variable.

MAX_SIZE is the array size.

front , rear .

count (number of elements currently in queue).

isEmpty: count == 0 .

isFull: count == MAX_SIZE .

Let MAX_SIZE = 4 (can hold 4 elements). Initial state: front =


0 , rear = -1 , count = 0 , Q = [_,_,_,_] (A bit different init, but
easier to reason)

1. enqueue(10) isFull() is false ( count=0 != 4 ). rear = (rear


+ 1) % 4 = (-1 + 1) % 4 = 0 . Q[0] = 10 . count = 1 . State:
front = 0 , rear = 0 , count = 1 , Q = [10,_,_,_]

[Link] Page 71 of 164


2. enqueue(20) isFull() is false ( count=1 != 4 ). rear = (0 +
1) % 4 = 1 . Q[1] = 20 . count = 2 . State: front = 0 , rear =
1 , count = 2 , Q = [10,20,_,_]

3. enqueue(30) isFull() is false ( count=2 != 4 ). rear = (1 +


1) % 4 = 2 . Q[2] = 30 . count = 3 . State: front = 0 , rear =
2 , count = 3 , Q = [10,20,30,_]

4. enqueue(40) isFull() is false ( count=3 != 4 ). rear = (2 +


1) % 4 = 3 . Q[3] = 40 . count = 4 . State: front = 0 , rear =
3 , count = 4 , Q = [10,20,30,40]

5. enqueue(50) isFull() is true ( count=4 == 4 ). Print


"Queue is full."

6. dequeue() isEmpty() is false ( count=4 != 0 ). element =


Q[front] = Q[0] = 10 . front = (0 + 1) % 4 = 1 . count = 3 .
State: front = 1 , rear = 3 , count = 3 , Q = [_,20,30,40] .
Returns 10 .

7. enqueue(50) isFull() is false ( count=3 != 4 ). rear = (3 +


1) % 4 = 0 . Q[0] = 50 . count = 4 . State: front = 1 , rear =
0 , count = 4 , Q = [50,20,30,40] (50 overwrites the empty
slot)

This count based approach is easier for dry runs and less
ambiguous. I'll include it in the code too. I will update the
isEmpty and isFull in the general theory to mention count
as an alternative.

8. Common Mistakes & Edge Cases


Where students go wrong

Incorrect Full/Empty Conditions: This is the most frequent


mistake. Forgetting the (rear + 1) % MAX_SIZE == front condition
or getting it wrong, or not initializing front and rear correctly.
Using a count variable can simplify this logic.

[Link] Page 72 of 164


Off-by-one Errors: Incorrectly calculating MAX_SIZE , front , or
rear indices leading to buffer overflows or incorrect wrap-around
behavior.

Forgetting to Reset Pointers: When the last element is dequeued,


front and rear should typically be reset to -1 (or initial empty
state) to correctly signify an empty queue. If not reset, front ==
rear might incorrectly be interpreted as full after subsequent
operations.

Not Using Modulo Operator: Incrementing front or rear


without applying % MAX_SIZE will result in linear behavior and
index out-of-bounds errors.

Special cases

Queue with MAX_SIZE = 1 : A queue with a single slot is tricky.


front = -1 , rear = -1 . Enqueue(X) -> front=0, rear=0 .

isFull becomes (0+1)%1 == 0 , which is 0 == 0 . So it is


immediately full. Correct.

Dequeue -> element=Q[0] . front==rear is true. front=-1,


rear=-1 .

Empty queue to full queue and back: Repeated enqueue until


full, then dequeue all elements. The pointers should correctly cycle
and reset.

Enqueue/Dequeue across wrap-around: Operations that cause


rear or front to move from MAX_SIZE-1 to 0 are critical test
cases.

9. Real-Life Applications
Operating Systems - CPU Scheduling (Round Robin): Processes are
put into a circular queue. The CPU processes a task from the front for
a fixed time slice. If the task is not completed, it is moved to the rear of
the queue to wait for its next turn. This ensures fair allocation of CPU
time.

[Link] Page 73 of 164


Traffic Management Systems: Coordinating traffic light changes at
intersections to optimize traffic flow. The sequence of lights (e.g., Red,
Green, Yellow) can be thought of as a circular queue.

Printer Spoolers: When multiple print jobs are sent to a single


printer, they are placed in a circular queue. The printer processes
them one by one, and new jobs are added to the end of the queue.

Data Buffering: In streaming media (e.g., video or audio playback), a


circular buffer is used to temporarily store data packets received from
the network. As packets are consumed for playback, new packets are
loaded into the buffer, reusing the space.

Multiplayer Gaming: Managing player turns in a game where players


take turns in a fixed sequence.

10. Practice Problems


Easy

1. Implement a peek() function for the circular queue that returns


the front element without removing it.

2. Trace the front and rear pointers for a circular queue of


MAX_SIZE = 5 (capacity 4 using the (rear+1)%MAX_SIZE == front
full condition) for the following sequence of operations:
enqueue(A), enqueue(B), dequeue(), enqueue(C), enqueue(D),
enqueue(E), dequeue(), enqueue(F) . What is the state of the queue
after these operations?

Medium

1. Modify the circular queue implementation to return the current


number of elements ( size() ) in the queue in O(1) time. (Hint: Use a
count variable or implement size() using front and rear
arithmetic based on your chosen full/empty logic).

2. Write a program to reverse the elements of a circular queue. You


may use a temporary linear queue or a stack.

Hard

[Link] Page 74 of 164


1. Implement a doubly ended circular queue (Circular Deque) where
elements can be inserted and deleted from both front and rear .
Pay close attention to all edge cases and full/empty conditions.

2. Consider a scenario where a circular queue stores requests. Some


requests might be "high priority" and need to be processed quickly.
Design an algorithm to process high-priority requests before
regular ones, while maintaining the circular nature for regular
requests. (This might involve two circular queues or a more
complex logic).

11. Summary / Key Points


A Circular Queue is an array-based FIFO data structure that efficiently
reuses memory by connecting the end of the array to its beginning,
forming a logical circle.

It solves the space wastage problem of linear queues by allowing


rear to wrap around and occupy previously dequeued slots at the
front.

Modular arithmetic ( % MAX_SIZE ) is fundamental for pointer


movement and wrap-around logic.

Key pointers are front (points to the first element) and rear (points
to the position where the next element will be inserted).

Common isEmpty and isFull conditions:


isEmpty() : front == -1 (assuming initial state front = -1, rear =
-1 ).

isFull() : (rear + 1) % MAX_SIZE == front . (This convention leaves


one slot empty, so a MAX_SIZE array holds MAX_SIZE - 1
elements).

Alternative (using a count variable): isEmpty: count == 0 ,


isFull: count == MAX_SIZE . This allows the queue to utilize all
MAX_SIZE slots.

All primary operations ( enqueue , dequeue , isFull , isEmpty ) have


a time complexity of O(1).

[Link] Page 75 of 164


The space complexity is O(N), where N is the fixed MAX_SIZE of the
array.

Real-life applications include CPU scheduling (Round Robin), buffer


management, and traffic control systems.

Common pitfalls include incorrect isFull / isEmpty conditions, off-


by-one errors, and not resetting pointers when the queue becomes
empty.

Note: The code examples provided here are conceptual and assume a
class structure with front , rear , MAX_SIZE , and arr as member
variables. For a complete runnable example, they would need to be
encapsulated within a class. For this response, I've opted for a "no code
fences" rule interpretation that implies inline code or pseudo-code-like
presentation, not full runnable code blocks.

// Example C++ Circular Queue Class Structure (conceptual, not actual ru

class CircularQueue {
private:
int* arr;
int front;
int rear;
int MAX_SIZE;
// int count; // Could be used for alternative empty/full conditions

public:
CircularQueue(int size) {
MAX_SIZE = size;
arr = new int[MAX_SIZE];
front = -1;
rear = -1;
// count = 0;
}

~CircularQueue() {

[Link] Page 76 of 164


delete[] arr;
}

bool isEmpty() {
return front == -1;
// return count == 0; // Alternative
}

bool isFull() {
return ((rear + 1) % MAX_SIZE == front);
// return count == MAX_SIZE; // Alternative
}

void enqueue(int element) {


if (isFull()) {
std::cout << "Queue is full. Cannot enqueue " << element << std::e
return;
}
if (isEmpty()) {
front = 0;
}
rear = (rear + 1) % MAX_SIZE;
arr[rear] = element;
// count++; // For count-based logic
std::cout << element << " enqueued." << std::endl;
}

int dequeue() {
if (isEmpty()) {
std::cout << "Queue is empty. Cannot dequeue." << std::endl;
return -1; // Sentinel value
}
int element = arr[front];
if (front == rear) { // Last element
front = -1;
rear = -1;

[Link] Page 77 of 164


} else {
front = (front + 1) % MAX_SIZE;
}
// count--; // For count-based logic
std::cout << element << " dequeued." << std::endl;
return element;
}

int peek() {
if (isEmpty()) {
std::cout << "Queue is empty. No element to peek." << std::endl;
return -1;
}
return arr[front];
}
};

Queue Implementation using


Linked Lists
Implementing queues dynamically using linked lists, advantages over array-
based implementations (no fixed size), and pointer management.

1. Introduction
Definition A Queue is a linear data structure that follows the First-In,
First-Out (FIFO) principle. This means the element that was added first
will be the first one to be removed. When implementing a queue using
a Linked List, each element of the queue is represented as a node in
the linked list. The front (or head ) pointer of the queue points to
the first node (the element to be dequeued), and the rear (or tail )
pointer points to the last node (where new elements are enqueued).

[Link] Page 78 of 164


Why it is needed (real-life analogy) Array-based queue
implementations often suffer from limitations such as fixed size
(potential for overflow) or the need to shift elements after dequeueing
(leading to inefficiency). Linked List implementations overcome these
challenges by offering dynamic size and efficient insertion/deletion
operations. Real-life Analogy: Imagine a line of customers at a coffee
shop. The first person to join the line is the first person to be served.
New customers join at the back of the line. If a customer leaves, they
always leave from the front. This perfectly illustrates the FIFO
principle. A linked list naturally models this line, where each customer
is a node, and the front is the customer being served, while the
rear is where new customers join.

Applications Queues are fundamental in various computing


scenarios:

Task Scheduling: Managing tasks in an operating system (e.g., CPU


scheduling, I/O requests).

Print Spooling: Documents waiting to be printed are stored in a


queue.

Message Queues: Used in inter-process communication or


distributed systems to handle messages asynchronously.

Web Servers: Handling client requests in the order they arrive.

Simulations: Modeling real-world queues like customer service


lines.

Breadth-First Search (BFS): A graph traversal algorithm that uses


a queue.

2. Theory Explanation
In-depth explanation A queue implemented using a linked list
maintains two pointers: front and rear .

The front pointer points to the first node of the linked list, which
represents the front of the queue (the element to be removed

[Link] Page 79 of 164


next).

The rear pointer points to the last node of the linked list, which
represents the rear of the queue (where new elements are added).
Initially, when the queue is empty, both front and rear pointers
are NULL . Each node in the linked list typically contains two parts:
data (the actual element stored) and next (a pointer to the
subsequent node in the list).

Diagrams/flowcharts (described in text) Let's visualize the states


and operations:

1. Empty Queue: [NULL] <-- front [NULL] <-- rear (Both front and
rear point to nothing, indicating an empty queue.)

2. Enqueue Operation (adding element 'A'):

A new node is created with data 'A'.

If the queue was empty ( front was NULL ): The new node
becomes both the front and the rear . [A|NULL] <-- front,
rear

If the queue was not empty: The next pointer of the current
rear node is updated to point to the new node. The rear
pointer is then updated to point to the new node. Example:
Queue has [X] -> [Y]. Enqueue 'A'. [X] -> [Y] -> [A|NULL] ^ ^ | |
front rear

3. Dequeue Operation (removing an element):

If the queue is empty: An "Underflow" condition occurs (no


element to remove).

If the queue is not empty: The data of the front node is


retrieved. A temporary pointer is used to hold the front node.
The front pointer is moved to the next node ( front = front-
>next ). The node pointed to by the temporary pointer is then
deallocated (freed from memory). If front becomes NULL
after dequeueing (meaning the last element was removed),
then rear must also be set to NULL . Example: Queue has [X]

[Link] Page 80 of 164


-> [Y] -> [Z]. Dequeue. Retrieve X. [X] -> [Y] -> [Z] ^ ^ | | temp
rear | front

After front = front->next and deleting temp: [Y] -> [Z] ^ ^ | |


front rear

Step-by-step breakdown

Node Structure: Define a structure or class for a node that holds


data and a next pointer. struct Node { int data; Node* next;
Node(int val) : data(val), next(nullptr) {} };

Queue Class/Structure: Define a class for the queue that holds


front and rear pointers. class Queue { Node* front; Node* rear;
// ... constructor, methods ... };

Initialization: In the constructor, set front = nullptr and rear =


nullptr .

3. Properties & Characteristics


Key properties

FIFO (First-In, First-Out): Elements are processed in the order


they are added.

Dynamic Size: The queue can grow or shrink as needed, limited


only by available memory. Unlike array-based queues, there's no
fixed capacity.

Pointer-based: Operations primarily involve manipulating pointers


( front , rear , and next pointers within nodes).

Non-Contiguous Memory: Elements of the queue are not


necessarily stored in adjacent memory locations.

Advantages & disadvantages

Advantages:

[Link] Page 81 of 164


Dynamic Size: Automatically adjusts to the number of
elements, avoiding overflow issues of fixed-size arrays.

Efficient Operations: Enqueue and Dequeue operations


typically take constant time (O(1)) because they only involve
updating a few pointers.

No Shifting of Elements: Unlike some array implementations,


elements do not need to be shifted upon dequeueing.

Disadvantages:

Memory Overhead: Each node requires extra memory for the


next pointer in addition to the data itself.

Not Cache-Friendly: Due to non-contiguous memory


allocation, linked lists can lead to more cache misses compared
to arrays, which can impact performance in some scenarios.

Random Access Not Supported: To access an element in the


middle of the queue, one must traverse from the front
pointer, which is an O(N) operation. However, queues
fundamentally don't support random access.

4. Mathematical / Logical Representation


Important logical conditions

Empty Queue: A queue is considered empty if and only if its


front pointer is NULL . When front is NULL , rear should also
be NULL . isEmpty() = (front == NULL)

Non-empty Queue: If front is not NULL , the queue contains at


least one element.

Single Element Queue: If front == rear (and front is not


NULL ), the queue contains exactly one element.

Derivations There are no complex mathematical formulas or


derivations typically associated with the basic implementation of a
queue using linked lists, beyond the logical conditions described

[Link] Page 82 of 164


above. The efficiency comes from pointer manipulation rather than
mathematical computation.

5. Operations / Algorithms
Here, we define the core operations for a queue implemented with a
linked list. We'll use C++-like pseudo-code and concepts.

Node Structure (Conceptual C++)

struct Node {
int data;
Node* next;
// Constructor
Node(int val) {
data = val;
next = nullptr;
}
};

class Queue {
private:
Node* front;
Node* rear;
public:
// Constructor
Queue() {
front = nullptr;
rear = nullptr;
}
// Destructor (to release memory)
~Queue() {
while (!isEmpty()) {
dequeue(); // Dequeue all elements to delete nodes
}
}

[Link] Page 83 of 164


// Operations
bool isEmpty();
void enqueue(int val);
int dequeue(); // Returns the dequeued value
int peek(); // Returns the front value
};

5.1 IsEmpty Operation

Problem statement Determine if the queue currently contains any


elements.

Step-by-step algorithm

1. Check if the front pointer is NULL .

2. If front is NULL , the queue is empty, return true .

3. Otherwise, the queue is not empty, return false .

Dry run example

Initial Queue: front = NULL , rear = NULL


isEmpty() -> front is NULL -> Returns true .

Queue after enqueue(10) : front points to Node(10), rear


points to Node(10)
isEmpty() -> front is not NULL -> Returns false .

Pseudocode FUNCTION isEmpty(): IF front IS NULL THEN RETURN


TRUE ELSE RETURN FALSE

Code (C++-like logic) bool Queue::isEmpty() { return front == nullptr; }

5.2 Enqueue Operation

Problem statement Add a new element to the rear of the queue.

Step-by-step algorithm

1. Create a new Node with the given value .

[Link] Page 84 of 164


2. If the queue is isEmpty() (i.e., front is NULL ): a. Set both front
and rear pointers to point to the newNode .

3. Else (the queue is not empty): a. Set the next pointer of the
current rear node to point to the newNode . b. Update the
rear pointer to point to the newNode .

Dry run example

Initial Queue: front = NULL , rear = NULL


enqueue(10) :
Create Node(10). Queue is empty.

front points to Node(10).

rear points to Node(10).

Queue state: Node(10) <-- front, rear

Queue after enqueue(10) : Node(10) <-- front, rear


enqueue(20) :
Create Node(20). Queue is not empty.

rear->next (Node(10)->next) points to Node(20).

rear now points to Node(20).

Queue state: Node(10) -> Node(20) <-- rear. Node(10) <--


front.

Queue after enqueue(20) : Node(10) -> Node(20) <-- rear.


Node(10) <-- front.
enqueue(30) :
Create Node(30). Queue is not empty.

rear->next (Node(20)->next) points to Node(30).

rear now points to Node(30).

Queue state: Node(10) -> Node(20) -> Node(30) <-- rear.


Node(10) <-- front.

Pseudocode FUNCTION enqueue(value): CREATE newNode WITH


value SET [Link] TO NULL IF rear IS NULL THEN // Queue is

[Link] Page 85 of 164


empty SET front TO newNode SET rear TO newNode ELSE SET
[Link] TO newNode SET rear TO newNode

Code (C++-like logic) void Queue::enqueue(int val) { Node* newNode


= new Node(val); if (isEmpty()) { front = newNode; rear = newNode; }
else { rear->next = newNode; rear = newNode; } }

5.3 Dequeue Operation

Problem statement Remove and return the element from the front
of the queue.

Step-by-step algorithm

1. Check if the queue is isEmpty() . If it is, report "Underflow" or


throw an exception, as there's nothing to remove.

2. Store the data of the front node in a temporary variable


( dequeuedValue ).

3. Create a temporary pointer ( temp ) to store the current front


node (the one to be deleted).

4. Move the front pointer to the next node ( front = front->next ).

5. Deallocate the node pointed to by temp (free its memory).

6. If, after moving front , front becomes NULL (meaning the


queue is now empty), set rear to NULL as well.

7. Return dequeuedValue .

Dry run example

Queue: Node(10) -> Node(20) -> Node(30) <-- rear. Node(10) <--
front.
dequeue() :
Queue is not empty.

dequeuedValue = 10 .

temp points to Node(10).

front moves to Node(20).

[Link] Page 86 of 164


Delete Node(10).

front is not NULL .

Return 10.

Queue state: Node(20) -> Node(30) <-- rear. Node(20) <--


front.

Queue: Node(20) -> Node(30) <-- rear. Node(20) <-- front.


dequeue() :
Queue is not empty.

dequeuedValue = 20 .

temp points to Node(20).

front moves to Node(30).

Delete Node(20).

front is not NULL .

Return 20.

Queue state: Node(30) <-- rear. Node(30) <-- front.

Queue: Node(30) <-- rear. Node(30) <-- front.


dequeue() :
Queue is not empty.

dequeuedValue = 30 .

temp points to Node(30).

front moves to NULL .

Delete Node(30).

front is NULL , so set rear = NULL .

Return 30.

Queue state: front = NULL , rear = NULL (Empty).

Pseudocode FUNCTION dequeue(): IF isEmpty() THEN PRINT "Queue


Underflow" RETURN ERROR_VALUE (or throw exception) DECLARE
dequeuedValue AS INTEGER = [Link] DECLARE tempNode AS
POINTER TO Node = front SET front TO [Link] DELETE tempNode //

[Link] Page 87 of 164


Free memory IF front IS NULL THEN // If queue became empty SET
rear TO NULL RETURN dequeuedValue

Code (C++-like logic) int Queue::dequeue() { if (isEmpty()) { // Handle


error: Queue Underflow // For simplicity, returning a sentinel value or
printing an error // In real applications, you might throw an exception.
std::cout << "Queue Underflow!" << std::endl; return -1; // Or some
other error indicator } int dequeuedValue = front->data; Node* temp =
front; front = front->next; delete temp; // Free memory if (front ==
nullptr) { // If the queue became empty after dequeue rear = nullptr; }
return dequeuedValue; }

5.4 Peek/Front Operation

Problem statement Return the element at the front of the queue


without removing it.

Step-by-step algorithm

1. Check if the queue is isEmpty() . If it is, report "Queue Empty" or


throw an exception.

2. Return the data of the front node.

Dry run example

Queue: Node(10) -> Node(20) <-- rear. Node(10) <-- front.


peek() :
Queue is not empty.

Returns front->data which is 10.

Queue state remains unchanged.

Pseudocode FUNCTION peek(): IF isEmpty() THEN PRINT "Queue is


Empty" RETURN ERROR_VALUE (or throw exception) RETURN
[Link]

Code (C++-like logic) int Queue::peek() { if (isEmpty()) { std::cout <<


"Queue is Empty!" << std::endl; return -1; // Error indicator } return
front->data; }

[Link] Page 88 of 164


6. Complexity Analysis
Time complexity

Enqueue: O(1) Creating a new node, assigning its next pointer,


and updating rear and potentially front are all constant time
operations regardless of the queue size.

Dequeue: O(1) Retrieving the front element, updating front to


the next node, and deallocating the old front node are all
constant time operations.

Peek/Front: O(1) Accessing the data of the front node is a


direct, constant-time operation.

IsEmpty: O(1) Checking if front is NULL is a constant-time


operation.

Space complexity

O(N) The space required is proportional to the number of


elements (N) currently stored in the queue, as each element
occupies one node. Each node requires memory for its data and its
next pointer.

Comparison with alternatives

Array-based Queue (Fixed-size):


Pros: Better cache performance due to contiguous memory. No
overhead for pointers per element.

Cons: Fixed maximum size (prone to overflow). Dequeueing can


be O(N) if elements are shifted, or requires modulo arithmetic
for a circular array, which can be less intuitive.

Array-based Queue (Dynamic/Resizable Array):


Pros: Can grow dynamically. Better cache performance than
linked lists.

Cons: Resizing (copying all elements to a new, larger array) is an


O(N) operation, which can occur occasionally. Dequeue still
might involve shifting or circular array logic.

[Link] Page 89 of 164


Linked list implementation of a queue offers the most straightforward
way to achieve O(1) time complexity for both enqueue and dequeue
operations without size limitations, at the cost of slight memory
overhead and potentially worse cache performance compared to
arrays.

7. Examples
Solved Problem: A sequence of Queue Operations

Let's consider an empty queue implemented using a linked list. We'll track
the front and rear pointers and the queue's contents.

Initial State: front = NULL , rear = NULL . Queue: [ ]

1. Operation: enqueue(5)

Create Node(5).

Since queue is empty, front = Node(5) , rear = Node(5) .

Queue: [ 5 ] *front^rear

2. Operation: enqueue(10)

Create Node(10).

rear->next (Node(5)->next) becomes Node(10).

rear becomes Node(10).

Queue: [ 5, 10 ] front rear

3. Operation: peek()

Queue is not empty.

Return front->data , which is 5 .

Queue: [ 5, 10 ] (unchanged) front rear

4. Operation: enqueue(15)

Create Node(15).

[Link] Page 90 of 164


rear->next (Node(10)->next) becomes Node(15).

rear becomes Node(15).

Queue: [ 5, 10, 15 ] front rear

5. Operation: dequeue()

Queue is not empty.

Store front->data (which is 5 ).

temp points to Node(5).

front moves to Node(10).

Delete Node(5).

front is not NULL .

Return 5 .

Queue: [ 10, 15 ] front rear

6. Operation: dequeue()

Queue is not empty.

Store front->data (which is 10 ).

temp points to Node(10).

front moves to Node(15).

Delete Node(10).

front is not NULL .

Return 10 .

Queue: [ 15 ] *front^rear

7. Operation: dequeue()

Queue is not empty.

Store front->data (which is 15 ).

temp points to Node(15).

front moves to NULL .

[Link] Page 91 of 164


Delete Node(15).

front is NULL , so set rear = NULL .

Return 15 .

Queue: [ ] front NULL , rear NULL

8. Operation: isEmpty()

front is NULL .

Return true .

8. Common Mistakes & Edge Cases


Where students go wrong

Null Pointer Dereferencing: Trying to access front->data or


rear->next when front or rear is NULL (e.g., calling peek()
or dequeue() on an empty queue). Always check isEmpty() first.

Forgetting to Update rear for First Element: When the queue


is empty and the first element is enqueued, front and rear must
both point to this new node. A common mistake is only updating
front .

Forgetting to Set rear = NULL on Last Dequeue: When the last


element is dequeued, front becomes NULL . If rear is not also
set to NULL , rear will still point to a deallocated node, leading to
a dangling pointer and incorrect isEmpty() checks later.

Memory Leaks (in C/C++): Failing to delete (or free ) nodes


after they are dequeued results in memory leaks. The memory
occupied by removed nodes is not returned to the system.
Similarly, a destructor for the queue class should properly
dequeue all elements to free memory.

Incorrect Pointer Linkage: Confusing front with rear during


enqueue or dequeue. New elements are always added at the
rear , and removed always from the front .

Special cases

[Link] Page 92 of 164


Empty Queue:
isEmpty() returns true .

enqueue() correctly initializes both front and rear .

dequeue() or peek() on an empty queue should gracefully


handle the "underflow" condition (e.g., return a sentinel value,
print an error, or throw an exception).

Single-Element Queue:
front and rear point to the same node.

enqueue() correctly adds a new node after the current rear


and updates rear .

dequeue() removes the single node, setting both front and


rear to NULL to signify an empty queue.

9. Real-Life Applications
Operating Systems (CPU Scheduling): Processes waiting for CPU
time are often kept in a ready queue. The scheduler selects the next
process from the front of this queue (e.g., Round Robin scheduling).

Networking (Packet Buffers): Routers and network interfaces use


queues to buffer incoming and outgoing data packets, processing
them in the order they arrive to ensure reliable data flow.

Printer Spoolers: When multiple print jobs are sent to a printer, they
are placed in a queue. The printer processes them one by one, FIFO.

Asynchronous Message Handling: In web services or distributed


systems, message queues (e.g., Apache Kafka, RabbitMQ) are used to
temporarily store messages between services, allowing them to
communicate asynchronously and process messages reliably even
under high load.

Traffic Simulation: In traffic modeling, vehicles waiting at a traffic


light or toll booth can be simulated using queues.

Keyboard Buffering: When you type characters faster than the


computer can process them, they are stored in a keyboard buffer (a

[Link] Page 93 of 164


queue) to be processed in order.

10. Practice Problems


Easy

1. Implement a basic Queue class using a linked list, providing


enqueue , dequeue , peek , and isEmpty methods. Ensure
proper memory management (e.g., delete nodes in C++).

2. Write a function that counts the number of elements in a queue


without modifying it. (Hint: You might need to temporarily dequeue
and enqueue elements or traverse the list if you want to avoid
adding a size field).

Medium

1. Modify your queue implementation to store string data instead of


int .

2. Implement a display() method for your queue that prints all


elements from front to rear without modifying the queue.

3. Create a function that reverses the elements of a queue using only


the queue's standard operations ( enqueue , dequeue ,
isEmpty ). (Hint: You might need auxiliary data structures like a
stack).

Hard

1. Implement a "Double-Ended Queue" (Deque) using a doubly linked


list, supporting insertion/deletion from both front and rear in O(1)
time.

2. Given a queue of integers and an integer k , reverse the order of


the first k elements of the queue, leaving the other elements in
the same relative order. You can use an auxiliary stack.

11. Summary / Key Points

[Link] Page 94 of 164


FIFO Principle: Linked list queues strictly adhere to First-In, First-Out,
meaning the first element added is the first one removed.

Pointers front and rear : The front pointer indicates the next
element to be dequeued, and the rear pointer indicates where the
next element will be enqueued.

Dynamic Size: Linked lists provide an automatically resizable queue,


overcoming the fixed-size limitations of array implementations.

Efficiency: Both enqueue (addition to the rear) and dequeue


(removal from the front) operations have a time complexity of O(1).
peek and isEmpty are also O(1).

Memory Overhead: Each element (node) in a linked list queue


requires additional memory for a next pointer.

Edge Cases: Special attention must be paid to handling an empty


queue and a queue with a single element during enqueue and
dequeue operations, especially ensuring front and rear are both
NULL when empty and correctly updated otherwise.

Memory Management: In languages like C++, explicit memory


deallocation ( delete ) is crucial to prevent memory leaks when nodes
are dequeued.

Highlighted Concepts & Operations:

Enqueue: Add to rear . Update rear and rear->next . If empty,


update front too.

Dequeue: Remove from front . Update front = front->next . If front


becomes NULL , set rear = NULL . Deallocate old front node.

Peek: Return front->data .

isEmpty: Check front == NULL .

Complexity Analysis of Queues

[Link] Page 95 of 164


Time and space complexity analysis for all queue operations (enqueue,
dequeue, front, isEmpty) across array, circular array, and linked list
implementations.

1. Introduction
Definition Complexity analysis, in the context of data structures like
queues, is the process of evaluating the resources (time and space) an
algorithm or data structure operation consumes as a function of the
input size. For queues, this means assessing how quickly operations
like adding an element (enqueue), removing an element (dequeue), or
checking the front element (peek) execute, and how much memory
they require, as the number of elements in the queue grows.

Why it is needed (real-life analogy if possible) Imagine a busy


supermarket checkout line (a queue).

If a new customer joins the end of the line (enqueue), how quickly
can they be added without slowing down the entire system?

When a customer is served and leaves the front of the line


(dequeue), how efficiently can the line move forward?

If the manager wants to see who is next to be served (peek), how


fast can they get that information? If these operations are slow,
customers get frustrated, and the supermarket loses efficiency.
Similarly, in software, slow operations can lead to unresponsive
applications, wasted resources, and poor user experience.
Complexity analysis helps us predict and optimize this
performance before we write the full code, ensuring our systems
scale effectively with larger amounts of data.

Applications Understanding the complexity of queue operations is


crucial in various applications:

Operating Systems: Managing process scheduling, where tasks


wait in a queue to be executed.

[Link] Page 96 of 164


Network Packet Buffering: Routers use queues to hold data
packets waiting to be transmitted.

Printer Queues: Documents waiting to be printed are stored in a


queue.

Breadth-First Search (BFS): A graph traversal algorithm that


heavily relies on queue operations.

Simulation Systems: Modeling real-world waiting lines.

Event-Driven Systems: Processing events in the order they arrive.

2. Theory Explanation
In-depth explanation Complexity analysis primarily focuses on how
an algorithm's performance degrades as the input size (often denoted
by n ) increases. For queues, n typically refers to the number of
elements currently in the queue. We generally analyze operations in
terms of their "worst-case" performance, as this guarantees a certain
level of performance.

When we talk about the complexity of queue operations, we're


interested in:

1. Enqueue (Adding an element): How many steps does it take to


add a new item to the rear of the queue? Does it matter if the
queue is empty or almost full?

2. Dequeue (Removing an element): How many steps does it take


to remove an item from the front of the queue? What happens to
the remaining elements?

3. Peek/Front (Viewing the front element): How many steps to


simply look at the element at the front without removing it?

4. isEmpty (Checking if empty): How many steps to determine if the


queue contains any elements?

5. isFull (Checking if full - for array-based): How many steps to


determine if the queue has reached its maximum capacity?

[Link] Page 97 of 164


The goal is to design queue implementations where these
fundamental operations can be performed as efficiently as possible,
ideally in constant time, meaning the time taken doesn't grow with the
number of elements in the queue.

Diagrams/flowcharts (describe in text if not possible to draw)


Let's describe the conceptual flow for two common queue
implementations and how operations affect them:

Array-based Queue (Non-circular):

Structure: A fixed-size array with two pointers: front (index of


the first element) and rear (index of the last element).

Enqueue: Increment rear , then place the new element at


array[rear] . (Constant time)

Dequeue: Remove array[front] . To maintain the "front" at


index 0 and avoid "holes", all subsequent elements (from
front+1 to rear ) must be shifted one position to the left.
Then increment front (logically, or reset front to 0 and rear
to rear-1 after shifting). This shifting is the costly part. (Linear
time)

Visualizing Dequeue (Non-circular array): Initial: [A, B, C, D, _


, _] (front=0, rear=3) Dequeue A: [B, C, D, _ , _ , _] (shift B, C, D
one position left; front=0, rear=2) The shift operation makes this
inefficient.

Array-based Queue (Circular):

Structure: A fixed-size array where the last element is


considered adjacent to the first (wraps around). Uses front
and rear pointers, often with a count or a specific condition
to distinguish empty/full.

Enqueue: Increment rear (with modulo array size to wrap


around), then place the new element at array[rear] . (Constant
time)

[Link] Page 98 of 164


Dequeue: Increment front (with modulo array size to wrap
around). The element at the old front is considered removed.
No shifting occurs. (Constant time)

Visualizing Dequeue (Circular array): Initial: [A, B, C, D]


(front=0, rear=3, size=4) Dequeue A: [_, B, C, D] (front=1,
rear=3) - 'A' is logically removed, no shift needed. Now if we
enqueue E: [E, B, C, D] (front=1, rear=0) - 'E' wraps around.

Linked List-based Queue:

Structure: A sequence of nodes, each containing data and a


pointer to the next node. Two pointers: front (to the first
node) and rear (to the last node).

Enqueue: Create a new node. Make the current rear node


point to the new node. Update rear to the new node. If the
queue was empty, front also points to the new node.
(Constant time)

Dequeue: Make front point to the next node in the sequence.


The old front node is removed. If the queue becomes empty,
rear also becomes null. (Constant time)

Visualizing Enqueue/Dequeue (Linked List): Enqueue (A) ->


(B) -> (C) (front points to A, rear points to C) Add D: (A) -> (B) ->
(C) -> (D) (front points to A, rear points to D) Remove A: (B) -> (C)
-> (D) (front points to B, rear points to D)

Step-by-step breakdown The core idea is to break down each queue


operation into its elementary steps (e.g., pointer assignment, array
access, comparison). Then, count these steps. For example, enqueue
in a linked list:

1. Create a new node. (1 step)

2. Assign data to the new node. (1 step)

3. Check if the queue is empty. (1 step)

4. If empty, front points to new node. (1 step)

5. Else, rear 's next pointer points to new node. (1 step)

[Link] Page 99 of 164


6. rear points to new node. (1 step) All these steps take a fixed
amount of time, regardless of the queue's size. Hence, it's constant
time.

3. Properties & Characteristics


Key properties (from a complexity perspective)

FIFO (First-In, First-Out): This fundamental property dictates that


elements are always added at one end (rear) and removed from
the other (front). Implementations must preserve this.

Dynamic vs. Fixed Size:


Fixed Size (e.g., Array-based): Has a maximum capacity.
isFull operation becomes relevant. Operations might fail if the
queue is full.

Dynamic Size (e.g., Linked List-based, dynamic


array/ArrayList): Can grow or shrink as needed, limited only by
available memory. isFull is generally not applicable (or implies
memory exhaustion). Reallocation might cause temporary
performance spikes in dynamic arrays.

Pointer Management: The efficient management of front and


rear pointers (or indices) is critical for achieving optimal O(1)
complexity for core operations.

Advantages & disadvantages (of complexity analysis of queues)

Advantages:
Predictive Power: Allows developers to predict how a queue-
based system will perform under heavy load or with large
datasets.

Optimization Guidance: Highlights bottlenecks in queue


implementations, guiding decisions on which implementation to
choose or how to optimize existing code.

Resource Management: Helps in estimating memory usage


and processing time, crucial for resource-constrained

[Link] Page 100 of 164


environments.

Scalability Assessment: Ensures that the system can handle


increasing data sizes without a dramatic drop in performance.

Informed Design Choices: Enables choosing the right queue


implementation (e.g., array vs. linked list) based on
performance requirements.

Disadvantages:
Worst-Case Focus: Often focuses on worst-case complexity,
which might not always reflect average real-world performance.

Hardware Abstraction: Big O notation abstracts away constant


factors and lower-order terms, which can sometimes matter for
small input sizes or specific hardware.

Implementation Details: The analysis might assume ideal


conditions, while real-world implementation details (e.g., cache
misses, garbage collection, language overhead) can influence
actual runtime.

4. Mathematical / Logical Representation


Important formulas The primary mathematical tool for complexity
analysis is Big O notation. It describes the upper bound of an
algorithm's growth rate in terms of time or space requirements.

O(1) - Constant Time: The time/space required does not change


with the input size n .
Example: Accessing an array element by index, adding a node to
a linked list queue.

O(log n) - Logarithmic Time: The time/space increases slowly as


n grows (e.g., halving the problem size with each step).
Example: Binary search (not typical for basic queue operations).

O(n) - Linear Time: The time/space increases proportionally to the


input size n .

[Link] Page 101 of 164


Example: Traversing a linked list, searching for an element in an
unsorted array, dequeue in a non-circular array queue.

O(n log n) - Linearithmic Time: Common in efficient sorting


algorithms.

O(n²) - Quadratic Time: The time/space increases quadratically


with n .
Example: Nested loops, some simple sorting algorithms (Bubble
Sort).

O(2ⁿ) - Exponential Time: Very slow, typically found in brute-force


solutions to complex problems.

Derivations (logical reasoning) The "derivation" of a queue


operation's complexity involves:

1. Identify Primitive Operations: Break down the algorithm into


fundamental steps (assignments, comparisons, arithmetic
operations, memory access).

2. Count Steps: Count how many times each primitive operation is


executed in the worst case.

3. Express as Function of n : Represent the total count as a


function of the input size n (number of elements in the queue).

4. Determine Big O: Drop constant factors and lower-order terms to


find the dominant term, which gives the Big O notation.

Example (Dequeue in a non-circular array queue):

1. data = array[front] (1 step)

2. Loop from front to rear-1 : array[i] = array[i+1] (n-1 steps for


shifting, if n is rear-front+1 )

3. rear = rear - 1 (1 step)

4. Total steps: 1 + (n-1) + 1 = n + 1

5. As n grows, the +1 becomes insignificant. Thus, the complexity


is O(n).

[Link] Page 102 of 164


Example (Dequeue in a linked list queue):

1. data = [Link] (1 step)

2. front = [Link] (1 step)

3. Check if front is null, then rear = null (1 step)

4. Total steps: 3 (a constant).

5. This is O(1).

5. Operations / Algorithms
For demonstrating the logic affecting complexity, we will focus on two
primary implementations: Array-based (Circular) and Linked List-based.

5.1 Array-based Circular Queue

Problem Statement: Implement a queue using a fixed-size array,


where operations wrap around to the beginning of the array to
maximize space utilization.

Data Structure: An array queue_array , front index, rear index,


size (current elements), capacity (max elements).
front points to the first element.

rear points to the last element.

The queue is empty if size == 0 .

The queue is full if size == capacity .

5.1.1 Enqueue Operation ( add / offer )

Problem Statement: Add a new element to the rear of the queue.


Handle the full queue condition.

Step-by-step algorithm:

1. Check if the queue is full. If size == capacity , return an error or


indicate failure.

2. Increment rear . If rear reaches capacity , wrap it around to 0


( rear = (rear + 1) % capacity ).

[Link] Page 103 of 164


3. Place the new item at queue_array[rear] .

4. Increment size .

5. If this was the first element ( size == 1 ), front should also be set
to 0 .

Dry run example: capacity = 5 , queue_array = [_, _, _, _, _] Initial:


front = 0 , rear = -1 , size = 0 (or front = -1 , rear = -1 in some
implementations) Let's use front = 0 , rear = 0 , size = 0 (empty
state, rear points to next available spot). front points to the first
element if not empty.

Initial: queue_array = [_, _, _, _, _] , front = 0 , rear = 0 , size = 0

enqueue(10) :

1. size (0) < capacity (5) -> Not full.

2. rear = (0 + 1) % 5 = 1 . ( rear now indicates where the next


element will go).

3. queue_array[0] = 10 . (Let's adjust: rear points to the last


added element). Let's use the common convention: front
points to the first element, rear points to the last element.
Initial: front = -1 , rear = -1 , size = 0

enqueue(10) :

1. size (0) < capacity (5). Not full.

2. If size == 0 : front = 0 .

3. rear = (rear + 1) % capacity = (-1 + 1) % 5 = 0 .

4. queue_array[0] = 10 .

5. size = 1 . queue_array = [10, _, _, _, _] , front = 0 , rear =


0 , size = 1

enqueue(20) :

1. Not full.

2. rear = (0 + 1) % 5 = 1 .

[Link] Page 104 of 164


3. queue_array[1] = 20 .

4. size = 2 . queue_array = [10, 20, _, _, _] , front = 0 , rear =


1 , size = 2

enqueue(30) :

1. Not full.

2. rear = (1 + 1) % 5 = 2 .

3. queue_array[2] = 30 .

4. size = 3 . queue_array = [10, 20, 30, _, _] , front = 0 , rear


= 2 , size = 3

Pseudocode:

function enqueue(item):
if size == capacity:
print "Queue is full"
return

if size == 0: // First element being added


front = 0
rear = 0
else:
rear = (rear + 1) % capacity

queue_array[rear] = item
size = size + 1

Code (Python):

class CircularQueue:
def __init__(self, capacity):
[Link] = capacity
self.queue_array = [None] * capacity
[Link] = 0
[Link] = -1 # rear points to the last element

[Link] Page 105 of 164


[Link] = 0

def enqueue(self, item):


if [Link]():
print("Queue is full. Cannot enqueue", item)
return False

[Link] = ([Link] + 1) % [Link]


self.queue_array[[Link]] = item
[Link] += 1
# If it was the first element, front should also be updated if it was
# or if front tracks the 'next to dequeue' and starts at 0 initially.
# With [Link] = 0 and [Link] = -1, the first enqueue makes re
# This is a consistent state.
return True

def isFull(self):
return [Link] == [Link]

# Other methods like isEmpty, peek, dequeue would also be here

5.1.2 Dequeue Operation ( remove / poll )

Problem Statement: Remove and return the element from the front
of the queue. Handle the empty queue condition.

Step-by-step algorithm:

1. Check if the queue is empty. If size == 0 , return an error or null.

2. Store the element at queue_array[front] in a temporary variable.

3. Decrement size .

4. If the queue is now empty ( size == 0 ), reset front and rear


(e.g., to -1 or 0 depending on implementation's initial state).

5. Else, increment front (with modulo capacity to wrap around:


front = (front + 1) % capacity ).

6. Return the stored element.

[Link] Page 106 of 164


Dry run example: capacity = 5 , queue_array = [10, 20, 30, _, _] ,
front = 0 , rear = 2 , size = 3

dequeue() :

1. size (3) > 0. Not empty.

2. item = queue_array[0] (which is 10).

3. size = 2 .

4. Queue not empty ( size != 0 ).

5. front = (0 + 1) % 5 = 1 .

6. Return 10 . queue_array = [10, 20, 30, _, _] , front = 1 , rear =


2 , size = 2 (Note: queue_array[0] still holds 10, but front
moved past it, making it logically removed).

dequeue() :

1. size (2) > 0. Not empty.

2. item = queue_array[1] (which is 20).

3. size = 1 .

4. Queue not empty ( size != 0 ).

5. front = (1 + 1) % 5 = 2 .

6. Return 20 . queue_array = [10, 20, 30, _, _] , front = 2 , rear =


2 , size = 1

dequeue() :

1. size (1) > 0. Not empty.

2. item = queue_array[2] (which is 30).

3. size = 0 .

4. Queue is now empty ( size == 0 ). Set front = 0 , rear = -1 .

5. Return 30 . queue_array = [10, 20, 30, _, _] , front = 0 , rear =


-1 , size = 0 (back to initial state).

Pseudocode:

[Link] Page 107 of 164


function dequeue():
if size == 0:
print "Queue is empty"
return null // or raise an error

item = queue_array[front]
// Optionally, for garbage collection or debugging, set array[front] t
// queue_array[front] = null

size = size - 1

if size == 0: // Queue became empty


front = 0 // or -1, depending on initial state convention
rear = -1 // or 0, depending on initial state convention
else:
front = (front + 1) % capacity

return item

Code (Python):

class CircularQueue:
# ... (init and enqueue from above) ...

def dequeue(self):
if [Link]():
print("Queue is empty. Cannot dequeue.")
return None

item = self.queue_array[[Link]]
self.queue_array[[Link]] = None # Optional: clear reference
[Link] -= 1

if [Link] == 0: # Queue is now empty


[Link] = 0

[Link] Page 108 of 164


[Link] = -1
else:
[Link] = ([Link] + 1) % [Link]

return item

def isEmpty(self):
return [Link] == 0

def peek(self):
if [Link]():
print("Queue is empty.")
return None
return self.queue_array[[Link]]

5.2 Linked List-based Queue

Problem Statement: Implement a queue using a linked list, allowing


dynamic resizing.

Data Structure: A Node class (data, next_node_pointer), front


pointer (to first node), rear pointer (to last node).

5.2.1 Enqueue Operation ( add / offer )

Problem Statement: Add a new element to the rear of the queue.

Step-by-step algorithm:
1. Create a new Node with the given item .

2. If the queue is empty ( rear is null):


Both front and rear point to the new node.

3. Else (queue is not empty):


The next pointer of the current rear node points to the new
node.

Update rear to point to the new node.

4. Increment size .

[Link] Page 109 of 164


Pseudocode:

class Node:
data
next_node

function enqueue(item):
new_node = new Node(item)
if rear is null: // Queue is empty
front = new_node
rear = new_node
else:
rear.next_node = new_node
rear = new_node
size = size + 1

Code (Python):

class Node:
def __init__(self, data):
[Link] = data
[Link] = None

class LinkedListQueue:
def __init__(self):
[Link] = None
[Link] = None
[Link] = 0

def enqueue(self, item):


new_node = Node(item)
if [Link] is None: # Queue is empty
[Link] = new_node
[Link] = new_node
else:
[Link] = new_node

[Link] Page 110 of 164


[Link] = new_node
[Link] += 1

5.2.2 Dequeue Operation ( remove / poll )

Problem Statement: Remove and return the element from the front
of the queue. Handle the empty queue condition.

Step-by-step algorithm:
1. Check if the queue is empty ( front is null). If so, return error or
null.

2. Store the data from the front node in a temporary variable.

3. Move front to the next node ( front = [Link] ).

4. Decrement size .

5. If front becomes null (meaning the queue is now empty), set


rear to null as well.

6. Return the stored data.

Pseudocode:

function dequeue():
if front is null: // Queue is empty
print "Queue is empty"
return null // or raise an error

item = [Link]
front = [Link] // Move front pointer

size = size - 1

if front is null: // If queue became empty after removal


rear = null // Both front and rear should be null

return item

Code (Python):

[Link] Page 111 of 164


class LinkedListQueue:
# ... (init and enqueue from above) ...

def dequeue(self):
if [Link]():
print("Queue is empty. Cannot dequeue.")
return None

item = [Link]
[Link] = [Link]
[Link] -= 1

if [Link] is None: # Queue became empty


[Link] = None

return item

def isEmpty(self):
return [Link] is None # Or [Link] == 0

def peek(self):
if [Link]():
print("Queue is empty.")
return None
return [Link]

6. Complexity Analysis
Time complexity (Best, Worst, Average) This section analyzes the
time complexity of core queue operations for different
implementations.

[Link] Page 112 of 164


Array-based (Non- Array-based Linked List-
Operation
Circular) (Circular) based
O(1) (Worst: O(n) if
enqueue O(1) O(1)
resize)
dequeue O(n) O(1) O(1)
peek O(1) O(1) O(1)
isEmpty O(1) O(1) O(1)
O(1) (N/A for
isFull O(1) O(1)
LL)

Explanation:

Array-based Queue (Non-Circular):

enqueue : O(1). Simply add to the rear index. If the array


needs to be resized (copied to a larger array), it becomes O(n) in
the worst case, but this is amortized O(1) over many operations
in dynamic arrays.

dequeue : O(n). When an element is removed from the front


(index 0), all remaining n-1 elements must be shifted one
position to the left to fill the gap. This shifting operation takes
time proportional to the number of elements.

peek , isEmpty , isFull : O(1). These involve direct array


access or pointer/size checks.

Array-based Queue (Circular):

enqueue : O(1). The rear pointer is incremented with a


modulo operator, and the item is placed. No shifting.

dequeue : O(1). The front pointer is incremented with a


modulo operator. No shifting of elements. The space is reused
efficiently.

peek , isEmpty , isFull : O(1). Direct array access or


pointer/size checks.

Linked List-based Queue:

[Link] Page 113 of 164


enqueue : O(1). A new node is created and attached to the
current rear node. rear is updated. No traversal or shifting
required.

dequeue : O(1). The front pointer is simply moved to the next


node. No traversal or shifting required.

peek , isEmpty : O(1). Direct pointer access ( [Link] or


front == null ).

isFull : Not typically applicable, as a linked list queue grows


dynamically until memory runs out.

Space complexity

Array-based Queue (Circular or Non-Circular):


O(Capacity) or O(N) where N is the maximum number of
elements the array can hold. Even if the queue is empty, the
underlying array of Capacity size is allocated. This is often
considered O(1) extra space if the array itself is considered
part of the problem input/fixed structure, but O(N) if we count
the space taken by the elements. If it's a dynamic array that
resizes, the space can grow up to O(n) where n is the current
number of elements.

Linked List-based Queue:


O(n), where n is the current number of elements in the
queue. Each element requires a node, and each node takes
constant space for its data and pointer. The space scales
directly with the number of elements.

Comparison with alternatives

Array (Non-circular) vs. Circular Array vs. Linked List:

Non-circular Array: Bad dequeue performance (O(n)) makes


it generally undesirable for queues where frequent dequeues
are expected. Simple to implement, but wasteful of space if
front advances far into the array without being reset.

[Link] Page 114 of 164


Circular Array: Excellent all-around performance (O(1) for all
main operations) and good cache locality. Fixed size is its main
limitation; it cannot grow beyond its initial capacity without
resizing the underlying array (which would be an O(n)
operation).

Linked List: Excellent all-around performance (O(1) for all main


operations) and dynamically resizable. No fixed size limit.
However, it incurs higher memory overhead per element (due
to storing pointers) and potentially poorer cache performance
due to non-contiguous memory allocation compared to arrays.

When to choose which:

Choose Circular Array when:


The maximum number of elements is known or can be
reasonably estimated.

Memory usage needs to be predictable and tight.

Cache performance is critical.

Choose Linked List when:


The number of elements is highly variable and
unpredictable.

Dynamic resizing is a must.

The overhead of pointers per element is acceptable.

7. Examples
Example 1: Analyzing a Simple Queue Operation Sequence

Problem: Trace the state and complexity of operations on a circular array


queue of capacity 3. Initial state: queue = [] , front = 0 , rear = -1 , size
= 0 , capacity = 3

Operations:

1. enqueue(A)

[Link] Page 115 of 164


2. enqueue(B)

3. dequeue()

4. enqueue(C)

5. enqueue(D) (Attempt)

Analysis:

1. enqueue(A) :

rear = ( -1 + 1 ) % 3 = 0 .

queue_array[0] = A .

size = 1 .

State: [A, _, _] , front = 0 , rear = 0 , size = 1

Complexity: O(1)

2. enqueue(B) :

rear = ( 0 + 1 ) % 3 = 1 .

queue_array[1] = B .

size = 2 .

State: [A, B, _] , front = 0 , rear = 1 , size = 2

Complexity: O(1)

3. dequeue() :

item = queue_array[0] (which is A ).

front = ( 0 + 1 ) % 3 = 1 .

size = 1 .

State: [A, B, _] (logically [B, _ , _] ), front = 1 , rear = 1 , size = 1

Returned: A

Complexity: O(1)

4. enqueue(C) :

[Link] Page 116 of 164


rear = ( 1 + 1 ) % 3 = 2 .

queue_array[2] = C .

size = 2 .

State: [A, B, C] (logically [B, C , _] ), front = 1 , rear = 2 , size =


2

Complexity: O(1)

5. enqueue(D) (Attempt):

size (2) is not capacity (3) yet. My circular queue implementation's


isFull check should be size == capacity .

rear = (2 + 1) % 3 = 0 .

queue_array[0] = D .

size = 3 .

State: [D, B, C] (logically [B, C, D] ), front = 1 , rear = 0 , size =


3

Complexity: O(1)

(Self-correction: My previous dry run for isFull was slightly off. With size
tracking, the queue is full when size == capacity . If using front == rear
to detect full/empty, one slot is typically left unused, or an additional flag
is needed.)

Corrected step 5 (if size = 3 makes it full): If the queue was full
when size == capacity (3 elements), then enqueue(D) would fail.
Current State: [A, B, C] ( A at index 0, B at index 1, C at index 2).
front = 1 , rear = 2 , size = 2 . The size counter is essential to
distinguish between empty and full when front == rear .

Let's re-run with common front and rear for circular queue: front
points to first element, rear points to last element. Initial: queue =
[_, _, _] , front = -1 , rear = -1 , size = 0 , capacity = 3

1. enqueue(A) : front=0 , rear=0 , queue_array=[A,_,_] , size=1 .


O(1).

[Link] Page 117 of 164


2. enqueue(B) : rear=(0+1)%3=1 , queue_array=[A,B,_] , size=2 .
O(1).

3. dequeue() : item=A , front=(0+1)%3=1 , queue_array=[A,B,_] ,


size=1 . Returns A . O(1).

4. enqueue(C) : rear=(1+1)%3=2 , queue_array=[A,B,C] , size=2 .


O(1).

5. enqueue(D) : rear=(2+1)%3=0 , queue_array=[D,B,C] , size=3 .


O(1). Current state: front = 1 (B), rear = 0 (D), size = 3 .
queue_array = [D, B, C] . The elements are: B (at index 1), C (at
index 2), D (at index 0). Now the queue is full ( size == capacity ).

Example 2: Complexity of Reversing a Queue

Problem: Design an algorithm to reverse a queue (e.g., [1, 2, 3]


becomes [3, 2, 1] ) using an auxiliary data structure, and analyze its time
and space complexity.

Algorithm (using a Stack):

1. Dequeue all elements from the original queue and push them onto a
stack.

2. Pop all elements from the stack and enqueue them back into the
original queue.

Analysis:

1. Dequeue all elements to Stack:

Assume the queue has n elements.

This involves n dequeue operations and n push operations.

If the queue and stack operations are O(1) (which they typically are
for standard implementations), this step takes n * O(1) + n * O(1) =
O(n) time.

2. Pop from Stack to Enqueue:

This involves n pop operations and n enqueue operations.

[Link] Page 118 of 164


Again, assuming O(1) stack/queue operations, this step takes n *
O(1) + n * O(1) = O(n) time.

Total Time Complexity: O(n) + O(n) = O(n) .

Space Complexity:

The auxiliary stack will hold all n elements from the queue at one
point.

Thus, the space complexity is O(n).

8. Common Mistakes & Edge Cases


Circular Array Queue Pointer Logic:

Distinguishing Empty from Full: If front == rear , is the queue


empty or full?
Common solutions: Use a size counter, or leave one array slot
perpetually empty.

Mistake: Not handling this ambiguity can lead to incorrect


isEmpty or isFull checks.

Off-by-one errors: Incorrectly calculating (index + 1) % capacity


or using rear to point to the next available slot vs. the last
element can lead to incorrect logic.

Initial State: Improperly initializing front and rear (e.g.,


front=0, rear=0 for an empty queue can lead to issues if size is
not used).

Empty Queue Operations:

Attempting to dequeue or peek from an empty queue.

Mistake: Not checking isEmpty() before these operations can lead


to NullPointerException or IndexOutOfBoundsException .

Solution: Always check for emptiness.

Full Queue Operations (Array-based):

[Link] Page 119 of 164


Attempting to enqueue into a full array-based queue.

Mistake: Not checking isFull() can lead to


IndexOutOfBoundsException or overwriting existing data.

Solution: Always check for fullness.

Memory Leaks (Linked List):

Not properly dereferencing or setting removed nodes to null (in


languages with manual memory management) can lead to memory
leaks. Even in garbage-collected languages, holding onto
unnecessary references can delay garbage collection.

Mistake: If dequeue just moves front but the old front node is
still referenced elsewhere, it won't be collected.

Amortized vs. Worst-Case:

Mistake: Confusing amortized O(1) (like enqueue in a dynamic


array that occasionally resizes) with strict O(1) (like enqueue in a
linked list or circular array).

While average performance might be O(1), a single worst-case


resize can be O(n). This distinction is important for real-time
systems.

9. Real-Life Applications
Operating System Task Scheduling: When multiple programs or
processes need to use the CPU, they are often placed in a queue.
Processes are executed in a FIFO manner, ensuring fair access. Priority
queues are often used here, but basic queue principles apply. Efficient
enqueue/dequeue is vital for system responsiveness.

Printer Queues: Documents sent to a printer are added to a queue


and printed in the order they were received.

Network Packet Buffering: Routers and network devices use queues


to temporarily store data packets that arrive faster than they can be

[Link] Page 120 of 164


processed or transmitted. Fast enqueue and dequeue ensure smooth
data flow and prevent packet loss.

Web Server Request Handling: Web servers often put incoming


client requests into a queue. Workers or threads then pick requests
from the front of the queue to process them. This ensures requests
are handled systematically and prevents the server from being
overwhelmed.

Asynchronous Operations: In many programming paradigms (e.g.,


event loops in [Link], message queues like RabbitMQ or Kafka), tasks
or messages are put into a queue to be processed later by a different
thread or service. O(1) queue operations are critical for maintaining
high throughput.

Breadth-First Search (BFS): This fundamental graph traversal


algorithm uses a queue to explore all neighbors at the current depth
level before moving to the next depth level. The efficiency of BFS
directly depends on the efficiency of the underlying queue.

Call Centers/Customer Service: Incoming calls or customer queries


are often placed in a queue, handled by available agents in the order
they arrived.

10. Practice Problems


Easy:

1. Implement a queue using two stacks and analyze its enqueue


and dequeue time complexity.

2. Write a function isPalindrome(queue) that checks if a given queue


of characters forms a palindrome, without modifying the original
queue. What's its complexity?

Medium:

1. Given a queue of integers and an integer k , reverse the order of


the first k elements in the queue, leaving the other elements in

[Link] Page 121 of 164


the same relative order. Analyze the complexity. Example: Input: Q
= [1, 2, 3, 4, 5] , k = 3 . Output: Q = [3, 2, 1, 4, 5]

2. Implement a "Deque" (Double-ended queue) using a doubly linked


list, ensuring all addFront , addRear , removeFront ,
removeRear , peekFront , peekRear operations are O(1).
Analyze space complexity.

Hard:

1. You have n people in a line, and each person has a ticket number.
A person wants to buy k tickets, and they are at position p (0-
indexed) in the queue. Each person at the front of the queue buys
one ticket and then goes to the end of the queue if they still need
more tickets. If they have bought all their tickets, they leave the
queue. Find the total time it takes for the person at position p to
buy all their tickets. Analyze the time complexity of your solution.

2. Implement a queue using a fixed-size array such that enqueue


and dequeue are amortized O(1), even when resizing is required.
When resizing, double the array capacity. Analyze the amortized
time complexity.

11. Summary / Key Points


Purpose: Complexity analysis evaluates time and space resources
consumed by queue operations.

Why it matters: Ensures scalable, efficient, and responsive


applications, especially with growing data sizes.

Core Operations: enqueue (add), dequeue (remove), peek (view


front), isEmpty , isFull .

Primary Tool: Big O notation describes the growth rate of resource


usage (e.g., O(1) for constant, O(n) for linear).

Key Implementations & Their Complexity:

Array-based (Non-Circular):

[Link] Page 122 of 164


enqueue : O(1)

dequeue : O(n) (due to element shifting)

space : O(Capacity)

Array-based (Circular):
enqueue : O(1)

dequeue : O(1)

space : O(Capacity)

Linked List-based:
enqueue : O(1)

dequeue : O(1)

space : O(n) (scales with elements, higher per-element


overhead)

Choosing Implementation:

Circular Array: Preferred when fixed capacity is acceptable and


maximum O(1) performance is needed.

Linked List: Preferred when dynamic size is crucial, and variable


memory overhead is acceptable.

Common Pitfalls: Ambiguity between empty/full in circular arrays,


neglecting isEmpty / isFull checks, amortized vs. worst-case
distinctions.

Real-world Impact: Critical for operating systems, network devices,


web servers, and algorithms like BFS.

Highlighted Formulas and Shortcuts:

Big O Notation: Upper bound on growth rate.

Modular Arithmetic for Circular Array: (index + 1) % capacity

General Rule for Efficient Queues: Aim for O(1) for enqueue ,
dequeue , peek .

[Link] Page 123 of 164


Deque (Double-Ended Queue)
Introduction to Deque, a variation of a queue that allows insertion and
deletion from both front and rear ends.

1. Introduction
- Definition

A Deque (pronounced "deck" or "dequeue") stands for Double-Ended


Queue. It is a linear data structure that generalizes a queue, allowing
elements to be added or removed from both the front and the rear (or
back) ends. Unlike a traditional queue (First-In, First-Out or FIFO) or a
stack (Last-In, First-Out or LIFO), a deque offers more flexibility by
providing four fundamental operations: inserting at the front, inserting at
the rear, deleting from the front, and deleting from the rear.

- Why it is needed

Traditional queues allow insertions only at the rear and deletions only at
the front. Stacks allow insertions and deletions only at one end (top).
There are many scenarios where operations are needed at both ends of a
linear collection of items.

Real-life Analogy: Imagine a line of people waiting to get on a bus.

Queue: New people can only join at the back of the line, and people
can only get on the bus from the front.

Stack: (Less intuitive for a line) Imagine a pile of plates; you can only
add or remove from the top.

Deque: Now, imagine a special scenario where people can join the line
from either the front (maybe VIPs) or the back. Also, people can leave
the line from either the front (to get on the bus) or the back (if they
change their mind). This flexibility is what a Deque provides. It's like a
two-way street for data.

- Applications

[Link] Page 124 of 164


Deques are versatile and used in various applications:

Implementing other data structures: A deque can be used to


efficiently implement both a stack and a queue.

Undo/Redo functionality: In text editors or graphic design software,


deques can manage the sequence of actions, allowing users to undo
or redo operations from either end.

Browser History: Keeping track of visited pages, where you can go


back and forth.

Job Scheduling: In systems where tasks can be added or removed


from a queue based on their priority, or if urgent tasks need to be
processed first (front of deque) while regular tasks are added to the
back.

Stealing work: In multi-processor scheduling, one processor might


"steal" tasks from the deque of another processor if its own deque is
empty, often taking from the front or back.

Palindrome Checker: Efficiently checking if a string is a palindrome.

Sliding Window Maximum/Minimum: Algorithms that need to find


the maximum or minimum element within a sliding window of fixed
size in an array often use a deque to store potential candidates.

2. Theory Explanation
- In-depth explanation

A Deque maintains a sequence of elements where items can be added or


removed from either of its two ends: the "front" or the "rear". It combines
the functionalities of a stack (LIFO) and a queue (FIFO) into a single data
structure, providing greater flexibility.

Front End: This is where insertFront adds elements and


deleteFront removes elements.

Rear End: This is where insertRear adds elements and deleteRear


removes elements.

[Link] Page 125 of 164


The internal representation of a deque can be either:

1. Array-based: Uses a fixed-size array (or a dynamically-resizing array).


To handle insertions/deletions at both ends efficiently, a circular array
implementation is often preferred. This avoids shifting all elements
upon insertion/deletion at the front.

2. Linked List-based: Uses a doubly linked list. Each node stores data
and pointers to the next and previous nodes. This allows for constant-
time insertions and deletions at both ends without the size limitations
of an array.

- Diagrams/flowcharts (describe in text)

Imagine a horizontal line of boxes (representing elements in the deque).

<-- Delete Front / Insert Front Delete Rear / Insert Rear -->
+---+---+---+---+---+---+---+
|A|B|C|D|E| | |
+---+---+---+---+---+---+---+
^ ^
Front Rear

Initial State: [A, B, C, D, E]

Front points to 'A'.

Rear points to 'E'.

insertFront(X) : X is added before 'A'.

+---+---+---+---+---+---+---+
|X|A|B|C|D|E| |
+---+---+---+---+---+---+---+
^ ^
Front Rear

insertRear(Y) : Y is added after 'E'.

[Link] Page 126 of 164


+---+---+---+---+---+---+---+
|X|A|B|C|D|E|Y|
+---+---+---+---+---+---+---+
^ ^
Front Rear

deleteFront() : 'X' is removed. 'A' becomes the new front.

+---+---+---+---+---+---+---+
| |A|B|C|D|E|Y|
+---+---+---+---+---+---+---+
^ ^
Front Rear

deleteRear() : 'Y' is removed. 'E' becomes the new rear.

+---+---+---+---+---+---+---+
| |A|B|C|D|E| |
+---+---+---+---+---+---+---+
^ ^
Front Rear

- Step-by-step breakdown

The core idea is to maintain pointers/indices to both the 'front' and 'rear'
of the deque.

Linked List Implementation:

Maintain two pointers: head (or front ) and tail (or rear ).

For insertFront , create a new node, link it to the current head ,


and update head .

For insertRear , create a new node, link it to the current tail , and
update tail .

[Link] Page 127 of 164


For deleteFront , remove the node pointed to by head and
update head .

For deleteRear , remove the node pointed to by tail and update


tail .

Handle null (empty list) cases carefully.

Circular Array Implementation:

Maintain two indices: front and rear .

front points to the first element.

rear points to the last element.

When inserting/deleting, these indices are


incremented/decremented modulo the array size to wrap around.

Special conditions are needed to distinguish between a full and an


empty deque (e.g., leaving one empty slot, or using a count
variable).

3. Properties & Characteristics


- Key properties

Double-Ended Access: Elements can be added or removed from both


the front and rear.

Ordered Collection: Elements are stored in a specific sequence, and


their relative order is maintained unless explicitly removed.

Dynamic Size (typically): When implemented with a linked list or a


dynamic array, a deque can grow or shrink as needed, not limited by a
fixed capacity.

Versatility: Can simulate the behavior of both a queue (FIFO) and a


stack (LIFO).
As a Stack: Use insertFront (or insertRear ) as push and
deleteFront (or deleteRear ) as pop .

As a Queue: Use insertRear as enqueue and deleteFront as


dequeue .

[Link] Page 128 of 164


- Advantages & disadvantages

Advantages:

Flexibility: Provides more operational flexibility than a simple queue


or stack due to dual-ended access.

Efficiency: Basic operations (inserting/deleting at ends, peeking at


ends) are typically O(1) in both linked list and circular array
implementations.

Code Reusability: Can be used as a building block for other data


structures or to implement specific algorithms (e.g., sliding window).

Disadvantages:

Increased Complexity: The implementation logic (especially for


circular arrays) is more complex than a basic queue or stack due to
managing two pointers/indices and wrap-around conditions.

No Random Access: Similar to queues and stacks, direct access to an


element by its index (e.g., deque[i] ) is not an O(1) operation. It would
require traversing from an end, making it O(N) in the worst case.

Overhead: Linked list implementations incur memory overhead for


pointers in each node. Array-based implementations might waste
space if fixed-size and not fully utilized, or incur overhead for resizing
if dynamic.

4. Mathematical / Logical Representation


- Important formulas

A Deque primarily relies on logical indexing and pointer manipulation


rather than mathematical formulas in the traditional sense. The
"formulas" relate to how front and rear indices are managed in an
array-based circular implementation:

Let N be the maximum capacity of the array.

Increment rear : rear = (rear + 1) % N

[Link] Page 129 of 164


Decrement front : front = (front - 1 + N) % N (Adding N before
modulo ensures a positive result in some languages if front-1 is
negative)

Increment front : front = (front + 1) % N

Decrement rear : rear = (rear - 1 + N) % N

Conditions:

Empty Deque: front and rear often point to the same location, or a
count variable is 0.
If front == -1 (initial state)

If count == 0

For circular array: (front == rear && count == 0) or front == (rear


+ 1) % N (if one slot is left empty).

Full Deque (Array-based): count == N or (front == (rear + 1) % N) if


one slot is left empty to distinguish empty/full.

- Derivations (if any)

No formal mathematical derivations are typically associated with the


basic concept of a Deque itself. The "derivations" are more about the
logical steps and considerations for implementing its operations
efficiently, especially handling circular indices correctly.

5. Operations / Algorithms
Here are the primary operations for a Deque. We'll use a doubly linked
list based implementation for conceptual simplicity and efficiency in the
general case, then provide a Python class.

Assume a Node class with data , prev , and next attributes.

Deque() - Constructor

Problem statement: Initialize an empty deque.

Step-by-step algorithm:

[Link] Page 130 of 164


1. Set front pointer to None .

2. Set rear pointer to None .

3. Set size to 0 .

Pseudocode:

METHOD Deque()
[Link] = NULL
[Link] = NULL
[Link] = 0
END METHOD

isEmpty()

Problem statement: Check if the deque is empty.

Step-by-step algorithm:
1. Return true if front is None (or size is 0 ), false otherwise.

Pseudocode:

METHOD isEmpty()
RETURN [Link] == NULL
END METHOD

insertFront(value)

Problem statement: Add an element to the front of the deque.

Step-by-step algorithm:
1. Create a new Node with the given value .

2. If the deque is empty: a. Set front and rear to the new Node .

3. Else (deque is not empty): a. Set [Link] to the current


front . b. Set [Link] to newNode . c. Update front to
newNode .

4. Increment size .

Dry run example:

[Link] Page 131 of 164


Deque: [] (empty)

insertFront(10) :
New node (10) created.

Deque is empty, so front = (10) , rear = (10) .

Deque: [10]

insertFront(20) :
New node (20) created.

[Link] = 10

[Link] = 20

front = 20

Deque: [20, 10]

Pseudocode:

METHOD insertFront(value)
newNode = new Node(value)
IF [Link]() THEN
[Link] = newNode
[Link] = newNode
ELSE
[Link] = [Link]
[Link] = newNode
[Link] = newNode
END IF
[Link] = [Link] + 1
END METHOD

insertRear(value)

Problem statement: Add an element to the rear of the deque.

Step-by-step algorithm:
1. Create a new Node with the given value .

2. If the deque is empty: a. Set front and rear to the new Node .

[Link] Page 132 of 164


3. Else (deque is not empty): a. Set [Link] to the current
rear . b. Set [Link] to newNode . c. Update rear to
newNode .

4. Increment size .

Pseudocode:

METHOD insertRear(value)
newNode = new Node(value)
IF [Link]() THEN
[Link] = newNode
[Link] = newNode
ELSE
[Link] = [Link]
[Link] = newNode
[Link] = newNode
END IF
[Link] = [Link] + 1
END METHOD

deleteFront()

Problem statement: Remove and return the element from the front
of the deque.

Step-by-step algorithm:
1. If the deque is empty, raise an error or return a special value (e.g.,
None ).

2. Store the data of front node to return later.

3. If front and rear are the same (only one element): a. Set front
and rear to None .

4. Else: a. Update front to [Link] . b. Set [Link] to None .

5. Decrement size .

6. Return the stored data .

Pseudocode:

[Link] Page 133 of 164


METHOD deleteFront()
IF [Link]() THEN
PRINT "Deque is empty, cannot delete from front."
RETURN NULL // Or raise an error
END IF

data = [Link]
IF [Link] == [Link] THEN // Only one element
[Link] = NULL
[Link] = NULL
ELSE
[Link] = [Link]
[Link] = NULL
END IF
[Link] = [Link] - 1
RETURN data
END METHOD

deleteRear()

Problem statement: Remove and return the element from the rear of
the deque.

Step-by-step algorithm:
1. If the deque is empty, raise an error or return a special value (e.g.,
None ).

2. Store the data of rear node to return later.

3. If front and rear are the same (only one element): a. Set front
and rear to None .

4. Else: a. Update rear to [Link] . b. Set [Link] to None .

5. Decrement size .

6. Return the stored data .

Pseudocode:

[Link] Page 134 of 164


METHOD deleteRear()
IF [Link]() THEN
PRINT "Deque is empty, cannot delete from rear."
RETURN NULL // Or raise an error
END IF

data = [Link]
IF [Link] == [Link] THEN // Only one element
[Link] = NULL
[Link] = NULL
ELSE
[Link] = [Link]
[Link] = NULL
END IF
[Link] = [Link] - 1
RETURN data
END METHOD

getFront()

Problem statement: Return the element at the front of the deque


without removing it.

Step-by-step algorithm:
1. If the deque is empty, raise an error or return None .

2. Return [Link] .

Pseudocode:

METHOD getFront()
IF [Link]() THEN
PRINT "Deque is empty."
RETURN NULL
END IF
RETURN [Link]
END METHOD

[Link] Page 135 of 164


getRear()

Problem statement: Return the element at the rear of the deque


without removing it.

Step-by-step algorithm:
1. If the deque is empty, raise an error or return None .

2. Return [Link] .

Pseudocode:

METHOD getRear()
IF [Link]() THEN
PRINT "Deque is empty."
RETURN NULL
END IF
RETURN [Link]
END METHOD

Python Code Example (Doubly Linked List Implementation)

class Node:
def __init__(self, data):
[Link] = data
[Link] = None
[Link] = None

class Deque:
def __init__(self):
[Link] = None
[Link] = None
[Link] = 0

def isEmpty(self):
return [Link] is None

def insertFront(self, value):

[Link] Page 136 of 164


new_node = Node(value)
if [Link]():
[Link] = new_node
[Link] = new_node
else:
new_node.next = [Link]
[Link] = new_node
[Link] = new_node
[Link] += 1
print(f"Inserted {value} at front. Deque: {self._to_list()}")

def insertRear(self, value):


new_node = Node(value)
if [Link]():
[Link] = new_node
[Link] = new_node
else:
new_node.prev = [Link]
[Link] = new_node
[Link] = new_node
[Link] += 1
print(f"Inserted {value} at rear. Deque: {self._to_list()}")

def deleteFront(self):
if [Link]():
print("Deque is empty, cannot delete from front.")
return None

deleted_data = [Link]
if [Link] == [Link]: # Only one element
[Link] = None
[Link] = None
else:
[Link] = [Link]
[Link] = None
[Link] -= 1

[Link] Page 137 of 164


print(f"Deleted {deleted_data} from front. Deque: {self._to_list()}")
return deleted_data

def deleteRear(self):
if [Link]():
print("Deque is empty, cannot delete from rear.")
return None

deleted_data = [Link]
if [Link] == [Link]: # Only one element
[Link] = None
[Link] = None
else:
[Link] = [Link]
[Link] = None
[Link] -= 1
print(f"Deleted {deleted_data} from rear. Deque: {self._to_list()}")
return deleted_data

def getFront(self):
if [Link]():
print("Deque is empty.")
return None
return [Link]

def getRear(self):
if [Link]():
print("Deque is empty.")
return None
return [Link]

def getSize(self):
return [Link]

# Helper for printing deque content


def _to_list(self):

[Link] Page 138 of 164


elements = []
current = [Link]
while current:
[Link]([Link])
current = [Link]
return elements

# Example Usage:
my_deque = Deque()
my_deque.insertRear(5) # Inserted 5 at rear. Deque: [5]
my_deque.insertFront(10) # Inserted 10 at front. Deque: [10, 5]
my_deque.insertRear(15) # Inserted 15 at rear. Deque: [10, 5, 15]
my_deque.insertFront(20) # Inserted 20 at front. Deque: [20, 10, 5, 15]

print(f"Front element: {my_deque.getFront()}") # Front element: 20


print(f"Rear element: {my_deque.getRear()}") # Rear element: 15
print(f"Deque size: {my_deque.getSize()}") # Deque size: 4

my_deque.deleteFront() # Deleted 20 from front. Deque: [10, 5, 15]


my_deque.deleteRear() # Deleted 15 from rear. Deque: [10, 5]

print(f"Front element after deletions: {my_deque.getFront()}") # Front ele


print(f"Rear element after deletions: {my_deque.getRear()}") # Rear elem

my_deque.deleteFront() # Deleted 10 from front. Deque: [5]


my_deque.deleteRear() # Deleted 5 from rear. Deque: []
my_deque.deleteRear() # Deque is empty, cannot delete from rear.

6. Complexity Analysis
The complexity analysis for Deque operations depends on its underlying
implementation (doubly linked list vs. circular array). In general, both
provide efficient operations at the ends.

- Time complexity (Best, Worst, Average)

[Link] Page 139 of 164


For both doubly linked list and circular array implementations:

insertFront(value) : O(1)
Best, Worst, Average: Creating a new node/assigning a value and
updating pointers/indices takes constant time.

insertRear(value) : O(1)
Best, Worst, Average: Same as insertFront .

deleteFront() : O(1)
Best, Worst, Average: Updating pointers/indices takes constant
time.

deleteRear() : O(1)
Best, Worst, Average: Same as deleteFront .

getFront() : O(1)
Best, Worst, Average: Accessing the front pointer/index is
constant time.

getRear() : O(1)
Best, Worst, Average: Accessing the rear pointer/index is constant
time.

isEmpty() : O(1)
Best, Worst, Average: Checking a pointer/index or a size variable is
constant time.

isFull() (for array-based): O(1)


Best, Worst, Average: Checking front , rear indices or count is
constant time.

Note: Operations like searching for an element or accessing an element


by index would be O(N) because they require traversal from one of the
ends.

- Space complexity

O(N): Where N is the number of elements currently stored in the


deque.

[Link] Page 140 of 164


Linked List: Each node stores the data and two pointers ( prev ,
next ). So, it's roughly N * (size_of_data + 2 * size_of_pointer) .

Array: Stores N elements in an array. If it's a fixed-size array, the


space is capacity * size_of_data , regardless of how many
elements are present. If it's a dynamic array, it's N * size_of_data
(amortized, with occasional resizing overhead).

- Comparison with alternatives

Queue (FIFO): Deque is a superset of a queue. A queue only allows


enqueue (insertRear) and dequeue (deleteFront). Deque provides
insertFront and deleteRear in addition, offering more flexibility.
Time complexity for common operations is similar (O(1)).

Stack (LIFO): Deque is also a superset of a stack. A stack only allows


push (insertFront or insertRear) and pop (deleteFront or
deleteRear, depending on which end push was). Deque offers
operations on both ends. Time complexity for common operations is
similar (O(1)).

Array/List: While basic arrays or lists can simulate deque behavior,


doing insertFront or deleteFront on a simple array (not circular)
would involve shifting all subsequent elements, leading to O(N) time
complexity. Deque implementations ensure O(1) for these operations.

7. Examples
Example 1: Palindrome Checker

Problem: Check if a given string is a palindrome using a Deque. A


palindrome is a word, phrase, number, or other sequence of characters
which reads the same backward as forward.

Solution:

1. Create an empty deque.

2. Add each character of the string to the rear of the deque.

[Link] Page 141 of 164


3. While the deque has more than one character: a. Remove a character
from the front. b. Remove a character from the rear. c. If the
characters do not match, the string is not a palindrome.

4. If the loop completes, the string is a palindrome.

Dry Run Example: madam

Initial string: madam

Deque d = Deque()

insertRear('m') -> d: [m]

insertRear('a') -> d: [m, a]

insertRear('d') -> d: [m, a, d]

insertRear('a') -> d: [m, a, d, a]

insertRear('m') -> d: [m, a, d, a, m]

Iteration 1:

[Link]() -> 'm'

[Link]() -> 'm'

deleteFront() -> 'm', d: [a, d, a, m]

deleteRear() -> 'm', d: [a, d, a]

'm' == 'm' -> continue.

Iteration 2:

[Link]() -> 'a'

[Link]() -> 'a'

deleteFront() -> 'a', d: [d, a]

deleteRear() -> 'a', d: [d]

'a' == 'a' -> continue.

Iteration 3:

[Link] Page 142 of 164


Deque size is 1 ( [d] ). Loop terminates.

Result: The string "madam" is a palindrome.

# Assuming the Deque class from above is available

def is_palindrome(s):
char_deque = Deque()

# Add characters to the rear of the deque


for char in s:
char_deque.insertRear([Link]()) # Convert to lower case for case

is_pal = True
while char_deque.getSize() > 1:
first_char = char_deque.deleteFront()
last_char = char_deque.deleteRear()
if first_char != last_char:
is_pal = False
break

# To restore deque for re-use or other checks, you might need to rebu
# For this example, we just return the result
return is_pal

# Test cases
print(f"'madam' is a palindrome: {is_palindrome('madam')}") # True
print(f"'racecar' is a palindrome: {is_palindrome('racecar')}") # True
print(f"'hello' is a palindrome: {is_palindrome('hello')}") # False
print(f"'Aba' is a palindrome: {is_palindrome('Aba')}") # True (due to .lowe

Example 2: Implementing a Stack using a Deque

Problem: Show how to use a Deque to implement a Stack (LIFO).

Solution: A stack can be implemented using a deque by consistently


using one end for both insertions (push) and deletions (pop).

[Link] Page 143 of 164


push(element) : Use insertFront(element) (or insertRear(element) )

pop() : Use deleteFront() (if push was insertFront ) or


deleteRear() (if push was insertRear )

peek() : Use getFront() (if push was insertFront ) or getRear()

# Assuming the Deque class from above is available

class StackFromDeque:
def __init__(self):
[Link] = Deque()

def push(self, value):


# Push elements to the front (acting like top of stack)
[Link](value)

def pop(self):
# Pop elements from the front (acting like top of stack)
return [Link]()

def peek(self):
# Peek at the front element
return [Link]()

def isEmpty(self):
return [Link]()

def size(self):
return [Link]()

# Test cases
my_stack = StackFromDeque()
my_stack.push(10) # Inserted 10 at front. Deque: [10]
my_stack.push(20) # Inserted 20 at front. Deque: [20, 10]
my_stack.push(30) # Inserted 30 at front. Deque: [30, 20, 10]

[Link] Page 144 of 164


print(f"Stack top: {my_stack.peek()}") # Stack top: 30
print(f"Stack size: {my_stack.size()}") # Stack size: 3

print(f"Popped: {my_stack.pop()}") # Deleted 30 from front. Deque: [20, 1


print(f"Stack top: {my_stack.peek()}") # Stack top: 20
print(f"Popped: {my_stack.pop()}") # Deleted 20 from front. Deque: [10]
print(f"Popped: {my_stack.pop()}") # Deleted 10 from front. Deque: []
print(f"Is stack empty? {my_stack.isEmpty()}") # Is stack empty? True
print(f"Popped from empty stack: {my_stack.pop()}") # Deque is empty, c

8. Common Mistakes & Edge Cases


- Where students go wrong

Off-by-one errors in circular array implementation: Incorrectly


calculating front and rear indices when wrapping around,
especially when dealing with (index - 1 + N) % N .

Confusing front and rear : Forgetting which end is which for


specific operations (e.g., calling insertRear when insertFront was
intended).

Not handling empty/full conditions: Attempting to deleteFront or


deleteRear on an empty deque without checking isEmpty() , or
insertFront / insertRear on a full array-based deque without
checking isFull() .

Incorrect single-element logic: When only one element is present


( front == rear ), deleting it should set both front and rear to
None . Failing to do so can lead to front or rear pointing to a
deleted node.

Memory leaks (in C++/Java with manual memory management):


Forgetting to deallocate nodes when using a linked list
implementation, leading to unused memory. Python's garbage
collection handles this automatically.

- Special cases

[Link] Page 145 of 164


Empty Deque:
deleteFront() or deleteRear() on an empty deque should raise
an error or return None .

getFront() or getRear() on an empty deque should raise an error


or return None .

insertFront() or insertRear() into an empty deque should


correctly set both front and rear pointers to the new element.

Single Element Deque:


After an insertion into an empty deque, front and rear should
point to the same element.

Deleting the single element (from either end) should result in an


empty deque, with front and rear becoming None .

Full Deque (Array-based):


insertFront() or insertRear() on a full deque should raise an
error (Queue Overflow).

Correctly distinguishing between an empty and a full deque when


front and rear might be at the same logical position (e.g., using
a count variable or leaving one slot empty).

9. Real-Life Applications
Web Browser History: Browsers typically use a deque-like structure
to store the history of visited web pages. You can navigate back
(delete from front/get previous) and forward (delete from rear/get
next) through your browsing history.

Undo/Redo Functionality: In many applications (text editors, drawing


tools), changes are recorded in a deque. Undo operations retrieve
previous states (from the front), and Redo operations reapply
undone changes (from the rear).

Process Scheduling: In operating systems, deques can be used to


manage processes or tasks. High-priority tasks might be inserted at
the front, while regular tasks are added to the rear. Processes can be
picked from either end depending on the scheduling algorithm.

[Link] Page 146 of 164


Graph Algorithms: Breadth-First Search (BFS) for finding shortest
paths often uses a queue. However, algorithms like the "0-1 BFS"
(where edge weights are 0 or 1) can use a deque: edges with weight 0
are pushed to the front, and edges with weight 1 are pushed to the
rear, ensuring that 0-weight paths are explored first.

Implementing std::deque in C++ or [Link] in


Python: These are highly optimized built-in deque implementations
used extensively in system-level programming and libraries where
efficient insertions/deletions at both ends are crucial.

Steal-Deque Work Scheduling: In parallel programming, work-


stealing schedulers often use deques. Each processor has its own
deque of tasks. When a processor runs out of its own tasks, it "steals"
tasks from the deque of another processor, usually from the "bottom"
(rear) to reduce contention.

10. Practice Problems


- Easy

1. Basic Deque Implementation: Implement a Deque using Python's


built-in list as the underlying storage, without using
[Link] . Pay attention to efficiency for insertFront (which
might be [Link](0, item) and thus O(N)).

2. Deque to Stack/Queue: Write functions deque_to_stack(deque) and


deque_to_queue(deque) that demonstrate how to use a deque to
simulate basic stack (push/pop) and queue (enqueue/dequeue)
operations.

- Medium

1. Palindrome Checker (String): Given a string, determine if it is a


palindrome using a deque. Handle spaces, punctuation, and case-
insensitivity (e.g., "Madam, I'm Adam" is a palindrome).

2. Sliding Window Maximum (Conceptual): Given an array nums and


a window size k , find the maximum element in each sliding window.
(This problem is often solved using a deque to store indices of

[Link] Page 147 of 164


potentially maximum elements in decreasing order.) You don't need to
write a full solution, but outline the algorithm using a deque.

- Hard

1. Sliding Window Maximum (Implementation): Implement the Sliding


Window Maximum problem efficiently using a deque.
Example: nums = [1,3,-1,-3,5,3,6,7] , k = 3

Output: [3,3,5,5,6,7]

2. LRU Cache (Least Recently Used Cache): Design and implement an


LRU cache. The cache should support get and put operations.
When the cache reaches its capacity, it should invalidate the least
recently used item. A deque (or a doubly linked list) combined with a
hash map is a common and efficient way to implement this.

11. Summary / Key Points


- Quick revision of points covered

Definition: A Deque (Double-Ended Queue) is a linear data structure


that allows insertion and deletion of elements from both its front and
rear ends.

Flexibility: It's a generalization of both queues (FIFO) and stacks


(LIFO), offering more versatile operations.

Implementation: Can be implemented using a doubly linked list


(most common for dynamic size and O(1) operations without resizing)
or a circular array (for fixed-size memory efficiency).

Core Operations: insertFront() , insertRear() , deleteFront() ,


deleteRear() , getFront() , getRear() , isEmpty() , and (for array)
isFull() .

Efficiency: All core operations (insertions, deletions, peeking at ends)


typically have a time complexity of O(1).

Space Complexity: O(N) for N elements stored.

[Link] Page 148 of 164


Applications: Web browser history, undo/redo features, job
scheduling, palindrome checking, sliding window problems,
implementing other data structures.

Common Pitfalls: Off-by-one errors in array implementation,


neglecting empty/single-element/full conditions, and managing
pointers correctly in linked lists.

- Highlighted concepts and takeaways

"Two-way street" for data: Think of it as a flexible container where


items can flow in and out from either end.

O(1) at the ends: The primary advantage of a deque is its constant-


time performance for operations at both ends, making it highly
efficient for scenarios requiring this specific pattern of access.

Building Block: Deques are powerful enough to simulate both stacks


and queues, making them a fundamental data structure in many
algorithms and system designs.

Applications of Queues
Exploring common uses of queues in computer science, such as breadth-first
search (BFS), CPU scheduling, printer queues, and buffer management.

1. Introduction
Definition: Queues are linear data structures that follow the First-In,
First-Out (FIFO) principle. This means the element that was added first
will be the first one to be removed. Think of it like a line of people
waiting for a service.

Why it is needed: Queues are essential for managing sequential


processing where the order of arrival dictates the order of service.
They ensure fairness and proper sequencing of tasks, requests, or
data.

[Link] Page 149 of 164


Real-life Analogy: Imagine a ticket counter at a movie theater.
People arrive and form a line. The person who arrived first (front of
the line) gets their ticket first, and then leaves. New people join the
back of the line. This orderly process is exactly what a queue
facilitates in computing.

Applications: Queues are widely used in:


Operating Systems (CPU scheduling, I/O buffers)

Networking (packet buffering, routing)

Graph Algorithms (Breadth-First Search - BFS)

Simulations

Spooling (printer, mail)

Call center systems

Task management and message queuing in distributed systems

2. Theory Explanation
Queues are fundamental in scenarios requiring ordered processing based
on arrival time. Their utility stems from strictly enforcing the FIFO policy,
which is crucial for managing shared resources, processing asynchronous
tasks, and exploring connected structures systematically.

In-depth explanation: When a system has more tasks or data than it


can process simultaneously, a queue acts as a buffer. New tasks are
enqueued (added to the rear), and the processing unit dequeues
(removes from the front) tasks one by one. This ensures that no task is
unfairly delayed while newer tasks are processed, maintaining fairness
and preventing starvation (where some tasks might never get
processed). In graph traversal, for instance, a queue helps explore
nodes level by level, ensuring that all nodes at a particular "distance"
from the start are visited before moving to the next level. This
systematic exploration is a direct consequence of the FIFO property.

Diagrams/Flowcharts (describe in text): Imagine a simple


processing system:

[Link] Page 150 of 164


1. Incoming Tasks: Tasks (A, B, C, D) arrive in that order.

2. Queue: Tasks are added to the rear of the queue.


Initially empty.

A arrives: [A]

B arrives: [A, B]

C arrives: [A, B, C]

D arrives: [A, B, C, D]

3. Processor: The processor takes tasks from the front of the


queue.
Processor takes A. Queue: [B, C, D]

Processor takes B. Queue: [C, D]

Processor takes C. Queue: [D]

Processor takes D. Queue: [] This flow ensures A is processed


before B, B before C, and so on, preserving the order of arrival.

Step-by-step breakdown:

1. Arrival: A new request, task, or data item becomes available.

2. Enqueuing: The item is added to the rear of the queue. If the


queue has a capacity limit, it might wait or be rejected.

3. Waiting: The item remains in the queue, waiting for its turn,
respecting the FIFO order.

4. Dequeuing: When the processing resource becomes available, the


item at the front of the queue is removed and sent for processing.

5. Processing: The item is handled by the system.

6. Repeat: Steps 1-5 continue as long as items arrive and processing


resources are available.

3. Properties & Characteristics


Key properties:

[Link] Page 151 of 164


First-In, First-Out (FIFO): The most defining characteristic.
Elements are processed in the exact order they were added.

Linear Structure: Elements are arranged sequentially.

Dynamic Size (typically): Can grow or shrink as elements are


added or removed (though fixed-size arrays can also implement
queues).

Two Pointers (Front and Rear): Managed by two pointers, front


(or head ) for dequeuing and rear (or tail ) for enqueuing.

No Random Access: Elements can only be accessed from the


front.

Advantages & disadvantages:

Advantages:
Order Preservation: Guarantees that items are processed in
their arrival order, crucial for many real-world scenarios.

Resource Management: Helps manage shared resources


efficiently by providing a fair waiting mechanism.

Decoupling: Allows different parts of a system (producers and


consumers) to operate at different speeds without
overwhelming each other.

Simplicity: Conceptually straightforward and easy to


implement.

Disadvantages:
Limited Access: Only the front element is accessible for
processing.

Potential for Bottlenecks: If the processing speed is


consistently slower than the arrival rate, the queue can grow
indefinitely, consuming memory and increasing waiting times.

No Priority Handling (by default): A basic queue doesn't


inherently support processing higher-priority items first.
Variations like Priority Queues are needed for such cases.

[Link] Page 152 of 164


4. Mathematical / Logical Representation
For applications of queues, the "mathematical/logical representation"
often pertains to queueing theory, which analyzes the flow of items
through a waiting line. While full derivations are complex, we can
highlight key logical concepts and simple metrics.

Important concepts:

Arrival Rate ($\lambda$): The average number of items arriving


per unit of time.

Service Rate ($\mu$): The average number of items a server can


process per unit of time.

Utilization ($\rho$): The fraction of time the server is busy,


typically $\rho = \lambda / \mu$. For a stable queue, $\lambda <
\mu$.

Queue Length (Lq): Average number of items waiting in the


queue.

System Length (Ls): Average number of items in the system


(waiting + being served).

Waiting Time (Wq): Average time an item spends waiting in the


queue.

System Time (Ws): Average time an item spends in the system


(waiting + service).

Logical Flow: Items arrive $\xrightarrow{\text{enqueue}}$ Queue


$\xrightarrow{\text{dequeue}}$ Server
$\xrightarrow{\text{processed}}$ Exit.

The logical representation emphasizes:

1. FIFO Discipline: The next item to be served is always the one that
has been waiting the longest.

2. State Management: The queue maintains the current state of


pending tasks/items, allowing an application to know what needs
to be done next.

[Link] Page 153 of 164


5. Operations / Algorithms
This section demonstrates how queues are used within algorithms, rather
than detailing the basic enqueue / dequeue operations themselves.
We'll use Breadth-First Search (BFS) as a prime example.

Algorithm Example: Breadth-First Search (BFS)

Problem statement: Given a graph (or a tree) and a starting node,


traverse the graph level by level, visiting all reachable nodes in
increasing order of their distance from the starting node. This is used
to find the shortest path in an unweighted graph, explore all
connected components, etc.

Step-by-step algorithm:

1. Create an empty queue and add the starting node to it.

2. Mark the starting node as visited.

3. While the queue is not empty: a. Dequeue a node u from the


front of the queue. b. Process node u (e.g., print it, check for a
target). c. For each unvisited neighbor v of node u : i. Mark v as
visited. ii. Enqueue v into the queue.

Dry run example: Consider a graph: A -- B, A -- C, B -- D, C -- E. Start


node: A.

1. Queue: [] , Visited: {}

2. Enqueue A. Queue: [A] , Visited: {A}

3. Loop 1: Queue not empty. a. Dequeue A. Current node u = A.


Queue: [] b. Process A (e.g., print "A"). c. Neighbors of A: B, C. i. B
is unvisited. Mark B visited. Enqueue B. Queue: [B] , Visited: {A,
B} ii. C is unvisited. Mark C visited. Enqueue C. Queue: [B, C] ,
Visited: {A, B, C}

4. Loop 2: Queue not empty. a. Dequeue B. Current node u = B.


Queue: [C] b. Process B (e.g., print "B"). c. Neighbors of B: A, D. i. A

[Link] Page 154 of 164


is visited. Skip. ii. D is unvisited. Mark D visited. Enqueue D. Queue:
[C, D] , Visited: {A, B, C, D}

5. Loop 3: Queue not empty. a. Dequeue C. Current node u = C.


Queue: [D] b. Process C (e.g., print "C"). c. Neighbors of C: A, E. i. A
is visited. Skip. ii. E is unvisited. Mark E visited. Enqueue E. Queue:
[D, E] , Visited: {A, B, C, D, E}

6. Loop 4: Queue not empty. a. Dequeue D. Current node u = D.


Queue: [E] b. Process D (e.g., print "D"). c. Neighbors of D: B. i. B is
visited. Skip.

7. Loop 5: Queue not empty. a. Dequeue E. Current node u = E.


Queue: [] b. Process E (e.g., print "E"). c. Neighbors of E: C. i. C is
visited. Skip.

8. Loop 6: Queue is empty. Terminate.

Output (example): A B C D E. This shows a level-by-level traversal.

Pseudocode:

BFS(graph, start_node):
create an empty queue Q
create a set 'visited' to store visited nodes

enqueue start_node into Q


add start_node to 'visited'

while Q is not empty:


u = dequeue from Q
process u // e.g., print u

for each neighbor v of u:


if v is not in 'visited':
add v to 'visited'
enqueue v into Q

Code (Python):

[Link] Page 155 of 164


from collections import deque

def bfs(graph, start_node):


visited = set()
queue = deque() # Python's deque is optimized for queue operation

[Link](start_node)
[Link](start_node)

while queue:
current_node = [Link]() # Dequeue from front
print(current_node, end=" ")

for neighbor in graph[current_node]:


if neighbor not in visited:
[Link](neighbor)
[Link](neighbor) # Enqueue to rear

# Example Graph represented as an adjacency list


graph_example = {
'A': ['B', 'C'],
'B': ['A', 'D'],
'C': ['A', 'E'],
'D': ['B'],
'E': ['C']
}

print("BFS Traversal starting from A:")


bfs(graph_example, 'A') # Output: A B C D E

6. Complexity Analysis
When analyzing the complexity of algorithms that use queues, we focus
on the overall algorithm, considering the queue operations

[Link] Page 156 of 164


(enqueue/dequeue) typically take O(1) time.

Time complexity (for BFS):

O(V + E), where V is the number of vertices (nodes) and E is the


number of edges in the graph.
Each vertex is enqueued and dequeued exactly once (O(V)).

Each edge is examined exactly once (in a directed graph) or


twice (in an undirected graph, from both directions) when
iterating through neighbors (O(E)).

Best Case: O(V + E)

Worst Case: O(V + E) (since it visits all reachable nodes and edges)

Average Case: O(V + E)

Space complexity (for BFS):

O(V), primarily for storing the visited set and the queue itself.
In the worst case (e.g., a "star" graph where one node is
connected to all others, or a dense graph), the queue might
hold up to O(V) nodes at a time (e.g., all neighbors of a central
node).

The visited set also stores up to O(V) nodes.

Comparison with alternatives (e.g., DFS for graph traversal):

BFS vs. DFS (Depth-First Search):


BFS: Uses a Queue. Explores level by level. Good for finding
shortest paths in unweighted graphs and exploring all
reachable nodes. Space complexity can be higher for wide
graphs.

DFS: Uses a Stack (or recursion, which uses the call stack).
Explores as deeply as possible along each branch before
backtracking. Good for topological sorting, finding connected
components, detecting cycles. Space complexity can be higher
for deep graphs (due to recursion depth).

[Link] Page 157 of 164


Queue's Role: The queue in BFS enforces the "breadth-first"
exploration, ensuring that nodes closer to the source are
processed before nodes further away. A stack (LIFO) would lead
to a "depth-first" exploration.

7. Examples
Here we look at a solved problem that leverages a queue.

Solved Problem: Level Order Traversal of a Binary Tree

Problem: Given the root of a binary tree, return the level order
traversal of its nodes' values. (i.e., from left to right, level by level).

Solution Approach: This is a classic application of BFS. We can use a


queue to keep track of nodes to visit at the current level and then
process their children for the next level.

Steps:

1. If the root is null, return an empty list.

2. Initialize an empty list result to store the level order traversal.

3. Create an empty queue and add the root node to it.

4. While the queue is not empty: a. Get the number of nodes


currently in the queue (this represents the count of nodes at the
current level). Let this be level_size . b. Initialize an empty list
current_level_nodes to store values of nodes at the current level.
c. Loop level_size times: i. Dequeue a node temp from the front
of the queue. ii. Add [Link] to current_level_nodes . iii. If
[Link] is not null, enqueue [Link] . iv. If [Link] is not
null, enqueue [Link] . d. Add current_level_nodes to
result .

5. Return result .

Dry Run Example: Tree: 3 /


9 20 /
15 7

[Link] Page 158 of 164


1. result = [] , queue = []

2. Enqueue 3 . queue = [3]

3. Loop 1: queue not empty. a. level_size = 1 . b.


current_level_nodes = [] . c. Dequeue 3 . current_level_nodes =
[3] . Enqueue 9 , 20 . queue = [9, 20] . d. result = [[3]] .

4. Loop 2: queue not empty. a. level_size = 2 . b.


current_level_nodes = [] . c. Dequeue 9 . current_level_nodes =
[3, 9] . 9 has no children. queue = [20] . d. Dequeue 20 .
current_level_nodes = [3, 9, 20] . Enqueue 15 , 7 . queue = [15,
7] . e. result = [[3], [9, 20]] . (Correcting: current_level_nodes
should only contain nodes from the current level. Let's re-do this
part)

Revised Dry Run Example: Tree: 3 /


9 20 /
15 7

1. result = [] , queue = []

2. Enqueue 3 . queue = [Node(3)]

3. Loop 1: queue not empty. a. level_size = len(queue) = 1 . b.


current_level_nodes = [] . c. i. Dequeue Node(3) .
current_level_nodes.append(3) . ii. Enqueue Node(9) , Node(20) .
queue = [Node(9), Node(20)] . d. result = [[3]] .

4. Loop 2: queue not empty. a. level_size = len(queue) = 2 . b.


current_level_nodes = [] . c. i. Dequeue Node(9) .
current_level_nodes.append(9) . Node(9) has no children. ii.
Dequeue Node(20) . current_level_nodes.append(20) . iii.
Enqueue Node(15) , Node(7) . queue = [Node(15), Node(7)] . d.
result = [[3], [9, 20]] .

5. Loop 3: queue not empty. a. level_size = len(queue) = 2 . b.


current_level_nodes = [] . c. i. Dequeue Node(15) .
current_level_nodes.append(15) . Node(15) has no children. ii.
Dequeue Node(7) . current_level_nodes.append(7) . Node(7)
has no children. d. result = [[3], [9, 20], [15, 7]] .

[Link] Page 159 of 164


6. Loop 4: queue is empty. Terminate.

Final result : [[3], [9, 20], [15, 7]] .

8. Common Mistakes & Edge Cases


Where students go wrong:

Confusing with Stacks: Accidentally applying LIFO logic (using


push / pop from the same end) instead of FIFO for queue-based
problems.

Infinite Loops in BFS: Not marking nodes as visited before


enqueuing them, especially in cyclic graphs. This can lead to re-
adding the same node repeatedly and infinite loops.

Off-by-one errors with array-based queues: Mismanaging front


and rear pointers, leading to incorrect isEmpty , isFull , or data
corruption.

Forgetting to handle empty queue: Attempting to dequeue


from an empty queue without checking, resulting in errors.

Incorrectly handling specific application requirements: E.g., for


CPU scheduling, forgetting to re-enqueue a partially executed
process in Round Robin.

Special cases:

Empty Queue: An application must gracefully handle scenarios


where the queue is empty (e.g., no tasks to process, no more
nodes to visit).

Full Queue (Bounded Queues): If an application uses a fixed-size


queue, it must handle cases where the queue is full (e.g., rejecting
new requests, blocking the producer).

Single Element: Correctly enqueue ing and dequeue ing a single


element.

Circular Queues: Used to efficiently utilize space in array-based


implementations by wrapping around front and rear pointers.

[Link] Page 160 of 164


Applications need to correctly implement modulo arithmetic for
pointer movement.

Priority Queues: While not a simple queue, many "queueing"


applications require items to be processed based on priority, not
just arrival time. This calls for a Priority Queue (often implemented
with a heap) instead of a standard queue.

9. Real-Life Applications
Queues are ubiquitous in modern computing systems:

Operating Systems:
CPU Scheduling: Processes waiting for the CPU are often placed in
a queue (e.g., First-Come, First-Served scheduling, Round Robin
scheduling).

I/O Buffers: Data waiting to be written to or read from a disk or


network device is buffered in queues.

Interrupt Handling: Interrupts are often queued and processed in


the order they arrive.

Computer Networks:
Packet Buffering: Routers and switches use queues to buffer
network packets when network traffic temporarily exceeds
capacity. Packets are then forwarded in FIFO order.

Traffic Management: Managing network traffic flow to prevent


congestion.

Web Servers:
Request Queues: Web servers use queues to manage incoming
client requests, ensuring fair processing of requests even during
high load.

Print Spooling:
When multiple print jobs are sent to a single printer, they are
placed in a print queue and processed one by one.

[Link] Page 161 of 164


Message Queuing Systems (e.g., Apache Kafka, RabbitMQ):
Used in distributed systems to facilitate asynchronous
communication between different services. Messages are sent to a
queue, and consumer services process them in order. This
provides resilience and scalability.

Asynchronous Task Processing:


For tasks that don't need immediate results (e.g., sending emails,
generating reports), they can be added to a queue and processed
by worker services in the background.

10. Practice Problems


Easy:

1. Implement a simple print queue: Simulate adding print jobs and


processing them.

2. Queue for a supermarket checkout: Simulate customers arriving


and being served.

3. BFS on a simple adjacency list graph: Traverse a small,


unweighted graph.

Medium:

1. Level Order Traversal of a Binary Tree: Given a binary tree,


return the level order traversal of its nodes' values (as
demonstrated in Section 7).

2. Shortest Path in an Unweighted Graph: Implement BFS to find


the shortest path between two nodes in an unweighted graph.

3. Hot Potato Game: Simulate the "Hot Potato" game using a queue,
where children pass a potato and the one holding it when the timer
runs out is removed.

Hard:

1. Matrix 01 (LeetCode 542): Given a matrix consisting of 0s and 1s,


find the distance of the nearest 0 for each cell. (This is a multi-

[Link] Page 162 of 164


source BFS problem).

2. Snake and Ladders (LeetCode 909): Implement a BFS to find the


minimum number of moves to reach the final square on a game
board.

3. Task Scheduler: Given a char array representing tasks and a non-


negative integer n representing the cooldown period, find the
minimum time needed to finish all tasks. This involves priority
queue concepts but also queue-like scheduling logic.

11. Summary / Key Points


Quick revision of points covered:

Queues are FIFO data structures, essential for ordered processing.

They manage resources, handle asynchronous tasks, and are key


to graph traversal algorithms like BFS.

Main operations: enqueue (add to rear), dequeue (remove from


front).

Properties: FIFO, linear, dynamic size, managed by front and


rear pointers.

BFS uses queues to explore graphs level by level, finding shortest


paths in unweighted graphs.

Queueing theory provides mathematical models for analyzing


waiting lines.

Common applications include OS scheduling, network buffering,


web server requests, and message queuing.

Mistakes often involve confusing FIFO with LIFO or improper


handling of graph cycles in BFS.

Highlighted formulas and shortcuts:

FIFO: First-In, First-Out (the core principle).

BFS Complexity: Time O(V + E), Space O(V).

[Link] Page 163 of 164


Queueing Theory Basics: Arrival Rate ($\lambda$), Service Rate
($\mu$), Utilization ($\rho = \lambda / \mu$). (Remember $\lambda
< \mu$ for a stable system).

Key Concept: Queues introduce a buffer and ordering mechanism


that decouples producers and consumers, making systems more
robust and fair.

[Link] Page 164 of 164

You might also like