0% found this document useful (0 votes)
5 views

LRU Cache Implementation

Uploaded by

lonesome.rar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views

LRU Cache Implementation

Uploaded by

lonesome.rar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

1/2/25, 12:15 PM LRU Cache Implementation

AfterAcademy

Admin AfterAcademy
13 Oct 2020

LRU Cache implementation

Difficulty: Hard

Asked in: Amazon, Microsoft, Adobe, Google

Understanding The Problem


Problem Description

https://afteracademy.com/blog/lru-cache-implementation/ 1/10
1/2/25, 12:15 PM LRU Cache Implementation

Design and implement a data structure for Least Recently Used(LRU)


cache.

Problem Note:

The LRU Cache is initialized with a positive capacity

Your data structure must support two operations: get() and put()

get(key) : Finds and returns the value if the key exists in the cache.
If the key is not present in the cache, get(key) returns -1

put(key, value) : Inserts new key, if it is not present in the cache.


If the cache is filled to capacity, it must remove the least recently used
entry.

Try implementing both operations in O(1) time complexity

Input in this problem would be a series of function calls to get() and


put()

Example

cache = LRUCache(3)
cache.put(1,1)
cache.put(2,2)
cache.put(1,3)
cache.get(1) ---> returns 3
cache.put(3,4)
cache.put(4,3) // removes key 2
cache.get(2) ---> returns -1

Input Format:

First-line contains N and C , the total number of queries and the


cache size.

Each of the following N lines has a query of either type 1(put) or type
2(get).
https://afteracademy.com/blog/lru-cache-implementation/ 2/10
1/2/25, 12:15 PM LRU Cache Implementation

The query of type 1 is of format: 1 k v , where k is key and v is


value

The query of type 2 is of format: 2 k , where k is key whose value is


to be fetched.

For example, the input for the above example will be:

7 3
1 1 1
1 2 2
1 1 3
2 1
1 3 4
1 4 3
2 2

Solution
A Least Recently Used (LRU) Cache organizes items in order of use,
allowing you to quickly identify which item hasn’t been used for the longest
amount of time.

An LRU cache is often implemented by using a doubly-linked list(DLL) with


a hash map.

The node structure used for the doubly linked list will be as follows:

class Node {
int key
int value
Node pre
Node next
public Node(int key, int value) {
this.key = key
this.value = value
}
}
https://afteracademy.com/blog/lru-cache-implementation/ 3/10
1/2/25, 12:15 PM LRU Cache Implementation

To implement an LRU, we will move with the straightforward idea of using a


doubly-linked list that will represent itself as an LRU cache. Corresponding
to each item of the linked list, there will be a key existing in the hashmap.

We make the doubly linked list to always function such that the front node
will always be the least recently used item. Now, we will be using two
functions:

1. Get: Get function can be executed in constant time as we have


maintained a hash map and thus the cached item can be found in the
hashmap using the input key from arguments. If the item not found in
the hashmap, simply we can return -1.

2. Put: As the problem note explains how the LRU cache will work, we will
manage a hashmap whose maximum size will be as specified by the
user.

If a new item is pushed in the cache which is not already available in


the hashmap, we will make a sanity check, if there is overflow or
underflow in the capacity of the hashmap. If its overflow, then we will
remove the LRU item which will be found at the rear of the doubly
linked list and thereby also removing its key from the hashmap to make
room for the new item. Insert the new item in the linked list by pushing
it in the front and also insert it in the hashmap against the input key.

If a new item is pushed in the cache is already available in the


hashmap, then we can bring the item to the front of the doubly linked
list, as it will become the least recently used item.

Using a doubly linked list will help us to push and pop the elements from
the front and rear of the linked list in constant time and so DLL is preferred.

Solution Steps

https://afteracademy.com/blog/lru-cache-implementation/ 4/10
1/2/25, 12:15 PM LRU Cache Implementation

1. Create a hashmap that will store the input item values corresponding
to the keys and a doubly-linked list to express LRU.

2. Look up the item in our hash map for every get operation.

If the item is in the hash table, then it’s already in our cache — this is
called a “ cache hit ”

Move the item’s linked list node to the head of the linked list as the
item is the most recently used item.

3. If the item fails to be in the hash table on "get" operation, we have a


cache miss and so we need to return -1.

4. For every "put" operation load the item into the LRU cache.

Check if the cache is full? If so, we need to evict some item to make
room. In our case, that item should be the least recently used item and
that item would be available at the tail of the linked list.

Evict that item from the cache by removing it from the linked list and
the hash map.

If the cache is not full, create a new linked list node for the item. Insert
it at the head of the linked list.

Add the item to our hash map, storing the newly-created linked list
node as the value.

Pseudo Code

class LRUCache {
HashMap(Integer, Node) map
int capicity, count
Node head, tail
public LRUCache(int capacity) {
this.capicity = capacity
map = new HashMap()
// create dummy nodes to point head and tail
https://afteracademy.com/blog/lru-cache-implementation/ 5/10
1/2/25, 12:15 PM LRU Cache Implementation

head = new Node(0, 0)


tail = new Node(0, 0)
head.next = tail;
tail.pre = head
head.pre = null
tail.next = null
count = 0
}
void deleteNode(Node node) {
node.pre.next = node.next
node.next.pre = node.pre
}
void addNodeToHead(Node node) {
node.next = head.next
node.next.pre = node
node.pre = head
head.next = node
}
int get(int key) {
if (key exist in map) {
Node node = map.get(key)
int result = node.value
deleteNode(node)
addNodeToHead(node)
return result
}
return -1
}
void put(int key, int value) {
if (key exist in map) {
Node node = map.get(key)
node.value = value
deleteNode(node)
addNodeToHead(node)
}
else {
Node node = new Node(key, value)
map.put(key, node)
if (count < capicity) {
count = count + 1
addNodeToHead(node)
}
else {

https://afteracademy.com/blog/lru-cache-implementation/ 6/10
1/2/25, 12:15 PM LRU Cache Implementation

map.remove(tail.pre.key)
deleteNode(tail.pre)
addNodeToHead(node)
}
}
}
}

Complexity Analysis

Time Complexity: O(1)

Put method: O(1)

Get method: O(1)

Space Complexity: O(1) (How?)

Critical Ideas To Think

Why did we use the Doubly linked list?

How did we add a new node in DLL while checking if that node already
existed in the DLL or not in the put function using constant time?

How we managed to remove the least recently used item while making
the room for a new item?

Explain the time complexities of the two methods.

What are the main applications of using an LRU cache mechanism?

Suggested Problems To Solve


Multi-level cache organization

Clockwise rotation of Doubly Linked List by N places

Large number arithmetic using doubly linked list

https://afteracademy.com/blog/lru-cache-implementation/ 7/10
1/2/25, 12:15 PM LRU Cache Implementation

Convert a Binary Tree to a Circular Doubly Link List

Count triplets in a sorted doubly linked list whose product is equal to a


given value x

If you have any more approaches or you find an error/bug in the above
solutions, please comment down below.

Happy Coding!

Enjoy Algorithms!

Recommended for You

Recursive Insertion Sort Binary Search


Write a program for the recursive Given a sorted integer array arr[] of n
implementation of Insertion Sort. Insertion elements and a target value k, write a
Sort is used to sort a given array. This program to search k in arr[]. The famous
problem will sharpen your recursion skills. binary search algorithm is easy to
implement and the best optimization
technique for performing searching
operations

Admin AfterAcademy Admin AfterAcademy


23 Sep 2020 18 Jul 2020

https://afteracademy.com/blog/lru-cache-implementation/ 8/10
1/2/25, 12:15 PM LRU Cache Implementation

What are the operations Implement Queue Using Stack-


implemented in tree and what Interview Problem
are its applications? This is an interview problem asked in
Tree Data Structure supports various companies like Microsoft, Amazon, Adobe,
operations such as insertion, deletion and Flipkart, etc. Here, we need to implement
searching of elements along with few Queue data structure using another data
others. This blogs discussed those structure Stack. This problem is meant for
operations along with explaining its testing the data structure knowledge.
applications.

Admin AfterAcademy Admin AfterAcademy


11 Feb 2020 16 Mar 2020

How to traverse in a tree? Introduction to Heaps in Data


It is important to know traversal techniques Structures
in a tree before solving tree problems. We Heap is a very useful data structure that
have discussed the 3 primary traversals in a every programmer should know well. The
tree at length in this blog with both heap data structure is used in Heap Sort,
recursive and iterative implementations. Priority Queues. The understanding of
heaps helps us to know about memory
management. In this blog, we will discuss

Admin AfterAcademy Admin AfterAcademy


https://afteracademy.com/blog/lru-cache-implementation/
18 F b 2020 9/10
1/2/25, 12:15 PM LRU Cache Implementation

9 Jan 2020 18 Feb


the structure, 2020
properties, and array
implementation of heaps

Connect With Your Mentors

Janishar Ali Amit Shekhar


Founder | IIT-BHU | 10 Yrs Exp. Founder | IIT-BHU | 10 Yrs Exp.

Copyright 2022, MindOrks Nextgen Private Limited

https://afteracademy.com/blog/lru-cache-implementation/ 10/10

You might also like