LRU Cache Implementation
LRU Cache Implementation
AfterAcademy
Admin AfterAcademy
13 Oct 2020
Difficulty: Hard
https://afteracademy.com/blog/lru-cache-implementation/ 1/10
1/2/25, 12:15 PM LRU Cache Implementation
Problem Note:
Your data structure must support two operations: get() and put()
get(key) : Finds and returns the value if the key exists in the cache.
If the key is not present in the cache, get(key) returns -1
Example
cache = LRUCache(3)
cache.put(1,1)
cache.put(2,2)
cache.put(1,3)
cache.get(1) ---> returns 3
cache.put(3,4)
cache.put(4,3) // removes key 2
cache.get(2) ---> returns -1
Input Format:
Each of the following N lines has a query of either type 1(put) or type
2(get).
https://afteracademy.com/blog/lru-cache-implementation/ 2/10
1/2/25, 12:15 PM LRU Cache Implementation
For example, the input for the above example will be:
7 3
1 1 1
1 2 2
1 1 3
2 1
1 3 4
1 4 3
2 2
Solution
A Least Recently Used (LRU) Cache organizes items in order of use,
allowing you to quickly identify which item hasn’t been used for the longest
amount of time.
The node structure used for the doubly linked list will be as follows:
class Node {
int key
int value
Node pre
Node next
public Node(int key, int value) {
this.key = key
this.value = value
}
}
https://afteracademy.com/blog/lru-cache-implementation/ 3/10
1/2/25, 12:15 PM LRU Cache Implementation
We make the doubly linked list to always function such that the front node
will always be the least recently used item. Now, we will be using two
functions:
2. Put: As the problem note explains how the LRU cache will work, we will
manage a hashmap whose maximum size will be as specified by the
user.
Using a doubly linked list will help us to push and pop the elements from
the front and rear of the linked list in constant time and so DLL is preferred.
Solution Steps
https://afteracademy.com/blog/lru-cache-implementation/ 4/10
1/2/25, 12:15 PM LRU Cache Implementation
1. Create a hashmap that will store the input item values corresponding
to the keys and a doubly-linked list to express LRU.
2. Look up the item in our hash map for every get operation.
If the item is in the hash table, then it’s already in our cache — this is
called a “ cache hit ”
Move the item’s linked list node to the head of the linked list as the
item is the most recently used item.
4. For every "put" operation load the item into the LRU cache.
Check if the cache is full? If so, we need to evict some item to make
room. In our case, that item should be the least recently used item and
that item would be available at the tail of the linked list.
Evict that item from the cache by removing it from the linked list and
the hash map.
If the cache is not full, create a new linked list node for the item. Insert
it at the head of the linked list.
Add the item to our hash map, storing the newly-created linked list
node as the value.
Pseudo Code
class LRUCache {
HashMap(Integer, Node) map
int capicity, count
Node head, tail
public LRUCache(int capacity) {
this.capicity = capacity
map = new HashMap()
// create dummy nodes to point head and tail
https://afteracademy.com/blog/lru-cache-implementation/ 5/10
1/2/25, 12:15 PM LRU Cache Implementation
https://afteracademy.com/blog/lru-cache-implementation/ 6/10
1/2/25, 12:15 PM LRU Cache Implementation
map.remove(tail.pre.key)
deleteNode(tail.pre)
addNodeToHead(node)
}
}
}
}
Complexity Analysis
How did we add a new node in DLL while checking if that node already
existed in the DLL or not in the put function using constant time?
How we managed to remove the least recently used item while making
the room for a new item?
https://afteracademy.com/blog/lru-cache-implementation/ 7/10
1/2/25, 12:15 PM LRU Cache Implementation
If you have any more approaches or you find an error/bug in the above
solutions, please comment down below.
Happy Coding!
Enjoy Algorithms!
https://afteracademy.com/blog/lru-cache-implementation/ 8/10
1/2/25, 12:15 PM LRU Cache Implementation
https://afteracademy.com/blog/lru-cache-implementation/ 10/10