LRU Cache is the least recently used cache which is basically used for Memory Organization. In this, the elements come as
First in First Out format. We are given total possible page numbers that can be referred to. We are also given cache (or memory) size (Number of page frames that cache can hold at a time). The LRU caching scheme is to remove the least recently used frame when the cache is full and a new page is referenced which is not there in the cache. There are generally two terms use with LRU Cache, let's see them -
- Page hit: If the required page is found in the main memory then it is a page hit.
- Page Fault: If the required page is not found in the main memory then page fault occurs.
When a page is referenced, the required page may be in the memory. If it is in the memory, we need to detach the node of the list and bring it to the front of the queue.
If the required page is not in memory, we bring that in memory. In simple words, we add a new node to the front of the queue and update the corresponding node address in the hash. If the queue is full, i.e. all the frames are full, we remove a node from the rear of the queue, and add the new node to the front of the queue.
Example – Consider the following reference string :
1, 2, 3, 4, 1, 2, 5, 1, 2, 3, 4, 5
Find the number of page faults using least recently used (LRU) page replacement algorithm with 3 page frames.
Explanation –
LRU Cache Using Python
You can implement this with the help of the queue. In this, we have used Queue using the linked list. Run the given code in Pycharm IDE.
Python 1==
import time
class Node:
# Nodes are represented in n
def __init__(self, key, val):
self.key = key
self.val = val
self.next = None
self.prev = None
class LRUCache:
cache_limit = None
# if the DEBUG is TRUE then it
# will execute
DEBUG = False
def __init__(self, func):
self.func = func
self.cache = {}
self.head = Node(0, 0)
self.tail = Node(0, 0)
self.head.next = self.tail
self.tail.prev = self.head
def __call__(self, *args, **kwargs):
# The cache presents with the help
# of Linked List
if args in self.cache:
self.llist(args)
if self.DEBUG == True:
return f'Cached...{args}\n{self.cache[args]}\nCache: {self.cache}'
return self.cache[args]
# The given cache keeps on moving.
if self.cache_limit is not None:
if len(self.cache) > self.cache_limit:
n = self.head.next
self._remove(n)
del self.cache[n.key]
# Compute and cache and node to see whether
# the following element is present or not
# based on the given input.
result = self.func(*args, **kwargs)
self.cache[args] = result
node = Node(args, result)
self._add(node)
if self.DEBUG == True:
return f'{result}\nCache: {self.cache}'
return result
# Remove from double linked-list - Node.
def _remove(self, node):
p = node.prev
n = node.next
p.next = n
n.prev = p
# Add to double linked-list - Node.
def _add(self, node):
p = self.tail.prev
p.next = node
self.tail.prev = node
node.prev = p
node.next = self.tail
# Over here the result task is being done
def llist(self, args):
current = self.head
while True:
if current.key == args:
node = current
self._remove(node)
self._add(node)
if self.DEBUG == True:
del self.cache[node.key]
self.cache[node.key] = node.val
break
else:
current = current.next
# Default Debugging is FALSE. For
# execution of DEBUG is set to TRUE
LRUCache.DEBUG = True
# The DEFAULT test limit is NONE.
LRUCache.cache_limit = 3
@LRUCache
def ex_func_01(n):
print(f'Computing...{n}')
time.sleep(1)
return n
print(f'\nFunction: ex_func_01')
print(ex_func_01(1))
print(ex_func_01(2))
print(ex_func_01(3))
print(ex_func_01(4))
print(ex_func_01(1))
print(ex_func_01(2))
print(ex_func_01(5))
print(ex_func_01(1))
print(ex_func_01(2))
print(ex_func_01(3))
print(ex_func_01(4))
print(ex_func_01(5))
Output:
Function: ex_func_01
Computing...1
1
Cache: {(1,): 1}
Computing...2
2
Cache: {(1,): 1, (2,): 2}
Computing...3
3
Cache: {(1,): 1, (2,): 2, (3,): 3}
Computing...4
4
Cache: {(1,): 1, (2,): 2, (3,): 3, (4,): 4}
Cached...(1,)
1
Cache: {(2,): 2, (3,): 3, (4,): 4, (1,): 1}
Cached...(2,)
2
Cache: {(3,): 3, (4,): 4, (1,): 1, (2,): 2}
Computing...5
5
Cache: {(4,): 4, (1,): 1, (2,): 2, (5,): 5}
Cached...(1,)
1
Cache: {(4,): 4, (2,): 2, (5,): 5, (1,): 1}
Cached...(2,)
2
Cache: {(4,): 4, (5,): 5, (1,): 1, (2,): 2}
Computing...3
3
Cache: {(5,): 5, (1,): 1, (2,): 2, (3,): 3}
Computing...4
4
Cache: {(1,): 1, (2,): 2, (3,): 3, (4,): 4}
Computing...5
5
Cache: {(2,): 2, (3,): 3, (4,): 4, (5,): 5}
Similar Reads
Clear LRU Cache in Python
The LRU is the Least Recently Used cache. LRU Cache is a type of high-speed memory, that is used to quicken the retrieval speed of frequently used data. It is implemented with the help of Queue and Hash data structures. Note: For more information, refer to Python â LRU Cache How can one interact wit
2 min read
Python Functools - lru_cache()
The functools module in Python deals with higher-order functions, that is, functions operating on(taking as arguments) or returning functions and other such callable objects. The functools module provides a wide array of methods such as cached_property(func), cmp_to_key(func), lru_cache(func), wraps
2 min read
Python __all__
Have you ever used a Python fileâs variables after importing (`from F import *`) it into another Python file? In this scenario, every variable from that file can be used in the second file where the import is made. In this article, we'll look at one Python mechanism, called __all__, which allows use
5 min read
Cachetools module in Python
Cachetools is a Python module which provides various memoizing collections and decorators. It also includes variants from the functools' @lru_cache decorator. To use it, first, we need to install it using pip. pip install cachetools Cachetools provides us five main function. cached LRUCache TTLCache
5 min read
Python Crash Course
If you are aware of programming languages and ready to unlock the power of Python, enter the world of programming with this free Python crash course. This crash course on Python is designed for beginners to master Python's fundamentals in record time! Experienced Python developers developed this fre
7 min read
Python Functools - cached_property()
The @cached_property is a decorator which transforms a method of a class into a property whose value is computed only once and then cached as a normal attribute. Therefore, the cached result will be available as long as the instance will persist and we can use that method as an attribute of a class
4 min read
Python Arrays
Lists in Python are the most flexible and commonly used data structure for sequential storage. They are similar to arrays in other languages but with several key differences:Dynamic Typing: Python lists can hold elements of different types in the same list. We can have an integer, a string and even
9 min read
Python DSA Libraries
Data Structures and Algorithms (DSA) serve as the backbone for efficient problem-solving and software development. Python, known for its simplicity and versatility, offers a plethora of libraries and packages that facilitate the implementation of various DSA concepts. In this article, we'll delve in
15 min read
LRU Cache in Python using OrderedDict
LRU (Least Recently Used) Cache discards the least recently used items first. This algorithm requires keeping track of what was used when, which is expensive if one wants to make sure the algorithm always discards the least recently used item. General implementations of this technique require keepin
4 min read
Fibonacci Heap in Python
A Fibonacci Heap is a data structure that supports the insert, minimum, extract_min, merge, decrease_key, and delete operations, all amortized efficiently. It is mainly used in the implementation of Dijkstra's shortest path algorithm and Prim's minimum spanning tree algorithm. Fibonacci Heap Operati
4 min read