Tutorial Playlist
An LRU Cache plays a very important role in efficiently storing data. As most people are aware of the fact that a cache is a portion of a computer memory that contains frequently accessed data. Such data is stored in cache memory temporarily. However, the capacity to store data of the cache memory is limited and it is the responsibility of the management to ensure that old data is transferred so that the new data can be stored in it.
Thus, LRU Cache implementation works by eliminating the least utilized data and making memory space available for new data. This is how the LRU Cache replacement mechanism works.
In this tutorial, we will investigate the function of LRU Cache, its working model, data structures, and various other elements it contains. Additionally, we will take a look at the implementation of the LRU mechanism in Java, in order to provide a comprehensive understanding of this crucial concept of computer systems.
LRU stands for Least Recently Used Cache which arranges the data and items stored temporarily in the cache memory in order of their utilization and importance. The LRU mechanism helps you to quickly locate which data has been used for the most amount of time and which portion of the data is optional.
LRU Cache is one of the most famous caching techniques as it automatically allows the users to remove the least recently used data and make space for new data in the cache memory, once it reaches the highest storage capacity.
The structure of LRU Cache is like a series of memory blocks connected in a string-like manner. LRU Cache can be comparable to a series of memory chunks that contain data. Whenever a user calls for any data which is not present in the catch memory, the LRU mechanism fetches the data from the disc and then returns it to the user.
However, as this process continues, the recently fetched data that is returned to the user now becomes the recently used data as well. So, now the recently used data is positioned at the front of the cache list.
Initially, only two data elements are contained in a cache list. Because of the LRU mechanism, whenever any request for data is made by a user, the cache will immediately return them with the data rather than following the time-consuming process of fetching the data from the disc and then providing it to the user.
The objective of implementing all are you caching schemes is to create a data structure that follows the elements of a Least Recently Used (LRU) Cache. This is how we can implement an LRU Cache class with the help of the following operations:
We would want to derive the following specification from the LRU mechanism:
Let's understand it with the help of an LRU Cache implementation example:
Let's say we have five items that are named A1, A2, A3, A4, and A5 in the main memory, and suppose the size of our cache memory is 3.
At the start of the process, the cache memory is empty and all the items are stored in the main memory. So if we want to retrieve A1, we would get the value of A1 from the main memory and then store it in the cache memory.
In the next step, we would want to get the value of A2 so we retrieve the value of A2 from the main memory. Now, A2 is the most recently used item so it will be placed at the top of the cache memory list. Automatically, A1 will move down in the list and will no longer be the most recently used item.
Next, we would want to get the value of A3 so the same process will continue and A3 will become the most recently used item in the list. Let's say we would want to get the value of A2 again. Now we can easily get the value of A2 from our cache list rather than retrieving it from the main memory again. So, A2 will be placed at the top of the list again as it is the most recently used item now.
Now let's say we want to get the value of A4 so you will have to get it from the main memory. Now the point is where will it be stored in our cache? Our cache memory is already full. To store A4 we have to get rid of some items in the list. In this case, we remove A1 from the list as it is the least recently used item in this list, placed at the bottom.
Since the maximum limit of cache memory is 3, we have to eliminate the least recently used element from the list so that we can make space for the new items.
The size of our cache memory is usually much smaller than that of our main memory. So, fitting everything from the main memory into the cache memory is an impossible task. The LRU cache implementation is one of the most convenient ways of handling this. The main goal of the LRU mechanism is to store only the recently accessed items in 'n' (where the size of the cache memory is n).
LRU cache cannot be implemented without the use of proper data structures. For the LRU Cache implementation scheme, the following two data structures are to be used:
Code:
class Node:
  def __init__(self, key, value):
    self.key = key
    self.value = value
    self.prev = None
    self.next = None
class LRUCache:
  def __init__(self, capacity):
    self.capacity = capacity
    self.cache = {}  # Hash map for quick lookup
    self.head = Node(None, None)  # Dummy head node
    self.tail = Node(None, None)  # Dummy tail node
    self.head.next = self.tail
    self.tail.prev = self.head
  def _add_node(self, node):
    # Add a node after the dummy head
    node.prev = self.head
    node.next = self.head.next
    self.head.next.prev = node
    self.head.next = node
  def _remove_node(self, node):
    # Remove a node from the linked list
    prev_node = node.prev
    next_node = node.next
    prev_node.next = next_node
    next_node.prev = prev_node
  def _move_to_front(self, node):
    # Move a node to the front of the linked list
    self._remove_node(node)
    self._add_node(node)
  def get(self, key):
    if key in self.cache:
      node = self.cache[key]
      self._move_to_front(node)
      return node.value
    return -1
  def put(self, key, value):
    if key in self.cache:
      node = self.cache[key]
      node.value = value
      self._move_to_front(node)
    else:
      if len(self.cache) >= self.capacity:
        # Remove the least recently used node (tail.prev)
        del self.cache[self.tail.prev.key]
        self._remove_node(self.tail.prev)
      new_node = Node(key, value)
      self.cache[key] = new_node
      self._add_node(new_node)
# Create an LRU cache with a capacity of 3
lru_cache = LRUCache(3)
# Add items to the cache
lru_cache.put(1, 'one')
lru_cache.put(2, 'two')
lru_cache.put(3, 'three')
# Access items from the cache
print(lru_cache.get(2)) Â # Output: 'two'
# Adding a new item will remove the least recently used item ('one')
lru_cache.put(4, 'four')
# Trying to access the removed item will return -1
print(lru_cache.get(1)) Â # Output: -1
In this implementation, the LRUCache class uses a doubly linked list to maintain the order of item usage. The cache dictionary acts as a hash map to provide quick access to the nodes. When an item is accessed or added, the relevant node is moved to the front of the linked list. When the cache becomes full, the least recently used item is removed from the linked list and the hash map.
We can also implement an LRU cache in Java using LinkedHashMap, which provides a built-in way to create a map with a specified order. Here's an example:
import java.util.LinkedHashMap;
import java.util.Map;
class LRUCache<K, V> extends LinkedHashMap<K, V> {
  private final int capacity;
  public LRUCache(int capacity) {
    super(capacity, 0.75f, true);
    this.capacity = capacity;
  }
  @Override
  protected boolean removeEldestEntry(Map.Entry<K, V> eldest) {
    return size() > capacity;
  }
}
public class Main {
  public static void main(String[] args) {
    LRUCache<Integer, String> lruCache = new LRUCache<>(3);
    lruCache.put(1, "one");
    lruCache.put(2, "two");
    lruCache.put(3, "three");
    System.out.println(lruCache.get(2)); // Output: "two"
    lruCache.put(4, "four");
    System.out.println(lruCache.get(1)); // Output: null (removed due to capacity)
  }
}
In this example, the LRUCache class extends LinkedHashMap, where the ordering mode is set to "access-order" (by passing true as the third parameter to the LinkedHashMap constructor). The removeEldestEntry method is overridden to control when the oldest entry should be removed from the cache based on the capacity.
LinkedHashMap automatically maintains the order of insertion and access. When an element is accessed (through get or put), it's moved to the end of the list, making it the most recently accessed item.
Remember that LinkedHashMap isn't thread-safe by default. If you need thread safety, you might need to consider synchronization mechanisms or use a thread-safe map like ConcurrentHashMap.
In this tutorial, we learned what LRU Cache is and how it works, including its code implementation, data structures, and all the major components of LRU Cache. LRU Cache implementation is a crucial subject that is booming in the tech industry and it will remain indispensable in multiple roles and applications.
Consider earning a Computer Science certification from upGrad if you want to become an expert in this important role. The upGrad course will enable you to become an online master of coding. You can also use the upGrad courses to assist you in securing executive positions in the tech sector.
The time complexity of the Put and the Get operations in LRU Cache is O ( 1 ) O(1) O(1) in the least favored circumstances. The LRU mechanism can easily be implemented using a doubly linked list and hashMap.
LRU Cache works by dividing the items in the cache list into least recently used and most recently used ones. This technique organizes the items in the list in order of their usage. Each time you access an item in the list, the LRU Cache mechanism will move it to the top of the list as it is the most recently used item.
LRU Cache implementation in Java, allows the user to remove the least recently used item from the cashless to make space for the new items in the list. It works with the help of Put and Get operations.
PAVAN VADAPALLI
Popular
Talk to our experts. We’re available 24/7.
Indian Nationals
1800 210 2020
Foreign Nationals
+918045604032
upGrad does not grant credit; credits are granted, accepted or transferred at the sole discretion of the relevant educational institution offering the diploma or degree. We advise you to enquire further regarding the suitability of this program for your academic, professional requirements and job prospects before enrolling. upGrad does not make any representations regarding the recognition or equivalence of the credits or credentials awarded, unless otherwise expressly stated. Success depends on individual qualifications, experience, and efforts in seeking employment.
upGrad does not grant credit; credits are granted, accepted or transferred at the sole discretion of the relevant educational institution offering the diploma or degree. We advise you to enquire further regarding the suitability of this program for your academic, professional requirements and job prospects before enr...