top

Search

Software Key Tutorial

.

UpGrad

Software Key Tutorial

LRU Cache Implementation

Introduction

An LRU Cache plays a very important role in efficiently storing data. As most people are aware of the fact that a cache is a portion of a computer memory that contains frequently accessed data. Such data is stored in cache memory temporarily. However, the capacity to store data of the cache memory is limited and it is the responsibility of the management to ensure that old data is transferred so that the new data can be stored in it.

Thus, LRU Cache implementation works by eliminating the least utilized data and making memory space available for new data. This is how the LRU Cache replacement mechanism works.

Overview

In this tutorial, we will investigate the function of LRU Cache, its working model, data structures, and various other elements it contains. Additionally, we will take a look at the implementation of the LRU mechanism in Java, in order to provide a comprehensive understanding of this crucial concept of computer systems.

What is LRU Cache?

LRU stands for Least Recently Used Cache which arranges the data and items stored temporarily in the cache memory in order of their utilization and importance. The LRU mechanism helps you to quickly locate which data has been used for the most amount of time and which portion of the data is optional.

LRU Cache is one of the most famous caching techniques as it automatically allows the users to remove the least recently used data and make space for new data in the cache memory, once it reaches the highest storage capacity. 

Structure of LRU Cache

The structure of LRU Cache is like a series of memory blocks connected in a string-like manner. LRU Cache can be comparable to a series of memory chunks that contain data. Whenever a user calls for any data which is not present in the catch memory, the LRU mechanism fetches the data from the disc and then returns it to the user.

However, as this process continues, the recently fetched data that is returned to the user now becomes the recently used data as well. So, now the recently used data is positioned at the front of the cache list. 

Initially, only two data elements are contained in a cache list. Because of the LRU mechanism, whenever any request for data is made by a user, the cache will immediately return them with the data rather than following the time-consuming process of fetching the data from the disc and then providing it to the user.

How to Implement LRU Caching Scheme?

The objective of implementing all are you caching schemes is to create a data structure that follows the elements of a Least Recently Used (LRU) Cache. This is how we can implement an LRU Cache class with the help of the following operations:

  • LRUCache (int capacity): This operation is used to initialize the LRU Cache mechanism with positive size values and capacity.

  • int get (int key): This operation is used to return the value of the particular int key. If there is no value, it will return the result as - 1.

  • void put (int key, int value): This operation is used to update the value of the key only if it exists. If not, add a key value and pair it with the cache. If the number of keys is more than the capacity of the LRU Cache, then eliminate the least recently used key or data, as the case may be.

We would want to derive the following specification from the LRU mechanism:

  • Access time for any element should be 0(1).

  • The time required to retrieve (get) the least recently used element should be 0(1).

  • The time required to place (put) any item should be 0(1).

  • The space required should be 0(n).

Let's understand it with the help of an LRU Cache implementation example:

Let's say we have five items that are named A1, A2, A3, A4, and A5 in the main memory, and suppose the size of our cache memory is 3.


At the start of the process, the cache memory is empty and all the items are stored in the main memory. So if we want to retrieve A1, we would get the value of A1 from the main memory and then store it in the cache memory.


In the next step, we would want to get the value of A2 so we retrieve the value of A2 from the main memory. Now, A2 is the most recently used item so it will be placed at the top of the cache memory list. Automatically, A1 will move down in the list and will no longer be the most recently used item.


Next, we would want to get the value of A3 so the same process will continue and A3 will become the most recently used item in the list. Let's say we would want to get the value of A2 again. Now we can easily get the value of A2 from our cache list rather than retrieving it from the main memory again. So, A2 will be placed at the top of the list again as it is the most recently used item now.


Now let's say we want to get the value of A4 so you will have to get it from the main memory. Now the point is where will it be stored in our cache? Our cache memory is already full. To store A4 we have to get rid of some items in the list. In this case, we remove A1 from the list as it is the least recently used item in this list, placed at the bottom.


Since the maximum limit of cache memory is 3, we have to eliminate the least recently used element from the list so that we can make space for the new items.

The size of our cache memory is usually much smaller than that of our main memory. So, fitting everything from the main memory into the cache memory is an impossible task. The LRU cache implementation is one of the most convenient ways of handling this. The main goal of the LRU mechanism is to store only the recently accessed items in 'n' (where the size of the cache memory is n).

What Data Structures Should be Used?

LRU cache cannot be implemented without the use of proper data structures. For the LRU Cache implementation scheme, the following two data structures are to be used:

  • Queue: You can build a queue with a maximum size that is equal to the limit of the cache size, that is, the maximum number of frames by using a doubly-linked list. This list presents the data in chronological order where the most recently used data is shown at the front of the list and the least recently used data is pushed backward. Hence, the front part of the doubly-linked list consists of the recently used pages, and the ones that are old and not used frequently are contained in the rear of the list.

  • Hash: Here, a key denotes a hash along with a page number, and simultaneously, the address of the associated queue node is denoted by a value. A hash allows the users to hit data in the cash in a constant time as removal of the least recent use data is crucial for allowing the new data to come in.

Implementation of LRU Cache With Queue and Hash

Code:

class Node:
    def __init__(self, key, value):
        self.key = key
        self.value = value
        self.prev = None
        self.next = None

class LRUCache:
    def __init__(self, capacity):
        self.capacity = capacity
        self.cache = {}  # Hash map for quick lookup
        self.head = Node(None, None)  # Dummy head node
        self.tail = Node(None, None)  # Dummy tail node
        self.head.next = self.tail
        self.tail.prev = self.head

    def _add_node(self, node):
        # Add a node after the dummy head
        node.prev = self.head
        node.next = self.head.next
        self.head.next.prev = node
        self.head.next = node

    def _remove_node(self, node):
        # Remove a node from the linked list
        prev_node = node.prev
        next_node = node.next
        prev_node.next = next_node
        next_node.prev = prev_node

    def _move_to_front(self, node):
        # Move a node to the front of the linked list
        self._remove_node(node)
        self._add_node(node)

    def get(self, key):
        if key in self.cache:
            node = self.cache[key]
            self._move_to_front(node)
            return node.value
        return -1

    def put(self, key, value):
        if key in self.cache:
            node = self.cache[key]
            node.value = value
            self._move_to_front(node)
        else:
            if len(self.cache) >= self.capacity:
                # Remove the least recently used node (tail.prev)
                del self.cache[self.tail.prev.key]
                self._remove_node(self.tail.prev)
            new_node = Node(key, value)
            self.cache[key] = new_node
            self._add_node(new_node)

# Create an LRU cache with a capacity of 3
lru_cache = LRUCache(3)

# Add items to the cache
lru_cache.put(1, 'one')
lru_cache.put(2, 'two')
lru_cache.put(3, 'three')

# Access items from the cache
print(lru_cache.get(2))  # Output: 'two'

# Adding a new item will remove the least recently used item ('one')
lru_cache.put(4, 'four')

# Trying to access the removed item will return -1
print(lru_cache.get(1))  # Output: -1

In this implementation, the LRUCache class uses a doubly linked list to maintain the order of item usage. The cache dictionary acts as a hash map to provide quick access to the nodes. When an item is accessed or added, the relevant node is moved to the front of the linked list. When the cache becomes full, the least recently used item is removed from the linked list and the hash map.

Java Implementation using LinkedHashMap

We can also implement an LRU cache in Java using LinkedHashMap, which provides a built-in way to create a map with a specified order. Here's an example:

import java.util.LinkedHashMap;
import java.util.Map;

class LRUCache<K, V> extends LinkedHashMap<K, V> {
    private final int capacity;

    public LRUCache(int capacity) {
        super(capacity, 0.75f, true);
        this.capacity = capacity;
    }

    @Override
    protected boolean removeEldestEntry(Map.Entry<K, V> eldest) {
        return size() > capacity;
    }
}

public class Main {
    public static void main(String[] args) {
        LRUCache<Integer, String> lruCache = new LRUCache<>(3);

        lruCache.put(1, "one");
        lruCache.put(2, "two");
        lruCache.put(3, "three");

        System.out.println(lruCache.get(2)); // Output: "two"

        lruCache.put(4, "four");

        System.out.println(lruCache.get(1)); // Output: null (removed due to capacity)
    }
}

In this example, the LRUCache class extends LinkedHashMap, where the ordering mode is set to "access-order" (by passing true as the third parameter to the LinkedHashMap constructor). The removeEldestEntry method is overridden to control when the oldest entry should be removed from the cache based on the capacity.

LinkedHashMap automatically maintains the order of insertion and access. When an element is accessed (through get or put), it's moved to the end of the list, making it the most recently accessed item.

Remember that LinkedHashMap isn't thread-safe by default. If you need thread safety, you might need to consider synchronization mechanisms or use a thread-safe map like ConcurrentHashMap.

Conclusion

In this tutorial, we learned what LRU Cache is and how it works, including its code implementation, data structures, and all the major components of LRU Cache. LRU Cache implementation is a crucial subject that is booming in the tech industry and it will remain indispensable in multiple roles and applications.

Consider earning a Computer Science certification from upGrad if you want to become an expert in this important role. The upGrad course will enable you to become an online master of coding. You can also use the upGrad courses to assist you in securing executive positions in the tech sector.

Frequently Asked Questions

  1. What is the time complexity of LRU Cache implementation?

The time complexity of the Put and the Get operations in LRU Cache is O ( 1 ) O(1) O(1) in the least favored circumstances. The LRU mechanism can easily be implemented using a doubly linked list and hashMap.

  1. How does a LRU Cache work?

LRU Cache works by dividing the items in the cache list into least recently used and most recently used ones. This technique organizes the items in the list in order of their usage. Each time you access an item in the list, the LRU Cache mechanism will move it to the top of the list as it is the most recently used item.

  1. What is the significance of LRU cache implementation in Java?

LRU Cache implementation in Java, allows the user to remove the least recently used item from the cashless to make space for the new items in the list. It works with the help of Put and Get operations. 

Leave a Reply

Your email address will not be published. Required fields are marked *