Blog Data Structure

Understanding LRU Cache 🌈

LRU (Least Recently Used) cache is a caching strategy that evicts the least recently used items when the cache reaches its capacity limit. It is widely used in various applications, such as web browsers, databases, and operating systems, to improve performance by keeping frequently accessed data in memory and avoiding expensive operations like disk reads or network requests.

Working Principle: 🔍

The LRU cache maintains a data structure (usually a combination of a hash table and a doubly-linked list) that keeps track of the cached items and their usage order. When a new item needs to be added to the cache, and the cache is already at its capacity, the least recently used item is evicted to make room for the new item.

Usage: 💻

LRU caches are particularly useful in scenarios where you have a limited amount of memory or storage and need to manage a large amount of data efficiently. Some common use cases include:

  1. Web Browsers: LRU caches are used to store recently accessed web pages, reducing the need to fetch the same pages from the server repeatedly.
  2. Database Caching: LRU caches can be used to cache frequently accessed database queries, reducing the load on the database server.
  3. Application Caching: LRU caches can be used to cache application data, such as user sessions, configuration settings, or frequently accessed objects, improving application performance.

Implementations: 🛠️

There are several ways to implement an LRU cache, and each implementation may have its own trade-offs in terms of time and space complexity. Here are three common implementations in Java and Python:

  1. Hashmap and Doubly-Linked List Implementation: Java:
Java
import java.util.HashMap;
import java.util.Map;

class Node {
    int key;
    int value;
    Node prev;
    Node next;

    Node(int key, int value) {
        this.key = key;
        this.value = value;
    }
}

class LRUCache {
    private Map<Integer, Node> map;
    private int capacity;
    private Node head;
    private Node tail;

    LRUCache(int capacity) {
        this.capacity = capacity;
        map = new HashMap<>();
        head = new Node(0, 0);
        tail = new Node(0, 0);
        head.next = tail;
        tail.prev = head;
    }

    // Other methods: get, put, removeNode, addNode
}

Python:

Java
   class Node:
       def __init__(self, key, value):
           self.key = key
           self.value = value
           self.prev = None
           self.next = None

   class LRUCache:
       def __init__(self, capacity):
           self.capacity = capacity
           self.cache = {}
           self.head = Node(0, 0)
           self.tail = Node(0, 0)
           self.head.next = self.tail
           self.tail.prev = self.head

       # Other methods: get, put, remove_node, add_node
  1. Ordered Dictionary Implementation (Python only): 🐍
Python
   from collections import OrderedDict

   class LRUCache:
       def __init__(self, capacity):
           self.capacity = capacity
           self.cache = OrderedDict()

       def get(self, key):
           if key not in self.cache:
               return -1
           self.cache.move_to_end(key)
           return self.cache[key]

       def put(self, key, value):
           if key in self.cache:
               self.cache.move_to_end(key)
           self.cache[key] = value
           if len(self.cache) > self.capacity:
               self.cache.popitem(last=False)
  1. Linked HashMap Implementation (Java only):
Java
   import java.util.LinkedHashMap;
   import java.util.Map;

   class LRUCache extends LinkedHashMap<Integer, Integer> {
       private int capacity;

       LRUCache(int capacity) {
           super(capacity, 0.75f, true);
           this.capacity = capacity;
       }

       public int get(int key) {
           if (!containsKey(key)) {
               return -1;
           }
           int value = super.get(key);
           super.remove(key);
           super.put(key, value);
           return value;
       }

       public void put(int key, int value) {
           if (containsKey(key)) {
               super.remove(key);
           } else if (size() == capacity) {
               removeEldestEntry();
           }
           super.put(key, value);
       }

       protected boolean removeEldestEntry(Map.Entry<Integer, Integer> eldest) {
           return size() > capacity;
       }
   }

LRU Cache Implementation Using Stack

LRU (Least Recently Used) cache is a caching strategy that evicts the least recently used items when the cache reaches its capacity limit. It’s widely used in various applications to improve performance by keeping frequently accessed data in memory and avoiding expensive operations like disk reads or network requests. 💾

In addition to the implementations we discussed earlier (hashmap and doubly-linked list, ordered dictionary, and linked hash map), we can also implement an LRU cache using a stack data structure along with a hash map or dictionary. 📚

Working Principle: 🔍

The stack is used to keep track of the order of the most recently used items, and the hash map/dictionary is used to store the key-value pairs for fast lookup. When a new item needs to be added to the cache, and the cache is already at its capacity, the least recently used item (the one at the bottom of the stack) is evicted to make room for the new item.

Implementation in Python: 🐍

Python
class LRUCache:
    def __init__(self, capacity):
        self.capacity = capacity
        self.cache = {}  # Dictionary to store key-value pairs
        self.stack = []  # Stack to track the order of recently used items

    def get(self, key):
        if key not in self.cache:
            return -1

        # Move the key to the top of the stack
        self.stack.remove(key)
        self.stack.append(key)
        return self.cache[key]

    def put(self, key, value):
        if key in self.cache:
            # Update the value and move the key to the top of the stack
            self.cache[key] = value
            self.stack.remove(key)
            self.stack.append(key)
        else:
            if len(self.cache) == self.capacity:
                # Evict the least recently used item
                evicted_key = self.stack.pop(0)
                del self.cache[evicted_key]

            # Add the new key-value pair and push it to the top of the stack
            self.cache[key] = value
            self.stack.append(key)

Implementation in Java:

Java
import java.util.HashMap;
import java.util.Stack;

class LRUCache {
    private int capacity;
    private HashMap<Integer, Integer> cache;
    private Stack<Integer> stack;

    LRUCache(int capacity) {
        this.capacity = capacity;
        cache = new HashMap<>();
        stack = new Stack<>();
    }

    public int get(int key) {
        if (!cache.containsKey(key)) {
            return -1;
        }

        // Move the key to the top of the stack
        stack.remove(new Integer(key));
        stack.push(key);

        return cache.get(key);
    }

    public void put(int key, int value) {
        if (cache.containsKey(key)) {
            // Update the value and move the key to the top of the stack
            cache.put(key, value);
            stack.remove(new Integer(key));
            stack.push(key);
        } else {
            if (cache.size() == capacity) {
                // Evict the least recently used item
                int evictedKey = stack.remove(0);
                cache.remove(evictedKey);
            }

            // Add the new key-value pair and push it to the top of the stack
            cache.put(key, value);
            stack.push(key);
        }
    }
}

In both implementations, we use a stack (stack) to keep track of the order of recently used items and a hash map/dictionary (cache) to store the key-value pairs. The get operation moves the accessed key to the top of the stack, and the put operation adds or updates the key-value pair and pushes the key to the top of the stack. If the cache is at its capacity during the put operation, the least recently used item (the one at the bottom of the stack) is evicted.

We’ve also added some emojis to make the explanations more visually appealing. 🎉 For example, we use a magnifying glass 🔍 to represent the “Working Principle” section, a snake 🐍 for the Python implementation, and a coffee cup ☕ for the Java implementation.

Avatar

Neelabh

About Author

As Neelabh Singh, I am a Senior Software Engineer with 6.6 years of experience, specializing in Java technologies, Microservices, AWS, Algorithms, and Data Structures. I am also a technology blogger and an active participant in several online coding communities.

You may also like

Blog Design Pattern

Understanding the Builder Design Pattern in Java | Creational Design Patterns | CodeTechSummit

Overview The Builder design pattern is a creational pattern used to construct a complex object step by step. It separates
Blog Tech Toolkit

Base64 Decode

Base64 encoding is a technique used to encode binary data into ASCII characters, making it easier to transmit data over