preface

LRU is widely used in real life.

  • For example, when your phone switches to a task screen, your most recently used apps will always be at the top of the list by time.
  • For example, you always put the books you’ve been reading recently on your desk, and the books on your shelf are the ones you haven’t read often.
  • For another example, we record all the useful information on our computer or online disk, but the information we use most often, we keep firmly in our head.

LRU Cache

LRU Cache algorithm, also known as the least recently used algorithm. We will set up a fixed amount of storage, such as computer/phone memory, but if this memory is going to be filled with data and new data is to be stored, we will remove the least recently used resources from the cache and add new data to the cache.

Second, the design idea

  1. The elements in the box are unfilled and can be saved directly, but make sure that the most recent saved element is at the top of the queue and the most unused element is at the bottom.
  2. Step 1: When storing element C, the oldest unused element A is removed. The element C already exists in the queue, so it should be replaced at the top of the queue.
  3. Step 2, when storing G elements, remove the oldest unused B element and insert G at the top of the queue.

This is the basic idea of LRU

Third, concrete implementation

The LRU Cache implementation has two main elements: the Cache capacity and the replacement policy require the time complexity of querying, modifying, and updating any element to be O(1). Here we implement LRU using hash tables and bidirectional linked lists

Bidirectional linked lists are used to simulate the longest first out queue. The HashMap is used to store node identifiers and node objects so that you can query, modify, and update any element in O(1) time

public class LRUCache {
    static class Node {
        public int key, val;
        public Node(int k, int v) {
            this.key = k;
            this.val = v; }}/ / the LRU capacity
    int capacity;
    // HashMap
    HashMap<Integer, Node> map;
    // Two-way linked list
    LinkedList<Node> cache;

    public LRUCache(int capacity) {
        this.capacity = capacity;
        map = new HashMap<>(capacity);
        cache = new LinkedList<>();
    }
    
    public int get(int key) {
        if(! map.containsKey(key))return -1;
        Node node = map.get(key);
        // After the element is retrieved, the element is placed at the top of the queue
        cache.remove(node);
        cache.addFirst(node);
        return node.val;
    }
    
    public void put(int key, int value) {
        Node node = new Node(key, value);
        if (map.containsKey(key)) {
            // Remove existing values from the list
            cache.remove(map.get(key));
        } else if (capacity == map.size()) {
            // If the list has reached its capacity, remove the last element of the list
            Node last = cache.removeLast();
            map.remove(last.key);
        }
        // Put the latest element at the top of the queuecache.addFirst(node); map.put(key, node); }}Copy the code