Introduces a cache flushing mechanism to control cache volume: LRU (least recently used) cache mechanism.

I. Background introduction

Caching is used extensively in both front-end and back-end daily development scenarios, and when used properly can greatly optimize code execution speed and reduce unnecessary double calculations.

However, in some scenarios with a large amount of data or a long running time, misuse of cache may lead to a large number of caches and memory can not be released, leading to problems such as running lag.

Therefore, a cache elimination mechanism is established for this situation, and the maximum number of caches is set. When the maximum number of caches is reached, less caches will be deleted, so as to control the space occupied by caches.

Second, LeetCode title

  • LeetCode 146. LRU caching mechanism

Using your knowledge of data structures, design and implement an LRU (least recently used) caching mechanism.

Implement the LRUCache class:

  • LRUCache(int capacity)The capacity is a positive integercapacityInitialize the LRU cache
  • int get(int key)If the keywordkeyExists in the cache, returns the value of the keyword, otherwise returns- 1
  • void put(int key, int value)If the keyword already exists, change its data value; If the keyword does not exist, the set of keyword – values is inserted. When the cache reaches its maximum capacity, it should delete the oldest unused data values before writing new data to make room for new data values.

Advanced: Can you do both in O(1) time?

Example:

The input

["LRUCache"."put"."put"."get"."put"."get"."put"."get"."get"."get"]
[[2], [1.1], [2.2], [1], [3.3], [2], [4.4], [1], [3], [4]]
Copy the code

The output

[null.null.null.1.null, -1.null, -1.3.4]
Copy the code

explain

const lRUCache = new LRUCache(2);
lRUCache.put(1.1); // Cache is {1=1}
lRUCache.put(2.2); // Cache is {1=1, 2=2}
lRUCache.get(1);    / / returns 1
lRUCache.put(3.3); {1=1, 3=3}
lRUCache.get(2);    // return -1 (not found)
lRUCache.put(4.4); {4=4, 3=3}
lRUCache.get(1);    // return -1 (not found)
lRUCache.get(3);    / / return 3
lRUCache.get(4);    / / return 4
Copy the code

 

Tip:

  • 1 <= capacity <= 3000
  • 0 <= key <= 3000
  • 0 <= value <= 104
  • Most call3 * 104getput

Third, train of thought

  1. Using an array implementation, cache values are stored in the array, and each time the cache is fetched and set, the array needs to be traversed. The time complexity isO(n);
  2. usehashMap + Two-way linked listImplementation, the get operation is directly taken from the map, set and get operation may be accompanied by the link change operation of the linked list node, get and set cache time complexity isO(1);

Four, solution

1. Use arrays

  • Get and set the cache time complexity isO(n);
/ * * *@class LRU (least recently used) cache *@desc Using arrays, time complexity O(n) *@param {number} Capacity Maximum cache capacity */
 class LRUCache {
    constructor(capacity) {
        // Maximum capacity of cache
        this.capacity = capacity;
        // Cache list
        this.list = [];
    }
    /** * returns -1 * if the cache is not hit@param {number} Key Unique identifier of the cache *@return {number} The cache value * /
    get(key) {
        const node = this._moveToHead(key);
        return node ? node.value : -1;
    }
    /** * increase cache *@param {number} Key Unique identifier of the cache *@param {number} Value Indicates the cache value *@return {void} No return value */
    put(key, value) {
        const node = this._moveToHead(key, value);
        if(! node) {if (this.list.length === this.capacity) {
                this.list.shift();
            }
            this.list.push({key, value}); }}/** * Move the node to the head *@param {number} Key Unique identifier of the cache *@param {number} Value Indicates the cache value *@return {Node | null} Matched node */
    _moveToHead(key, value) {
        for (let i = 0; i < this.list.length; i ++) {
            const item = this.list[i];
            if (item.key === key) {
                const node = this.list.splice(i, 1) [0];
                if(value ! = =undefined) {
                    node.value = value;
                }
                this.list.push(node);
                returnnode; }}return null; }}/ * * * * * * * * * * * * * * * * * * the following is a test code * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * /

const cache = new LRUCache(3);

cache.put(1.1);
console.log(cache.list.map(e= > e.value));
cache.put(2.2);
console.log(cache.list.map(e= > e.value));
cache.put(3.3);
console.log(cache.list.map(e= > e.value));
console.log(cache.get(3));
console.log(cache.list.map(e= > e.value));
console.log(cache.get(1));
console.log(cache.list.map(e= > e.value));
console.log(cache.get(2));
console.log(cache.list.map(e= > e.value));
cache.put(1.1);
console.log(cache.list.map(e= > e.value));
cache.put(4.4);
console.log(cache.list.map(e= > e.value));
cache.put(4.4);
console.log(cache.list.map(e= > e.value));
console.log(cache.get(5));
console.log(cache.list.map(e= > e.value));

Copy the code

2. HashMap + bidirectional linked list implementation

  • Get and set the cache time complexity isO(1);
/ * * *@class LRU (least recently used) cache *@param {number} Capacity Maximum cache capacity */
class LRUCache {
    constructor(capacity) {
        // Maximum capacity of cache
        this.capacity = capacity;
        // The current amount of storage in the cache
        this.count = 0;
        // Cache the linked list header
        this.head = null;
        // Cache the end of the list
        this.tail = null;
        // Cache mapping
        this.map = {};
    }
    /** * returns -1 * if the cache is not hit@param {number} Key Unique identifier of the cache *@return {number} The cache value * /
    get(key) {
        let node = this.map[key];
        if (node) {
            this._moveToHead(node);
            return node.value;
        }
        return -1;
    }
    /** * increase cache *@param {number} Key Unique identifier of the cache *@param {number} Value Indicates the cache value *@return {void} No return value */
    put(key, value) {
        let node = this.map[key];
        if (node) {
            node.value = value;
            this._moveToHead(node);
        } else {
            node = { key, value, pre: null.next: this.head };
            // The first node is set to both head and tail
            if (this.count === 0) {
                this.head = this.tail = node;
            }
            // If the number of cache nodes reaches the maximum, the tail nodes need to be deleted first, and then the header nodes need to be added
            if (this.count === this.capacity) {
                delete this.map[this.tail.key];
                this._delTail();
            } else {
                this.count++;
            }
            this._setHead(node);
        }
        this.map[key] = node; // Set the cache mapping
    }
    /** * move the node to the end *@param {Node} Node Current node *@return {void} No return value */
    _moveToHead(node) {
        // If you hit the head node, you don't need to move
        if (this.head.key === node.key) {
            return;
        }
        // If the last node is hit, you need to manually set the last node to the previous node
        if (this.tail.key === node.key && node.pre) {
            this.tail = node.pre;
        }
        // Delete the current node in the list first, and then add it in the header
        node.pre && (node.pre.next = node.next);
        node.next && (node.next.pre = node.pre);
        // Add the current node to the header
        node.pre = null;
        node.next = this.head;
        this._setHead(node);
    }
    /** * sets the head node *@param {Node} Node Current node *@return {void} No return value */
    _setHead(node) {
        this.head.pre = node;
        this.head = node;
    }
    /** * Delete the last node *@return {void} No return value */
    _delTail() {
        this.tail.pre && (this.tail.pre.next = null);
        this.tail = this.tail.pre; }}/ * * * * * * * * * * * * * * * * * * the following is a test code * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * /

// Outputs the linked list array
function logLinkList(node) {
    let tail = node;
    let list = [tail];
    while(tail.next) {
        list.push(tail.next);
        tail = tail.next;
    }
    return list;
}


const cache = new LRUCache(3);

cache.put(1.1);
console.log(logLinkList(cache.head).map(e= > e.value));
cache.put(2.2);
console.log(logLinkList(cache.head).map(e= > e.value));
cache.put(3.3);
console.log(logLinkList(cache.head).map(e= > e.value));
console.log(cache.get(3));
console.log(logLinkList(cache.head).map(e= > e.value));
console.log(cache.get(1));
console.log(logLinkList(cache.head).map(e= > e.value));
console.log(cache.get(2));
console.log(logLinkList(cache.head).map(e= > e.value));
cache.put(1.1);
console.log(logLinkList(cache.head).map(e= > e.value));
cache.put(4.4);
console.log(logLinkList(cache.head).map(e= > e.value));
cache.put(4.4);
console.log(logLinkList(cache.head).map(e= > e.value));


Copy the code