1. What is cache? Why cache? 3. Write an LRU algorithm by hand. 4. What are the common pitfalls of using cache?Copy the code

Hello everyone, I am the most generous small ears in 49 cities.

Today let’s talk about some common interview questions in plain English.

1. What is cache? Why cache?

A cache is a buffer used for data exchange. The cache is used to store data on slow read/write media in fast read/write media, increasing the read/write speed and reducing the time consumption.

For CPU caches, cache reads and writes are much faster than memory. When the CPU reads data, it does not need to read memory if it finds the required data in the cache. When the CPU writes data to the cache, it writes it back to memory.

For disk cache, in fact, the commonly used disk data is stored in memory, memory read and write speed is much higher than disk. Read data from memory. When writing data, you can write data to the memory first, write data back to the disk periodically or quantitatively, or write back synchronously. This improves disk I/O throughput.

It can be concluded that the purpose of caching is to improve read and write performance. In actual service scenarios, the read performance is improved to achieve better performance and higher concurrency

2. Common cache tools

In daily business, MySQL and Redis are the databases we use most. We cache the hot data of MySQL into Redis to improve the reading performance and reduce the reading pressure of MySQL. For example:

The hot event list of Weibo has a high reading frequency, and the reading quantity should be updated in real time. Using Redis to record the list can improve performance and concurrency.

Commodity information, data update frequency is not high, but the frequency of reading is high, especially popular commodities.

Common caching tools and frameworks in Java back-end development are listed as follows:

LocalCache: Guava LocalCache, Ehcache. Ehcache is more versatile.

Distributed cache: Redis, Memcached. Redis is the most mainstream and commonly used.

3. Write an LRU algorithm

There are three common cache algorithms:

LRU (least recently used)

LFU (Least Frequently used, Least Frequently used)

FIFO (First in first out)

Handwritten LRU code can be implemented in a number of ways. The simplest one is based on the LinkedHashMap. The principle is to put the least recently accessed data at the end of the map for easy deletion, so that the most recently accessed data will be placed in the first order. Its implementation code is as follows:


class LRUCache<K, V> extends LinkedHashMap<K, V> { private final int CACHE_SIZE; ** @param cacheSize cacheSize */ public LRUCache(int cacheSize) {// true specifies the order in which the LinkedHashMap is accessed. The most recently accessed ones are placed in the head, and the oldest accessed ones are placed in the tail. Super ((int) math.ceil (cacheSize / 0.75) + 1, 0.75F, true); CACHE_SIZE = cacheSize; } @override protected Boolean removeEldestEntry(map.entry <K, V> younger);} return size() > CACHE_SIZE; }}Copy the code

4. What are the common pitfalls of using cache?

Common problems are as follows:

When does the cache write, and how do I avoid concurrent repeated writes when I write?

How does the cache fail?

How to ensure the consistency between cache and DB?

How do I avoid cache penetration?

How to avoid cache breakdown problem?

How to avoid cache avalanche problems?

                       End
Copy the code

Author brief introduction: bold small ear, a bold programmer. I want to work with you in the world of technology and look at the world from the perspective of technology. Welcome to scan the qr code below, continue to pay attention to a large wave of original series of articles on the way.

Scan the code and reply to “666”, you can get a copy of the Java senior engineer learning materials shown in the following figure for free.