An overview of the

Speak of the cache, you can easily think of Http caching mechanism, LruCache, actually cache was originally for networks, also is in a narrow sense of the cache, generalized refers to the reuse of data cache, I mentioned here is also generalized cache, is more common memory buffer and disk cache, but want to further understand the cache system, I actually need to review a little bit about computers.

CPU

CPU is divided into arithmetic unit and controller, is one of the main equipment of the computer, its function is mainly to interpret computer instructions and process data in computer software. Computer programmability mainly refers to the programming of the central processing unit. The central processing unit (CPU), internal memory and input/output devices are the three core components of a modern computer.

memory

There are many types of memory, which can be divided into main memory and auxiliary memory according to their purposes.

Main memory

Also known as memory is the CPU can directly address the storage space, it is characterized by fast access. Memory generally uses semiconductor storage units, including Random Access Memory, Read Only Memory and high-level Cache.

  • RAM: Data can be read and written randomly, but the stored data will be lost when the power is turned off;
  • ROM: can only read, can not change, even if the machine power, data will not be lost
  • Cache: It is located between the CPU and memory. It usually contains level-1 Cache (L1), level-2 Cache (L2), and level-3 Cache (L3) (commonly found in the Intel series). It reads and writes faster than memory. When the CPU reads or writes data to memory, the data is stored in advanced buffer memory, and the next time the data is accessed, the CPU reads it directly from advanced buffer memory, instead of slower memory.

Auxiliary storage

Auxiliary storage is also called external storage, referred to as external storage, for computers, usually said to be hard disk or CD, for mobile phones generally refers to SD card, but now many manufacturers have been integrated together

Cache type

  • Memory cache: This refers mainly to the memory cache
  • Disk cache: This mainly refers to external storage, computer hard drive, cell phone SD card

Cache capacity

That’s the size of the cache, and after that, you need to clean it up

Caching strategies

Both the memory cache and the disk cache have a limited capacity, so similar to the thread processing strategy when the thread pool is full, we also need to have a corresponding processing strategy when the cache is full. Common policies are:

  • FIFO(First in First Out) : a first-in, first-out (FIFO) policy, similar to queues.

  • Less frequently used (LFU) : used as the least frequently used policy in RecyclerView cache.

  • LRU(least recently used): The least recently used strategy Picasso used when caching memory.

When the cache capacity reaches the specified capacity, corresponding elements are deleted according to the specified policy.

Memory leaks

This occurs primarily in memory caches, and memory leaks occur when objects in the lifecycle segment hold references to long-lived objects. There are two common ways to resolve this problem

  • Reference nulling: Objects referenced in the cache are nulled so that the GC can reclaim them
  • Use weak references: Use weak references to associate objects so that the life cycle of the object is not interfered with so that the GC can collect normally

In fact, both methods are used fairly banal in preventing memory leaks, but we use weak references most of the time.

In fact, Java has four kinds of references, strong reference, soft reference, WeakReference, and virtual reference. There is nothing to say about these. We usually use WeakReference most, that is, WeakReference.

Weak vs. soft references:

Objects with only weak references have a shorter lifetime. When the garbage collector thread scans the memory area under its control, once it finds an object with only weak references, it reclaims its memory regardless of whether the current memory space is sufficient. However, because the garbage collector is a low-priority thread, objects that have only weak references are not necessarily found quickly.

The following is a brief description of the differences between the two methods of preventing memory leaks

Refer to empty

The internal class of RecyclerView, LayoutManager, holds the use of RecyclerView without weak references, but provides a null method

 public static abstract class LayoutManager {
        ChildHelper mChildHelper;
        RecyclerView mRecyclerView;
        @Nullable
        SmoothScroller mSmoothScroller;
        private boolean mRequestedSimpleAnimations = false;
        boolean mIsAttachedToWindow = false;
        private boolean mAutoMeasure = false;
        private boolean mMeasurementCacheEnabled = true;
        private int mWidthMode, mHeightMode;
        private int mWidth, mHeight;

    void setRecyclerView(RecyclerView recyclerView) {
            if (recyclerView == null) {
              / / recycling
                mRecyclerView = null;
                mChildHelper = null;
                mWidth = 0;
                mHeight = 0;
            } else {
              / / initialization
                mRecyclerView = recyclerView;
                mChildHelper = recyclerView.mChildHelper;
                mWidth = recyclerView.getWidth();
                mHeight = recyclerView.getHeight();
            }
            mWidthMode = MeasureSpec.EXACTLY;
            mHeightMode = MeasureSpec.EXACTLY;
        }
Copy the code

Using weak references

Using the Action in Picasso as an example, the parent class adopts WeakReference

The Action the parent class

abstract class Action<T> {
  final WeakReference<T> target;
  Action(Picasso picasso, T target, Request request, int memoryPolicy, int networkPolicy,
      int errorResId, Drawable errorDrawable, String key, Object tag, boolean noFade) {
    this.picasso = picasso;
    this.request = request;
    this.target =target ;
    this.memoryPolicy = memoryPolicy;
    this.networkPolicy = networkPolicy;
    this.noFade = noFade;
    this.errorResId = errorResId;
    this.errorDrawable = errorDrawable;
    this.key = key;
    this.tag = (tag ! =null ? tag : this);
  }
Copy the code

ImageAction subclass

class ImageViewAction extends Action<ImageView> {
  Callback callback;
  ImageViewAction(Picasso picasso, ImageView imageView, Request data, int memoryPolicy,
      int networkPolicy, int errorResId, Drawable errorDrawable, String key, Object tag,
      Callback callback, boolean noFade) {
    super(picasso, imageView, data, memoryPolicy, networkPolicy, errorResId, errorDrawable, key,tag, noFade);
    this.callback = callback;
  }

  @Override public void complete(Bitmap result, Picasso.LoadedFrom from) {
    if (result == null) {
      throw new AssertionError(
          String.format("Attempted to complete action with no result! \n%s".this));
    }

    ImageView target = this.target.get();
    if (target == null) {
      return;
    }
    Context context = picasso.context;
    boolean indicatorsEnabled = picasso.indicatorsEnabled;
    PicassoDrawable.setBitmap(target, context, result, from, noFade, indicatorsEnabled);
  }

Copy the code

Since the ImageView holds a reference to the Context, the GC does not reclaim the Activity if the ImageView is a strong reference, and once the Activity is reclaimed if the weak reference is used, The reference to ImageViewAction does not interfere with the Activity’s recycling.

Cache time

It can be set according to business needs. However, it should be noted that the actual judgment time of the cache should be based on the server time. The time stamp in the Response header of the returned data from the server can be used as the judgment basis.

Read the order

Memory cache read speed is much higher than the disk cache, we all know is Picasso was adopted with disk cache memory cache of the two cache, but when he get first read from memory, and then add to the network in the cache to disk cache, actually in the beginning, I do not like this, I am the memory cache, Disk cache and network cache reads instantiate a Runnable, and the image always flashes when loading the next page, but WHEN I use Picasso, UIL and Glide, it doesn’t. However, when I set their memory cache policy to memoryPolicy-no_cache, they will also flash, which is shown below

In fact, there are flashes in both cases. The common reason is because of memory caching, as mentioned in the Picasso issue by JakeWharton

Yes, 200ms, if the Bitmap is not read successfully, then there will be a flicker, which explains the above two situations, because we set up the placeholder map, the first flicker is because we put the memory cache read into a thread, thread creation, thread switching these things take time, Then the total time will exceed 200ms; Similarly, in the second case, if the memory cache is not set, then it can only be read from the network or disk. This will take more than 200ms and will also blink, so this is why the image loading framework will first read from memory and blink when the memory cache is not set.

At the same time, the disk cache needs to use Http cache mechanism to ensure the timeliness of cache, which will be analyzed in detail later.

conclusion

In fact, the cache change is easier to understand, that is, when using memory cache, you need to pay attention to prevent memory leaks, and when using disk cache, you need to pay attention to the Http cache mechanism to ensure the caching timeliness