Cabbage Java self study room covers core knowledge

The path to Becoming a Java engineer (1) The path to becoming a Java engineer (2) The path to becoming a Java engineer (3) the path to becoming a Java engineer (5)

Java concurrency – Concurrent containers

1. Threadlocal

ThreadLocal is a ThreadLocal that stores thread-private variables. Each thread has its own container that stores thread-private variables. ThreadLocal is a shell that stores thread-private variables. Each thread can access variables through set() and get(), but each thread cannot access local variables, which is equivalent to creating a partition between each thread. As long as a thread is active, its corresponding instance of ThreadLocal is accessible, and when the thread is terminated, all of its instances are garbage collected. The bottom line: ThreadLocal stores variables that belong to the current thread.

1.1. Simple use of ThreadLocal

Without further discussion, let’s take a look at a simple example of ThreadLocal:

public class Test implements Runnable {
    private static AtomicInteger counter = new AtomicInteger(100);
    private static ThreadLocal<String> threadInfo = new ThreadLocal<String>() {
        @Override
        protected String initialValue(a) {
            return "[" + Thread.currentThread().getName() + "," + counter.getAndIncrement() + "]"; }};@Override
    public void run(a) {
        System.out.println("threadInfo value:" + threadInfo.get());
    }

    public static void main(String[] args) throws InterruptedException {
        Thread thread1 = new Thread(new Test());
        Thread thread2 = new Thread(new Test());

        thread1.start();
        thread2.start();

        thread1.join();
        thread2.join();

        System.out.println("threadInfo value in main:"+ threadInfo.get()); }}Copy the code

Output result:

threadInfo value:[Thread-0.100]
threadInfo value:[Thread-1.101]
threadInfo value in main:[main,102]
Copy the code

In the above code, I use ThreadLocal to store thread information in the form of [thread name, thread ID]. The variables defined are static. The value of a ThreadLocal variable is not the same as that of the main thread. The value of a ThreadLocal variable is not the same as that of the main thread.

1.2. ThreadLocal principle

1.2.1. Access procedure of ThreadLocal

Let’s start with the source code and take a look at the threadlocal.set () method

// set method in ThreadLocal
public void set(T value) {
    // Get the current thread object
    Thread t = Thread.currentThread();
    // Get the threadLocals property of the thread (ThreadLocalMap object).
    ThreadLocalMap map = getMap(t);
    if(map ! =null)
        map.set(this, value);
    else
        createMap(t, value);
}

// The threadLocals definition in the Thread class
ThreadLocal.ThreadLocalMap threadLocals = null;

// getMap method in ThreadLocal
ThreadLocalMap getMap(Thread t) {
    return t.threadLocals;
}

// createMap method in ThreadLocal
// Create a ThreadLocalMap object for the thread and assign it to threadLocals
void createMap(Thread t, T firstValue) {
    t.threadLocals = new ThreadLocalMap(this, firstValue);
}
Copy the code

In the set method, the value is stored in the class ThreadLocalMap, which is an internal class of ThreadLocal, but also in the class threadLocals. The get method also operates on ThreadLocalMap, which means that the key storage and retrieval essentially lies with the ThreadLocalMap class. The ThreadLocal class is used as the key, and the stored value is value. The ThreadLocal class is defined in the attributes of each thread, which implements the function of “privatizing ThreadLocal threads”. Each time, the threadLocals attribute is fetched from the current thread, which is the ThreadLocalMap object, and the corresponding value is accessed from the ThreadLocal object as the key.

1.2.2. Explore ThreadLocalMap objects

The ThreadLocalMap object is an inner class of the ThreadLocal class. It is a simple Map. Each stored value is encapsulated into an Entry for storage

static class Entry extends WeakReference<ThreadLocal<? >>{
    /** The value associated with this ThreadLocal. */Object value; Entry(ThreadLocal<? > k, Object v) {super(k); value = v; }}Copy the code

Entry is an internal class of ThreadLocalMap, and a close look at its source code shows that it inherits a weak reference to ThreadLocal. Recall the four types of references in Java: strong references, soft references, weak references, and phantom references. A strong reference is an object created by new. As long as a strong reference exists, the garbage collector will never reclaim it. A SoftReference is reclaimed when the memory is about to be used up. WeakReference is used to describe non-essential objects, but its strength is weaker than soft reference. Objects associated with WeakReference can only survive until the next GC occurs. When the garbage collector works, such objects will be reclaimed regardless of whether the current memory is sufficient. Phantom Reference, also known as Phantom Reference, is the weakest kind of Reference relation. The existence of a virtual reference does not affect the lifetime of an object, and the object instance cannot be obtained through a virtual reference. It can be reclaimed at any time.

1.2.3. Collection of ThreadLocal objects

Entry is a weak reference because it does not affect the GC behavior of a ThreadLocal. If it is a strong reference, we will no longer use ThreadLocal during thread execution and set ThreadLocal to NULL. However, ThreadLocal has a reference in the thread’s ThreadLocalMap, making it uncollectible. When Entry is declared as WeakReference and ThreadLocal is set to null, ThreadLocalMap of a thread is not strong reference and ThreadLocal can be reclaimed by GC.

// The remove method in ThreadLocal
public void remove(a) {
    ThreadLocalMap m = getMap(Thread.currentThread());
    if(m ! =null)
        m.remove(this);
}

// Remove method in ThreadLocalMap
private void remove(ThreadLocal
        key) {
    Entry[] tab = table;
    int len = tab.length;
    int i = key.threadLocalHashCode & (len-1);
    for(Entry e = tab[i]; e ! =null;
            e = tab[i = nextIndex(i, len)]) {
        if (e.get() == key) {
            e.clear();
            expungeStaleEntry(i);
            return; }}}Copy the code

Possible memory leak problems:

The lifetime of a ThreadLocalMap is the same as that of a thread, but a ThreadLocal is not necessarily the same. It is possible that a ThreadLocal is exhausted and wants to be reclaimed, but the thread may not terminate immediately and continue running (such as thread reuse in a thread pool). If ThreadLocal objects have only weak references, they will be cleaned up in the next garbage collection.

If a ThreadLocal is not strongly referenced, it will be cleaned up during garbage collection, and so will any key in the ThreadLocalMap that uses this ThreadLocal. However, a value is a strong reference and will not be cleaned up, resulting in a value with a null key.

In ThreadLocalMap, when the set(), get(), and remove() methods are called, records with a null key are cleared. When ThreadLocal is set to null, a memory leak can occur in a ThreadLocalMap, which can only be avoided by manually calling remove(). Remove () with threadlocal.remove () as far as possible to avoid threadLocal=null operation. The remove() method completely reclaims the object, and threadLocal=null simply frees the reference to threadLocal, but there is still an Entry in the ThreadLocalMap, which needs to be processed later.

1.3. Application scenarios of ThreadLocal

  1. For more complex business, use ThreadLocal instead of display passing of parameters.
  2. ThreadLocal can be used as a database Connection pool to store Connection objects so that the same Connection can be fetched multiple times in a thread (Spring’s DataSource uses ThreadLocal).
  3. Manage Session sessions by storing sessions in ThreadLocal so that the thread processing multiple sessions is always one Session.

2. Common concurrent containers

Regardless of multithreaded concurrency, container classes tend to be more efficient by using threadless classes such as ArrayList and HashMap. In concurrent scenarios, thread-safe container classes such as ConcurrentHashMap and ArrayBlockingQueue are often used, sacrificing efficiency but gaining security.

All thread-safe containers mentioned above are in the java.util.concurrent package, which contains a number of concurrent containers.

ConcurrentHashMap: ConcurrentHashMap

One of the most common concurrent containers that can be used as a cache in concurrent scenarios. The underlying hash table is still a hash table, but it has changed a lot in Java 8, and Java 7 and Java 8 are both used in more common versions, so there are often comparisons between the two implementations (for example, in interviews).

One big difference is that Java 7 uses piecewise locking to reduce lock contention, while Java 8 ditches piecewise locking in favor of CAS (a type of optimistic locking), and to prevent degradation to linked lists in the event of a serious hash collision (when a collision occurs, a linked list is created at that location, and objects with the same hash value are linked together). Will be converted to a red-black tree after the list length reaches a threshold (8) (tree query efficiency is more stable than linked list).

CopyOnWriteArrayList: a concurrent ArrayList

A concurrent version of ArrayList is also an array, unlike ArrayList in that it creates a new array when adding or deleting elements, adds or removes specified objects from the new array, and finally replaces the original array with the new array.

Application scenario: Read operations are not locked, but write (add, Delete, or change) operations are locked. Therefore, it applies to the scenario where read operations are heavy and write operations are short.

Limited scenario: Since reads are not locked (efficient reads, just like normal ArrayList), the current copy is read and therefore dirty data may be read. If you mind, I suggest not.

CopyOnWriteArraySet: a concurrent ArraySet

The implementation is based on CopyOnWriteArrayList (with a CopyOnWriteArrayList member variable), which means that the underlying layer is an array, meaning that each add has to iterate through the entire collection to see if it exists, and insert (lock) if it doesn’t.

Application scenario: Add a copy of CopyOnWriteArrayList to CopyOnWriteArrayList. The set should not be too large.

ConcurrentLinkedQueue: Concurrent queue (based on linked lists)

Concurrent queue based on linked list implementation, using optimistic lock (CAS) to ensure thread safety. Because the data structure is a linked list, there is theoretically no queue size limit, which means adding data is guaranteed to succeed.

ConcurrentLinkedDeque: Concurrent queue (based on two-way linked lists)

Concurrent queues based on bidirectional linked lists can operate from head to tail, so in addition to FIFO (FIFO), it can also be FIFO (FILO), of course, it should be called stack.

ConcurrentSkipListMap: concurrent Map based on hop tables

SkipList is a spatial-temporal data structure that uses redundant data to index linked lists layer by layer, similar to binary lookup

ConcurrentSkipListSet: a concurrent Set based on a hop table

ConcurrentSkipListSet ConcurrentSkipListMap ConcurrentSkipListMap ConcurrentSkipListMap ConcurrentSkipListMap

ArrayBlockingQueue: Blocking queue (array-based)

An array-based implementation of a blocking queue must be constructed with an array size, and if the array is full it will block until it has a place (also supports direct return and timeout waiting), with a lock ReentrantLock to ensure thread safety.

LinkedBlockingQueue: Blocking queue (based on linked lists)

The list based implementation of the blocking queue, like the non-blocking ConcurrentLinkedQueue, has one more capacity limit, if not set to default to int Max.

LinkedBlockingDeque: Blocking queue (based on two-way linked lists)

Similar to LinkedBlockingQueue, but with actions specific to two-way lists.

SynchronousQueue: Reads and writes pairs of queues

A dummy queue, because it has no real space to store elements, every insert must have a corresponding fetch, and can’t continue to put without fetching. A usage scenario in Java is the Executors. NewCachedThreadPool (), create a cached thread pool.

PriorityBlockingQueue: Thread-safe priority queue

You can construct it by passing in a comparator, as if the elements you put in are sorted, and then consumed in order as they are read. Some low-priority elements may not be consumed in the long run because higher-priority elements keep coming in.

LinkedTransferQueue: A data exchange queue based on a linked list

The interface TransferQueue is realized. When the element is put in by transfer method, if a thread is found to be blocking and fetching the element, the element will be directly given to the waiting thread. If no one is waiting to consume it, the element is placed at the end of the queue and the method blocks until someone reads it. Similar to SynchronousQueue, but more powerful.

The path to Becoming a Java engineer (1) The path to becoming a Java engineer (2) The path to becoming a Java engineer (3) the path to becoming a Java engineer (5)