Hello, today I want to share the Java thread pool with you. Please take out your notebook and write it down!

1. Why use thread pool

1. Frequent creation and destruction of a single thread wastes resources and causes frequent GC

2. Lack of unified management and competition among threads

2.ThreadPoolExecutor

ThreadPoolExecutor has four overloaded constructors. Let’s talk about the overloaded constructor with the most arguments so you can see what the other method arguments mean:

public ThreadPoolExecutor(int corePoolSize,

                          int maximumPoolSize,

                          long keepAliveTime,

                          TimeUnit unit,

                          BlockingQueue<Runnable> workQueue,

                          ThreadFactory threadFactory,

                          RejectedExecutionHandler handler) 
  • Detailed description of each parameter:

Here are the 7 parameters (we use more of a 5 parameter constructor in development). OK, let’s see what the 7 parameters mean:

CorePoolSize Number of core threads in the thread pool

Maximum number of threads in the MaximumPoolSize thread pool

KeepAliveTime The timeout duration of a non-core thread, which is collected when the system’s non-core thread has been idle longer than KeepAliveTime. If ThreadPoolExecutor’s AllowCoreThreadTimeout property is set to true, then this parameter also represents the length of the core thread’s timeout

The units of the third argument are nanoseconds, microseconds, milliseconds, seconds, minutes, hours, days, and so on

Workqueue A queue of tasks in a thread pool that is used to store tasks that have been submitted but not yet executed. The tasks stored here are committed by the execute method of ThreadPoolExecutor.

ThreadFactory provides the ability to create new threads for the thread pool, which we usually use by default

Handler reject strategy, when there is no thread when performing a new task (usually because the number of threads in thread pool has reached maximum number or caused by thread pool closed), by default, when a thread pool is unable to process the new thread, throws a RejectedExecutionException.

  • ArrayBlockingQueue: ArrayBlockingQueue: ArrayBlockingQueue: ArrayBlockingQueue This represents a BlockingQueue with a specified size. The constructor of an ArrayBlockingQueue takes an int that represents the size of the BlockingQueue. The elements stored in the ArrayBlockingQueue are accessed on a FIFO (first in, first out) basis.

2. LinkedBlockingQueue: LinkedBlockingQueue (int) LinkedBlockingQueue (int) LinkedBlockingQueue (int) LinkedBlockingQueue (int) LinkedBlockingQueue (int) LinkedBlockingQueue (int) LinkedBlockingQueue (int) LinkedBlockingQueue (int) LinkedBlockingQueue The size of LinkedBlockingQueue is Integer.max_value. The source code is as follows:

3. PriorityBlockingQueue: This queue is similar to LinkedBlockingQueue except that the elements in PriorityBlockingQueue are not sorted by FIFO. The access order is determined by the element’s Comparator (this feature also reflects the fact that data stored in PriorityBlockingQueue must implement the Comparator interface).

4. SynchronousQueue will: SynchronousQueue This is a SynchronousQueue, one of the thread-safe BlockingQueue. In a SynchronousQueue, the insert on the producer thread must wait for the remove on the consumer thread. Therefore, you cannot read or traverse the synchronousQueue, and the element cannot exist unless you attempt to retrieve it. We can think of it as producers and consumers waiting for each other, waiting for each other and then leaving together.

Reject policy AbortPolicy: Reject directly and throw an exception, which is the default policy.

CallerRunsPolicy: Direct the thread that calls the execute method to perform this task.

Discard doldestpolicy: Discard the oldest unprocessed task, and then try again to execute the current new task.

DiscardPolicy: Discards the current task without throwing an exception

3. Execution process

When the number of threads does not reach CorePoolSize, a new thread is created to execute the task.

When the core thread count is full, the task is placed on the blocking queue.

When the queue is full and the maximum number of threads is not reached, a new non-core thread is created to execute the task (important).

When the queue is full and the maximum number of threads has been reached, a refusal policy is selected to execute.

4. Other thread pools

1.FixedThreadPool

Fixed thread pool size, can specify the size of the thread pool, the pool CorePoolSize and MaximumPoolSize equal, blocking queue using LinkedBlockingQueue, the size of the integer maximum.

The number of threads in the thread pool is always the same, and when a new task is committed, the idle thread in the thread pool is executed immediately, and if not, it is temporarily stored in the blocking queue. For a fixed-size pool, there is no variation in the number of threads.

An unbounded LinkedBlockingQueue is also used to store executing tasks. When the task submission is very frequent, the LinkedBlockingQueue increases rapidly, and there is the problem of exhaustion of system resources.

Also, when the thread pool is idle, that is, when there are no runnable tasks in the thread pool, it will not release the worker thread, and it will use some system resources and need to be shutdown

2.SingleThreadExecutor

You can see that the blocking queue uses a LinkedBolckingQueue with a default size of INTEGER.MAX_VALUE, so that if a large number of requests come in, they will be placed in the task queue and may cause OOM.

3.Executors.newCachedThreadPool()

Cacheable thread pool, first check the thread pool has previously created threads, if there is a direct use, if not a new thread to add to the thread pool, cacheable thread pool

It is usually used to perform some asynchronous tasks with a short lifetime; The thread pool is infinite, and when the previous task has completed while the current task is executing, the thread that executed the previous task is reused instead of creating a new thread each time

The cached thread lives for 60 seconds by default. The core pool of the thread has a corePoolSize of 0, and the maximum core pool is Integer.MAX_VALUE. The blocking queue uses SynchronousQueue.

Is a directly committed blocking queue, which always forces the thread pool to add new threads to perform new tasks.

When no task is executed, when the thread is idle for more than keepAliveTime (60 seconds), the worker thread will terminate and be collected. When a new task is submitted,

If no idle threads are available, a new thread is created to perform the task, incurring some system overhead.

If a large number of tasks are being submitted at the same time and the tasks are not being executed in a particularly fast time, then an equal number of new thread pool processing tasks will be added to the thread pool, which is likely to deplete the system resources very quickly.

4.ScheduledThreadPool

Create a fixed-length thread pool to support timed and periodic task execution

Timed thread pool, which can be used to perform tasks periodically, usually for periodically synchronizing data.

ScheduleAtFixedRate: ScheduleAtFixedRate executes tasks at a fixed frequency. Cycle is the interval between successful execution of each task.

SchedulerWithFixedDelay: The task is executed with a fixed time delay, which is the time between the last successful execution and the next start.

5. Why does Alibaba recommend a custom thread pool

Through the above source analysis, we found that the newFixedThreadPool and newSingleThreadExecutor methods both use the LinkedBlockingQueue task queue. The default size of LinkedBlockingQueue is Integer.max_value. The size of the thread pool defined in the NewCachedThreadPool is INTEGER.MAX_VALUE.

So the reason why we don’t use Executors to create thread pools is that FixedThreadPool and SingleThreadPool request queues are integer.max_value and may pile up so many requests that an OOM may result.

The cachedThreadPool allows the number of threads to be created to be INTEGER.MAX_VALUE, which may create a large number of threads, resulting in OOM.

6. Other

1. The thread pool is closed with shutDown(), which does not affect the committed tasks

2. shutdownNow () closes the thread pool and attempts to terminate the executing thread

3. AllowCoreThreadTimeout (Boolean Value) allows core threads to be reclaimed when they idle out

4. Singleton mode to create thread pool

import com.google.common.util.concurrent.ThreadFactoryBuilder;

import java.util.concurrent.*;

/ * *

  • Asynchronous task processor

* /

public class AsyncTaskExecutor {

Public static final int CORE_POOL_SIZE = 10; public static final int CORE_POOL_SIZE = 10; Public static final int MAX_POOL_SIZE = 40; public static final int MAX_POOL_SIZE = 40; Public static final int KEEP_ALIVE_TIME = 1; public static final int KEEP_ALIVE_TIME = 1; Public static final int BLOCKING_QUEUE_SIZE = 1000; public static final int BLOCKING_QUEUE_SIZE = 1000; ThreadPoolExecutor(Core_Pool_Size) = new ThreadPoolExecutor(Core_Pool_Size); MAX_POOL_SIZE, KEEP_ALIVE_TIME, TimeUnit.MICROSECONDS, new LinkedBlockingQueue<Runnable>(BLOCKING_QUEUE_SIZE),

new TreadFactoryBuilder.setNameFormat(“boomoom-thread-pool-%d”).build(),

new TreadPoolExecutor.DiscardPolicy());

private AsyncTaskExecutor() {}; /** * @Param task */ public void execute(Runnable task) {processExecutor.submit(task); }

}

The difference between lazy and hungry Chinese style

1. Hungry is thread-safe. A static object is created for the system at the same time as the class is created, and will not be changed in the future.

To be thread-safe, you must double-validate the lock and the object must be volatile to prevent reordering

Well, that’s the end of today’s article, I hope to help you confused before the screen