Why thread pools? What are the advantages of thread pools?

The main task of the thread pool is to control the number of threads running, put tasks in the queue during processing, and then start these tasks after the creation of the thread. If the number of threads exceeds the maximum number of threads, the thread queue will wait until other threads finish executing, and then take out the task from the queue to execute.

Main advantages: thread reuse, maximum concurrency control, thread management

(1) Reduce resource consumption by reusing the created threads to reduce the loss caused by thread creation and destruction;

(2) Improve the response speed, when the task arrives, the task can be executed without waiting for the creation of the thread;

(3) Improve the manageability of threads. Threads are scarce resources. If they are created without limit, they will consume system resources and reduce system stability. Thread pools can be used for uniform allocation, tuning, and monitoring.

A thread pool is a pool that manages threads. Because it takes time to create and close threads, creating one thread for each task is very resource-intensive. Using a thread pool can avoid increased resource consumption for creating and destroying threads, improve response times, and reuse threads. After using a thread pool, creating a thread becomes fetching free threads from the pool, and closing a thread becomes returning threads to the pool.

How many important parameters of thread pool are introduced?

public ThreadPoolExecutor(int corePoolSize,
                          int maximumPoolSize,
                          long keepAliveTime,
                          TimeUnit unit,
                          BlockingQueue<Runnable> workQueue,
                          ThreadFactory threadFactory,
                          RejectedExecutionHandler handler) 
Copy the code

CorePoolSize: The number of core threads in the thread pool that will not be recycled if they are not used!

MaximumPoolSize: The maximum number of threads in the thread pool, including the core thread pool

KeepAliveTime: The maximum duration that a thread pool can hold, except for the core thread, because the core thread cannot be cleared even if there is no work, all the rest are keepAliveTime

TimeUnit Unit: indicates the unit of survival time

WorkQueue: indicates a task queue

ThreadFactory: a threadFactory used to create threads, usually using the default

Handler: Rejects the policy when the queue is full and the number of worker threads is greater than or equal to the maximum number of threads in the pool

Can threads be pre-created when initializing a thread pool?

prestartAllCoreThreads

Threads can be pre-created when the thread pool is initialized. After the pool is initialized, the prestartAllCoreThreads() method is called to pre-create the corePoolSize number of core threads

public int prestartAllCoreThreads(a) {
    int n = 0;
    while (addWorker(null.true))
        ++n;
    return n;
}
private boolean addWorker(Runnable firstTask, boolean core) {
  // ..
}
Copy the code

The addWorker method is used to add a task to the thread pool and execute it. If the task is empty, the thread gets the task execution and calls getTask(). This method blocks the task execution from the blockingQueue blockingQueue, so the thread is not released and remains in the thread pool. Note Tasks can only be executed using the core thread.

So this method precreates a total number of threads in the thread pool with no task to execute, corePoolSize

prestartCoreThread

PrestartCoreThread () can also create threads in advance, except that this method creates only one thread

public boolean prestartCoreThread(a) {
    return workerCountOf(ctl.get()) < corePoolSize &&
        addWorker(null.true);
}
Copy the code

If the number of worker threads is smaller than corePoolSize, addWorker is called to create a free core thread

Can the core thread of the thread pool be reclaimed

allowCoreThreadTimeOut

ThreadPoolExecutor has a private member variable:

private volatile boolean allowCoreThreadTimeOut;
Copy the code

If allowCoreThreadTimeOut=true, the core thread will be reclaimed within the specified time

Thread pool execution flow

(1) After creating the thread pool, wait for the submitted task request;

(2) When the execute() method is called to add a request task, the thread pool will make the following judgment:

If the number of running threads is less than corePoolSize, create a core thread to run the task immediately. If the number of running threads is greater than or equal to corePoolSize, the task is placed in the task queue; If the task queue is full and the number of running threads is less than maximumPoolSize(the maximum number of threads), create a non-core thread to run the task immediately; If the task queue is full and the number of running threads is greater than or equal to maximumPoolSize, the thread pool executes a reject policy;Copy the code

(3) When a thread completes a task, it will take the next task from the queue to execute it;

(4) When a thread has nothing to do for more than a certain time, the thread pool will stop.

Rejection policies

When the waiting task queue is too full to accommodate new tasks and the maximum number of threads in the thread pool reaches, new non-core threads cannot be created to process tasks. In this case, a rejection policy is required.

Normal AbortPolicy: throw RejectedExecutionException abnormal block system;

CallerRunsPolicy: The task is reverted to the caller and handled by the calling thread.

DiscardOldestPolicy: Discards the longest waiting task in the task queue and puts the current task into the task queue.

DiscardPolicy: Discards the task without handling or throwing an exception.

In addition to the four default RejectedExecutionHandler interfaces provided by the JDK, you can customize the RejectedExecutionHandler interface based on your service requirements

What kinds of work queues do thread pools have

1, ArrayBlockingQueue

Is a bounded blocking queue based on an array structure that sorts elements according to FIFO (first-in, first-out).

2, LinkedBlockingQueue

A linked list-based blocking queue that sorts elements in FIFO (first in, first out) and typically has a higher throughput than ArrayBlockingQueue. Static factory methods Executors. NewFixedThreadPool () using the queue

3, SynchronousQueue will

A blocking queue that does not store elements. Each insert operation must wait until another thread calls to remove operation, otherwise the insert has been in the blocking state, the throughput is usually more than LinkedBlockingQueue, static factory methods Executors. NewCachedThreadPool using the queue.

4, PriorityBlockingQueue

An infinite blocking queue with priority.

What about unbounded queues and bounded queues

Bounded queue

If poolSize < corePoolSize, the runnable task will be executed immediately as an argument to a new Thread.

2. When the number of submitted tasks exceeds corePoolSize, the current runable is submitted to a block queue.

3. If the poolSize of a bounded queue is smaller than maximumPoolsize, a new Thread will be created and the corresponding runnable task will be executed immediately.

4. If no, go to Step 4, reject.

Unbounded queue

Compared with the bounded queue, the unbounded task queue does not have task failure unless the system resources are exhausted. When a new task arrives and the number of threads in the system is smaller than corePoolSize, a new thread is created to execute the task. When corePoolSize is reached, it does not continue to increase. If new tasks are added and there are no idle thread resources, the task is directly queued to wait. If the speed of task creation and processing is very different, the unbounded queue keeps growing rapidly until it runs out of system memory

Common thread pool types?

  1. newSingleThreadExecutor

Create a single threaded thread pool that uses only one worker thread to execute tasks, ensuring that all tasks are executed in the specified order (FIFO, LIFO, priority).

  1. newFixedThreadPool

Create a fixed-length thread pool that controls the maximum number of concurrent threads, and the excess threads wait in the queue.

  1. newCachedThreadPool

Create a cacheable thread pool. If the length of the thread pool exceeds the processing requirement, you can recycle idle threads flexibly, or create a new thread if none is available.

  1. newScheduledThreadPool

Create a fixed – length thread pool to support scheduled and periodic task execution.

Which of the three ways to create a thread, single, fixed or variable, have you used in your work?

The following methods do not apply to the Thread pool: * Do not allow the thread pool to be created by the Executors. Use ThreadPoolExecutor to clear the thread pool operation rules and avoid resource depletion.

//ExecutorService threadPool = Executors.newFixedThreadPool(5); // A pool of 5 processing threads
//ExecutorService threadPool = Executors.newFixedThreadPool(1); // 1 thread per pool
ExecutorService threadPool = Executors.newCachedThreadPool();// A pool of N threads
Copy the code

* If the thread pool object returns by Executors, it has the following disadvantages:

  1. FixedThreadPool and SingleThreadPool:

    The queue length of allowed requests is integer. MAX_VALUE, which may accumulate a large number of requests and result in OOM.

  2. CacheThreadPool and ScheduledThreadPool:

    The number of threads allowed to be created is integer.max_value, which may create a large number of threads, resulting in OOM;

支那

What’s the difference between execute and submit?

In the previous explanation, we used the execute method to execute the task. In addition to the execute method, there is also a Submit method to execute the task we submitted.

Execute (Runnable Command) belongs to the Executor interface. Execute (Runnable Command) applies to scenarios that do not need to pay attention to the return value. You only need to throw threads into the thread pool to execute. ExecutorService inherits Executor

The submit method is suitable for scenarios where you need to pay attention to the return value, and execute(Runnable Command) belongs to the Executor interface. ExecutorService inherits Executor

Future, the return value of submit(), catches and handles exceptions when the get method is called

Submit get method

The future’s get method blocks until the return value is obtained. We can use the Future’s isDone method to determine whether the task is complete and then decide whether to get

Closing the thread pool

Shutting down the thread pool can be done by calling the shutdownNow and shutdown methods

ShutdownNow: Interrupt () is issued for all ongoing tasks, stops execution, cancels all tasks that have not yet started, and returns a list of tasks that have not yet started.

Shutdown: When we call shutdown, the thread pool will not accept new tasks, but will not force the termination of committed or executing tasks

How to configure thread pools properly?

There are two types, CPU intensive and IO intensive

The size of the thread pool depends on what tasks your thread pool performs. It can be CPU intensive or IO intensive, and the size of the pool varies depending on the type of task.

(1) CPU intensive

CPU intensive means that the task requires a lot of computation without blocking and the CPU runs at full speed all the time.

Cpu-intensive tasks can only be speeded up on a truly multi-core CPU (through multiple threads), whereas on a single-core CPU, no matter how many simulated multithreads you run, the task can’t be speeded up because that’s the total CPU power.

CPU intensive tasks should be configured with the smallest possible thread, the general formula is: configure the number of CPU cores +1 thread thread pool,

(2) I/O intensive

IO intensive, that is, the task requires a lot of IO, that is, a lot of blocking. Running IO intensive tasks on a single thread can result in a lot of WASTED CPU power being wasted waiting. So using multithreading in IO intensive tasks can greatly speed up applications, even on a single-core CPU, primarily by taking advantage of wasted blocking time.

Method 1: You can use a large thread pool, usually 2 CPU cores

IO – intensive CPU usage is low, so the CPU can process other tasks while waiting for I/OS to make full use of the CPU time

Method 2: The higher the proportion of thread waiting time, the more threads are needed. The higher the percentage of thread CPU time, the fewer threads are required.

Here’s an example:

If the average CPU run time per thread is 0.5 seconds, the thread wait time (non-cpu run time, such as IO) is 1.5 seconds, and the number of CPU cores is 8, then the formula above estimates :(0.5+1.5)/0.5)8=32. This formula is further translated into:

Optimal number of threads = (ratio of thread wait time to thread CPU time + 1) Number of cpus.