Overview of multithreading

1.1. Concepts of program, process and thread

1) program

A set of instructions written in a language to accomplish a specific task, i.e. a static piece of code, a static object.

2) process

Each process has an independent memory space. An application can run multiple processes at the same time. Process is also a procedure of execution process, is the basic unit of the system running procedures; System running a program is a process from creation, operation to extinction process.

3) thread

A separate unit of execution within a process; A process can run multiple threads concurrently, which can be understood as a process is equivalent to a single CPU operating system, and threads are the multiple tasks running in this system.

1.2. Parallelism and concurrency

1) parallel

Multiple cpus performing multiple tasks simultaneously; Refers to two or more times occurring at the same time.

For example: multiple people doing different things at the same time

2) the concurrent

When a CPU (using a time slice) performs multiple tasks simultaneously, two or more events occur in a period of time.

Like seckill platforms, where multiple people do the same thing

Was installed in the operating system, multiple applications, and claim is for a period of time have more than one program running at the same time, on the macro it in a single CPU system, every moment can only have a program, or micro on these programs is time-sharing alternately, is merely the sense that gives a person is running at the same time, it is because time-sharing alternating running time is very short.

In a multi-CPU system, the programs that can be executed concurrently can be allocated to multiple processors (CPUS) to achieve multi-task parallel execution, that is, each processor is used to process a program that can be executed concurrently, so that multiple programs can be executed simultaneously.

At present, the computer market said that the multi-core CPU, is the multi-core processor, the more cores, the more parallel processing procedures, can greatly improve the efficiency of computer operation.

Note: a computer with a single processor certainly cannot process multiple tasks in parallel, only alternating tasks on a single CPU.

1.3. The role of multithreading

1) In order to make better use of CPU resources, if there is only one thread, the second task must wait until the end of the first task can be carried out, if the use of multi-threading in the main thread of the task can be executed at the same time other tasks, without waiting;

2) Processes cannot share data, threads can;

3) System creation process needs to reallocate system resources for the process, the cost of thread creation is relatively small;

4) Java language built-in multithreading function support, simplify Java multithreading programming.

1.4,The difference between processes and threads

Process: has independent memory space, data storage space (heap space and stack space) in the process is independent, at least one thread.

Threads: Heap space is shared, stack space is independent, threads consume much less resources than processes.

Note:

1) Because multiple threads in a process run concurrently, they are sequenced from a micro point of view. Which thread executes completely depends on CPU scheduling, and programmers can not interfere. And this leads to the randomness of multithreading.

2) A Java program has at least two threads in its process, the main() thread and the garbage collector thread. Every time a class is executed using a Java command, a JVM is actually started, and each JVM is actually starting a thread in the operating system. Java itself has a garbage collection mechanism, so at least two threads are started while Java is running.

3) Since the cost of creating a thread is much less than the cost of creating a process, we usually consider creating multiple threads rather than multiple processes when developing multitasking runs.

Second, the life cycle of threads

1.1. Thread lifecycle diagram

1) Thread state graph

2) Thread state transition diagram

3) Specific operation transformation diagram

1.2 detailed explanation of thread life cycle

1.2.1 New State (New)

When a Thread object is created with the new keyword and the Thread class or its subclasses, the Thread object is in a newborn state. A thread in its nascent state has its own memory space.

Note: * * * * not the threads are started again call start () method, otherwise there will be a Java lang. IllegalThreadStateException anomalies.

1.2.2 Ready State (Runnable)

After a thread object is created, it enters the ready state when another thread calls the object’s start() method. The thread in this state is in the runnable thread pool and becomes runnable, waiting to acquire CPU usage.

A:

A thread in the ready state that is ready to run, but not yet allocated to a CPU, is in a thread-ready queue (albeit in the form of a queue, in fact, called a runnable pool rather than a runnable queue). Because CPU scheduling is not necessarily in first-in, first-out order), waiting for the system to allocate cpus to it. The waiting state is not the execution state. When the system selects a Thread object to be executed, it changes from the waiting state to the execution state. The action selected by the system is called “CPU scheduling”. Once the CPU is acquired, the thread enters the running state and automatically calls its own run method.

** Note: ** If you want the child Thread to execute immediately after calling the start() method, you can use thread.sleep () to make the main Thread sleep and transfer the child Thread to execute.

1.2.3 Operating Status (Running)

The thread in the ready state grabs the CPU and executes the program code.

A:

The running thread is the most complex, and can be blocked, ready, or dead.

A thread in the ready state, if scheduled by the CPU, changes from the ready state to the run state to perform tasks in the run() method. If the thread loses CPU resources, it goes from the running state to the ready state again. Wait for the system to allocate resources again. You can also call yield() on a running thread, which will yield CPU resources and become ready again.

1.2.4. Blocked

Blocked is when a thread gives up CPU usage for some reason and stops running temporarily. Until the thread enters the ready state, it has no chance to go to the running state.

A:

In some cases, a running thread that executes a sleep method, or waits for a resource such as an I/O device, gives up the CPU and temporarily stops running itself, entering a blocked state.

A thread in the blocked state cannot enter the ready queue. Only when the cause of the blocking is eliminated, such as when it is time to sleep, or when the waiting I/O device is idle, will the thread go to the ready state, queue up again in the ready queue, and resume running from where it left off after being selected by the system.

Classification of blocking conditions:

1) Wait blocking: A running thread executes wait(), and the JVM puts the thread into the wait pool. 2) Synchronization blocking: When a running thread acquires a synchronized lock on an object and the lock is held by another thread, the JVM adds that thread to the lock pool. 3) Other blocking: When a running thread executes a sleep() or join() method, or makes an I/O request, the JVM puts the thread in a blocked state. When the sleep() state times out, join() waits for the thread to terminate or exceed, or THE I/O process completes, the thread returns to the ready state. (Note that sleep does not release held locks)

1.2.5 Dead State

A thread terminates its life cycle when it finishes executing or exits the run() method because of an exception.

A thread changes from a running state to a dead state when its run() method finishes executing, or when it is terminated by force, such as an exception, or when a stop(), desyory() method is called, etc.

A:

A thread is considered dead when its run() method finishes executing, or when it is forced to terminate. The thread object may be alive, but it is no longer a single thread executing. Once a thread dies, it cannot be resurrected. If on a dead thread calls the start () method, which will be thrown. Java lang. IllegalThreadStateException anomalies.

Threads in JVM memory structure

1) Java VM memory structure diagram

2) explain

Program (a static piece of code) —————— “loaded into memory ——————” process (code loaded into memory, dynamic program) process can be subdivided into multiple threads, a thread represents an execution path within a program each thread has its own program counter (PC, Directs the program down) and the run stack (local variables, local methods, etc.)

3) Classification of threads

Threads in Java fall into two categories: 1. Daemon threads (garbage collection threads, exception handling threads) and 2. User threads (such as the main thread)

If the JVM is full of daemons, the current JVM exits.

4. Thread overhead

There are two necessary overhead in multithreading: thread creation and context switching

4.1. Create thread overhead

Create a thread to use is directly apply to the system resources, for the operating system, to create a thread price is very expensive, need to give it allocates memory, included in the schedule, at the same time when the thread to perform memory page breaks, the CPU’s cache is cleared, switch back to read information from the memory, even when destroyed the locality of data.

By default, the stack size of a thread is 1M(this can be set by setting the -xSS property, but pay attention to stack overflow). However, If each user requests a new thread, 1024 user threads will consume 1 GIGAByte of memory. If the system is large, all of a sudden the system will run out of resources, and finally the program will crash.

The same is true in Java programs, do not open new threads at will, especially for high-frequency services, try to use the thread pool, otherwise it is easy to cause insufficient memory, program crash.

4.2. Context switching overhead

1. The concept

The current task moves to the next task after executing a time slice. Before switching, the state of the previous task will be saved, and the state of the task can be reloaded when switching back to the task next time. The process from saving to reloading a task is a context switch.

2. Show

1) The time slice is the time allocated by the CPU to each thread. The time slice is usually tens of milliseconds.

2) The CPU implements multi-threading by allocating CPU time slices to each thread and constantly switching threads. Because the time slice is so short, it feels like multiple threads are executing simultaneously.

3. Ways to reduce context switching

1) Lockless concurrent programming

Context switches occur when multiple threads compete for locks, so there are strategies you can use to avoid locks when using multiple threads to process data.

A common strategy: The data is shred by the hash of the ID, and different threads process different segments of the data.

2) Lock separation technology

Example: ConcurrentHashMap

3) CAS algorithm

Java’s Atomic package uses the CAS algorithm to update data without locking.

4) Use the least number of threads

Avoid creating unnecessary threads, such as a few tasks, but many threads are created to process them, which results in a large number of threads waiting.

For example:

Reduce the number of context switches by reducing the number of WAITING threads

Dump stack Information

jstack PID > dumpfile

Statistics the status of all threads

grep java.lang.Thread.State dumpfile | awk ‘{print
2 2″ ”
3″ “
4 4″ ”
5}’ | sort | uniq -c

If a large number of waiting threads exist, check dumpfile for analysis:

1) If a large number of worker threads are waiting on the server, modify the configuration information of the thread pool in the server configuration file and restart the server to check the effect.