1. Concurrent programming is also called multithreaded programming.

The essence of concurrency is that one physical CPU(or multiple physical cpus) multiplexes between several programs,

Concurrency is the enforcement of multi-user sharing of limited physical resources to improve efficiency.

(And refers to the number of tasks more than the number of CPU cores, through the various task scheduling algorithms of the operating system, implementation of multiple tasks “together” (in fact, there are always some tasks are not executed, because the speed of switching tasks is quite fast, it seems to be executed together)

Concurrent when there are multiple threads in operation, if the system is only one CPU, it is impossible to truly for more than one thread at the same time, it can only divide the CPU run time into several time periods, then time period assigned to each thread execution, threads run code in a time period, other threads in a hang. This method is called Concurrent.

2. “parallel” means two or more events or activities occurring at the same time. In a multiprogram environment, parallelism enables multiple programs to run simultaneously on different cpus at the same time. (Hadoop clusters are parallel computing.)

When the system has more than one CPU, the operation of the thread may be non-concurrent. When one CPU executes a thread, another CPU can execute another thread. The two threads do not occupy CPU resources, but can execute simultaneously. This method is called Parallel.


Concurrency and parallelism are both similar and different concepts. Parallelism refers to two or more events occurring at the same time. Concurrency is when two or more events occur at the same time interval. In the multi-program environment, concurrency refers to the fact that there are multiple programs running concurrently at a macro level in a period of time, but in the single-processor system, only one program can be executed at any time, so these programs can only be executed alternately at a time. If there are multiple processors in the computer system, the programs that can be executed concurrently can be distributed to multiple processors to achieve parallel execution, that is, each processor is used to process a program that can be executed concurrently, so that multiple programs can be executed simultaneously.

3. Serial and parallel:

Parallel and serial refer to how tasks are executed. Serial refers to multiple tasks, each task is executed in sequence, and the next task can be performed only after one task is completed. Parallelism means that multiple tasks can be executed at the same time, and asynchrony is a prerequisite for the parallelism of multiple tasks.

4. Synchronous and asynchronous: refers to whether new threads can be started. Synchronization cannot start new threads, asynchrony can. Asynchrony and synchronization are relative. Synchronization is sequential execution. After one execution, the next execution needs to wait and coordinate. Asynchrony means being independent of each other and continuing to do things while waiting for something to happen, rather than waiting for the event to complete. Threads are one way to achieve asynchrony. Asynchrony means that the main thread calling a method does not have to wait synchronously for another thread to complete, allowing the main thread to do other things. Asynchrony and multithreading are not an equal relationship, asynchrony is the ultimate goal, multithreading is only a means to achieve asynchrony. Asynchrony is when an invocation request is sent to the caller and the caller can do something else without waiting for the result to return. Asynchrony can be implemented using multi-threading techniques or handed over to another process.