Before we get into multithreading, let’s look at a few confusing concepts.

concept

Threads and processes

The relationship between threads and processes, for example in a company, is that processes are departments and threads are department employees. That is, there can be one or more threads in a process.

Concurrency and parallelism

Parallelism refers to the alternating use of cpus by multiple tasks. Parallelism refers to the simultaneous execution of multiple tasks by multiple cpus. For example, when buying tickets at a train station, parallel means that there are many people queuing for tickets at one window, while parallel means that there are many people queuing for tickets at multiple Windows.

Synchronous and asynchronous

Synchronization is when a function is executed and the next function cannot be executed until the function is complete. Asynchrony refers to the execution of a function without waiting for the completion of the function before the execution of the next function.

GCD

After Swift3, THE GCD Api has been greatly adjusted, from the original C-style function calls to object-oriented packaging, which is more comfortable and flexible to use.

synchronous

let queue = DispatchQueue(label: "com.ffib.blog")

queue.sync {
    for i in0.. < 5 {print(i)
    }
}

for i in10.. The < 15 {print(i)
}

output: 
0
1
2
3
4
10
11
12
13
14
Copy the code

It can be seen from the result that during queue synchronization operation, when the program is performing queue tasks, the operation of the main thread will not be executed. This is because when the program is performing synchronization operation, the thread will be blocked, so the program can continue to execute until the queue task is completed.

asynchronous

let queue = DispatchQueue(label: "com.ffib.blog")

queue.async {
    for i in0.. < 5 {print(i)
    }
}

for i in10.. The < 15 {print(i)
}

output:
10
0
11
1
12
2
13
3
14
4
Copy the code

It can be seen from the result that in queue asynchronous operation, when the program is executing the queue task, it does not need to wait for the queue task to start executing before it can execute the operation of the main thread. In contrast to synchronous execution, asynchronous queues do not block the main thread, and when the main thread is idle, other tasks can be performed.

QoS priority

In actual development, we need to classify tasks, such as UI display and interactive operation, which are of high priority, while some operations that are not urgent, such as cache operation and user habit collection, are of low priority. In GCD, we use queues and priorities to divide tasks to achieve a better user experience. Choosing the right priority can better allocate CPU resources. GCD uses the DispatchQoS structure. If no QoS is specified, default is used. The following grades go from high to low.

public struct DispatchQoS : Equatable {

     public static letUserInteractive: DispatchQoS // User interaction level, which needs to be completed in a very fast time, such as UI display public staticletUserInitiated: DispatchQoS // userInitiated events that need to be completed in a very short time, such as user click events and user gestures. public staticlet'default' : DispatchQoS // System default priority. Public staticletUtility: DispatchQoS // Utility level, don't need to complete tasks quickly public staticletBackground: DispatchQoS // Time-consuming operations that are not detected by the user public staticlet unspecified: DispatchQoS
}

Copy the code

Here are two examples to look at priority usage in detail.

Same priority

let queue1 = DispatchQueue(label: "com.ffib.blog.queue1", qos: .utility)
let queue2 = DispatchQueue(label: "com.ffib.blog.queue2", qos: .utility)

queue1.async {
    for i in5.. The < 10 {print(i)
    }
}

queue2.async {
    for i in0.. < 5 {print(i)
    }
}
 output:
 0
 5
 1
 6
 2
 7
 3
 8
 4
 9
Copy the code

As you can see from the results, the two queues are executed alternately with the same priority.

Different priorities

let queue1 = DispatchQueue(label: "com.ffib.blog.queue1", qos: .default)
let queue2 = DispatchQueue(label: "com.ffib.blog.queue2", qos: .utility)

queue1.async {
    for i in0.. < 5 {print(i)
    }
}

queue2.async {
    for i in5.. The < 10 {print(i)
    }
}

output:
0
5
1
2
3
4
6
7
8
9
Copy the code

As can be seen from the result, the CPU allocates more resources to the queue with the highest priority in alternate output, and only allocates resources to the queue with the lowest priority when the CPU is idle.

By default, the main queue has the highest priority, namely userInteractive, so use this priority with caution, otherwise the user experience may be affected. For operations that do not require user awareness, such as caching, utility can be used

Serial queues

When creating a queue, the default is a serial queue when the queue type is not specified.

let queue = DispatchQueue(label: "com.ffib.blog.initiallyInactive.queue", qos: .utility)

queue.async {
    for i in0.. < 5 {print(i)
    }
}

queue.async {
    for i in5.. The < 10 {print(i)
    }
}

queue.async {
    for i in10.. The < 15 {print(i)
    }
}
output: 
0
1
2
3
4
5
6
7
8
9
10
11
12
13
14
Copy the code

The queue execution results are visible from the results, which are executed in the order in which the tasks are added.

Parallel lines

let queue = DispatchQueue(label: "com.ffib.blog.concurrent.queue", qos: .utility, attributes: .concurrent)

queue.async {
    for i in0.. < 5 {print(i)
    }
}

queue.async {
    for i in5.. The < 10 {print(i)
    }
}

queue.async {
    for i in10.. The < 15 {print(i)
    }
}
output:
5
0
10
1
2
3
11
4
6
12
7
13
8
14
9

Copy the code

As you can see from the results, all tasks are executed in parallel. In addition, when you set the Attributes parameter, there is another enumerated value, initiallyInactive, which means that the task is not automatically executed and needs to be triggered manually by the programmer. If this parameter is not specified, the task is automatically executed after being added.


let queue = DispatchQueue(label: "com.ffib.blog.concurrent.queue", qos: .utility,
attributes: .initiallyInactive)
queue.async {
    for i in0.. < 5 {print(i)
    }
}
queue.async {
    for i in5.. The < 10 {print(i)
    }
}
queue.async {
    for i in10.. The < 15 {print(i)
    }
}

//需要调用activate,激活队列。
queue.activate()

output:
0
1
2
3
4
5
6
7
8
9
10
11
12
13
14
Copy the code

As you can see from the results, just by switching from automatic execution to manual execution, the execution result remains the same, and adding this property gives you more flexibility, the freedom to decide when to execute. Let’s look again at how parallel queues set this enumeration value.

let queue = DispatchQueue(label: "com.ffib.blog.concurrent.queue", qos: .utility, attributes:
[.concurrent, .initiallyInactive])
queue.async {
    for i in0.. < 5 {print(i)
    }
}
queue.async {
    for i in5.. The < 10 {print(i)
    }
}
queue.async {
    for i in10.. The < 15 {print(i)
    }
}
queue.activate()

output:
10
0
5
11
1
6
12
2
7
13
3
8
14
4
9
Copy the code

Delay to perform

The GCD provides a method to delay the execution of tasks by calling the function of the delayed task on the queue that has been created. Time is set with DispatchTimeInterval. GCD parameters related to time parameters are set with this enumeration.

public enum DispatchTimeInterval : Equatable {

    case/ / seconds seconds (Int)case/ / ms milliseconds (Int)case/ / subtle microseconds (Int)case/ / nanosecond nanoseconds (Int)case never
}
Copy the code

AsyncAfter has two identical methods when setting up the call function, the difference is that the parameter names are different. See Stack Overflow for an explanation.

WallDeadline and Deadline, when the system sleeps,wallDeadline will continue, but Deadline will be suspended. For example, if the parameter is set to 60 minutes, if the system sleeps for 50 minutes, wallDeadline will be executed 10 minutes after the system wakes up, and Deadline will be executed 60 minutes after the system wakes up.

let queue = DispatchQueue(label: "com.ffib.blog.after.queue")

let time = DispatchTimeInterval.seconds(5)

queue.asyncAfter(wallDeadline: .now() + time) {
    print("wall dead line done")
}

queue.asyncAfter(deadline: .now() + time) {
    print("dead line done")}Copy the code

DispatchGroup

You can use DispatchGroup if you want to wait for all queue tasks to complete.

let group = DispatchGroup()
let queue1 = DispatchQueue(label: "com.ffib.blog.queue1", qos: .utility)
let queue2 = DispatchQueue(label: "com.ffib.blog.queue2", qos: .utility)
queue1.async(group: group) {
    for i in0.. The < 10 {print(i)
    }
}
queue2.async(group: group) {
    for i in10.. < 20 {printGroup. Notify (queue: dispatchqueue.main) {print("done")
}

output: 
5
0
6
1
7
2
8
3
9
4
done
Copy the code

You can use wait if you want to wait for a queue to complete before executing another queue

let group = DispatchGroup()
let queue1 = DispatchQueue(label: "com.ffib.blog.queue1", qos: .utility)
let queue2 = DispatchQueue(label: "com.ffib.blog.queue2", qos: .utility)
queue1.async(group: group) {
    for i in0.. The < 10 {print(i)
    }
}
queue2.async(group: group) {
    for i in10.. < 20 {print(I)}} group.wait() {group.notify(queue: dispatchqueue.main) {print("done")
}
output:
0
1
2
3
4
5
6
7
8
9
done
Copy the code

You can set the timeout period to prevent threads from being locked due to queue blocking.

group.wait(timeout: <#T##DispatchTime#>)
group.wait(wallTimeout: <#T##DispatchWallTime#>)
Copy the code

DispatchWorkItem

Swift3 New API. You can use this API to set tasks to be executed in queues. Let’s start with a simple application. Initialize the closure with DispatchWorkItem.

let workItem = DispatchWorkItem {
    for i in0.. The < 10 {print(i)
    }
}
Copy the code

The first is to automatically respond to the closure by calling Perform ().

 DispatchQueue.global().async {
     workItem.perform()
 }
Copy the code

The second is passed as an argument to the async method.

 DispatchQueue.global().async(execute: workItem)
Copy the code

Let’s take a look at what methods and properties are inside DispatchWorkItem.

init(qos: DispatchQoS = default, flags: DispatchWorkItemFlags = default,
    block: @escaping () -> Void)
Copy the code

From the initialization method, DispatchWorkItem can also be set to priority. There is also a parameter called DispatchWorkItemFlags to see the internal composition of DispatchWorkItemFlags.

public struct DispatchWorkItemFlags : OptionSet, RawRepresentable {

    public static let barrier: DispatchWorkItemFlags 

    public static let detached: DispatchWorkItemFlags

    public static let assignCurrentContext: DispatchWorkItemFlags

    public static let noQoS: DispatchWorkItemFlags

    public static let inheritQoS: DispatchWorkItemFlags

    public static let enforceQoS: DispatchWorkItemFlags
}
Copy the code

DispatchWorkItemFlags is divided into two parts:

  • cover
    • NoQoS has no priority
    • InheritQoS inherits the priority of the Queue
    • EnforceQoS overwrites the priority of the Queue
  • The implementation of
    • barrier
    • detached
    • assignCurrentContext

The implementation is described below, but leave a hole here. Let’s start by looking at how setting priorities affects task execution.

let queue1 = DispatchQueue(label: "com.ffib.blog.workItem1", qos: .utility)
let queue2 = DispatchQueue(label: "com.ffib.blog.workItem2", qos: .userInitiated)
let workItem1 = DispatchWorkItem(qos: .userInitiated) {
    for i in0.. < 5 {print(i)
    }
}
let workItem2 = DispatchWorkItem(qos: .utility) {
    for i in5.. The < 10 {print(i)
    }
}
queue1.async(execute: workItem1)
queue2.async(execute: workItem2)

output:
5
0
6
7
8
9
1
2
3
4
Copy the code

It can be seen from the result that even if DispatchWorkItem is set, only setting the priority does not have any effect on the task execution order. Next, let’s try setting DispatchWorkItemFlags

let queue1 = DispatchQueue(label: "com.ffib.blog.workItem1", qos: .utility)
let queue2 = DispatchQueue(label: "com.ffib.blog.workItem2", qos: .userInitiated)

let workItem1 = DispatchWorkItem(qos: .userInitiated, flags: .enforceQoS) {
    for i in0.. < 5 {print(i)
    }
}

let workItem2 = DispatchWorkItem {
    for i in5.. The < 10 {print(i)
    }
}

queue1.async(execute: workItem1)
queue2.async(execute: workItem2)
output:
5
0
6
1
7
2
8
3
9
4
Copy the code

When the enforceQoS is set, the priority enforceQoS compulsively overwrites the priority of the queue, so that the two queues are executed in an alternating state and become the same priority.

DispatchWorkItem also has wait and notify methods, the same as DispatchGroup.

DispatchSemaphore

If you want to execute an asynchronous queue task synchronously, you can use semaphores. Wait () reduces the semaphore by one, or returns success if the semaphore is greater than 1, or timeout otherwise.

func wait(wallTimeout: DispatchWallTime) -> DispatchTimeoutResult
func wait(timeout: DispatchTime) -> DispatchTimeoutResult
Copy the code

Signal () increments the semaphore, returning the current semaphore.

func signal() -> Int
Copy the code

Here’s an example to see how it works. Let’s take a look at what happens when a file is written asynchronously without semaphore.

// Initialize the semaphore to 1let semaphore = DispatchSemaphore(value: 1)

let queue = DispatchQueue(label: "com.ffib.blog.queue", qos: .utility, attributes: .concurrent)
let fileManager = FileManager.default
let path = NSHomeDirectory() + "/test.txt"
print(path) filemanager.createFile (atPath: path, contents: nil, attributes: nil) // Loop write, expected result istest4
for i in0.. <5 { queue.async {do {
                try "test\(i)".write(toFile: path, atomically: true, encoding: String.Encoding.utf8)
            }catch {
                print(error)
            }
            semaphore.signal()
        }
    }
}
Copy the code

let semaphore = DispatchSemaphore(value: 1)
let queue = DispatchQueue(label: "com.ffib.blog.queue", qos: .utility, attributes: .concurrent)
let fileManager = FileManager.default
let path = NSHomeDirectory() + "/test.txt"
print(path)
fileManager.createFile(atPath: path, contents: nil, attributes: nil)
for i in0.. <5 {//. DistantFuture stands for foreverif semaphore.wait(wallTimeout: .distantFuture) == .success {
        queue.async {
            do {
                print(i)
                try "test\(i)".write(toFile: path, atomically: true, encoding: String.Encoding.utf8)
            }catch {
                print(error)
            }
            semaphore.signal()
        }
    }
}
Copy the code



for
wait
if
true
wait
signal()
sleep(1)
test.txt
test1
test4
DispatchSemaphore


Another use of DispatchSemaphore is to limit the maximum number of concurrent requests on a queue by reducing the wait() semaphore by one and increasing the signal() semaphore by one, as in the example above. If you’ve used NSOperationQueue classmate, should know maxConcurrentOperationCount, the effect is similar.

DispatchWorkItemFlags

There’s a DispatchWorkItemFlags pit, so let’s see.

barrier

Understandable for isolation, or in the file to read and write, for example, when reading the file, can be asynchronous access, but if suddenly appeared asynchronous writes, we want to achieve the effect of is when to write operation, which suspended the read operation, until the end of the write operation, continue to read operations, to ensure the read operation is the latest content of file access. The test. TXT file is used as an example. The expected result is as follows: Test4 is read before the write operation. After the write operation, the read is done (that is, what was written). Let’s take a look at what happens without barrier.

let queue = DispatchQueue(label: "com.ffib.blog.queue", qos: .utility, attributes: .concurrent)

let path = NSHomeDirectory() + "/test.txt"
print(path)

let readWorkItem = DispatchWorkItem {
    do {
        let str = try String(contentsOfFile: path, encoding: .utf8)
        print(str)
    }catch {
        print(error)
    }
    sleep(1)
}

let writeWorkItem = DispatchWorkItem(flags: []) {
    do {
        try "done".write(toFile: path, atomically: true, encoding: String.Encoding.utf8)
        print("write")
    }catch {
        print(error)
    }
    sleep(1)
}
for _ in0.. <3 { queue.async(execute:readWorkItem)
}
queue.async(execute: writeWorkItem)
for _ in0.. <3 { queue.async(execute:readWorkItem)
}

output:
test4
test4
test4
test4
test4
test4
write
Copy the code

The result is not what we want. Let’s see what happens when you add a barrier.

let queue = DispatchQueue(label: "com.ffib.blog.queue", qos: .utility, attributes: .concurrent)

let path = NSHomeDirectory() + "/test.txt"
print(path)

let readWorkItem = DispatchWorkItem {
    do {
        let str = try String(contentsOfFile: path, encoding: .utf8)
        print(str)
    }catch {
        print(error)
    }
}

let writeWorkItem = DispatchWorkItem(flags: .barrier) {
    do {
        try "done".write(toFile: path, atomically: true, encoding: String.Encoding.utf8)
        print("write")
    }catch {
        print(error)
    }
}

for _ in0.. <3 { queue.async(execute:readWorkItem)
}
queue.async(execute: writeWorkItem)
for _ in0.. <3 { queue.async(execute:readWorkItem)
}

output:
test4
test4
test4
write
done
done
done
Copy the code

As expected, barriers are used to isolate reads and writes so that they are not read when written.