This article does not cover the concept and theory of GCD, but only documents the application of GCD in some development scenarios. Well, you are welcome to leave a comment.

The long-running

This is the most widely used scenario, in order to avoid blocking the main thread, time-consuming operations are placed on the child thread and the results are then used on the main thread. For example, read some data from a sandbox and display the read data in the UI. There are several subdivisions for this scenario:

Callback the main thread after performing a time-consuming operation

/// the main thread needs the result of the child thread
func handle<T>(somethingLong: @escaping (a) -> T, finshed: @escaping (T) -> ()) {
    globalQueue.async {
        let data = somethingLong()
        self.mainQueue.async {
            finshed(data)
        }
    }
}
// the main thread does not need the result of the child thread
func handle(somethingLong: @escaping (a) -> (), finshed: @escaping () -> ()) {
    let workItem = DispatchWorkItem {
        somethingLong()
    }
    globalQueue.async(execute: workItem)
    workItem.wait()
    finshed()
}

/////////////////////////////////////////////////////////////////////////////

GCDKit().handle(somethingLong: { [weak self] in
    self? .color =UIColor.red
    sleep(2)} {[weak self] in
    self? .view.backgroundColor =self? .color }GCDKit().handle(somethingLong: {
    let p = Person()
    p.age = 40
    print(Date(), p.age)
    sleep(2)
    return p
}) { (p: Person) in
    print(Date(), p.age)
}
Copy the code

Serial time-consuming operation

Each subtask depends on the completion of the previous task. When all subtasks are completed, the main thread is called back:

/// Add tasks to the global concurrent queue. The added tasks are executed synchronously
func wait(code: @escaping GCDKitHandleBlock) -> GCDKit {
    handleBlockArr.append(code)
    return self
}

// execute asynchronously on the main thread
func finshed(code: @escaping GCDKitHandleBlock) {
    globalQueue.async {
        for workItem in self.handleBlockArr {
            workItem()
        }
        self.handleBlockArr.removeAll()
        self.mainQueue.async {
            code()
        }
    }
}

/////////////////////////////////////////////////////////////////////////////

GCDKit().wait {
        self.num += 1
    }.wait {
        self.num += 2
    }.wait {
        self.num += 3
    }.wait {
        self.num += 4
    }.wait {
        self.num += 5
    }.finshed {
        print(self.num, Thread.current)
}
Copy the code

Concurrent time-consuming operation

Each subtask is independent, and the main thread is called back after all subtasks are completed:

/// Add tasks to a custom concurrent queue. The added tasks will be executed concurrently
func handle(code: @escaping GCDKitHandleBlock) -> GCDKit {
    let queue = DispatchQueue(label: "", attributes: .concurrent)
    let workItem = DispatchWorkItem {
        code()
    }
    queue.async(group: group, execute: workItem)
    return self
}

/// This task excludes other concurrent tasks and is generally used for writing transactions to ensure thread safety.
func barrierHandle(code: @escaping GCDKitHandleBlock) -> GCDKit {
    let queue = DispatchQueue(label: "", attributes: .concurrent)
    let workItem = DispatchWorkItem(flags: .barrier) {
        code()
    }
    queue.async(group: group, execute: workItem)
    return self
}

// execute asynchronously on the main thread
func allDone(code: @escaping GCDKitHandleBlock) {
    group.notify(queue: .main, execute: {
        code()
    })
}

/////////////////////////////////////////////////////////////////////////////

GCDKit().barrierHandle {
        self.num += 1
    }.barrierHandle {
        self.num += 2
    }.barrierHandle {
        self.num += 3
    }.handle {
        self.num += 4
    }.handle {
        self.num += 5
    }.allDone {
        self.num += 6
        print(self.num, Thread.current)
}
Copy the code

Delay to perform

The code is executed after a delay of some time, which is usually seen in the dialog box for praise pops up after opening the App for a period of time.

func run(when: DispatchTime, code: @escaping GCDKitHandleBlock) {
    DispatchQueue.main.asyncAfter(deadline: when) {
        code()
    }
}

/////////////////////////////////////////////////////////////////////////////

GCDKit().run(when: .now() + .seconds(120)) {
  self.doSomething()
}
Copy the code

The timer

Because the Target of the Timer is a strong reference, the destruction of the Timer needs special treatment. In addition, the operation of the Timer depends on the Runloop, and the Timer will only be executed once within a Runloop. This makes the time when the Runloop is heavily loaded, It is possible to skip the Timer, so you can use CGD’s TimerSource instead:

/ / / timer
///
/// - Parameters:
/// -start: indicates the start time
/// -end: indicates the end time
/// - Repeating: How often do you repeat
/// -leeway: allows error
/// -eventhandle: Processes events
/// -cancelHandle: timer end event
func timer(start: DispatchTime,
           end: DispatchTime,
           repeating: Double,
           leeway: DispatchTimeInterval,
           eventHandle: @escaping GCDKitHandleBlock,
           cancelHandle: GCDKitHandleBlock? = nil)
{
    let timer = DispatchSource.makeTimerSource() timer.setEventHandler { eventHandle() } timer.setCancelHandler { cancelHandle? () } timer.schedule(deadline: start, repeating: repeating, leeway: leeway) timer.resume() run(when: end) { timer.cancel() } }/////////////////////////////////////////////////////////////////////////////

GCDKit().timer(start: .now(),
               end: .now() + .seconds(10),
               repeating: 2,
               leeway: .milliseconds(1),
               eventHandle: {
    self.doSomething()
}) {
    print("timer cancel")}Copy the code

Concurrent traversal

If you need to process data faster, you can use concurrentPerform to execute loops concurrently:

func map<T>(data: [T], code: (T)- > ()) {DispatchQueue.concurrentPerform(iterations: data.count) { (i) in
        code(data[i])
    }
}

func run(code: (Int) -> (), repeting: Int) {
    DispatchQueue.concurrentPerform(iterations: repeting) { (i) in
        code(i)
    }
}

/////////////////////////////////////////////////////////////////////////////

let data = [1.2.3]
var sum = 0

GCDKit().map(data: data) { (ele: Int) in
    sleep(1)
    sum += ele
}
print(sum)

GCDKit().run(code: { (i) in
    sleep(1)
    sum += data[i]
}, repeting: data.count)
print(sum)
Copy the code

Control concurrency

Sometimes we need to process some tasks concurrently, but we do not want to open many threads at the same time. GCD does not have a concept like the maximum number of concurrent NSOperation, but we can use semaphores to achieve this:

func doSomething(label: String, cost: UInt32, complete:@escaping (a)- > ()) {NSLog("Start task%@",label)
    sleep(cost)
    NSLog("End task%@",label)
    complete()
}

/////////////////////////////////////////////////////////////////////////////

let semaphore = DispatchSemaphore(value: 3)
let queue = DispatchQueue(label: "", qos: .default, attributes: .concurrent)

queue.async {
    semaphore.wait()
    self.doSomething(label: "1", cost: 2, complete: {
        print(Thread.current)
        semaphore.signal()
    })
}

queue.async {
    semaphore.wait()
    self.doSomething(label: "2", cost: 2, complete: {
        print(Thread.current)
        semaphore.signal()
    })
}

queue.async {
    semaphore.wait()
    self.doSomething(label: "3", cost: 4, complete: {
        print(Thread.current)
        semaphore.signal()
    })
}

queue.async {
    semaphore.wait()
    self.doSomething(label: "4", cost: 2, complete: {
        print(Thread.current)
        semaphore.signal()
    })
}

queue.async {
    semaphore.wait()
    self.doSomething(label: "5", cost: 3, complete: {
        print(Thread.current)
        semaphore.signal()
    })
}
Copy the code

Timing management

Timing management mainly has several combinations:

  • Whether threads are open in subtasks;
  • Whether the subtasks are executed sequentially;

Subtasks are executed sequentially without open threads

Refer to the time consuming operation section.

Open threads within subtasks are executed in sequence

This is common in network requests, where the request parameters of one interface are the return values of another interface. In this case, network requests need to be timing managed. The following code represents the encapsulation of a network request:

func networkTask(label:String, cost:UInt32, complete:@escaping (a)- > ()) {NSLog("Start network Task task%@",label)
    DispatchQueue.global().async {
        sleep(cost)
        NSLog("End networkTask task%@",label)
        DispatchQueue.main.async {
            complete()
        }
    }
}
Copy the code

In the case of child threads that can be opened, sequential execution requires semaphore control:

let semaphore = DispatchSemaphore(value: 1)
let queue = DispatchQueue(label: "", qos: .default, attributes: .concurrent)
queue.async {
    semaphore.wait()
    self.networkTask(label: "1", cost: 2, complete: {
        semaphore.signal()
    })
    semaphore.wait()
    self.networkTask(label: "2", cost: 4, complete: {
        semaphore.signal()
    })
    semaphore.wait()
    self.networkTask(label: "3", cost: 3, complete: {
        semaphore.signal()
    })
    semaphore.wait()
    self.networkTask(label: "4", cost: 1, complete: {
        semaphore.signal()
    })
    semaphore.wait()
    print("all done")
    semaphore.signal()
}

/////////////////////////////////////////////////////////////////////////////
2017-12-19 14:02:33.297613+0800 Demo[11757:4946542] Start network Task task1
2017-12-19 14:02:35.301386+0800 Demo[11757:4946541] End networkTask task1
2017-12-19 14:02:35.301971+0800 Demo[11757:4946542] Start network Task task2
2017-12-19 14:02:39.306592+0800 Demo[11757:4946541] End networkTask task2
2017-12-19 14:02:39.306901+0800 Demo[11757:4946542] Start network Task task3
2017-12-19 14:02:42.307843+0800 Demo[11757:4946541] End networkTask task3
2017-12-19 14:02:42.308268+0800 Demo[11757:4946542] Start network Task task4
2017-12-19 14:02:43.310724+0800 Demo[11757:4946541] End networkTask task4
all done
Copy the code

Open threads in subtasks are not executed sequentially

This is often the case where you need to request multiple interfaces, and then perform some operations after all the requests are completed. This can be handled with the help of GCD task groups:

let group = DispatchGroup()
group.enter()
networkTask(label: "1", cost: 2, complete: {
    group.leave()
})

group.enter()
networkTask(label: "2", cost: 4, complete: {
    group.leave()
})

group.enter()
networkTask(label: "3", cost: 2, complete: {
    group.leave()
})

group.enter()
networkTask(label: "4", cost: 4, complete: {
    group.leave()
})

group.notify(queue: .main, execute:{
    print("All network is done")})/////////////////////////////////////////////////////////////////////////////
2017-12-19 14:10:33.876393+0800 Demo[16495:4973791] Start network Task task1
2017-12-19 14:10:33.878869+0800 Demo[16495:4973791] Start network Task task2
2017-12-19 14:10:33.879142+0800 Demo[16495:4973791] Start network Task task3
2017-12-19 14:10:33.879309+0800 Demo[16495:4973791] Start network Task task4
2017-12-19 14:10:35.883851+0800 Demo[16495:4974025] End networkTask task1
2017-12-19 14:10:35.883850+0800 Demo[16495:4974030] End networkTask task3
2017-12-19 14:10:37.883995+0800 Demo[16495:4974026] End networkTask task2
2017-12-19 14:10:37.883995+0800 Demo[16495:4974027] End networkTask task4
All network is done

// You can also abbreviate it this way
let downloadGroup = DispatchGroup(a)GCDKit().run(code: { (i) in
    downloadGroup.enter()
    networkTask(label: "\(i)", cost: UInt32(i), complete: {
        downloadGroup.leave()
    })
}, repeting: 10)
downloadGroup.notify(queue: .main) {
    print("All network is done")}/////////////////////////////////////////////////////////////////////////////
2017-12-19 15:07:13.253428+0800 Demo[49319:5169745] Start network Task task3
2017-12-19 15:07:13.253428+0800 Demo[49319:5169743] Start network Task task2
2017-12-19 15:07:13.253428+0800 Demo[49319:5169744] Start network Task task0
2017-12-19 15:07:13.253479+0800 Demo[49319:5169474] Start network Task task1
2017-12-19 15:07:13.253946+0800 Demo[49319:5169744] Start network Task task6
2017-12-19 15:07:13.253947+0800 Demo[49319:5169743] Start network Task task4
2017-12-19 15:07:13.253947+0800 Demo[49319:5169745] Start network Task task5
2017-12-19 15:07:13.254119+0800 Demo[49319:5169763] End networkTask task0
2017-12-19 15:07:13.254193+0800 Demo[49319:5169474] Start network Task task7
2017-12-19 15:07:13.254339+0800 Demo[49319:5169744] Start network Task task8
2017-12-19 15:07:13.254343+0800 Demo[49319:5169743] Start network Task task9
2017-12-19 15:07:14.258061+0800 Demo[49319:5169764] End networkTask task1
2017-12-19 15:07:15.258071+0800 Demo[49319:5169762] End networkTask task2
2017-12-19 15:07:16.258189+0800 Demo[49319:5169742] End networkTask task3
2017-12-19 15:07:17.258100+0800 Demo[49319:5169745] End networkTask task4
2017-12-19 15:07:18.258196+0800 Demo[49319:5169766] End networkTask task5
2017-12-19 15:07:19.258171+0800 Demo[49319:5169765] End networkTask task6
2017-12-19 15:07:20.259119+0800 Demo[49319:5169763] End networkTask task7
2017-12-19 15:07:21.258239+0800 Demo[49319:5169767] End networkTask task8
2017-12-19 15:07:22.258280+0800 Demo[49319:5169744] End networkTask task9
All network is done
Copy the code

Custom data listening

You can use DispatchSourceUserData to listen for data changes without frequently invoking the corresponding callback processing. It automatically merges the changes and calls back when the queue is idle to save CPU overhead.

extension GCDKit {
    
    convenience init(valueChanged: @escaping (T) - > ()) {self.init()
        userDataAddSource = DispatchSource.makeUserDataAddSource() userDataAddSource? .setEventHandler(handler: { [weak self] in
            guard let `self` = self else { return }
            guard let value = self.value else { return} valueChanged(value) }) userDataAddSource? .resume() }func send(_ value: T) {
        self.value = value userDataAddSource? .add(data:1)}}/////////////////////////////////////////////////////////////////////////////

GCD = GCDKit<Int> { (value: Int) in
    print(value)
}

let serialQueue = DispatchQueue(label: "com")
serialQueue.async {
    for i in 1.1000{
        self.GCD.send(i)
    }
    for i in 1000.9999 {
        self.GCD.send(i)
    }
}

/////////////////////////////////////////////////////////////////////////////

64
9999
Copy the code

Monitor the process

In Mac development, you can listen for other processes on and off:

let apps = NSRunningApplication.runningApplications(withBundleIdentifier: "com.apple.mail")
letprocessIdentifier = apps.first? .processIdentifierlet source = DispatchSource.makeProcessSource(identifier: pid, eventMask: .exit)
source.setEventHandler {
    print("mail quit")
}
source.resume()
Copy the code

Listening for directory structure

let folder = try? FileManager.default.url(for: .documentDirectory,
                                          in: .userDomainMask,
                                          appropriateFor: nil,
                                          create: false)
print(folder! .path)letfd = open(folder! .path,O_CREAT.0o644)
let queue = DispatchQueue(label: "m")
let source = DispatchSource.makeFileSystemObjectSource(fileDescriptor: fd,
                                                       eventMask: .all,
                                                       queue: queue)
source.setEventHandler {
    print("folder changed")
}
source.resume()

let result = FileManager.default.createFile(atPath: folder! .path +"/abc", contents: nil, attributes: nil)
if result {
    print(0)}else {
    print(1)}Copy the code

Thread safety

Instead of using a thread lock, you can restrict the thread on which the resource is being read or written, for example:

///. Barrier Ensures that it excludes other concurrent tasks and is used for writing transactions to ensure thread safety.
func barrierHandle(code: @escaping GCDKitHandleBlock) -> GCDKit {
    let queue = DispatchQueue(label: "", attributes: .concurrent)
    let workItem = DispatchWorkItem(flags: .barrier) {
        code()
    }
    queue.async(group: group, execute: workItem)
    return self
}
Copy the code

Or start a serial queue synchronous read/write task:

extension GCDKit {
    
    var data: T? {
        get {
            return readWriteQueue.sync { value }
        }
        set {
            readWriteQueue.sync { value = newValue }
        }
    }
}
Copy the code