Overview

Single-threaded programming is simple.

Given the incredible state of the art in the world we live in, we could make a single-core superCPU. No matter how complex the process, it is always so fast that humans can’t perceive any temporal pause.

It’s also one of the things that software people most want to happen: to be free from the shackles of hardware and not have to compromise on software code for it!

Back to reality, the vertical development speed of CPU has gradually started to slow down, since the quality of a short time can not change, so first to quantitative change, so the multicore CPU came out, parallel computing has become a reality.

The emergence of parallel computing brings a series of problems to software workers, but the essence of these problems is centered on the core problem of “collaboration”.

Take the example of a factory worker:

There are two solutions when you have to arrange for a number of workers to hand-assemble a Rolls-Royce, and when you are halfway there are not enough materials:

  • The workers stopped what they were doing, did nothing and stood by. Until enough material has been procured, tell them to get back to work.

  • The workers stopped what they were doing, but the superior staff coordinated with the situation and let them do other work until enough materials were procured.

This example vividly explains blocking (workers stop, do nothing, stand by) versus non-blocking (workers are scheduled to do something else).

In fact, Coroutine is a general term for some non-blocking technologies in the software field.

There are various implementations for different programming languages and platforms.

This article focuses on implementing the Coroutine library by combining Swift and assembly language. Thanks to the good compatibility between Swift and C, this library can also be called by C.

Next, I’ll explore some of the technical details of how to implement Coroutine using Swift and assembly language.

Technical detail 1: Boost.Context library

Before you worry, let’s take a look at the simplest scenario (single thread) :

There is one thread (thread_01) and three functions (F1, F2, f3).

Suppose that these three functions represent three different independent (non-dependent) tasks that need to be completed, and that this thread represents the performer who needs to complete the three tasks.

We’ll show you how this works in pseudocode:

thread_01.run {
    f1()
    f2()
    f3()
}
Copy the code

The challenge then arises. If f1 on thread_01 is halfway through the function body and some conditions are not met, let thread_01 jump to F2 to free up CPU power. Again, F2 encounters the same condition as F1 halfway through, so let Thread_01 jump from F2 to F3 again.

F3 successfully completes the execution and jumps back to f2’s “hanging start “, then F2 completes the execution and jumps back to F1’s” hanging start “, and F1 continues to execute.

To complete the above challenge, a common solution would surely be to implement the series of unconditional jumps described above with a GOto in C and a label that represents a “hanging point”.

Back to reality, you never know how many tasks there are while the program is running, and you can’t statically assign labels to an unknown number of tasks at compile time.

Note:

In fact, it is possible to implement a single-threaded version of the coroutine in C alone. The Swedes who came up with this idea are really geniuses who have reached a perfect understanding of assembly.

Here is the related Article in Chinese:

Coolshell. Cn/articles / 10…

So how does that work in other languages?

The likes of Golang and Kotlin-Coroutines use compilation techniques to make jumps similar to Goto, but Lua doesn’t know much about them.

Swift and C and C++ and Rust rely directly or indirectly on assembly language to implement jumps. Boost.Context is one of the most efficient and widely used C++ libraries across multiple platforms, with jumps at its core written in assembly language.

The Swift_Coroutine library that will be written in the rest of this article is also based on the core part of Boost.Context. Since only the core part is needed, the assembly code corresponding to the core functions is copied and organized into a separate Swift library based on the modularization idea:

Swift_Boost_Context

Usage:

import Swift_Boost_Context

func f1(_ data: Int) throws -> String {
    defer {
        print("f1 finish")}print("main ----> f1  data = \(data)")

    let bc2: BoostContext = makeBoostContext(f2)
    print("bc2 = \(bc2)")
    let resultF2ToF1: BoostTransfer<String> = try bc2.jump(data: "I am f1")
    print("f1 <---- f2   resultF2ToF1 = \(resultF2ToF1)")

    return "7654321"
}

func f2(_ data: String) throws -> String {
    defer {
        print("f2 finish")}print("f1 ----> f2  data = \(data)")

    return "1234567"
}

func main(a) throws {
    let bc1: BoostContext = makeBoostContext(f1)
    print("bc1 = \(bc1)")
    //let bc2: BoostContext = makeBoostContext(f2)
    //print("bc2 = \(bc2)")

    let resultF1ToMain: BoostTransfer<String> = try bc1.jump(data: 123)
    print("main <---- f1 resultF1ToMain = \(resultF1ToMain.data)")
    //let _: BoostTransfer<String> = try resultF1ToMain.fromContext.jump(data: 123)
}

do {
    try main()
} catch {
    print("main : \(error)")}Copy the code

Console output:

bc1 = BoostContextImpl(_spSize: 65536._sp: 0x00007fec89500000._fctx: 0x00007fec894fffc0)
main ----> f1  data = 123
bc2 = BoostContextImpl(_spSize: 65536._sp: 0x00007fec89511000._fctx: 0x00007fec89510fc0)
f1 ----> f2  data = I am f1
f2 finish
f1 <---- f2   resultF2ToF1 = BoostTransfer(fromContext: BoostContextProxy(_fctx: 0x00007fec89510eb0))
f1 finish
main <---- f1 resultF1ToMain = 7654321

Process finished with exit code 0
Copy the code

Unlock Boost.Context for more postures

Can Swift_Boost_Context only be used to implement “Coroutine”?

The answer is definitely no.

It can also be used for slice-oriented programming, such as implementing “interceptors” in some back-end network frameworks.

It can also be used to implement dynamic proxies similar to those in Java.

It can also be used for dynamic plug-in.

It can also be used to implement non-stop update programs to fix bugs, which is a general cloud services do more in-depth students will touch the topic…. 😜

Boost.Context brings DSL language

In multithreaded programming, code is filled with a lot of “Callback Hell”, hence the research topic of “how to write asynchronous code as synchronous code”.

This feature is currently built into several languages:

Examples include the Future class in the JavaScript Dart and the aysnc await syntactic sugar designed for it.

Consider the suspend method in Kotlin-Coroutines.

Such as the built-in coroutine syntax sugar in Golang Lua.

All of the above languages are implemented by designing some syntactic sugar and eventually generating corresponding Future objects through compilation techniques. So let’s use

Swift_Boost_Context

To implement a “simple single-threaded Futrue”


import Foundation
import Swift_Boost_Context

public enum CoFutureError: Error {
    case canceled
}

public class CoFuture<R> :CustomDebugStringConvertible.CustomStringConvertible {

    let _name: String

    let _task: () throws -> R

    var _result: Result<R.Error>?

    var _bctx: BoostContext!

    deinit {
        print("\ [self) : deinit")}public init(_ name: String._ task: @escaping () throws -> R) {
        self._name = name
        self._task = task
        self._result = nil

        self._bctx = makeBoostContext { [unowned self] (fromCtx: BoostContext, data: Void) - >Void in
            let result: Result<R.Error> = Result{[unowned self] in
                try self._task()
            }

            let _: BoostTransfer<Void> = fromCtx.jump(data: result)
        }
    }

    @discardableResult
    public func await(a) throws -> R {
        let btf: BoostTransfer<Result<R.Error> > =self._bctx.jump(data: ())
        return try btf.data.get()}public func cancel(a) -> Void {
        if self._result == nil {
            self._result = .failure(CoFutureError.canceled)
        }
    }

    public var isCanceled: Bool {
        if case .failure(let error as CoFutureError)? = self._result {
            return error == .canceled
        }
        return false
    }

    public var debugDescription: String {
        return "CoFuture(_name: \(_name))"
    }

    public var description: String {
        return "CoFuture(_name: \(_name))"}}Copy the code

Usage:

CoFuture("01") {
    
    let r11 = CoFuture("01 _01") {
       
        return 11
    }
    .await()
    
    let r12 = CoFuture("01 _02") {
       
        return 12
    }
    .await()
    
    return r11 + r12
}

Copy the code

In fact, we used Swift to create a DSL language for the Future.

Technical detail 2: Implementing Coroutine

Combine Swift GCD and BoostContext to create a Coroutines class with an API similar to Kotlin-Coroutines.

Like Kotlin-Coroutines, it supports Coroutines in multi-threaded environments, not single-threaded!

Repeat 3 more times, Coroutine is non-blocking, non-blocking, non-blocking.

import Foundation
import Swift_Boost_Context
import SwiftAtomics
import RxSwift
import RxCocoa
import RxBlocking

public enum CoroutineState: Int {
    case INITED = 0
    case STARTED = 1
    case RESTARTED = 2
    case YIELDED = 3
    case EXITED = 4
}

public typealias CoroutineScopeFn<T> = (Coroutine) throws -> T

public typealias CoroutineResumer= () - >Void

public class CoJob {

    var _isCanceled: AtomicBool
    let _co: Coroutine

    public var onStateChanged: Observable<CoroutineState> {
        self._co.onStateChanged
    }

    init(_ co: Coroutine) {
        self._isCanceled = AtomicBool(a)self._isCanceled.initialize(false)
        self._co = co
    }

    @discardableResult
    public func cancel(a) -> Bool {
        // can not cancel Coroutine
        _isCanceled.CAS(current: false, future: true)}public func join(a) throws -> Void {
        try _co.onStateChanged.ignoreElements().toBlocking().first()
    }
}

public protocol Coroutine {

    var currentState: CoroutineState { get }

    var onStateChanged: Observable<CoroutineState> { get }

    func yield(a) throws -> Void

    func yieldUntil(cond: (a) throws -> Bool) throws -> Void

    func yieldUntil(_ beforeYield: (@escaping CoroutineResumer) -> Void) throws -> Void

    func delay(_ timeInterval: DispatchTimeInterval) throws -> Void

}

enum CoroutineTransfer<T> {
    case YIELD
    case YIELD_UNTIL(Completable)
    case DELAY(DispatchTimeInterval)
    case EXIT(Result<T.Error>)}class CoroutineImpl<T> :Coroutine.CustomDebugStringConvertible.CustomStringConvertible {

    let _name: String

    var _originCtx: BoostContext!

    var _yieldCtx: BoostContext?

    let _dispatchQueue: DispatchQueue

    let _task: CoroutineScopeFn<T>

    var _currentState: AtomicInt

    let _disposeBag: DisposeBag = DisposeBag(a)let _onStateChanged: AsyncSubject<CoroutineState>

    var currentState: CoroutineState {
        CoroutineState(rawValue: _currentState.load()) ?? .EXITED
    }

    var onStateChanged: Observable<CoroutineState> {
        return _onStateChanged.asObserver()
    }

    deinit {
        self._originCtx = nil
        self._yieldCtx = nil
        //print("CoroutineImpl deinit : _name = \(self._name)")
    }

    init(
            _ name: String._ dispatchQueue: DispatchQueue._ task: @escaping CoroutineScopeFn<T>) {self._name = name
        self._onStateChanged = AsyncSubject(a)self._yieldCtx = nil
        self._dispatchQueue = dispatchQueue
        self._task = task
        self._currentState = AtomicInt(a)self._currentState.initialize(CoroutineState.INITED.rawValue)

        // issue: memory leak!
        //self.originCtx = makeBoostContext(self.coScopeFn)

        self._originCtx = makeBoostContext { [unowned self] (fromCtx: BoostContext, data: Void) - >Void in
            //print("\(self) coScopeFn : \(fromCtx) ----> \(_bctx!) ")
            self._currentState.CAS(current: CoroutineState.INITED.rawValue, future: CoroutineState.STARTED.rawValue)
            self.triggerStateChangedEvent(.STARTED)

            self._yieldCtx = fromCtx
            let result: Result<T.Error> = Result{[unowned self] in
                try self._task(self)}//print("\(self) coScopeFn : \(self._fromCtx ?? fromCtx) <---- ")
            let _: BoostTransfer<Void> = (self._yieldCtx ?? fromCtx).jump(data: CoroutineTransfer.EXIT(result))
            //print("Never jump back to here !!!" )}}func triggerStateChangedEvent(_ state: CoroutineState) {
        self._onStateChanged.on(.next(state))
        if state == CoroutineState.EXITED {
            self._onStateChanged.on(.completed)
        }
    }

    func start(a) -> Void {
        let bctx: BoostContext = self._originCtx
        self._dispatchQueue.async(execute: self.makeResumer(bctx))
    }

    func resume(_ bctx: BoostContext, ctf: CoroutineTransfer<T>) -> Void {
        switch ctf {
            case .YIELD:
                //print("\(self) -- YIELD")
                triggerStateChangedEvent(.YIELDED)
                //self._dispatchQueue.asyncAfter(deadline: .now() + .milliseconds(5), execute: self.makeResumer(bctx))
                self._dispatchQueue.async(execute: self.makeResumer(bctx))
                //print("\(self) -- YIELD -- finish")
            case .YIELD_UNTIL(let onJumpBack):
                //print("\(self) -- YIELD_UNTIL")
                triggerStateChangedEvent(.YIELDED)
                onJumpBack.subscribe(onCompleted: {
                              //print("\(self) -- YIELD_UNTIL2")
                              self._dispatchQueue.async(execute: self.makeResumer(bctx))
                          })
                          .disposed(by: self._disposeBag)
            case .DELAY(let timeInterval):
                //print("\(self) -- DELAY -- \(timeInterval)")
                triggerStateChangedEvent(.YIELDED)
                self._dispatchQueue.asyncAfter(deadline: .now() + timeInterval, execute: self.makeResumer(bctx))
                //print("\(self) -- DELAY -- finish")
            case .EXIT(let result):
                //print("\(self) -- EXITED -- \(result)")
                self._currentState.store(CoroutineState.EXITED.rawValue)
                triggerStateChangedEvent(.EXITED)}}func makeResumer(_ bctx: BoostContext) -> CoroutineResumer {
        return{[unowned self] in
            let btf: BoostTransfer<CoroutineTransfer<T>> = bctx.jump(data: ())
            let coTransfer: CoroutineTransfer<T> = btf.data
            return self.resume(btf.fromContext, ctf: coTransfer)
        }
    }

    func yield(a) throws -> Void {
        return try self._yield(CoroutineTransfer.YIELD)}func yieldUntil(cond: (a) throws -> Bool) throws -> Void {
        while! (try cond()) {
            try self.yield()
        }
    }

    func yieldUntil(_ beforeYield: (@escaping CoroutineResumer) -> Void) throws -> Void {
        let resumeNotifier: AsyncSubject<Never> = AsyncSubject()
        beforeYield({ resumeNotifier.on(.completed) })
        try self._yield(CoroutineTransfer.YIELD_UNTIL(resumeNotifier.asCompletable()))
    }

    func delay(_ timeInterval: DispatchTimeInterval) throws -> Void {
        try self._yield(CoroutineTransfer.DELAY(timeInterval))
    }

    func _yield(_ ctf: CoroutineTransfer<T>) throws -> Void {
        // not in current coroutine scope
        // equals `func isInsideCoroutine() -> Bool`
        // ---------------
        guard let yieldCtx = self._yieldCtx else {
            throw CoroutineError.calledOutsideCoroutine(reason: "Call `yield()` outside Coroutine")}// jump back
        // ---------------
        _currentState.store(CoroutineState.YIELDED.rawValue)
        //print("\(self) _yield : \(fromCtx) <---- \(Thread.current)")
        let btf: BoostTransfer<Void> = yieldCtx.jump(data: ctf)
        // update `self._fromCtx` when restart
        self._yieldCtx = btf.fromContext
        _currentState.store(CoroutineState.RESTARTED.rawValue)
        triggerStateChangedEvent(.RESTARTED)
        //print("\(self) _yield : \(btf.fromContext) ----> \(Thread.current)")
    }

    func isInsideCoroutine(a) -> Bool {
        return self._yieldCtx ! =nil
    }

    var debugDescription: String {
        return "CoroutineImpl(_name: \(_name))"
    }
    var description: String {
        return "CoroutineImpl(_name: \(_name))"}}public class CoLauncher {
    public static func launch<T>(
            name: String = "",
            dispatchQueue: DispatchQueue,
            _ task: @escaping CoroutineScopeFn<T>
    ) -> CoJob {
        let co: CoroutineImpl = CoroutineImpl<T>(name, dispatchQueue, task)
        co.start()
        return CoJob(co)
    }
}


extension CoroutineImpl: ReactiveCompatible {}extension Reactive where Base: CoroutineImpl<Any> {}Copy the code

Usage:

func example_01(a) throws {
    // Example-01
    / / = = = = = = = = = = = = = = = = = = =
    print("Example-01 =============================")

    //let queue = DispatchQueue(label: "TestCoroutine")
    let queue = DispatchQueue.global()

    let coJob1 = CoLauncher.launch(name: "co1", dispatchQueue: queue) { (co: Coroutine) throws -> String in
        defer {
            print("co 01 - end \(Thread.current)")}print("co 01 - start \(Thread.current)")
        try co.yield()
        return "co1 's result"
    }

    let coJob2 = CoLauncher.launch(dispatchQueue: queue) { (co: Coroutine) throws -> String in
        defer {
            print("co 02 - end \(Thread.current)")}print("co 02 - start \(Thread.current)")
        try co.yield()
        throw TestError.SomeError(reason: "Occupy some error in co2")
        return "co2 's result"
    }

    let coJob3 = CoLauncher.launch(dispatchQueue: queue) { (co: Coroutine) throws -> String in
        defer {
            print("co 03 - end \(Thread.current)")}print("co 03 - start \(Thread.current)")
        try co.yield()
        return "co3 's result"
    }

    try coJob1.join(a)try coJob2.join(a)try coJob3.join(a)print("Example-01 ============= end ===============")}Copy the code

Console output:

Example-01 =============================
co 01 - start <NSThread: 0x7f9e3c004600>{number = 3, name = (null)}
co 02 - start <NSThread: 0x7f9e3a4060f0>{number = 2, name = (null)}
co 03 - start <NSThread: 0x7f9e3c104120>{number = 4, name = (null)}
co 01 - end <NSThread: 0x7f9e3a4060f0>{number = 2, name = (null)}
co 03 - end <NSThread: 0x7f9e3c004600>{number = 3, name = (null)}
co 02 - end <NSThread: 0x7f9e3c104120>{number = 4, name = (null)}
Example-01= = = = = = = = = = = = =end= = = = = = = = = = = = = = =Copy the code

As the console output above shows:

Three coroutines were created, each of which was “suspended” in the middle of its run, giving up the right to be executed by the CPU. Therefore, it will be found that the “first half of the code” and “second half of the code” of each coroutine are executed in a different thread!!

Technical detail 3: Implementing Coroutine based futures

Concrete implementation:

CoFuture.swift

Usage:

func loadSpeaker(a) -> CoFuture<Speaker> {
    return CoFuture(name, DispatchQueue.IO) { (co: Coroutine) in
        // running on "IO" thread}}func nextTalk(a) -> CoFuture<Talk> {
    return CoFuture(name, DispatchQueue.Single) { (co: Coroutine) in
        // running on "Single" thread}}func getConference(a) -> CoFuture<Conference> {
    return CoFuture(name, DispatchQueue.Net) { (co: Coroutine) in
       // running on "Net" thread}}func getCity(a) -> CoFuture<City> {
    return CoFuture(name, DispatchQueue.main) { (co: Coroutine) in
       // running on "Main" thread}}Copy the code
func workflow(a) throws -> Void {
    let speaker = try repository.loadSpeaker().await()
    let talk = try speaker.nextTalk().await()
    let conference = try talk.getConference().await()
    let city = try conference.getCity().await()

    // needs both Speaker and City objects
    reservations.bookFlight(speaker, city).await()
}
Copy the code

It is important to note that you can specify the threads to be used within CoFuture, in this case relying directly on the thread pool class DispatchQueue provided by Swift GCD, which is equivalent to the ThreadPoolExecutor of the Java ecosystem.

Being able to specify a DispatchQueue for each CoFuture means that each subtask can run on a different thread, and then organize some disparate tasks in synchronized code, as in the example Workflow method above.

This is somewhat similar to the observeOn operator in RxJava RxSwift, which simply and quickly specifies the thread to use for each subtask.

/ / Java code
/ / = = = = = = = = = = = = = = = = = =

Observable<Integer> observable = Observable.create(new ObservableOnSubscribe<Integer>() {
            @Override
            public void subscribe(ObservableEmitter<Integer> observableEmitter) throws Exception {
                // ...}}); observable.subscribeOn(Schedulers.newThread()) .observeOn(Schedulers.io()) .map(new Function<Integer, Integer>() {
                    // ...
                })
                .observeOn(AndroidSchedulers.mainThread())
                .doOnSubscribe(new Consumer<Disposable>() {
                    // ...
                })
                .subscribeOn(Schedulers.single())
                .subscribe(new Consumer<Integer>() {
                    // ...
                });
Copy the code

The same thing can be done with Kotlin-Coroutine’s suspend method and withContext.

/ / kotlin code
/ / = = = = = = = = = = = = = = = = = = =

suspend func loadSpeaker() : Result<Speaker> {
    withContext(Dispatchers.IO) {
        // running on "IO" thread
    }
}

suspend func nextTalk() : Result<Talk> {
    withContext(Dispatchers.Default) {
            // running on "New" thread
    }
}

suspend func getConference() : Result<Conference> {
     withContext(Dispatchers.IO) {
            // running on "IO" thread
        }
}

suspend func getCity() : Result<City> {
     withContext(Dispatchers.Main) {
            // running on "Main" thread}}Copy the code

Technical details 4: Implementing coroutine-based channels

In the most familiar producer-consumer event-driven model, one thread is responsible for producing products and queuing them, while another is responsible for pulling products from the queue and using them.

Actually multi-threaded applications in the still seem a bit “heavyweight”, because of the lack of yield semantics, have to use synchronization between threads to avoid to produce the global resources of the state, it is inevitable the dormancy, scheduling, a context switching overhead, and thread scheduling will produce on the temporal uncertainty. For coroutines, the concept of “suspend” is simply to transfer code execution rights and call another coroutine, until the transfer of the coroutine after the end of the call and “wake up” from the suspension point, such calls between coroutines are logically controllable, timing determined, it can be said that everything is under control.

Concrete implementation:

CoChannel.swift

Usage:

func example_05(a) throws {
    // Example-05
    / / = = = = = = = = = = = = = = = = = = =
    print("Example-05 =============================")

    let producerQueue = DispatchQueue(label: "producerQueue", attributes: .concurrent)
    let consumerQueue = DispatchQueue(label: "consumerQueue", attributes: .concurrent)
    let closeQueue = DispatchQueue(label: "closeQueue", attributes: .concurrent)
    let channel = CoChannel<Int>(capacity: 7)

    let coClose = CoLauncher.launch(name: "coClose", dispatchQueue: closeQueue) { (co: Coroutine) throws -> Void in
        print("coClose before -- delay")
        try co.delay(.milliseconds(10))
        //try co.yield()
        print("coClose after -- delay")
        channel.close()
        print("coClose -- end")}let coConsumer = CoLauncher.launch(name: "coConsumer", dispatchQueue: consumerQueue) { (co: Coroutine) throws -> Void in
        var time: Int = 1
        for item in try channel.receive(co) {
            print("consumed : \(item)  --  \(time)  --  \(Thread.current)")
            time += 1}}let coProducer01 = CoLauncher.launch(name: "coProducer01", dispatchQueue: producerQueue) { (co: Coroutine) throws -> Void in
        for time in (1.32).reversed() {
            //print("coProducer01 -- before produce : \(time)")
            try channel.send(co, time)
            print("coProducer01  --  after produce : \(time)")
            try co.delay(.milliseconds(1))}print("coProducer01 -- end")}/*let coProducer02 = CoLauncher.launch(name: "coProducer02", dispatchQueue: producerQueue) { (co: Coroutine) throws -> Void in for time in (33... 50).reversed() { //print("coProducer02 -- before produce : \(time)") try channel.send(co, time) print("coProducer02 -- after produce : \(time)") } print("coProducer02 -- end") }*/

    try coClose.join(a)try coConsumer.join(a)try coProducer01.join(a)//try coProducer02.join()

    print("channel = \(channel)")}Copy the code

The implementation of this CoChannel supports multiple producers and single consumers, which can be demonstrated by opening the comments for the example above and running it.

Technical detail 5: Implementing Semaphore based on Coroutine

Concrete implementation:

CoSemaphore.swift

Semaphores can also have non-blocking versions… 😂 😭 😳

Usage:

func example_02(a) throws {
    // Example-02
    / / = = = = = = = = = = = = = = = = = = =
    print("Example-02 =============================")

    let producerQueue = DispatchQueue(label: "producerQueue", attributes: .concurrent)
    let consumerQueue = DispatchQueue(label: "consumerQueue", attributes: .concurrent)
    let semFull = CoSemaphore(value: 8."full")
    let semEmpty = CoSemaphore(value: 0."empty")
    let semMutex = DispatchSemaphore(value: 1)
    var buffer: [Int] = []

    let coConsumer = CoLauncher.launch(dispatchQueue: consumerQueue) { (co: Coroutine) throws -> Void in
        for time in (1.32) {
            try semEmpty.wait(co)
            semMutex.wait()
            if buffer.isEmpty {
                fatalError()}let consumedItem = buffer.removeFirst()
            print("consume : \(consumedItem)  -- at \(time)   \(Thread.current)")
            semMutex.signal()
            semFull.signal()
        }
    }

    let coProducer = CoLauncher.launch(dispatchQueue: producerQueue) { (co: Coroutine) throws -> Void in
        for time in (1.32).reversed() {
            try semFull.wait(co)
            semMutex.wait()
            buffer.append(time)
            print("produced : \(time)   \(Thread.current)")
            semMutex.signal()
            semEmpty.signal()
        }
    }

    try coConsumer.join(a)try coProducer.join(a)print("finally, buffer = \(buffer)")
    print("semFull = \(semFull)")
    print("semEmpty = \(semEmpty)")}Copy the code

Like the CoChannel example above, CoChannel can’t be implemented without cosemaphore. 😜

conclusion

Swift can be used to build Web apps and currently has a Web framework similar to Vapor (full 😂 like Spring buckets).

Vapor is a framework based on the NIO implementation, and there are plans to add Coroutine features to it, implementing a Web framework similar to Kotlin/Ktor.

Coroutine can also be used for Mobile apps, Desktop apps,

As an all-dry engineer, some of the core wheels should be re-realized by themselves, and the realization of the process is to master the principle and application of the process.

Even if a new coroutine library comes along, you’ll probably be able to use it to your heart’s content just by going through the document.

Is the so-called “one law, ten thousand law”.