Introduction to the

IOS provides technologies that allow you to perform any task asynchronously without having to manage threads yourself. One of the techniques for asynchronously starting tasks is Grand Central Dispatch (GCD). This technique uses threads to manage code and move that code to the system level. All you have to do is define the tasks to perform and add them to the appropriate dispatch queue. GCD is responsible for creating the required threads and scheduling tasks to run on them. Since thread management is now part of the system, GCD provides an overall approach to task management > and execution that provides greater efficiency than traditional threads.

The OperationQueue (NSOperationQueue, API class name) is an Objective-C object that encapsulates the GCD. It acts much like a dispatch queue. You define the tasks to execute and then add them to the OperationQueue, which handles the scheduling and execution of these tasks. Like GCD, OperationQueue handles all thread management for you, ensuring that tasks are executed as quickly and efficiently as possible on the system.

Included: www.cocoachina.com/articles/90…

Comparison between GCD and OperationQueue

The core ideas

  • Core concepts of the GCD:
    • Adds a task (block) to a queue and specifies the function to execute the task.
  • Core concepts of NSOperation:
    • Add an operation (asynchronously) to a queue.

The difference between

  • The GCD:

    • Adds a task (block) to a queue (serial/concurrent/main queue) and specifies the function (synchronous/asynchronous) for the task to execute
    • GCD is a low-level C API
    • Concurrent technology for multi-core processors in iOS 4.0
    • The tasks performed in the queue are made up of blocks, which are lightweight data structures
    • It takes complex code to stop blocks that are already in a queue
    • Dependencies between tasks need to be set via a Barrier (dispatch_barrier_Async) or synchronization task
    • Only queue priorities can be set
    • Advanced features:
      • Dispatch_once_t (one-time execution, multi-threaded safety);
      • Dispatch_after (delay); Dispatch_group (dispatch group); Dispatch_semaphore;
      - dispatch_apply (optimization sequence is not sensitive to large volume for loops);Copy the code
  • OperationQueue:

    • OC framework, more object oriented, is the encapsulation of GCD.

    • After the launch of iOS 2.0 and the launch of GCD by Apple, the bottom layer of NSOperation has been completely rewritten.

    • You can set the overall QOS of the QOS () queue for each operation in the queue

    • As an object, Operation gives us more options:

      • AddDependency, which sets the dependencies of operations across queues;
      • QualityOfService (iOS8+);
      • Completion callback (void (^completionBlock)(void)
    • QualityOfService (iOS8+);

    • Maximum number of concurrent operation (maxConcurrentOperationCount), the GCD is not easy to achieve; Suspended/suspended;

    • CancelAllOperations (cancelAllOperations);

    • KVO listens to queue progress (iOS13+);

 

GCD

From GCD often meet questions to the underlying source analysis, take you in-depth understanding of different GCD 1, GCD often meet questions analysis 2, GCD note (deadlock) 3, GCD underlying principles (queue analysis)

Click here:

IOS Multithreading – “GCD”

The queue

Serial Queues

Tasks in a serial queue are executed sequentially; But there are no constraints between different serial queues; When multiple serial queues are executed simultaneously, tasks in different queues are executed concurrently.

For example: train station tickets can have more than one ticket, but each row of the queue is serial queue, the whole concurrent, single line serial.

Note pit prevention: the location where the serial queue is created.

For example, in the following code example:

  • When created inside a for loop, each loop creates a new serial queue with only one task in it. Multiple serial queues result in an overall concurrent effect.

To have a serial effect, you must create a serial queue outside of the for loop.

Serial queues are suitable for managing shared resources. Ensure sequential access and eliminate resource competition.

Example code:

private func serialExcuteByGCD(){ let lArr : [UIImageView] = [imageView1, imageView2, imageView3, imageView4] Only a child thread let serialQ = DispatchQueue. Init (label: "com.com panyName. Serial. DownImage") for I in 0.. < larr. count{let lImgV = lArr[I] // empty old image limgv. image = nil // The location where the serial queues are created. When this is created, each loop is a new serial queue with only one task in it. Multiple serial queues have an overall parallel effect. // let serialQ = DispatchQueue.init(label: "Com.com panyName. Serial. DownImage") serialQ. Async {print (" the first \ (I) a beginning, %@",Thread.current) Downloader.downloadImageWithURLStr(urlStr: ImageURLs [I]) {(img) in let lImgV = lArr[I] print(" dispatchqueue.main.async ") {print(" imageURLs[I] ") Limgv. image = img} if nil == img{print(" \(I +1) img is nil")}}}}}Copy the code

Concurrent Queues

The concurrent queue is still guaranteed to start tasks in the order they were added (FIFO), but there is no way to know the order of execution, the duration of execution, and the number of tasks at a given time. After pressing FIFO to start, they will not wait for each other.

For example: tasks #1, #2, #3 are submitted to the concurrent queue, starting in the order #1, #2, #3. #2 and #3, though starting later than #1, may end earlier than #1. Task execution is determined by the system, so the execution duration and end time cannot be determined.

When concurrent queues are needed, it is highly recommended to use one of the four global queues that come with the system. However, when you need to use a barrier to fence tasks in a queue, you can only use custom concurrent queues.

Contrast: barrier vs. lock

  • A barrier depends on a custom concurrent queue, while a lock operation depends on a thread.
  • A barrier acts as a barrier in a customized concurrent queue. Locking prevents resource contention when multiple threads operate.

Example code:

private func concurrentExcuteByGCD(){ let lArr : [UIImageView] = [imageView1, imageView2, imageView3, imageView4] for i in 0.. < larr.count {let lImgV = lArr[I] // empty the old image limgv. image = nil Better performance. let lConQ = DispatchQueue.init(label: "cusQueue", qos: .background, attributes: . The concurrent) lConQ. Async {print (" the first \ (I) a beginning, % @ ", Thread. The current) Downloader. DownloadImageWithURLStr (urlStr: ImageURLs [I]) {(img) in let lImgV = lArr[I] print(" ") dispatchqueue.main.async {limgv. image = img} if nil == Img {print(" \(I +1) img is nil")}}}}}Copy the code

Comparison of serial and concurrent queues

Matters needing attention

  • Whether serial or concurrent queue, is FIFO;

Generally creating tasks (blocks) and adding tasks to queues are on the main thread, but task execution is usually on other threads (ASYC). When you need to refresh the UI, if you are no longer on the main thread, you need to switch back to the main thread. When you are not sure whether the current thread is in the main thread, you can use the following code:

/** Submits a block for asynchronous execution on a main queue and returns immediately. */ static inline void dispatch_async_on_main_queue(void (^block)()) { if (NSThread.isMainThread) { block(); } else { dispatch_async(dispatch_get_main_queue(), block); }}Copy the code
  • The main queue is a serial queue, and only one task can be executed at each time point. Therefore, if time-consuming operations are placed in the main queue, the interface will lag.

  • The system provides a serial main queue and four global queues with different priorities. When the global queue is obtained using the dispatch_get_global_queue method, the first parameter has four types:

    • DISPATCH_QUEUE_PRIORITY_HIGH
    • DISPATCH_QUEUE_PRIORITY_DEFAULT
    • DISPATCH_QUEUE_PRIORITY_LOW
    • DISPATCH_QUEUE_PRIORITY_BACKGROUND
  • When a serial queue is executed asynchronously, it also takes time to switch to the main thread to brush the UI, and the instruction may be in the next loop before the switch is complete. But it looks like the images are downloaded and displayed sequentially, because it takes time for each image to be displayed on the main thread. See the Demo example.

  • After iOS8, if you want to add tasks that can be cancelled, you can use the DispatchWorkItem class, which has the cancel method.

  • You should avoid creating a large number of serial queues, and if you want to execute a large number of tasks concurrently, commit them to one of the global concurrent queues. When you create serial queues, try to identify a purpose for each queue, such as protecting resources or synchronizing some key behavior of the application (such as logic that a Bluetooth detection result needs to be processed in order).

Block

What’s the difference between stacks? Why is it often asked in interviews?

  • Can heap blocks capture variables? Can stack blocks capture variables?
  • What is the difference between the use of the stack Block on the heap?
  • Does __weak really solve circular references?
  • How to understand the underlying structure of a Block?

Click here:

Block underlying principles and LLDB Plugin

Scheduling queues copy blocks added to them and release blocks when execution is complete. Although queues are more efficient than raw threads for performing small tasks, there is still an overhead involved in creating blocks and executing them on the queue. If a block takes too little work to execute, it may be much cheaper to execute it inline than to dispatch it to a queue. The way to determine if a block is too little work is to use a performance tool to collect metrics for each path and compare them. You might want to include some of the code for the block in @Autoreleasepool to handle memory management for these objects. Although GCD scheduling queues have their own automatic release pools, they do not guarantee when these pools will run out. If your application is memory-limited, creating your own automatic release pool allows you to release the memory of automatically freed objects at more regular intervals.

dispatch_after

The dispatch_after function does not start processing after a specified time, but rather appends tasks to the queue after a specified time. The timing is not infallible.

Example code:

dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(2 * NSEC_PER_SEC)), dispatch_get_main_queue(), ^{NSLog(@" execute after 2s "); });Copy the code

dispatch_semaphore

It is non-thread-safe when multiple threads access mutable variables.

The program may crash. In this case, you can use semaphore technology to ensure that when multiple threads process a piece of code, the following threads wait for the previous threads to execute, ensuring the security of multiple threads. Use two methods: one is “wait” (dispatch_semaphore_WAIT), the other is “signal” (dispatch_semaphore_signal).

dispatch_apply

A loop can be replaced with a dispatch_apply function when the work performed in each iteration is different from the work performed in all other iterations, and the order in which each loop is completed is not important.

Note: After the replacement, the dispatch_apply function is executed synchronously as a whole. The execution type of internal blocks (serial/concurrent) is determined by the queue type, but serial queues are deadlocked and concurrent queues are recommended.

The original cycle:

for (i = 0; i < count; i++) {
   printf("%u\n",i);
}
printf("done");
Copy the code

After the optimization:

dispatch_queue_t queue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0); // Count is the total number of iterations. dispatch_apply(count, queue, ^(size_t i) { printf("%u\n",i); }); // Also called after the end of the above loop. printf("done");Copy the code

You should ensure that your task code does a reasonable amount of work in each iteration.

As with any block or function you dispatch to a queue, there is an overhead involved in scheduling this code for execution. If each iteration of the loop performs only a small amount of work, the overhead of scheduling the code may outweigh the potential performance advantage of dispatching it to a queue. If you find this to be true during testing, you can use steps to increase the amount of work performed during each iteration of the loop. By striding forward, you can consolidate multiple iterations of the original loop into a single block and scale down the number of iterations. For example, if you initially performed 100 iterations, but decided to use an iteration of step 4, you now perform four loop iterations from each block for a total of 25 iterations.

Ask yourself the answer

  • Can different tasks in a queue be executed in multiple threads?

    • Serial queue, asynchronous execution, only one child thread open;
    • It doesn’t matter that multiple threads execute;
    • Concurrent queue, asynchronous execution, will automatically open multiple threads, multiple threads can concurrently execute different tasks.
  • Can a thread execute multiple queue tasks simultaneously?

    • A thread can execute only one task at a time, and may execute tasks (if any) from other queues.

For example, the main thread may execute tasks that are not in the main queue in addition to tasks in the main queue.

Sample diagram of queue and thread relationship:

  • What is the difference between qualityOfService and queuePriority?
    • qualityOfService:      
      • Used to indicate the operation priority when access to system resources, the default value: NSQualityOfServiceBackground, we can according to the need for operation optimization of different level, such as the highest level of optimization: NSQualityOfServiceUserInteractive.
    • queuePriority:      
      • This parameter is used to set the relative optimization level of operations in the operationQueue. Operations with higher optimization levels in the same queue (isReady is YES) are executed first.

It is important to distinguish between qualityOfService (the priority at the system level for operations to obtain resources from other threads) and queuePriority (the optimization level between operations in the same queue). Note the difference between Dependencies (which strictly controls the execution order) and queuePriority (which is the relative priority within a queue).

  • After a dependency is added, when the network request task in the queue has a dependency relationship, shall the data returned or the request initiated prevail when the task ends?
    • Based on the initiated request.

OperationQueue

NSOperation NSOperation is an “abstract class” that you can’t use directly.

The purpose of an abstract class is to define properties and methods common to subclasses. NSOperation is an object oriented encapsulation based on GCD. GCD is simpler to use and provides some features that are not well implemented with GCD. It’s a concurrency technology recommended by Apple.

It has two subclasses:

  • NSInvocationOperation
  • NSBlockOperation is commonly used. The code is simple, and because of the closure, it does not pass parameters. Tasks are encapsulated in NSOperation subclass instance objects. An NSOperation subclass object can add multiple task blocks and an execution completion block. When all associated blocks have been executed, the operation is considered finished.
  • NSOperationQueue OperationQueue is also an advanced encapsulation of the GCD, which is more object-oriented and can achieve some of the effects that the GCD is not convenient to implement. Operations added to queues are performed asynchronously by default.

PS: Common abstract classes are:

  • UIGestureRecognizer

  • CAAnimation

  • CAPropertyAnimation

    Non-fifo effects can be achieved

Non-fifo effects can be achieved by setting dependencies, or priorities, for different operations.

Example code:

func testDepedence(){ let op0 = BlockOperation.init { print("op0") } let op1 = BlockOperation.init { print("op1") } let op2 = BlockOperation.init { print("op2") } let op3 = BlockOperation.init { print("op3") } let op4 = BlockOperation.init { print("op4") } op0.addDependency(op1) op1.addDependency(op2) op0.queuePriority = .veryHigh op1.queuePriority = .normal  op2.queuePriority = .veryLow op3.queuePriority = .low op4.queuePriority = .veryHigh gOpeQueue.addOperations([op0, op1, op2, op3, op4], waitUntilFinished: false) }Copy the code

Note: When there is no dependency between operations, it is executed according to priority; When there are dependencies, execute in order of dependencies (the order of execution of a set of dependencies is uncertain compared to other tasks with no dependencies)

Queue pause/resume

By assigning a value to the isSuspended property of the queue, the suspended and continued effects of the unexecuted tasks in the queue can be achieved. The ongoing tasks are not affected.

// Pause the queue, only for tasks that are not executing. The effect on serial queues is obvious in this example. Concurrent queues can easily start executing together, and even a hang cannot affect tasks that are already in the executing state. @IBAction func pauseQueueItemDC(_ sender: Any) {gopequeue.issuspended = true} @ibAction Func resumeQueueItemDC(_ sender: Any) { gOpeQueue.isSuspended = false }Copy the code

Cancel the operation

  • Once added to an operation queue, the operation object actually belongs to the queue and cannot be deleted. The only way to cancel an operation is to cancel it. A single operation object can be cancelled by calling the cancel method of a single operation object, or all operation objects in a queue can be cancelled by calling the cancelAllOperations method of the queue object.
  • It is more common to cancel all queue operations in response to some important event, such as an application exit or a user specific request for cancellation, rather than canceling operations selectively.

Cancels a single operation object

When canceling, “cancel,” there are three cases:

1. The operation is waiting in the queue. In this case, the operation will not be executed. 2. If the operation is being performed, the system will not forcibly stop the operation, but the cancelled property will be set to true.

3. The operation is complete. At this point, cancel has no effect.

Cancels all operation objects in the queue

Method: cancelAllOperations. This will also only apply to tasks that are not performed.

Demo code:

    deinit {
        gOpeQueue.cancelAllOperations()
        print("die:%@",self)
    }
Copy the code

Ask yourself the answer

  • By setting the operation dependencies, you can achieve the non-FIFO specified order effect. Can you specify the order effect by setting the maximum concurrency to 1?

Can’t be! After the maximum number of concurrent operations is set to 1, although only one operation is executed at each time point, the execution order of the operations is still based on other factors, such as the dependence of the operations and the priority of the operations (the dependency is higher than the priority, that is, the order is sorted by the dependency;

Order by priority only when there are no dependencies). Therefore, the serialized operation queue does not provide exactly the same behavior as the sequence dispatch queue in the GCD. If the order in which the action objects are executed is important to you, you should either use dependencies to establish the order before adding the action to the queue, or use GCD’s serial queue for serialization instead.

  • Why not use [weak self] or [unowned self] in the block of an Operation Queue?

Even if the queue object is global, self -> queue -> operation block -> self does cause circular references.

However, when the operation in the queue is completed, the queue automatically releases the operation and automatically dereferences the circular reference. So you don’t have to use [weak self] or [unowned self]. In addition, this kind of circular reference can be very useful in situations where you can let an operation do its job automatically without having to hold any additional objects. For example, during the download process, if you exit the interface with circular reference, you can continue to execute the rest of the queue downloading task if you do not execute the cancelAllOperation method.

func addOperation(_ op: Operation) Discussion: Once added, the specified operation remains in the queue until it finishes executing. Declaration

func addOperation(_ block: @escaping () -> Void) Parameters block The block to execute from the operation. The block takes no parameters and has no return value. Discussion This method adds a single block to the receiver by first wrapping it in an operation object. You should not attempt to get a reference to the newly created operation object or determine its type information.

  • What is the relationship between the QOS of the operation and the QOS of the queue?

After QOS Settings are set for a queue, operations with lower priorities are automatically promoted to the same priority as those in the queue. (The priority of the higher priority operation remains the same).

If the priority of subsequent operations added to the queue is lower than that of the queue, the operations are automatically promoted to the same priority as those of the queue. Pay attention to, This property specifies the service level applied to operation objects added to the queue. If the operation object has an explicit service level set, that value is used instead.

Q&A

How to solve the resource competition problem

Resource contention can lead to data exceptions, deadlocks, and even crashes due to access to wild Pointers.

  • For tasks with obvious sequential dependencies, the best solution is a GCD serial queue, which ensures mutual exclusion of resources when not using thread locks.
  • In other cases, code with resource contention is locked or semaphore is used (the initial parameter is 1, indicating that only one thread is allowed to access the resource).
  • When a serial queue is executed synchronously, a deadlock will occur if tasks are waiting for each other.

For example, when a task is executed synchronously on the main thread, a deadlock occurs because the task and the previously unexecuted task that joined the main queue wait for each other.

Func testDeadLock(){// The main queue executes synchronously, resulting in a deadlock. The block needs to wait for testDeadLock to execute, and the main queue calls synchronously, making other tasks have to wait for the block to execute. So there's a mutual wait, and then there's a deadlock. DispatchQueue.main.sync { print("main block") } print("2") }Copy the code

However, the following code does not deadlock, so synchronizing tasks in a serial queue does not necessarily deadlock.

- (void)testSynSerialQueue{
    dispatch_queue_t myCustomQueue;
    myCustomQueue = dispatch_queue_create("com.example.MyCustomQueue", NULL);

    dispatch_async(myCustomQueue, ^{
        printf("Do some work here.\n");
    });

    printf("The first block may or may not have run.\n");

    dispatch_sync(myCustomQueue, ^{
        printf("Do some more work here.\n");
    });
    printf("Both blocks have completed.\n");
}
Copy the code

How can I make my code more efficient

“Legend of West Cake”

Code design priority: System method > parallel > serial > lock, abbreviated as: cake legend

  • Rely as much as possible on the system framework. The best way to achieve concurrency is to take advantage of the built-in concurrency provided by the system framework.
  • Identify a series of tasks early and make them as parallel as possible. If a task must be performed consecutively because it depends on a shared resource, consider changing the architecture to remove the shared resource. You can consider making a copy of the resource for each client that needs it, or eliminating the resource entirely.
  • Instead of using locks to protect some shared resource, you specify a serial queue (or use action object dependencies) to execute tasks in the correct order.
  • Avoid using locks. GCD scheduling queues and operation queues provide support so that locking is not necessary in most cases.

Glossary of terms

  • Asynchronous Tasks:
    • Started by one thread, but actually running on another, taking advantage of the extra processor resources to get the work done faster.
  • Mutex:
    • A lock that provides mutually exclusive access to a shared resource. A mutex can only be held by one thread at a time. Attempting to acquire a mutex held by a different thread puts the current thread to sleep until it finally acquires the lock.
  • Process:
    • A runtime instance of an application or program. Processes have their own virtual memory space and system resources (including port permissions) that are independent of those allocated to other programs. A process always contains at least one thread (the main thread) and may contain any number of other threads.
  • Semaphore:
    • A protected variable that restricts access to shared resources. Mutexes and conditions are different types of semaphores.
  • Task: indicates the amount of work to be performed.
  • Thread:
    • The flow of execution in a process. Each thread has its own stack space, but otherwise shares memory with other threads in the same process.
  • Run loop:
    • An event handling loop that receives events and dispatches them to the appropriate handler.

IOS Hot corpus & Video analysis

1) Swift

② Underlying iOS technology

③iOS reverse protection

④iOS interview collection

⑤ Daco interview questions + underlying technology + reverse security +Swift

Remember to like friends oh ~

Collection is equal to white prostitute, praise is true love ღ(´ · ᴗ · ‘)ღ