The topic of this article is “How to ensure UI fluency in a Single-threaded model”. This topic addresses the principles of Flutter performance, but the DART language is an extension of JS and many of the concepts and mechanisms are the same. I won’t go into details. Js is also a single-threaded model, similar to DART in terms of interface presentation and IO. Therefore, the combination of comparison to help comb and analogy, easier to grasp the topic of this paper, and lateral expansion of knowledge.

First, from the front end perspective, the event loop and event queue model are analyzed. Let’s start with the Flutter layer and talk about the relationship between the dart event queue and synchronous asynchronous tasks.

1. Design of single thread model

1. The most basic single thread handles simple tasks

Suppose there are several tasks:

  • Task 1: “Name:” + “Hangcheng Xiao Liu”
  • Task 2: “Age:” + “1995” + “02” + “20”
  • Task 3: “Size:” + (2021-1995 + 1)
  • Task 4: Print the results of tasks 1, 2, and 3

Executed in a single thread, the code might look like this:

//c
void mainThread (a) {
  string name = "Name:" + "Hangzhou Xiao Liu";
  string birthday = Age: + "1995" + "." + "20" 
  int age = 2021 - 1995 + 1;
	printf("Personal information: %s, %s, size: %d", name.c_str(), birthday.c_str(), age);
}
Copy the code

The thread starts to execute the task, and as required, the single thread executes each task in turn, and exits immediately after execution.

2. How to deal with new tasks when the thread is running?

The threading model introduced in Question 1 is too simple and ideal to start with n tasks and, in most cases, receive m new tasks. The design in Section1 would not meet that requirement.

** To be able to accept and execute new tasks while the thread is running, an event loop is required. ** The most basic event loop can be thought of as a loop.

// c++
int getInput(a) {
  int input = 0;
  cout<< "Please enter a number";
  cin>>input;
  return input;
}

void mainThread (a) {
  while(true) {
    int input1 = getInput(a);int input2 = getInput(a);int sum = input1 + input2;
    print("The sum of the two numbers is: %d", sum); }}Copy the code

Compared to the first version of the thread design, this version has the following improvements:

  • Introduced a looping mechanism where the thread does not exit immediately after finishing.
  • Events are introduced. The thread is suspended while it waits for user input. When the user finishes typing, the thread receives the input and is activated. Perform the operation of adding, and the final output result. Constantly waiting for input and calculating output.

3. Process tasks from other threads

Real-world threading modules are far from that simple. For example, in the browser environment, the thread may be drawing, it may receive an event from the user mouse click, an event from the network loading CSS resources completed, and so on. The second version of the threading model has introduced the event loop mechanism, which can accept new event tasks, but have you noticed? These tasks come from within the thread, and the design cannot accept tasks from other threads.

As can be seen from the figure above, the main render thread frequently receives some event tasks from the IO thread. When the message is received after the resource has been loaded, the render thread will start DOM parsing. When receiving a message from a mouse click, the main render thread executes the bound mouse click event script (JS) to handle the event.

Need a reasonable data structure to hold and retrieve messages sent by other threads?

Message queues are a term you’ve all heard, and event queues are a common solution in GUI systems.

Message queues (event queues) are a reasonable data structure. Tasks to be executed are added to the end of the queue, and tasks to be executed are taken from the head of the queue.

With message queues, the threading model is upgraded. As follows:

It can be seen that the transformation is divided into three steps:

  • Build a message queue
  • New tasks generated by IO threads are added to the end of the message queue
  • The render main thread iteratively reads the task from the header of the message queue and executes the task

Pseudo code. Construct the queue interface part

class TaskQueue {
  public:
  Task fetchTask (a); // Fetch a task from the queue header
  void addTask (Task task); // Insert the task at the end of the queue
}
Copy the code

Transformation of main thread

TaskQueue taskQueue;
void processTask (a);
void mainThread (a) {
  while (true) {
  	Task task = taskQueue.fetchTask(a);processTask(task); }}Copy the code

IO thread

void handleIOTask (a) {
  Task clickTask;
  taskQueue.addTask(clickTask);
}
Copy the code

Tips: Event queues are accessed by multiple threads, so they need to be locked.

4. Process tasks from other threads

In a browser environment, the rendering process often receives tasks from other processes, and the IO thread is dedicated to receiving messages from other processes. IPC specifically deals with communication across processes.

5. Task types in the message queue

There are many message types in message queues. Internal messages: such as mouse scroll, click, move, macro task, micro task, file read and write, timer, etc.

There are also a large number of page-related events in the message queue. Such as JS execution, DOM parsing, style calculation, layout calculation, CSS animation and so on.

The above events are performed in the main render thread, so be careful to code to minimize the duration of these events.

6. How to exit safely

In Chrome design, when you decide to exit the current page, the main thread of the page will set a variable to exit the flag, each time the completion of a task, to determine the flag. If set, the task is interrupted and the thread exits

7. Disadvantages of single threading

Event queues are characterized by first-in, first-out, last-in, last-out. The later task may be blocked by the previous task because it takes too long to execute, waiting for the previous task to complete before executing the later task. There are two problems with this.

  • How do you handle high-priority tasks

    Suppose you want to monitor DOM node changes (insert, delete, modify innerHTML), and then fire the corresponding logic. The most basic approach is to design a set of listening interfaces that the rendering engine calls synchronously when the DOM changes. The big problem with this is that the DOM changes very frequently. If each DOM change triggers the corresponding JS interface, the execution of this task will take a long time, resulting in the reduction of execution efficiency

    If you treat these DOM changes as asynchronous messages, say in a message queue. There may be a problem that the current DOM message will not be executed because the previous task is executing, which affects the real-time monitoring.

    How to balance efficiency and real-time performance? Microtasks solve this kind of problem.

    Generally, tasks in the message queue are called macro tasks, and each macro task contains a microtask queue. During the execution of the macro task, if the DOM changes, the changes will be added to the microtask queue of the macro task, so that the efficiency problem can be solved.

    When the main function in the macro task is completed, the rendering engine executes the microtask in the microtask queue. Therefore, the real-time problem can be solved

  • How do I solve the problem that the execution time of a single task is too long

    As you can see, if the JS computation times out and the animation paint times out, it will cause a lag. Browsers avoid this problem by using a callback design, which makes JS tasks delay execution.

The single threaded model in flutter

1. Event Loop mechanism

Dart is single-threaded, meaning the code executes in order. Furthermore, Dart, as the language for the DEVELOPMENT of the GUI framework Flutter, necessarily supports asynchrony.

A Flutter application contains one or more ISOLatets. The default method is the Main ISOLATE. An ISOLATE contains one Event loop and one Task queue. Task Queue contains an Event Queue and a MicroTask queue. As follows:

Why asynchrony? Because in most cases the application is not always doing the computation. Such as waiting for user input, input and then participate in the calculation. So this is an IO scenario. So a single thread can wait and do something else, and then do it when it really needs to do the computation. So although it is single threaded, it gives us the impression that the colleague is doing a lot of things (doing other things in his spare time).

If a task involves IO or asynchrony, the main thread will first do something else that requires computation. This action is driven by event loop. Like JS, dart stores event tasks in the role of the Event queue.

The Event Queue is responsible for storing events for tasks that need to be performed, such as reading DB.

Dart has two queues, a Microtask Queue and an Event Queue.

The Event loop performs continuous polling to determine whether the microtask queue is empty and fetch the task to be executed from the head of the queue. If the microtask queue is empty, it determines whether the event queue is empty. If it is not empty, it retrieves the event (such as keyboard, IO, network event, etc.) from the header and executes its callback function on the main thread, as follows:

2. Asynchronous tasks

Microtasks are asynchronous tasks that are completed in a short period of time. Microtasks have the highest priority in the event loop. As long as the microtask queue is not empty, the event loop continuously executes the microtask, and the subsequent tasks in the event queue continue to wait. Microtask queues can be created by scheduleMicroTask.

In general, microtasks have few usage scenarios. Microtasks are also used within Flutter for situations where high performance tasks are required, such as gesture recognition, text input, scrolling views, and saving page effects.

Therefore, in general, we use the Event Queue with a lower priority for asynchronous tasks. IO, draw, timer, etc., are all executed by the event queue driving the main thread.

Dart provides a layer of encapsulation for the Task of the Event Queue, called a Future. By putting a function body into a Future, you wrap a synchronous task into an asynchronous task (similar to how a task is submitted to a queue synchronously and asynchronously in iOS via GCD). The Future has the ability to chain calls to perform other tasks (functions) after asynchronous execution.

Look at some concrete code:

void main() {
  print('normal task 1');
  Future(() => print('Task1 Future 1'));
  print('normal task 2');
  Future(() => print('Task1 Future 2'))
      .then((value) => print("subTask 1"))
      .then((value) => print("subTask 2"));
}
//
lbp@MBP  ~/Desktop  dart index.dart
normal task 1
normal task 2
Task1 Future 1
Task1 Future 2
subTask 1
subTask 2
Copy the code

In the main method, a common synchronous task is added, and then an asynchronous task is added in the form of a Future. Dart adds the asynchronous task to the event queue and returns it. The subsequent code continues to execute as a synchronous task. Then a common synchronization task is added. Then an asynchronous task is added as a Future, and the asynchronous task is added to the event queue. In this case, there are two asynchronous tasks in the event queue. Dart takes one task from the head of the event queue and executes it synchronously. After all the tasks are executed (first-in, first-out), the subsequent THEN is executed.

Future and then share an event loop. If there are multiple THEN’s, they are executed sequentially.

Example 2:

void main() {
  Future(() => print('Task1 Future 1'));
  Future(() => print('Task1 Future 2'));

  Future(() => print('Task1 Future 3'))
      .then((_) => print('subTask 1 in Future 3'));

  Future(() => null).then((_) => print('subTask 1 in empty Future'));
}
lbp@MBP ~/Desktop  dart index. Dart Task1 Future1
Task1 Future 2
Task1 Future 3
subTask 1 in Future 3
subTask 1 in empty Future
Copy the code

Within the main method, Task 1 is added to Future 1 and Dart adds it to the Event Queue. Task 1 is added to Future 2 and Dart adds it to the Event Queue. Task 1 is added to Future 3 and Dart adds it to the Event Queue. SubTask 1 and Task 1 share the Event Queue. The task in Future 4 is empty, so the code in the then will be added to the Microtask Queue for execution in the next round of the event loop.

Comprehensive example

void main() {
  Future(() => print('Task1 Future 1'));
  Future fx = Future(() => null);
  Future(() => print("Task1 Future 3")).then((value) {
    print("subTask 1 Future 3");
    scheduleMicrotask(() => print("Microtask 1"));
  }).then((value) => print("subTask 3 Future 3"));

  Future(() => print("Task1 Future 4"))
      .then((value) => Future(() => print("sub subTask 1 Future 4")))
      .then((value) => print("sub subTask 2 Future 4"));

  Future(() => print("Task1 Future 5"));

  fx.then((value) => print("Task1 Future 2"));

  scheduleMicrotask(() => print("Microtask 2"));

  print("normal Task");
}
lbp@MBP ~/Desktop  Dart index. Dart normal Task Microtask2
Task1 Future 1
Task1 Future 2
Task1 Future 3
subTask 1 Future 3
subTask 3 Future 3
Microtask 1
Task1 Future 4
Task1 Future 5
sub subTask 1 Future 4
sub subTask 2 Future 4
Copy the code

Explanation:

  • The Event Loop first executes the main method synchronization task, then executes the microtask, and finally executes the Event Queue asynchronous task. Therefore, normal Task is executed first
  • Similarly, Microtask 2 is executed
  • Second, Event Queue FIFO, Task1 Future 1 is executed
  • The fx Future is internally empty, so the contents of the then are added to the queue of microtasks, which have the highest priority, so Task1 Future 2 is executed
  • Second, Task1 Future 3 is executed. Since there are two THEN, subTask 1 Future 3 in the first THEN is executed first, and then Microtask is encountered, so Microtask 1 is added to the Microtask queue, waiting for the next Event Loop to be triggered. Then subTask 3 Future 3 in the second THEN is executed. With the next Event Loop, Microtask 1 is executed
  • Second, Task1 Future 4 is executed. The first THEN task is then wrapped by the Future as an asynchronous task and added to the Event Queue. The second THEN is also added to the Event Queue.
  • Next, execute Task1 Future 5. This event loop ends
  • When the next round of event loop comes, sub subTask 1 Future 4 and Sub subTask 1 Future 5 are printed in the queue.

3. Asynchronous functions

The result of an asynchronous function is returned at some point in the Future, so you need to return a Future object for the caller to use. According to the requirement, the caller decides whether to register a “THEN” on the Future object and then process the Future asynchronously or synchronously until the Future execution ends. Future objects that need to wait synchronously need to add await on the call, and the function in which the Future resides needs to use the async keyword.

Await is not a synchronous wait. It is an asynchronous wait. The Event Loop will also treat the function where the call body is, as an asynchronous function, and add the whole context of the waiting statement to the Event Queue. Once returned, the Event Loop will fetch the context code from the Event Queue, and the waiting code will continue to execute.

Await blocks subsequent code execution in the current context and cannot block subsequent code execution at the top of its call stack

void main() {
  Future(() => print('Task1 Future 1'))
      .then((_) async= >await Future(() => print("subTask 1 Future 2")))
      .then((_) => print("subTask 2 Future 2"));
  Future(() => print('Task1 Future 2'));
}
lbp@MBP ~/Desktop  dart index. Dart Task1 Future1
Task1 Future 2
subTask 1 Future 2
subTask 2 Future 2
Copy the code

Resolution:

  • Task1 Future 1 is added to the Event Queue. And then we encounter the first then, which is an asynchronous task wrapped by a Future, soFuture(() => print("subTask 1 Future 2"))Is added to the Event Queue, and the await function is also added to the Event Queue. The second THEN is also added to the Event Queue
  • The ‘Task1 Future 2 ‘in the second Future is not blocked by the await because the await is an asynchronous wait (added to the Event Queue). So ‘Task1 Future 2 ‘. Then execute “subTask 1 Future 2 “, then fetch the await and execute subTask 2 Future 2

4. Isolate

Dart isolates cpu-level intensive computing to take advantage of multi-core cpus and provides a multi-threaded mechanism known as Isolate. Each Isolate has its own Event Loop, Event Queue, and Microtask Queue. Resource sharing between isolates Communicate through message mechanism (same as processes)

It’s easy to use. You need to pass one parameter when you create it.

void coding(language) {
  print("hello " + language);
}
void main() {
  Isolate.spawn(coding, "Dart");
}
lbp@MBP ~/Desktop  Dart index. Dart Hello dartCopy the code

In most cases, more than concurrent execution is required. You may also need to report the results to the main Isolate after the operation is complete. You can use SendPort to communicate messages through the Isolate. You can use a pipeline as a parameter in the main Isolate to transmit the parameters to a sub-ISOLATE. After the operation of the sub-ISOLATE is complete, the results are transmitted to the main Isolate through this pipeline

void coding(SendPort port) {
  const sum = 1 + 2;
  // Send the result to the caller
  port.send(sum);
}

void main() {
  testIsolate();
}

testIsolate() async {
  ReceivePort receivePort = ReceivePort(); // Create a pipe
  Isolate isolate = await Isolate.spawn(coding, receivePort.sendPort); // Create an Isolate and pass the send channel as the parameter
	// Listen for messages
  receivePort.listen((message) {
    print("data: $message"); receivePort.close(); isolate? .kill(priority: Isolate.immediate); isolate =null;
  });
}
lbp@MBP ~/Desktop  Dart index. Dart data:3
Copy the code

In addition, the compute function provides a shortcut to perform concurrent computing tasks. It encapsulates the creation of the Isolate and two-way communication.

In fact, there are very few scenarios where Compute can be used in business development, such as JSON codecs.

Calculate the factorial:

int testCompute() async {
  return await compute(syncCalcuateFactorial, 100);
}

int syncCalcuateFactorial(upperBounds) => upperBounds < 2
    ? upperBounds
    : upperBounds * syncCalcuateFactorial(upperBounds - 1);
Copy the code

Conclusion:

  • Dart is single-threaded, but can be asynchronous through event loops
  • Future is an encapsulation of asynchronous tasks. With await and async, we can implement non-blocking synchronous waits through event loops
  • The Isolate is a multi-threaded entity in the Dart that can implement concurrency. It has its own event loop and Queue and monopolizes resources. The ISOLates can communicate with each other in one direction through the message mechanism. These messages are asynchronously processed through the event loop of the isolates.
  • The FLUTTER provides a CPU-intensive compute method that encapsulates communication between the ISOLates internally
  • The concept of event queues, event loops is very important in GUI systems, almost everywhere in front, Flutter, iOS, Android, and even NodeJS.