Js the eventloop

It took me a week to put together, my own understanding of the cycle of events

Let’s start with three important concepts

The main thread

All synchronous tasks are executed in the main thread, while asynchronous tasks may be executed in macroTask or microTask

  • Synchronization task: A task that is queued to be executed on the main thread can be executed only after the previous task is completed.
  • Asynchronous task: An asynchronous task is executed only after it can be executed without entering the main thread.

Micro Task

  • promise
  • async
  • await
  • process.nextTick(node)
  • MutationObserver (new html5 feature)

Macro Task

  • Script (whole code)
  • setTimeout
  • setInterval
  • setImmediate
  • I/O
  • UI render

A flowchart

To put it simply, an eventLoop is a loop that occurs when a single-threaded JavaScript processes an asynchronous event. Specifically, an asynchronous event is added to the event queue and suspended until the main thread is idle.

Main thread task — > micro task — > macro task if there are micro tasks in the macro task continue to execute the micro task in the macro task, if there are micro tasks in the macro task in the macro task continue in turn

Main task — > micro task — > macro task — > Micro task in macro task — > Macro task in macro task — > Until all tasks are complete My understanding is that at the same level, micro task takes precedence over macro task execution

In the task queue of the same round, the micro-tasks generated by the same micro-task will be placed after the micro-tasks of this round, and the macro tasks generated by the same micro-task will be placed after the macro tasks of this round

In the same round of task queue, the micro tasks generated by the same macro task are executed immediately, and the macro tasks generated are placed after the macro tasks of the current round

It constantly checks the Call Stack to see if there are any tasks (also called Stack frames) that need to be executed, and if not, it checks the Event Queue, pops up a task, and drops it into the Call Stack, and so on.

Simple flow chart

  1. Synchronous and asynchronous tasks go to different execution “places”, synchronous tasks go to the main thread, asynchronous tasks go to the Event Table and register functions.
  2. When the specified Event completes, the Event Table moves this function to the Event Queue.
  3. If the tasks in the main thread are empty after execution, the Event Queue will read the corresponding function and enter the main thread for execution.
  4. This process is repeated over and over again, known as an Event Loop.

Event loop flow function version

Stack and queue schematic diagram

Stack and queue schematic diagram

Why micro first and then macro

The microtask is completed before performing any other event processing, or rendering, or performing any other macro task.

This is important because it ensures that the application environment is essentially the same between microtasks (no mouse coordinate changes, no new network data, and so on).

If we want to execute a function asynchronously (after the current code), but before changes are rendered or new events are processed, we can schedule it using queueMicrotask.

Other conceptual nouns

Heap (heap)

Saved address

Objects are allocated in a heap, a computer term used to denote a large (usually unstructured) area of memory.

The stack (stack)

Last in first out (lifo) (take the elevator 🌰 The first person in the elevator comes out last, and the last person in the elevator comes out first!)

The function call forms a stack of several frames.

function foo(b) { let a = 10; return a + b + 11; } function bar(x) { let y = 3; return foo(x * y); } console.log(bar(7)); / / return 42 ` ` `Copy the code

When bar is called, the first frame is created and pushed, containing the parameters and local variables of bar. When bar calls foo, a second frame is created and pushed on top of the first frame, which contains Foo’s arguments and local variables. When foo completes and returns, the second frame is popped off the stack (the remaining bar call frame). When the bar also completes and returns, the first frame is ejected and the stack is cleared.

Queue (queue)

First in first out

When an Event in the Event Table is triggered, the Event’s callback function is pushed into the Event Queue and waits to be executed

A JavaScript runtime contains a message queue of messages to be processed. Each message is associated with a callback function that processes the message.

At some point during the event loop, the runtime processes messages in the queue starting with the first message to be queued. The processed message is removed from the queue and the function associated with it is called as an input parameter. As mentioned earlier, calling a function always creates a new stack frame for it.

Function processing continues until the stack is empty again; The event loop will then process the next message (if any) in the queue.

Event Form (event table)

The Event Table can be interpreted as a Table of Event -> callback functions

Call the Web apis to execute the function and then call back to the event queue

It is used to store a list of asynchronous events (request, setTimeout, IO, etc.) and their corresponding callback functions in Js

Web APIs

Browsers provide a variety of asynchronous Web apis, such as DOM, Times (timers), AJAX, and so on.

When we call a Web API, such as setTimeout, the setTimeout function is pushed to the top of the stack and then executed, but the setTimeout callback function is not immediately pushed to the top of the stack, but instead starts a timer task. When the timer ends, the CallBack function is pushed to the task Queue. The call of the callback function in this queue is controlled by the event loop mechanism.

Understanding: When a task is invoked, the task queue is not immediately entered. Instead, the API provided by the Web is called first and the execution result is put into the task queue

Web Workers

For long, heavy computing tasks that should not block event loops, we can use Web Workers.

This is how you run code in another parallel thread.

Web Workers can exchange messages with the main thread, but they have their own variables and event loops.

Web Workers do not have DOM access, so they are useful for computations that use multiple CPU cores at the same time.

Processes and threads

A simple explanation of processes and threads

Take an example 🌰 (test learned)

console.log('script start') async function async1() { await async2() console.log('async1 end') } async function async2()  { console.log('async2 end') } async1() setTimeout(function() { console.log('setTimeout') }, 0) new Promise(resolve => { console.log('Promise') resolve() }) .then(function() { console.log('promise1') }) .then(function() {console.log('promise2')}) console.log('script end' Script start => async2 end => Promise => script end => async1 end => promise1 => promise2 => setTimeout Async2 () does not register with the microtask until it has completed the following tasks. // script start => async2 end => Promise => script end => promise1 => promise2 => ** asynC1 end** => setTimeoutCopy the code

But this is actually against the specification, of course the specification can be changed, this is a PR of the V8 team, the new version of the print has been changed. Also has a discussion on zhihu, can take a look at www.zhihu.com/question/26…

How do I add macro and micro tasks

Schedule a new macro task:

  • Use zero delaysetTimeout(f).

It can be used to break a heavy computing task into parts so that the browser can react to user events and display the progress of the task between the parts of the task.

In addition, it is also used in event handlers to schedule an action after the event has been fully processed.

Schedule a new microtask:

  • usequeueMicrotask(f).
  • The Promise handler also passes through the microtask queue.

There is no PROCESSING of UI or network events between microtasks: they are executed immediately, one after the other.

So, we can use queueMicrotask to execute a function asynchronously while keeping the state of the environment consistent.

Analysis of setTimeout

SetTimeout (fn,0) specifies that a task will be executed at the earliest available free time on the main thread. This means that the main thread will be executed as soon as all synchronization tasks in the stack are completed and the stack is empty

setTimeout(() => {
  task()
},3000)

sleep(10000000)
Copy the code

Task () on the console takes much longer than 3 seconds to execute. As promised, the delay is 3 seconds.

At this point we need to rethink the definition of setTimeout. Let’s start by saying how the above code is executed:

  • task()Enter the Event Table and register. The timer starts.
  • performsleepFunction, very slowly, very slowly, and the timing continues.
  • Three seconds is up. Time eventtimeoutComplete,task()Enter the Event Queue, butsleepIt’s too slow. It’s not done yet. We have to wait.
  • sleepAt last it was done,task()Finally from the Event Queue into the main thread execution.

Other applications for event loops

1 Split CPU overload tasks

Suppose we have a CPU overload task.

For example, syntax highlighting (used to color the sample code on this page) is a CPU intensive task. To highlight the code, it performs analysis, creates a lot of colored elements, and then adds them to the document — which can take a long time with a large text document.

When the engine is busy with syntax highlighting, it can’t handle other DOM-related work, such as handling user events. It may even cause the browser to “hiccup” or even “hang” for an unacceptable period of time.

We can avoid this problem by breaking up large tasks into smaller ones. Highlight the first 100 lines, then schedule the next 100 lines with setTimeout (delay parameter 0), and so on.

To demonstrate this approach, for simplicity’s sake, let’s write a function that counts from 1 to 1000000000 without text highlighting.

If you run the following code, you will see the engine “hang” for a while. This is obvious to server-side JS, and if you run it in a browser and try to click another button on the page, you’ll find that no other event is processed until the count ends.

let i = 0; let start = Date.now(); Function count() {for (let j = 0; j < 1e9; j++) { i++; } alert("Done in " + (Date.now() - start) + 'ms'); } count();Copy the code

The browser may even display a “script is taking too long” warning.

Let’s split the task using nested setTimeout calls:

let i = 0; let start = Date.now(); Function count() {// do part of the heavy task (*) do {i++; } while (i % 1e6 ! = 0); if (i == 1e9) { alert("Done in " + (Date.now() - start) + 'ms'); } else { setTimeout(count); // schedule a new call (**)}} count();Copy the code

The browser interface can now be used normally during the count process.

A single execution of count completes part of the work (*) and then reschedule its own execution (**) as needed:

  1. First perform the count:i=1... 1000000.
  2. Then perform the count:i=1000001.. 2000000.
  3. … And so on.

Now, if a new side task (such as an onclick event) appears while the engine is busy executing the first part, the side task is queued and then executed at the end of the first part and before the next part begins. Periodically returning an event loop between two count executions provides enough “air” for the JavaScript engine to perform other operations in response to other user actions.

It’s worth noting that the two variants — whether or not a setTimeout is used to split tasks — are comparable in execution speed. There was little difference in the total time it took to perform the count.

Let’s make an improvement to make the two times more similar.

We will move scheduling to the beginning of count() :

let i = 0; let start = Date.now(); Function count() {if (I < 1e9-1e6) {setTimeout(count); // schedule a new call} do {i++; } while (i % 1e6 ! = 0); if (i == 1e9) { alert("Done in " + (Date.now() - start) + 'ms'); } } count();Copy the code

Now, when we start calling count() and see that we need to make more calls to count(), we schedule it immediately before work.

If you run it, you’ll easily notice that it takes significantly less time.

Why is that?

This is simple: you’ll recall that multiple nested setTimeout calls have a minimum delay of 4ms in the browser. Even if we set it to 0, it’s still 4ms (or longer). So the earlier we schedule it, the faster it will run.

Finally, we split a heavy task into parts that now don’t clog the user interface. And it doesn’t take much longer.

2 Progress Indicator

Another benefit of splitting overloaded tasks in browser scripts is that we can display progress indicators.

As mentioned earlier, changes in the DOM are drawn only after the currently running task has completed, no matter how long the task has taken to run.

On the one hand, this is great, because our function might create many elements, insert them one by one into the document, and change their styles — visitors won’t see any unfinished “in-between” content. It’s important, right?

As an example, changes to I will not be shown until the function completes, so we will only see the last value:

<div id="progress"></div>

<script> 
function count() {
  for (let i = 0; i < 1e6; i++) {
    i++;
    progress.innerHTML = i;
  }
 }
 
 count(); 
 </script>
Copy the code

… But we might also want to show something during a task, such as a progress bar.

If we use setTimeout to break the heavy task into parts, the changes will be drawn between them.

This looks even better:

<div id="progress"></div> <script> let i = 0; Function count() {// do part of the heavy task (*) do {i++; progress.innerHTML = i; } while (i % 1e3 ! = 0); if (i < 1e7) { setTimeout(count); } } count(); </script>Copy the code

Div now shows the increment of I, which is a kind of progress bar

Key differences between Node and browser eventLoop

The main difference between the two is that the microtasks in the browser are performed in each corresponding macro task, whereas the microtasks in NodeJS are performed in different stages.

conclusion

  1. Microtask queue takes precedence over macro task queue.

  2. Macro tasks created on the microtask queue are added to the end of the current macro task queue.

  3. The microtasks created in the microtask queue are added to the end of the microtask queue.

  4. As long as there are tasks in the microtask queue, the macro task queue will only wait for the completion of the microtask queue.

  5. Add all code following the await statement to the microtask list only after the await statement has been run.

  6. When an await promise is encountered, all the code following the await statement can be added to the microtask only after the await promise function is completed.

    O While waiting to await promise. then microtask:

    • Run other synchronization code;
    • Wait until the synchronized code runs, and start runningawait promise.thenMicro tasks;
    • await promise.thenAfter the microtask is complete, add all the code following the await statement to the microtask list;

The resources

Developer.mozilla.org/zh-CN/docs/…

zh.javascript.info/event-loop

www.ruanyifeng.com/blog/2013/1…

Juejin. Cn/post / 684490…