I. Concept of Generator functions

Generator functions are an asynchronous programming solution provided by ES6. The Promise objects discussed earlier are also an asynchronous solution provided by ES6, so why propose a Generator?

There are advantages to using Promise objects to handle asynchrony, not least the ability to turn callback hell into chain calls to THEN. There are some drawbacks, such as asynchrony wrapped in Promise that contains a large number of Promise nouns (resolve,reject,then…). The readability is not good.

In fact, the best way to handle an asynchronous task is to operate on it as if it were a synchronous task, with the code following the asynchronous task written directly under the asynchronous rather than in a callback function or then method. The Generator function was proposed to solve this problem. How do you synchronize asynchronous operations? Imagine if we could give a function the ability to ‘pause’ execution, that is, when an asynchronous task is encountered, the state of the current context is temporarily stored, and when the asynchronous task is finished, the asynchronous result can be retrieved before continuing to execute. This is the idea of asynchronous processing for Generator functions.

How can I ‘pause’ a function? This introduces the concept of an Iterator interface

The concept of Iterator

Iterator is an interface that provides a unified access mechanism for different data structures. Any data structure that deploys the Iterator interface can be iterated.

Iterator can be thought of as a pointer object that iterates through the data structure through the next method. Each time next is called, the pointer points to the next member of the array and returns information about the current member of the data structure. This information is an object containing both value and done properties. Where, the value attribute is the value of the current member, and the done attribute is a Boolean value indicating whether the traversal is complete.

ES6 states that the Iterator interface is deployed in the symbol. Iterator property of a data structure. Calling this interface returns an Iterator object. Let’s use arrays for chestnuts

let arr = [1, 2, 3]; // return the iterator object let it = arr[symbol.iterator](); Console.log (it.next()) // {value: 1, done: false} console.log(it.next()) // {value: 2, done: false } console.log(it.next()) // { value: 3, done: false } console.log(it.next()) // { value: undefined, done: true }Copy the code

Not all data structures have an Iterator interface natively. The data structures that have Iterator interfaces natively in ES6 are as follows.

  • Array
  • Map
  • Set
  • String
  • TypedArray
  • The arguments object for the function
  • The NodeList object

Iterator interface for… The “of” loop, which means that a data structure with an Iterator interface can be used by the “for” loop. Of traversal. Otherwise, it cannot be traversed (such as object).

However, we can manually deploy the interface to data structures that do not have a native Iterator interface. In particular, you add the symbol. iterator property to it, which is a function that is called to return the iterator object. So, we use for… Of traverses it, manually calling our deployed Iterator interface. The following demonstrates deploying the Iterator interface to An Object.

const obj = { a: 'a', b: 'b', c: 'c', [Symbol.iterator]: function () { let keys = Object.keys(this); let index = 0; return { next: function () { return index < keys.length ? { value: this[keys[index++]], done: false, } : { value: undefined, done: true, }; }.bind(this), }; }}; for (const it of obj) { console.log(it) } // a // b // cCopy the code

As we know from the above discussion, the traversal will only continue down if the next method is executed. The Generator functions take advantage of this to synchronize asynchronous operations. Further, executing Generator returns an traverser object. It can traverse multiple states encapsulated within a Generator function. The following is specific analysis.

Form and basic use of Generator functions

1. Form of Generator functions.

Generator functions have two distinct characteristics that distinguish them from normal functions.

  • There is an asterisk between the function keyword and the function name.
  • Inside the body of a function, yield expressions are used to separate different parts of the state.
Function *gen(){yield 1 yield 2} let g = gen() console.log(g.ext ()) {value: 1, done: false } console.log(g.next()) // { value: 2, done: false } console.log(g.next()) // { value: undefined, done: true }Copy the code

2. Yield expression

As we can see from the above example, the yield expression is used to separate the states of a Generator function, which can be understood as a signal for the function to pause. When the next() method is executed and a yield expression is encountered, subsequent operations are paused and the yield expression’s value is used as the value of the information object returned by the next method. The next time you call the next() method, continue with the operation following the yield expression. This is important, and we will use it to manipulate asynchrony as well as synchronization.

Function *gen(){yield 1+2 yield 2+3} let g = gen() console.log(g.ext ()) {value: 3, done: false } console.log(g.next()) // { value: 5, done: false } console.log(g.next()) // { value: undefined, done: true }Copy the code

Asynchronous application of Generator functions

Now that we have seen the basic features of Generator functions, we return to the original problem of how to synchronize asynchronous operations. The requirement is to perform subsequent operations after the asynchronous operation, whereas Generator functions typically move from the current state to the next state only after the next method is executed. Therefore, we simply use yield to assign each asynchronous operation to a state, so that the function is guaranteed to pause when an asynchronous operation is encountered. At the end of each asynchronous operation, the next method is called, causing the function to continue executing, thus implementing asynchronous operation logic with synchronous operation.

To achieve this, two problems need to be solved.

1. Deliver asynchronous results

We know that asynchronous processing often requires asynchronous return results, so the first problem is how to pass the asynchronous return results.

To be clear, yield expressions do not return values (undefinded), which means that using the following method directly will not work.

function*gen(){
  const res = yield async1()
  yield async2(res)
}
Copy the code

To pass the results, we use the next method. If the next method has an input parameter, that parameter is treated as the return value of the previous yield expression.

Function *gen(){const res1 = yield 1 const res2 = yield res1+1 yield res2+2} Console. log(g.ext ()) // {value: 1, done: false} // Next method passed in 3 think res1=3 3+1=4 console.log(g.ext (3)) // {value: 4, done: false } console.log(g.next(4)) // { value: 6, done: false }Copy the code

Therefore, we simply pass the asynchronous return result to the next method

2. After the asynchrony ends, call the next method

The way we deal with asynchrony on a daily basis is with callbacks and Promsie, so there are two ways to approach this problem.

2.1 Generator asynchronous process processing based on callback functions

We simply call the next method in the asynchronous callback function to continue executing the Generator function after the asynchrony.

Const async1 = () => {setTimeout(() => {// Next method to pass async result g.ext (1); }); }; const async2 = (res) => { setTimeout(() => { console.log(res + " from async1"); g.next(2); }); }; const async3 = (res) => { setTimeout(() => { console.log(res + " from async2"); }); }; function* gen() { const res1 = yield async1(); const res2 = yield async2(res1); yield async3(res2); } // let g = gen(); g.next(); //1 from async1 //2 from async2Copy the code

In the above code, we see that the asynchronous logic processing inside the Generator functions is basically the same as the synchronous operation if yield is removed.

However, the problem with the above code is obvious, and we need to deal with each asynchronous callback. This was inefficient, as we found that we were doing exactly the same thing in the callback, executing the next method and passing in asynchronous returns. If we can take this process out and automate it. This will greatly simplify the code logic. Let’s solve these two problems in turn.

  • Extraction of callback function processing

How can I decouple the processing of callback functions?

Take the setTimeot function, which takes two arguments, a callback function and a delay time. We want to pass these two arguments in separately and handle them separately. This brings to mind the Currization function we discussed earlier. A Cremation function can transform a function that takes multiple arguments into one that takes a single argument, and return a function that takes the rest.

Using the setTimeot function as an example, if currified, we can pass in the delay time and then pass in the callback to the returned function, which accomplishes this requirement. So it looks like this

function currying(time) {
  return (cb) => {
    return setTimeout(cb, time);
  };
}
const curryTimeout = currying(500);
curryTimeout(() => {
  console.log("timeOut");
});
Copy the code

The next question is where to handle asynchronous callbacks. We know that the value property of the value returned by the next method is the result of the yield expression. Performing a Curried asynchro (currying(500) in the example above) after yield makes the value attribute of the return value of the next method a function that can be passed an asynchronous callback. So we simply pass the callback to the value property of the next method. The following is based on the above example for transformation.

function currying(time) { return (cb) => { return setTimeout(cb, time); }; } function* gen() { const res1 = yield currying(500); const res2 = yield currying(res1); yield currying(res2); } const g = gen(); g.next().value(() => { console.log("async1"); g.next(500).value(() => { console.log("async2"); g.next(500).value(() => { console.log("async3"); }); }); }); // Print async1 asynC2 async3 every 0.5 secondsCopy the code

You can see how much clearer the code logic is. It should also be noted that the so-called currified asynchrony is in fact the Thunk function. The Thunk function is a temporary function that replaces a multi-argument function with a single-argument function that takes only a callback. In the example above, the curryTimeout function is an intermediate function that takes only a callback function as an argument, namely the Thunk function. In ruan yifeng’s words: any function, as long as the parameter has a callback function, can be written in the form of the Thunk function. The above example is equivalent to manually implementing a beggar Thunk function converter. In production environments, nodeJS ‘Thunkify module is commonly used to convert the Thunkify function. The next step is to change the manual execution to automatic execution.

  • automated

A close look at the code that executes the Generator function manually returns that there is really only one thing we do, passing the same callback to the value property of the next method, and all the callback does is execute the next method and pass the asynchronous result. Based on the above, we can implement a program in which the Generator functions automatically execute according to the given logic. It simply determines the done attribute of the value returned by the next method and passes the callback to the value attribute of the next method as long as it is not true.

Using the readFileAPI of the Node.js FS module to demonstrate how to use the Thunkify module to convert an asynchronous API into a Thunk function. Prepare two text files, the content is’ wine song ‘and’ life geometry ‘.

const thunkify = require("thunkify"); const fs = require("fs"); const readFileThunk = thunkify(fs.readFile) function* gen() { yield readFileThunk("./text1.txt"); yield readFileThunk("./text2.txt"); } function run(fn) { const gen = fn(); Function next(err, data) {// Error priority callback if (data) console.log(data.tostring ()); const res = gen.next(data); if (res.done) return; res.value(next); } next(); } run(gen); // How is lifeCopy the code

As you can see, with the autoexecutor, we just handle the asynchracy inside the Generator function, passing the Generator function directly to the run function as long as the yield expression is a Thunk function.

2.2 Generator asynchronous process processing based on Promise

As you can see from the previous implementation of the callback based Generator executor, the key to automatic execution is to call the next method after the asynchron ends and let the Generator function continue. Again, you can do this using the promise.then method.

Use the above example to transform it, what we want to do is actually very simple

  • Wrap the readFile function as a Promise
  • This is done automatically using the promise.then method
const fs = require("fs"); function promisify_readFile(path) { return new Promise((resolve, reject) => { fs.readFile(path, (err, data) => { if (err) reject(err); else resolve(data); }); }); } function* gen() { yield promisify_readFile("./text1.txt"); yield promisify_readFile("./text2.txt"); } function run(fn) { const gen = fn(); function next(data) { if (data) console.log(data.toString()); const res = gen.next(data); if (res.done) return; Then (next,(r)=>console.log(r))} next(); // Res.value returns a Promise. } run(gen);Copy the code

At this point, we have basically implemented requirements like the one at the beginning of this article and implemented automatic execution, which is the core implementation principle of the famous CO module.

5. Co module and its realization principle

Co module A well-known module used for automatic execution of Generator functions. It is very simple to use, just pass the Generator function to CO and it will execute automatically.

const co = require("co"); function* gen() { const res1 = yield promisify_readFile("./text1.txt"); console.log(res1.toString()) const res2 = yield promisify_readFile("./text2.txt"); Console. log(res2.toString())} co(genCopy the code
Realize the principle of

In fact, after the discussion of automatic execution of Generator functions, we can know that the core implementation principle of CO module is an extension of the run function we implemented. Specific as follows

  • Co returns a Promise object, so add some logic to change the Promise state
  • Make sure that each step returns a Promise

The following implementation of a beggar version of co module

function co(gen) { return new Promise(function (resolve, reject) { gen = gen(); if (! gen || typeof gen.next ! == "function") return resolve(gen); function next(data) { const res = gen.next(data); if (res.done) { return resolve(res.value); } else {// Make sure that each step returns a Promise const value = promise.resolve (res.value); value.then(next, (r) => reject(r)); } } next(); }); Then (()=>console.log('end'))) // Make a noise // endCopy the code

Co module is the predecessor of async/await keyword. Async /await is known as the ultimate solution of asynchronous programming, which will be introduced in detail later.

Reference: es6.ruanyifeng.com/#docs/gener…