Simultaneously published on: CSDN Zhihu GitBook Blog

Writing in the front

For a long time, front-end modular management is not very mature (before webPack appeared), so early (ES5 era) in order to achieve dependency management, even use global objects such as Window for module registration. But now with webpack,rollup and other excellent packaging tools, so that the front-end development of twice the result with half the effort, so the author feel it is necessary to learn the realization of this epoch-making tool. Therefore, this article will show you how to implement a basic front-end packaging tool step by step. It is just for learning and sharing. Hopefully, after reading it, you will be able to think more about packaging tools (such as how to optimize and customize them on demand) than just knowing how to use them (just read the documentation).

Make a cup of coffee if you’re ready, because this is a long post (and it’s best to run it yourself)

The so-called packaging tool is nothing more than js, CSS and other resource files combined.

Take webpack’s start-kit as an example:

module.exports = {
  entry: path.resolve(__dirname, 'index.js'),
  output: {
    path: path.resolve(__dirname, 'dist'),
    filename: 'main.js'
  }
};

Copy the code

Configuration is simple, it just needs entry and Output (and you can even have a default).

So how does it do that? Let’s analyze a Vanilla demo.

main.js

import app from './application.js';
import config from './config/index.js';

const { appName, version } = config;
app.start(appName, version);
Copy the code

Main.js has the following features:

  • 1. A dependent

. / application. Js and. / config/index, js,

  • 2. Use ESM to introduce dependencies

import ...

  • 3. Dependencies have their own derivation

./application.js source code has export default (here to avoid verbose, not shown); Export const XXX;

  • 4. The main file itself is not exported

Based on this, the following questions can be raised:

  • 1. How to handle ESM import/export syntax? It seems that node.js and Browser do not know the import keyword.

  • 2. Assuming 1 has been solved, how can dependencies be handled as a module that supports self-exporting and external references?

  • 3. How to analyze dependence, and the dependence of dependence, and…

1 Module import and export

Let’s start with the ESM.

1.1 the ESM in the browser

In case you haven’t noticed, ESM is now available in many browsers:

index.html

<script type="module">
  import showVersion from './index.js';
  showVersion();
</script>

Copy the code

index.js

import { version } from './config.js';

export default () => {
	console.log(`====== app version: ${version} ======`);
}

Copy the code

Is it convenient?

But there are many problems: **

  • 1. File path

All of my files must be suffixed (or else I get an error), and the path must be relative to the server’s host, which greatly limits the module’s path resolution rules (alias, node_modules, etc.).

  • 2. The HTTP request

If you try the above code, you will see that this method is to request the file directly, and the request order is orderly.

This first requires a lot of HTTP requests, and implicitly requires callback order, otherwise F1 may request before F2 but come back late, f2 relies on methods from F1, and the callback will fall to the ground.

  • 3. CJS is not supported

If I had written code for the Node.js environment, I wouldn’t have been able to load it this way, let alone any other module mechanism.

1.2 the ESM in the Node. Js

By default, ESM cannot be used in node. js, but since version 13.9 and above, flag –experimental-modules can be turned on, and in package.json:

{
  "type": "module"
}
Copy the code

1.3 From ESM to CJS

As you can see, ESM compatibility between browsers and Node.js is problematic. To bridge the gap, the prevailing approach is not to use THE ESM directly in the browser, but to convert the ESM to the generic CJS first. For example, in main.js above, using Babel, the conversion is as follows:

"use strict";

//1
var _application = _interopRequireDefault(require("./application.js"));

//2
var _index = _interopRequireDefault(require("./config/index.js"));

//3
function _interopRequireDefault(obj) { 
  return obj && obj.__esModule ? obj : { default: obj }; 
}

//4
var appName = _index.default.appName,
    version = _index.default.version;
//5
_application.default.start(appName, version);
Copy the code

First, 1,2 has changed import to require.

If the module is ESM, it has an internal attribute _esModule, and the exported module is itself by default. Otherwise, encapsulate it as {default: obj}.

In the last 4 and 5, because./application.js and./config/index.js are exported by default, the default attribute needs to be taken out when using them, which is completely resolved according to the standard of 3.

At this point, it seems that most of the problem has been solved (the code has been normalized), but one problem remains:

** Main.js currently has only module imports, if there are also exports, how to implement export/import? **

It is important to clarify that require does not necessarily mean node.js require, node.js is simply an implementation of the commonJS standard.

2 Module exported general solution

As above, we use Babel to test the following ESM export transformations:

ESM:

export default A;

export const name = '';
Copy the code

After Babel:

"use strict";

Object.defineProperty(exports, "__esModule", {
  value: true
});
exports.name = exports.default = void 0;
var _default = A;
exports.default = _default;
var name = '';
exports.name = name;
Copy the code

Export default is converted to exports.default; Export const XXX to exports.xxx.

What if the code you need to package uses CJS?

CJS:

module.exports = A;

module.exports = {
  name: ''
}
Copy the code

After Babel:

module.exports = A;
module.exports = {
  name: ''
};
Copy the code

What? It’s the same. Yes, this is the current standard: since ESM support is still immature on the browser side and node.js side, translation is the process of ESM to CJS.

In this way, we can only handle commonJS1 and commonJS2 situation, namely the code module only require operation, exports. Default, exports. XXX, the module. Exports, so any file code can be in accordance with such as down packaging:

index.js

function(require, module, exports) {
  // source code
}
Copy the code

Require is there because you might need to refer to other modules, which I’ll talk about next.

Exports is equivalent to module.exports, so it essentially uses the Module object to inject the original code and eventually fetch the exported module.

At this point, we have a general-purpose import scheme that has not been analyzed, which is difficult to explain at this stage, but will be explained later.

3 Import path analysis

When importing a module, there are generally two ways to write it:

//1
import _ from 'lodash'; 

//2
import util from './util';

Copy the code

The default search rule for 1 is node_modules in the root directory of the project.

The finding rule of 2 is slightly more complicated:

/util tries to find package.json when there is no package. /util; Otherwise, the main property in package.json is looked at as an entry point.

The precondition of the above rules is that there is no packaging tool. For example, when using Webpack, the path analysis will be much more complex and involve the priority of main, Module,browser, etc. So for this reason, let’s make the rules simpler here:

  • 1. The default external dependency path is ${projectRoot}/node_modules

  • 2.. /util is equivalent to./util. Js or./util/index.js

  • 3. Any path that begins with a character represents an external dependency (util is equivalent to ${projectRoot}/node_modules/util/index.js)

So first we need to convert the relative path to the absolute path:

function buildPath(relativePath, dirname, config) {
  const { entry } = config;
  const NODE_MOUDLES_PATH = `${path.dirname(entry)}/node_modules`;

  if (relativePath === entry) {
    return relativePath;
  }

  let absPath = relativePath;
  if (/^\./.test(relativePath)) {
    absPath = path.join(dirname, relativePath);
    
  } else {
    absPath = path.join(NODE_MOUDLES_PATH, relativePath);
  }

  return revisePath(absPath);
}

Copy the code

The buildPath function tries to build an absolute path to the file based on the relative path and parent node of the current file, because we’ll need to read the file later.

function revisePath(absPath) { const ext = path.extname(absPath); if (ext) { if (EXTENSIONS.indexOf(ext) === -1) { throw new Error(`Only support bundler for (${EXTENSIONS}) file, current ext is ${ext}`) } if (fs.existsSync(absPath)) { return absPath; } } if (ext ! == '.js') { if (fs.existsSync(`${absPath}.js`)) { return `${absPath}.js`; } if (fs.existsSync(`${absPath}/index.js`)) { return `${absPath}/index.js`; } throw new Error(`Can not revise the path ${absPath}`) } //here relative path is absolute path return absPath; }Copy the code

The revisePath function simply fixes the file suffix because./util can stand for./util. Js or./util/index.js.

With that in mind, we can move on to our topic.

4 Babel

After analyzing so many ideas about module import and export, the first step in front of us is: ** how to identify a dependency in a file syntactically, and what is the dependency path? **

The steps Babel takes to translate the source code must be mentioned here.

4.1 the parse

The parse phase will parse the source code into an abstract syntax tree (AST). The AST generates nodes of the corresponding type through lexical analysis and describes the specific characteristics of each line of code in detail. For example:

index.js

//1
import app from './application.js';

//2
import config from './config/index.js';

//3
const { appName, version } = config;

//4
app.start(appName, version);

Copy the code

After parse:

You can see that 1,2 are import declarations, 3 are variable declarations, and 4 are expressions.

For example, 1 is the ImportDeclaration, indicating that it is an ESM import (require syntax is not a Node of this type), and the value in source is the relative path of the dependent module. But the tree can be deep, making it complicated to retrieve (and perhaps modify) specific information (a? .b? .c? .e? …). For this we can enter the second stage of Babel translation.

4.2 traverse by

Traverse can be easily done with AST. For example, we can traverse with ImportDeclaration:

traverse(ast, {
    ImportDeclaration({
      node
    }) {
      const relativePath = node.source.value;
      //record the dependency
    }
});

Copy the code

For CJS require, it exists in the AST on the Node of CallExpression (the following search method is definitely not the best) :


traverse(ast, {
  CallExpression({
    node
  }) {
    const {
      callee: {
        name
      },
      arguments
    } = node;

    if (name === 'require') {
      const relativePath = arguments[0].value;
      //record the dependency
    }
  }
});

Copy the code

With traverse we can also customize some syntax, such as asynchronous import for modules with asynchronous import… then:

import('./util').then(...)
Copy the code

We can define our favorite grammar sugar:

dynamicImport('./api').then( module => console.log('======successfully load dynamic moudle====='); ) ;Copy the code

Then iterate over the node where the syntax exists as above:


traverse(ast, {
  CallExpression({
    node
  }) {
    const {
      callee: {
        name
      },
      arguments
    } = node;

    if (name === 'dynamicImport') {
      const revisedPath = buildPath(relativePath, path.dirname(filename), config);
      //record the dependency
    }
  }
});

Copy the code

Similarly, we can even simulate ES7 decorators (@xxx). The API for traverse will not be covered here.

4.3 the transform

The AST is generated, the AST is iterated/modified, and ultimately Babel is still aimed at JS code. Transform can redirect our modified AST back to the code.

Preset -env is ES6->ES5. If preset-env is not set, Babel will preset to nothing.

The role of presets is superior, or rather, it is a macro rule that requires plugins for very specific code-transformation logic. For example, we used dynamicImport syntax above, and we passed through the message during traverse phase, but this was not JS syntax, so we needed to write a small plugin to convert the syntax:

DynamicImport syntax plugin/ dynamicimport.js:

module.exports = { visitor: { Identifier(path) { if (path.node.name === 'dynamicImport') { path.node.name = 'require'; }}}};Copy the code

The plugin replaces dynamicImport with require.

The translated code is then output using plug-ins and preset.

const { code } = babel.transformFromAstSync(
  ast, null, {
    plugins: [
      dynamicImportPlugin
    ],
    presets,
  }
);
Copy the code

This concludes the Babel JS translation process.

Package reference:

Const parser = require(‘@babel/parser’);

Const traverse = require(‘@babel/traverse’).default;

Const Babel = require(‘@babel/core’);

5 implementation

5.1 Generating a Resource File (Asset)

We created a very simple Vanilla project with the following structure:

Vanilla │ main. Js │ constant. Js │ application. Js └ ─ ─ ─ utils │ └ ─ ─ ─ the js └ ─ ─ ─ the config └ ─ ─ ─ the index, jsCopy the code

main.js

import app from './application.js';
import config from './config/index.js';

const { appName, version } = config;
app.start(appName, version);
Copy the code

Let’s start by analyzing dependencies.

For main.js, which is the entry to the project, this must be provided by the caller, and we expect to parse the following information from main.js:

{
  id: String,
  code: String,
  filename: String,
  dependencies: String[]
}
Copy the code

This structure is called an Asset. It consists of module id, code, filename, and dependencies. Examples of dependencies are as follows:

dependencies = ['./application.js', './config/index.js']
Copy the code

Similarly, the element in Dependencies should itself represent an Asset.

The logic for generating an Asset is as follows:

  • 1. Generate a resource ID

To keep things simple, we use the increment ID here (the first resource ID is 0).

  • 2. Read the source code

Read files directly synchronously (regardless of memory and efficiency for now).

  • 3. Generate AST

Refer to the parse phase above.

  • 4. Traverse the AST to collect dependencies

Collecting dependencies simply adds dependencies to the queue (traversing the AST will not be described here) :

const dependencies = []; traverse(ast, { ImportDeclaration({ node }) { const relativePath = node.source.value; dependencies.push(relativePath); }});Copy the code

Note that resource-dependent paths are relative paths and need to be corrected later, refer to the above correction function.

  • 5. Convert to source code

Refer to the Transform phase above

The final code is roughly as follows:

function createAsset(filename) { id++; const file = fs.readFileSync(filename, 'utf8'); const dependencies = []; const ast = parser.parse(file, { sourceType: 'module' }); traverse(ast, { ImportDeclaration({ node }) { const relativePath = node.source.value; dependencies.push(relativePath); }, CallExpression({ node }) { const { callee: { name }, arguments } = node; if (name === 'require') { const relativePath = arguments[0].value; if (/^\./.test(relativePath)) { dependencies.push(relativePath); }}}}); const { code } = babel.transformFromAstSync( ast, null, { presets, } ); return { id, filename, dependencies, code } };Copy the code

5.2 Generating a Resource Tree (Asset[])

To review our project structure again, this is a typical tree with entry file (main.js) as the root resource node, which we traverse using DFS:

function createAssets(filename) { const stack = [], assets = {}; const push = stack.push; stack.push = function(asset) { push.call(stack, asset); assets[asset.filename] = asset; } stack.push(createAsset(filename)); //DFS while(stack.length > 0) { const asset = stack.pop(); asset.mapping = {}; //1 asset.dependencies.forEach(relativePath => { const revisedPath = buildPath(relativePath, path.dirname(asset.filename), config); console.log(`Start extracting: ${revisedPath}`); const depAsset = createAsset(revisedPath); asset.mapping[relativePath] = depAsset.id; stack.push(depAsset); }); } return assets; };Copy the code

The above implementation is simple, but there is a special point to note that at 1 we attach a mapping attribute to asset.

Mapping records the dependencies of the next layer (child node of layer 1) of the current resource. Its key is the relative path of the dependent resource, and its value is the module ID. So you can use the path to reverse the ID. Such as the main. There must be in the js. / application. Js,. When we build the main js the resources, the mapping in roughly {“. / application. Js “: 1} such information. This is very useful when we need to package the main.js resource later, we can use the mapping to retrieve all its dependent modules.

5.3 Integrating All Resources

  • 1. Implement universal export

With asset, we’ve already done more than half of the work (take a deep breath). In order for assets to come together, we need to “piece” the resources together. After all, the output is a single file.

Splicing resources:

const modules = Object.values(assets)
  .reduce((result, asset) => {
      const {
        id,
        code,
        mapping,
        filename: revisedPath
      } = asset;
      
      return result += `
      ${id} : [
        function(require, module, exports) {
          ${code}
        },
        ${JSON.stringify(mapping)}
      ],
    `
    }, '');

Copy the code

For main.js, we’ll consolidate it into something like this:

Zero: [ function (require, module, exports) { // main.js here "use strict";  var _application = _interopRequireDefault(require("./application.js"));  var _index = _interopRequireDefault(require("./config/index.js"));  function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; } var appName = _index["default"].appName, version = _index["default"].version;  _application["default"].start(appName, version); }, { "./application.js": 1, "./config/index.js": 2 }, ], 1: [ function (require, module, exports) { // code resource here }, { "./utils/log.js": 4 }, ], 2: [ function (require, module, exports) { // code resource here }, { "../constant.js": 3 }, ], 3: [ function (require, module, exports) { // code resource here }, {}, ], 4: [ function (require, module, exports) { // code resource here }, {}, ], })Copy the code

Remember the generic solution for module exports mentioned above? .

We wrap a factory around all of our source code:

function (require, module, exports) {
  // code resource here
}
Copy the code

I did not change the name of the variable (require, Module, exports). If you like, The webpack-like signatures __pack_require__,__pack_module__, and __pack__exports__ can be replaced.

In the case of Main.js (require’s implementation will be covered later), we injected module and exports objects, and as long as the Main.js factory was executed, we could get the main.js code. Of course, there are some files that don’t have any export operations, but factory does (executes source code, after all), and natural Module and exports show no use.

  • 2. Implement universal import

Here we are. I don’t mean to foreshadow this, but the imported solution may be clearer if you read this.

For now, we’ll call the resources assembled in the previous step modules, and each module has its own Factory and mapping. Modules is a preliminary resource integration result, and we still have a require that is not implemented. Going back to the main.js factory above, when we execute factory, we get:

var _application = _interopRequireDefault(require("./application.js"));
Copy the code

At this point, the./application.js resource has already been created (except for special cases such as code separation), as long as we have a global scope (that’s moudles) to query.

For example, the module id corresponding to./application.js is 1, and module 1 also has its own factory. Once executed, it becomes a general export logic.

However, we need to look for a starting point, and that must be the entry file, module 0, so we have the following implementation:

(function(modules) {
    function load(id) {
      //2
      const [factory, mapping] = modules[id];
      //3
      function require(relativePath) {
        return load(mapping[relativePath]);
      }
      //4
      const module = {
        exports: {}
      }
      //5
      const result = factory(require, module, module.exports);
      if (module.exports && Object.getOwnPropertyNames(module.exports).length === 0) {
        return result;
      }
      //6
      return module.exports;
    }
    //1
    return function() {
      return load(0);
    }
  })({${modules}})
Copy the code

Step 1: Start with the root module.

Step 2: Retrieve factory and Mapping from Module.

Step 3: Customize the require method by recursively executing factory, which closes the mapping in 2

Step 4: Customize the export object, expecting a mutable operation.

Step 5: Actually execute the factory of the current module.

Step 6: Return the 4-step object.

Considering the amount of packaged code, we used a small project as a test:

main.js

const isArray = require('./util'); Const arr = [1, 2, 3]; The console. The log (` [1, 2, 3] is array: ${isArray (arr)} `);Copy the code

./util

const isArray = arr => arr instanceof Array;

module.exports = isArray;
Copy the code

After packing (slightly reduced) :

(function (modules) {
  function load(id) {
    const [factory, mapping] = modules[id];
    function require(relativePath) {
      return load(mapping[relativePath]);
    }
    const module = {
      exports: {},
    };
    const result = factory(require, module, module.exports);
    if (
      module.exports &&
      Object.getOwnPropertyNames(module.exports).length === 0
    ) {
      return result;
    }
    return module.exports;
  }
  return function () {
    return load(0);
  };
})({
  0: [
    function (require, module, exports) {
      const isArray = require("./util");

      const arr = [1, 2, 3];
      console.log(`arr is array: ${isArray(arr)}`);
    },
    { "./util": 1 },
  ],

  1: [
    function (require, module, exports) {
      const isArray = (arr) => arr instanceof Array;

      module.exports = isArray;
    },
    {},
  ],
})();

Copy the code

Execute the result in Node.js or a browser:

[1, 2, 3] is array: trueCopy the code

At this point, we are ready to package a small pure JS project, although it has a lot of problems (@_@).

6 optimization

A good packaging tool must consider a lot of details and performance, although I can not do as good as Webpack, but based on the above implementation, there are still some optimization suggestions.

  • 1. The cache

When we createA resource (createAsset method), different files may import the same resource A, so there is no need to createA resource multiple times. After all, it takes A long time to createA. So if you have a cache object, it’s a lot easier.

console.log(`Start extracting: ${revisedPath}`);

if (cache[revisedPath]) {
  asset.mapping[relativePath] = cache[revisedPath].id;
} else {
  const depAsset = createAsset(revisedPath);
  cache[revisedPath] = depAsset;
  asset.mapping[relativePath] = depAsset.id;
}
Copy the code
  • 2.umd

The bundle structure we currently package output is as follows:

(function(modules) { function load(id) { ... return module.exports; //1 } })(modules)Copy the code

Obviously the return at 1 is almost meaningless because there is no direct way to catch the result of the return. It is like when you execute a function var result = do() in your browser’s console. This method is called var wrapping.

In most cases, such as React projects, it is ok not to care about this, because the call relationship between chunks is already organized by Webpack and you do not need to manage the return value of the final package.

But sometimes when we want to be compatible with other environments or really want to get this packaged result (code separation, microfront-end separation, etc.), we need to consider a more complete compatibility solution, which is UMD.

We just need a simple encapsulation layer:

(function (root, factory) { //commonjs2 if (typeof module === "Object" && typeof exports === "Object") module.exports = factory(); //commonjs1 else if (typeof exports === "Object") exports["dummy"] = factory(); else root["umd-test"] = factory(); })(window, (function(modules) { function load(id) { ... return module.exports; //1 } })(modules)) )Copy the code

Ps: I’m ignoring AMD here.

  1. The configuration file

It is necessary to extract some important information. Here, the author refers to webpack, ˙ and extracts pack.config.js file. The basic configuration is as follows:

module.exports = {
  entry: './main.js',
  output: {
    filename: 'chunck.js',
    library: 'umd-test',
    libraryTarget: 'umd'  
  },
  presets: [
    '@babel/preset-env'
  ]
}
Copy the code
  1. The separation

I’ve mentioned dynamicImport several times, but I’ve taken a lazy approach to code separation.

First of all, the separation of the code must be configured with the name of the Library (no need to explain), and while walking through the AST, I keep track of which resources need to be loaded asynchronously.

const dynamicDeps = {}; if (name === 'dynamicImport') { const revisedPath = buildPath(relativePath, path.dirname(filename), config); dynamicDeps[revisedPath] = ''; } traverse(ast, { ... if (name === 'dynamicImport') { const revisedPath = buildPath(relativePath, path.dirname(filename), config); dynamicDeps[revisedPath] = ''; }... });Copy the code

The key in dynamicDeps indicates that the resource of this path is an asynchronous module. Since I need to modify it in the final stage, I have not set its value for the moment.

const {
  code
} = babel.transformFromAstSync(
  ast,
  null, {
    plugins: [
      dynamicImportPlugin
    ],
    presets,
  }
);

Copy the code

This way I can output with the normal module during the packaging phase:

function bundle(assets) {
  ...
  if (dynamicDeps.hasOwnProperty(revisedPath)) {
    //code split here:
    //1.assume that the dynamic module does't have mapping
    //2.and not allowed to import the same moudle in other place
    asyncModules.push({
      prefix: `${id}.`,
      content: buildDynamicFactory(id, assets[revisedPath].code)
    });
    return result;
  }

  return [
    ...normalModules,
    ...asyncModules,
  ]
}

Copy the code

Here I also use buildDynamicFactory, which implements a jSONp promise:

buildDynamicFactory: function (id, code) {
    return `(self || this)['jsonpArray']['${id}'] = function(require, module, exports) {
      ${code}  
    }`
  }
Copy the code

Whenever an asynchronous module is invoked, its Factory is registered with the global jsonpArray object, with key being the module ID and value being the module factory.

But before that, I need to know how to find the corresponding asynchronous module. Recall the load function above, I need to modify as follows:


function load(id) {
  if (!modules[id]) {
    return (function (id) {
      window["jsonpArray"] = window["jsonpArray"] || {};
      const script = document.createElement("script");
      script.src = `/dist/${id}.chunck.js`;
      document.body.appendChild(script);
      return new Promise(function (res, rej) {
        script.onload = function () {
          const factory = window["jsonpArray"][id];
          const module = {
            exports: {},
          };
          factory(null, module, module.exports);
          res(module.exports);
        };
      });
    })(id);
  }
  
  ...

  return module.exports;
}

Copy the code

When I can’t find the corresponding module in modules, the module is an asynchronous module. I will encapsulate a promise, and when the request is sent to the asynchronous module, it will complete the registration itself. I just need to asynchronously return the result after the factory execution.

For example, if the source code is:

dynamicImport('./api')
    .then(res => console.log(`dynamic module response: ${JSON.stringify(res.default())}`));
Copy the code

Would be translated as:

require('./api')
    .then(res => console.log(`dynamic module response: ${JSON.stringify(res.default())}`));
Copy the code

Note: at this point the require will return a promise (see the modifications to the load function above).

You may be used to writing ‘import(‘ XXX ‘).then’ and think this is weird, but you can use your own syntax to make things feel better.

  1. other
  • Implement more complex path resolution

Here I just implemented simple parsing logic to find index.js and complete file suffixes. A better solution would support custom node_modules, custom aliases, etc.

  • Support JSON import

For plain text type files (.yml,.txt,.md,.json…) Consider late support.

  • Support CSS/SCSS

Here I only deal with JS files by default, like CSS, SCSS, even TS etc can be considered support.

7 release

To make this little wheel more engineering, I also published a simple CLI:


npm i -g @dummmy/pack-cli

cd your-project

touch pack.config.js

pack

Copy the code

8 the source code

Please click here for the source code