Webpack is now a mainstream and powerful modular packaging tool. When using Webpack, if you do not pay attention to performance optimization, it is very likely to produce performance problems. The performance problems are mainly divided into slow packaging construction during development, repetitive work during development and debugging, and low quality of output files. Therefore, performance optimization is mainly analyzed from these aspects. This article is a summary of the book “Webpack in Easy Form” based on my own understanding. It covers most of the optimization methods and can be used as a reference and checklist when optimizing Webpack performance. Based on Version 3.4 of Webpack, this article requires you to be familiar with the basic use of Webpack and should take about 30 minutes to complete.

by MaryTien from supermaryy.com

First, optimize the construction speed

After starting, Webpack recursively parses dependent files based on the Entry configuration. This process consists of searching for files and analyzing and converting matching files. Therefore, you can optimize the configuration from these two perspectives.

1.1 Narrow the Search scope

Search process optimization methods include:

  1. resolveThe resolve field tells WebPack how to search for files, so pay attention to the configuration of the resolve field first:

    1. Set resolve.modules:[path.resolve(__dirname, ‘node_modules’)] to avoid layers of lookup.

      Modules tells webPack which directories to look for third-party modules. The default value is [‘node_modules’] and it will look in order./node_modules,.. / node_modules,.. /.. / node_modules.

    2. Set resolve.mainfields :[‘main’] to as few values as possible to reduce the search steps for the entry file

      MainFields defines which entry file to use for a third-party module. Since most third-party modules use the main field to describe the location of the entry file, you can set a single main value to reduce searching

    3. Set resolde. alias for large third-party modules to enable WebPack to use the library’s min file directly, avoiding library parsing

      For example, react:

      resolve.alias:{
      	'react':patch.resolve(__dirname, './node_modules/react/dist/react.min.js')}Copy the code

      This affects tree-shaking, which is appropriate for more holistic libraries, and avoids using tree-shaking for decentralized libraries such as Lodash.

    4. Configure resolve.extensions properly to reduce file searches

      Default: Extensions :[‘.js’, ‘.json’]. When the import statement does not contain file suffixes, Webpack will look for files based on the list of suffixes defined by Extensions, so:

      • Keep list values to a minimum
      • The suffix for the file type with high frequency is written first
      • Import statements in the source code should include file suffixes whenever possible, such asrequire(./data)To be writtenrequire(./data.json)
  2. module.noParseFields tell Webpack which files to parse and can be used to exclude parsing of non-modular library files

    If you configure react.min.js with resolve.alias, you should also exclude parsing because react.min.js is built as an unmodular file that can run directly in the browser. The noParse value can be RegExp, [RegExp], or function

    module:{ noParse:[/jquery|chartjs/, /react\.min\.js$/] }

  3. When you configure the Loader, use test, exclude, and include to narrow the search scope

1.2 Use DllPlugin to reduce the compilation times of basic modules

DllPlugin is a dynamic link library plug-in. Its principle is to extract the basic module that the web page depends on and package it into A DLL file. When the module that needs to be imported exists in a DLL, the module is no longer packaged, but is obtained from the DLL. ** Why does it increase build speed? ** The reason is that most DLLS contain commonly used third-party modules, such as React and React-dom, so as long as the version of these modules is not upgraded, it only needs to be compiled once. I think this has the same effect as configuring resolve.alias and module.noParse.

Usage:

  1. Use DllPlugin to configure a webpack_dll.config.js to build the DLL file:

    // webpack_dll.config.js
    const path = require('path');
    const DllPlugin = require('webpack/lib/DllPlugin');
    module.exports = {
     entry: {react: ['react'.'react-dom'].polyfill: ['core-js/fn/promise'.'whatwg-fetch']},output: {filename:'[name].dll.js'.path:path.resolve(__dirname, 'dist'),
         library:'_dll_[name]'.// The name of the DLL's global variable
     },
     plugins: [new DllPlugin({
             name:'_dll_[name]'.// The name of the DLL's global variable
             path:path.join(__dirname,'dist'.'[name].manifest.json'),// Describe the generated manifest file}})]Copy the code

    Note that the name value of the DllPlugin argument must be the same as the output.library value, and the output manifest file will reference the output.library value.

    The resulting file is:

    | - polyfill. DLL. Js | -- polyfill. Manifest. Json | -- the react. DLL. Js └ ─ ─ the react. The manifest. JsonCopy the code

    Xx.dll. js contains a package of n modules, which are stored in an array with the array index as the ID. These modules are exposed globally through a variable assumed to be _xx_DLL. These modules can be accessed through window._xx_DLL. The xx.manifest.json file describes which modules the DLL file contains, and the path and ID of each module. Then use the DllReferencePlugin to introduce the xx.manifest. Json file in the main config file of the project.

  2. Use the DllReferencePlugin in the main config file to introduce the xx.manifest.json file:

    //webpack.config.json
    const path = require('path');
    const DllReferencePlugin = require('webpack/lib/DllReferencePlugin');
    module.exports = {
        entry: {main:'./main.js' },
        / /... Omit the configuration of output and loader
        plugins:[
            new DllReferencePlugin({
                manifest:require('./dist/react.manifest.json')}),new DllReferenctPlugin({
                manifest:require('./dist/polyfill.manifest.json')]}})Copy the code

    The final build generates main.js

1.3 Use HappyPack to enable multi-process Loader conversion

Webpack, which runs on Node.js, has a single-threaded model, which means it can only process one file at a time, not in parallel. The HappyPack can split tasks among multiple sub-processes and send the results to the main process. JS is a single-threaded model and can only improve performance in this multi-process way.

The HappyPack command is as follows:

npm i -D happypack
// webpack.config.json
const path = require('path');
const HappyPack = require('happypack');

module.exports = {
    / /...
    module: {rules: [{test:/\.js$/To use:'happypack/loader? id=babel']
                exclude:path.resolve(__dirname, 'node_modules')
            },{
                test:/\.css/.use: ['happypack/loader? id=css']}],plugins: [new HappyPack({
                id:'babel'.loaders: ['babel-loader? cacheDirectory']}),new HappyPack({
                id:'css'.loaders: ['css-loader']})]}}Copy the code

In addition to ID and loaders, HappyPack supports three parameters: Threads, verbose, and threadpool. Threadpool represents a shared process pool. That is, multiple HappyPack instances use the same process pool to process tasks to avoid excessive resource usage.

1.4 Enable multi-process JS file compression using ParallelUglifyPlugin

When using UglifyJS plug-in to compress JS code, it is necessary to parse the code into the AST (abstract syntax tree) represented by Object, and then apply various rules to analyze and process the AST. Therefore, this process is time-consuming and computative. ParallelUglifyPlugin enables you to start multiple subprocesses, each of which uses UglifyJS to compress the code and execute in parallel, significantly reducing compression time.

It is also easy to use, replace the original UglifyJS plug-in with the cost plug-in, and use the following:

npm i -D webpack-parallel-uglify-plugin

// webpack.config.json
const ParallelUglifyPlugin = require('wbepack-parallel-uglify-plugin');
/ /...
plugins: [
    new ParallelUglifyPlugin({
        uglifyJS: {/ /... Here we put the uglifyJS parameter
        },
        / /... Other ParallelUglifyPlugin parameters, set cacheDir to enable caching and speed up build times})]Copy the code

2. Optimize development experience

After modifying the source code during development, you need to automatically build and refresh the browser to see the effects. This process can be automated using Webpack, which listens for file changes, and DevServer, which refreshes the browser.

2.1 Using automatic Refresh

2.1.1 Webpack listening files

Webpack can enable listening in two ways: 1. Start Webpack with –watch; 2. 2. In the configuration file, set watch to true. In addition, the following configuration parameters are available. Set watchOptions properly to optimize the listening experience.

module.exports = {
    watch: true.watchOptions: {
        ignored: /node_modules/.aggregateTimeout: 300.// How long after the file change to initiate the build, the bigger the better
        poll: 1000.// The number of queries per second, the smaller the better}}Copy the code

Ignored: Set the directory not to listen to. Excluding node_modules can significantly reduce Webpack memory consumption

AggregateTimeout: indicates how long it takes to initiate the build after the file changes. The larger the timeout, the better

Poll: Polls the system to determine whether a file has changed. Poll indicates the number of queries per second. The smaller the poll, the better

2.1.2 DevServer Refreshing the Browser

There are two ways for DevServer to refresh the browser:

  1. Inject proxy client code into the web page and initiate a refresh through the client
  2. Load an iframe to a web page and refresh the iframe

By default, as well as devServer: {inline:true}, the page is refreshed the first way. In the first case, DevServer does not know which chunks a web page depends on, so it injects client code into each Chunk, slowing down the build when there are many chunks to output. Since only one client is needed per page, turning off inline mode can reduce build time and increase months significantly with more chunks. Closing mode:

  1. Start with webpack-dev-server –inline false
  2. configurationdevserver:{inline:false}

After closing the inline entrance website http://localhost:8080/webpack-dev-server/

In addition, the devServer.compress parameter specifies whether to use Gzip compression. The default value is False

2.2 Enabling HMR Hot Replacement for modules

Module hot replacement does not refresh the whole page, but only recompiles the changed module, and replaces the old module with the new module, so the preview reaction is faster, the waiting time is less, and the running state of the current page can be preserved without refreshing the page. The idea is to inject proxy clients into each chunk to connect to the DevServer and web pages. Open mode:

  1. webpack-dev-server –hot
  2. Use HotModuleReplacementPlugin, more troublesome

After the function is enabled, if the submodule is modified, local refresh can be achieved, but if the root JS file is modified, the whole page will be refreshed. The reason is that when the submodule is updated, events are passed up layer by layer until the file at a certain layer receives the changed module, and then the callback function is executed. If layers are thrown out until no files are received in the outermost layer, the entire page is refreshed.

Use the NamedModulesPlugin to make the console print out the name of the module being replaced instead of the numeric ID, and as with webpack listening, ignore the files in the node_modules directory to improve performance.

Optimize output quality – compress file volume

3.1 Separate environments — Reduce the volume of code in the production environment

Code running environment is divided into development environment and production environment. Code needs to do different operations according to different environments. Many third-party libraries also have a large number of if else codes judged according to the development environment, and construction also needs to output different codes according to different environments. By differentiating the environment, you can reduce the volume of the output production environment code. The DefinePlugin is used in Webpack to define the environment in which the profile will be used.

const DefinePlugin = require('webpack/lib/DefinePlugin');
/ /...
plugins:[
    new DefinePlugin({
        'process.env': {
            NODE_ENV: JSON.stringify('production')}})]Copy the code

Note that the reason for json.stringify (‘production’) is that the value of the environment variable requires a double-quoted string, and the value after stringify is ‘”production”‘

You can then use the defined environment in the source code:

if(process.env.NODE_ENV === 'production') {console.log('You're in a production environment')
    doSth();
}else{
    console.log('You are developing the environment')
    doSthElse();
}
Copy the code

When process is used in code, Webpack will automatically package the code in the Process module to support non-Node.js runtime environments. This module is designed to simulate the Process in Node.js. To support the process.env.node_env === ‘production’ statement.

3.2 Compression code -JS, ES, CSS

  1. Compressed JS: Webpack built-in UglifyJS plug-in, ParallelUglifyPlugin

    Can analyze THE JS code syntax tree, understand the meaning of the code, so as to remove invalid code, remove log input code, shorten variable names and other optimization. The common configuration parameters are as follows:

    const UglifyJSPlugin = require('webpack/lib/optimize/UglifyJsPlugin');
    / /...
    plugins: [
        new UglifyJSPlugin({
            compress: {
                warnings: false.// Delete useless code without warning
                drop_console: true.// Delete all console statements to be compatible with IE
                collapse_vars: true.// Embed variables that are defined but used only once
                reduce_vars: true.// Extract static values that are used multiple times but not defined into variables
            },
            output: {
                beautify: false.// The most compact output, with no Spaces or tabs
                comments: false.// Delete all comments}})]Copy the code

    To start WebPack with Webpack –optimize-minimize, you can inject the default configuration UglifyJSPlugin

  2. Compression ES6: third-party UglifyJS plug-in

    As more and more browsers support direct execution of ES6 code, you should run native ES6 whenever possible, so that the amount of code is less and ES6 code performs better than converted ES5 code. When running ES6 code directly, code compression is also required. The third-party Uglify-Webpack-Plugin provides the ability to compress ES6 code:

    npm i -D uglify-webpack-plugin@beta // Use the latest version of the plugin
    //webpack.config.json
    const UglifyESPlugin = require('uglify-webpack-plugin');
    / /...
    plugins:[
        new UglifyESPlugin({
            uglifyOptions: {  // Nested one more layer than UglifyJS
                compress: {
                    warnings: false.drop_console: true.collapse_vars: true.reduce_vars: true
                },
                output: {
                    beautify: false.comments: false}}})]Copy the code

    Also to prevent babel-Loader from converting ES6 code, remove babel-preset-env from.babelrc because it is babel-preset-env that is responsible for converting ES6 to ES5.

  3. Compress the CSS: css-loader? Minimize, PurifyCSSPlugin

    Cssnano is based on PostCSS and can not only remove the whitespace, but also understand the meaning of the code. For example, to convert color:#ff0000 to color:red, csS-Loader has built-in CSSNano, so you only need to use csS-Loader? Minimize can open cssnano compression.

    Another way to compress CSS is to use the PurifyCSSPlugin, which is used in conjunction with the extract-text-webpack-Plugin. The main purpose of this plugin is to remove unused CSS code, similar to the Tree Shaking of JS.

3.3 Using Tree Shaking to remove JS dead code

Tree Shaking, which removes dead code that you don’t need, relies on ES6’s import and export modularity syntax, first introduced in Rollup and introduced in Webpack 2.0. Suitable for files with scattered utility classes such as Lodash and utils.js. It works only if the code uses ES6’s modularized syntax, which is static (paths in import and export statements must be static strings and cannot be placed in other code blocks). If modularity is used in ES5, for example, module.export = {… }, require(x+y), if (x) {require(‘./util’)}, Webpack cannot analyze which code to exclude.

Enabling Tree Shaking:

  1. Modify. Babelrc to preserve the ES6 modular statement:

    {
        "presets": [["env", 
                { "module": false },   // Turn off Babel's module conversion and keep the ES6 modularized syntax]]}Copy the code
  2. Launching webPack with –display-used-exports will print a reminder of code culling in the shell

  3. Use UglifyJSPlugin, or –optimize-minimize when started

  4. When using third-party libraries, you need to configure resolve.mainfields: [‘jsnext:main’, ‘main’] to indicate that the ES6 modular code entry is used when parsing third-party library code

Optimize output quality — speed up network requests

4.1 Using CDN to speed up static resource loading

  1. The principle of CND acceleration

    A CDN deploys resources around the world so that users can access resources nearby and speed up the access. To access the CDN, you need to upload the static resources of the web page to the CDN service. When accessing these resources, the URL provided by the CDN service is used.

    The CDN will enable the caching of resources for a long time. For example, if the user obtains the index.html from the CDN, the user will still use the previous version until the cache time expires even if the index.html is replaced later. Industry practice:

    • HTML files: put them on your own server and turn off caching. Do not access CDN
    • Static JS, CSS, images, and other resources: CDN and caching are enabled, and the file name has the Hash value calculated by the content, so as long as the content changes, the Hash will change, the file name will change, and they will be re-downloaded regardless of the cache time.

    In addition, under the HTTP1.x version of the protocol, browsers will limit the number of concurrent requests to the same domain name to 4 to 8. If you put all static resources on the same CDN service, you will encounter this limitation, so you can scatter them on different CDN services, such as JS files under js.cdn.com, CSS files under css.cdn.com, etc. Urls such as **//xx.com omit the protocol **. The advantage of this is that the browser automatically decides whether to use HTTP or HTTPS when accessing the resource based on the current URL mode.

  2. In summary, the build needs to meet the following requirements:

    • The URL of the static resource import should be the URL pointing to the absolute path of the CDN service
    • The file name of the static resource needs to have a Hash value calculated based on the content
    • Different types of resources are stored in CDN of different domain names
  3. Final configuration:

    const ExtractTextPlugin = require('extract-text-webpack-plugin');
    const {WebPlugin} = require('web-webpack-plugin');
    / /...
    output:{
     filename: '[name]_[chunkhash:8].js'.path: path.resolve(__dirname, 'dist'),
     publicPatch: '//js.cdn.com/id/'.// Specify the CDN address where the JS file is stored
    },
    module: {rules: [{test: /\.css/.use: ExtractTextPlugin.extract({
             use: ['css-loader? minimize'].publicPatch: '//img.cdn.com/id/'.// Specify the CDN address for storing the images imported from the CSS file}, {}),test: /\.png/.use: ['file-loader? name=[name]_[hash:8].[ext]'].// Hash the output PNG file name}},plugins: [new WebPlugin({
         template: './template.html'.filename: 'index.html'.stylePublicPath: '//css.cdn.com/id/'.// Specify the CDN address for storing the CSS file
      }),
     new ExtractTextPlugin({
         filename:`[name]_[contenthash:8].css`.// Hash the output CSS file})]Copy the code

4.2 Multi-page applications extract common code across pages to take advantage of caching

  1. The principle of

    Large web sites are often made up of multiple pages, each of which is a separate single-page application, and many pages are bound to rely on the same style files, technology stacks, etc. If these public files were not extracted, then each chunk packaged with a single page would contain common code, equivalent to n copies of duplicated code. If the public file is extracted as a file, then when the user visits a web page, loads the public file, and visits other web pages that depend on the public file, the file is directly cached in the browser, so that the public file is only transferred once.

  2. Application methods

    1. Extract the common code that multiple pages depend on into common.js, which at this point contains the code for the underlying library

      const CommonsChunkPlugin = require('webpack/lib/optimize/CommonsChunkPlugin');
      / /...
      plugins:[
          new CommonsChunkPlugin({
              chunks: ['a'.'b'].// Which chunk to extract from
              name:'common'.// The extracted public parts form a new chunk})]Copy the code
    2. Js and common.js extract the common code into the base. Common.js removes the base code, while base.js remains unchanged

      //base.js
      import 'react';
      import 'react-dom';
      import './base.css';
      //webpack.config.json
      entry:{
          base: './base.js'
      },
      plugins: [new CommonsChunkPlugin({
              chunks: ['base'.'common'].name:'base'.//minChunks:2, the minimum number of times the file must appear in the specified chunks to be extracted, in case there is no code in common.js})]Copy the code
    3. We get the base library code base.js, the common code common.js without the base library, and the code file xx.js for each page.

      The page reference order is as follows: base.js–> common.js–> xx.js

4.3 Split code for on-demand loading

  1. The principle of

    One of the problems with single-page apps is that using a single page for complex functions and large files to load can lead to long first screen loading times if not optimized, which can affect the user experience. Doing on-demand loading can solve this problem. The specific methods are as follows:

    1. Divide web site functions into categories of relevance
    2. Each type is merged into a Chunk and the corresponding Chunk is loaded on demand
    3. For example, putting only the functionality associated with the first screen in the Chunk where the execution entry is, so that a small amount of code is first loaded and the rest of the code is loaded when it is needed. It is best to anticipate what the user is going to do and load the corresponding code in advance so that the user is unaware of the network loading
  2. practice

    The simplest example is: the first time the page loads only main.js, the page displays a button, the button is clicked to load the split show.js, and the function in show.js is executed when the load is successful

    //main.js
    document.getElementById('btn').addEventListener('click'.function(){
        import(/* webpackChunkName:"show" */ './show').then((show) = >{
            show('Webpack'); })})//show.js
    module.exports = function (content) {
        window.alert('Hello ' + content);
    }
    Copy the code

    Import (/* webpackChunkName:show */ ‘./show’).then() is the key to loading on demand. Webpack has built-in support for import(*) statements. Webpack regenerates a Chunk using./show.js as entry. When the code runs in the browser, it loads show.js only when the button is clicked, and the import statement returns a Promise, which can be retrieved in the THEN method. This requires the browser to support the Promise API, and for unsupported browsers, the Promise Polyfill needs to be injected. /* webpackChunkName:show */ is the name of the Chunk that is defined dynamically. The default name is [id].js. In order to print the configured ChunkName correctly, you also need to configure Webpack:

    / /...
    output:{
        filename:'[name].js'.chunkFilename:'[name].js'.// Specify the name of the file in which the dynamically generated Chunk is exported
    }
    Copy the code

    The book also provides real-world scenarios for asynchronously loading components in the more complex React-Router. P212

5. Optimize output quality — Improve the efficiency of your code when it runs

5.1 Pre-evaluate using Prepack

  1. Principle:

    Prepack is a partial evaluator that puts the result of the calculation into the compiled code ahead of time, rather than evaluating it when the code is running. Get the execution results by pre-executing the source code in the next phase, and then output the run results directly to improve performance. But Prepack is not yet mature enough for an online environment.

  2. Method of use

    const PrepackWebpackPlugin = require('prepack-webpack-plugin').default;
    module.exports = {
        plugins: [new PrepackWebpackPlugin()
        ]
    }
    Copy the code

5.2 Use Scope collieries

  1. The principle of

    “Scope promotion”, a feature introduced in Webpack3, analyzes dependencies between modules and tries to merge broken modules into a single function without code redundancy, so only modules that are referenced once can be merged. As the dependencies between modules need to be analyzed, the source code must be in ES6 modularization, otherwise Webpack will be degraded without using Scope reactometer.

  2. Method of use

    const ModuleConcatenationPlugin = require('webpack/lib/optimize/ModuleConcatenationPlugin');
    / /...
    plugins:[
        newModuleConcatenationPlugin(); ] .resolve: {mainFields: ['jsnext:main'.'browser'.'main']}Copy the code

    Webpack –display-optimization-bailout The output log displays a message indicating which file caused the downgrade

Use output analysis tools

Starting Webpack with these two parameters generates a JSON file that the output analysis tool mostly relies on for analysis:

Webpack –profile –json >stats.json –profile –json >stats.json The meaning is to pipe the content to a stats.json file.

  1. The official tool, Webpack Analyse

    Open the tool’s official website http://webpack.github.io/analyse/ upload stats. Json, you can get the results of the analysis

  2. webpack-bundle-analyzer

    Visual analysis tool, more intuitive than Webapck Analyse. It’s also easy to use:

    1. NPM i-g webpack-bundle-Analyzer installed globally
    2. Generate stats. Json file as above
    3. Execute at the project root directorywebpack-bundle-analyzer, the browser automatically opens the result analysis page.

7. Other Tips

  1. When babel-Loader is configured, use: [‘ babel-loader? CacheDirectory ‘] The cacheDirectory is used to cache the compilation results of Babel to speed up recompilation. Also note the exclusion of the node_modules folder, since the files use ES5 syntax and there is no need to use the Babel conversion.

  2. Configure externals to exclude code that does not need to be packaged because it has been introduced using the <script> tag, and noParse to exclude code that does not use modular statements.

  3. By configuring the performance parameter, you can output the performance check configuration of the file.

  4. Profile: true, whether to capture performance information about a Webpack build and use it to analyze what caused the build to perform poorly.

  5. Configure cache: true: whether to enable cache to improve build speed.

  6. You can use url-Loader to convert small images into Base64 and embed them in JS or CSS to reduce load times.

  7. Compress images with imagemin-Webpack-Plugin and create Sprite images with Webpack-Spritesmith.

  8. Set devTool to cheap-modul -eval-source-map for the development environment because it is the fastest source map to generate and speeds up build. Set devTool to hidden-source-map in a production environment