Webpack is the most popular front-end resource module powerful packaging tool. It can package many loose modules according to dependencies and rules into front-end resources suitable for deployment in the production environment; You can also split up the code for modules loaded on demand and load them asynchronously when they are actually needed. When using WebPack, if you don’t pay attention to performance tuning, there is a very high potential for performance problems. Performance problems are mainly divided into slow construction speed during development, repetitive work during development and debugging, and large output package files, so the optimization program is mainly analyzed for these aspects.

First, optimize the construction speed

The WebPack package starts with a recursive walk through the files that parsing depends on, starting from the entry configuration. This process consists of searching for files and matching files for analysis and conversion. Therefore, you can optimize the configuration from these two perspectives.

1. Narrow your search for files

The optimized configuration to reduce file search is as follows:

1) Configure resolve to tell Webpack how to search for files

  • Resolve.modules :[path.resolve(__dirname, ‘node_modules’)] to avoid layer upon layer searching resolve.modules tells Webpack what directory to search for when parsing modules. The default is [‘node_modules’], which looks up the current directory and the ancestor path (i.e./node_modules,.. / node_modules,.. /.. / node_modules, etc.). If you want to add a directory to the module search directory, which takes precedence over node_modules/ search, configure it as follows:
modules: [path.resolve(__dirname, "src"), "node_modules"]
Copy the code
  • Set resolve.mainfields :[‘main’] to as few values as possible to reduce search parsing of entry files

    Webpack Node applications resolve from module by default, resolve. MainFields defaults to:

mainFields: ["module", "main"]
Copy the code

Third-party modules define multiple entry files to accommodate different usage environments. We can set mainFields to unify the entry file main for the third-party module to reduce search parsing. (Most third-party modules use the main field to describe the location of the entry file.)

  • Resolve. Alias is set to a large third-party module to refer directly to the min file of the third-party library, avoiding in-library resolution
resolve.alias:{
    'react':path.resolve(__dirname, './node_modules/react/dist/react.min.js')
}
Copy the code

This will affect Tree-shaking. It is best for libraries that are more holistic, but for libraries that are more fragmented like LoDash. Avoid this approach.

  • Configure resolve. Extensions to reduce file lookups

Default: resolve.extensions: [‘.js’, ‘.json’], when the import statement does not have a file suffix, WebPack will look for the suffix list defined by the extensions, so: – Introduce statements with file suffixes whenever possible, such as require(‘./data’) rewritten to require(‘./data.json’).

2) module.noParse, which tells Webpack which files do not need to be parsed, can be used to exclude parsing of modular library files

If resolve. Alias is used to configure react.min.js, resolve.

module: {
    noParse: [/jquery|lodash, /react\.min\.js$/]
}
Copy the code

Use externals to strip third-party dependencies (such as jquery, React, echarts, etc.) from bundle.js.

3) When configuring the Loader, use test, exclude, and include to narrow the search scope

module: {
    loaders: [{
        test: /\.js$/,
        loader: 'babel-loader',
        include: [
            path.resolve(__dirname, "app/src"),
            path.resolve(__dirname, "app/test")
        ],
        exclude: /node_modules/
    }]
}
Copy the code

2. Use DllPlugin to reduce the compilation times of basic modules

With the DllPlugin plugin, a large number of reusable modules are only compiled once. Its principle is to extract the basic modules that web pages rely on and package them into DLL files. When the imported module exists in a DLL, it is no longer packaged and directly obtained from the DLL. I think using DllPlugin to link third-party modules has similar effects to configuring resolve.alias and module.noparse.

Usage:

1) Use DllPlugin to configure webpack_dll.config.js to build DLL files:

const path = require('path'); const DllPlugin = require('webpack/lib/DllPlugin'); Module. Exports = {entry: {/ / put the React related modules in a single dynamic link library React: Polyfill: ['core-js/fn/promise', 'whatwg-fetch']}, output: // Polyfill: ['core-js/fn/promise', 'whatwg-fetch']} {// output dynamic link library filename filename: '[name].dll. Js ', // output files are placed in dist path: Path. resolve(__dirname, 'dist'), // store the global variable name of the dynamic link library, such as _dll_react. '_dll_[name]'}, plugins: [new DllPlugin({// dynamic link library global variable name: Join (__dirname, 'dist', '[name].manifest.json')})]}Copy the code

├── react.dll.js ├─ React.manifest.json └─ React.manifest.json └─ React.manifest.json

2) In the main config file, use DllReferencePlugin to import xx.manifest.json dynamic link library file:

const path = require('path'); const DllReferencePlugin = require('webpack/lib/DllReferencePlugin'); module.exports = { entry: { main: './main.js' }, // ... [new DllReferencePlugin({// describe react dynamic link library file contents manifest: Require (". / dist/react. The manifest. Json '),}), new DllReferencePlugin ({/ / describe polyfill dynamic link library file content manifest: require('./dist/polyfill.manifest.json'), }) ] }Copy the code

ParallelUglifyPlugin enables multi-process compressed JS files

ParallelUglifyPlugin allows you to enable multiple sub-processes, each of which uses uglifyJsPlugin to compress code in parallel, significantly reducing code compression time.

Usage:

1) Install webpack-parallel-uglify-plugin

npm install -D webpack-parallel-uglify-plugin
Copy the code

2) Then configure the code in webpack.config.js as follows:

const ParallelUglifyPlugin = require('webpack-parallel-uglify-plugin'); Module. exports = {plugins: [new paralleluglifyplugins ({uglifyJS: {})]}Copy the code

Second, optimize the first screen loading

1. Use loading configuration

HTML -webpack-plugin adds loading maps to HTML files. The usage method is as follows:

1) Install htML-webpack-plugin:

npm install -D html-webpack-plugin
Copy the code

2) Webpack.config.js is configured as follows:

const HtmlWebpackPlugin = require('html-webpack-plugin'); const loading = require('./render-loading'); Module.exports = {plugins: [new HtmlWebpackPlugin({template: './ SRC /index.html', loading: loading }) ] }Copy the code

2. Pre-render

Prerender-spa-plugin for dramatically faster first screen loading. The idea is that this plug-in simulates the browser environment locally, pre-executing our packaged files and returning the pre-parsed front-screen HTML. The usage method is as follows:

1) Install prerender-spa-plugin

npm install -D prerender-spa-plugin
Copy the code

2) Webpack.config.js is configured as follows:

const PrerenderSPAPlugin = require('prerender-spa-plugin'); Module. exports = {plugins: [new PrerenderSPAPlugin({// path to the PrerenderSPAPlugin(// path to the PrerenderSPAPlugin)) Path. join(__dirname, '../dist'), // Route to pre-render: ['/', '/team', '/analyst','/voter','/sponsor'], // this is very important. false, renderAfterDocumentEvent: 'render-active', // renderAfterTime: 5000 }) }) ] }Copy the code

3) The project entry file main.js starts pre-rendering:

/* eslint-disable no-new */ new Vue({ el: '#app', router, store, i18n, components: { App }, template: '<App/>', render: */ mounted () {document.dispatchEvent(new Event('render-active'))}})Copy the code

Optimize output quality – compress file volume

1, compression code -JS, ES6, CSS

1) Compress JS, use webPack built-in UglifyJSPlugin and ParallelUglifyPlugin

Can analyze THE JS code syntax tree, understand the meaning of the code, so as to remove invalid code, log output code, shorten the variable name, compression and other optimization. Use UglifyJSPlug to configure webpack.config.js as follows:

const UglifyJSPlugin = require('webpack/lib/optimize/UglifyJsPlugin'); / /... Plugins: [new UglifyJSPlugin({compress: {warnings: false, // drop_console: Collapse_vars: true, // Delete all console statements, compatible with IE collapse_vars: true, // Inline defined but only used once variable reduce_vars: true, // Extract static values used many times but not defined into variable}, output: {beautify: false, // most compact output with no Spaces and tabs comments: false, // remove all comments}})]Copy the code

2) Compress ES6 and use third-party UglifyJS plug-in

More and more browsers now support direct execution of ES6 code, which is less code and better performance than converted ES5. Directly run ES6 code, also need code compression, the third-party ugli-fi -webpack-plugin provides the function of compression ES6 code, the use method is as follows:

A, install uglip-Webpack-plugin:

uglify-webpack-plugin
Copy the code

B. Webpack.config. js is configured as follows:

const UglifyESPlugin = require('uglify-webpack-plugin'); / /... Plugins :[new UglifyESPlugin({uglifyOptions: {UglifyJS: {{warnings: false, drop_console: true, collapse_vars: true, reduce_vars: true }, output: { beautify: false, comments: false } } }) ]Copy the code

Also to prevent babel-loader from switching to ES6 code, remove babel-preset-env in. Babelrc since it is babel-preset-env that is responsible for switching TO ES6 to ES5.

3) Compress the CSS

Js inside the separation of multiple CSS into one, and then compression, to heavy processing.

A, install mini-CSS-extract -plugin, optimize- CSS-assets -webpack-plugin

const MiniCssExtractPlugin = require('mini-css-extract-plugin')
const OptimizeCSSAssetsPlugin = require('optimize-css-assets-webpack-plugin')
Copy the code

B. Configure the Loader

module: {
    rules: [
        {
            test: /\.css$/,
            use: [
                MiniCssExtractPlugin.loader,
                'css-loader',
                {
                    loader: 'postcss-loader',
                    options: {
                        plugins: [
                            require('postcss-import')(),
                            require('autoprefixer')({
                                browsers: ['last 30 versions', "> 2%", "Firefox >= 10", "ie 6-11"]
                            })
                        ]
                    }
                }
            ]
        }
    ]
}
Copy the code

C. Merge multiple CSS files into a single CSS file

Mainly for multi-entry, will produce multi-points style file, merge into a style file, reduce the load times configuration as follows

  • Configuration splitChunks
optimization:{
    splitChunks: {
        chunks: 'all',
        minSize: 30000,
        minChunks: 1,
        maxAsyncRequests: 5,
        maxInitialRequests: 3,
        name: true,
        cacheGroups: {
            styles: {
                name: 'style',
                test: /\.css$/,
                chunks: 'all',
                enforce: true
            }
        }
    }
}
Copy the code
  • Configure the plug-in
  1. Filename is named the same as filename in output
  2. Here you are merging multiple CSS files into a single CSS file, so chunkFilename is not handled
  3. The resulting style file name looks something like style.550f4.css; The style is splitChunks-> cacheGroups-> name
new MiniCssExtractPlugin({
    filename: 'assets/css/[name].[hash:5].css'
})
Copy the code

D. Optimize CSS files and de-compress them

  1. Optimize – CSS – Assets -webpack-plugin and CSSNano optimizer are mainly used
  2. Cssnano optimizer for specific optimization, please refer to the official website

Two configuration modes have the same effect.

A:

module.exports = { optimization:{ minimizer: [ new OptimizeCSSAssetsPlugin({ assetNameRegExp: /\.css$/g, cssProcessor: require('cssnano'), // cssProcessorOptions: cssnanoOptions, cssProcessorPluginOptions: { preset: ['default', {// discardComments: {removeAll: NormalizeUnicode: false}]}, // Whether to print logs during processing canPrint: true }) ] } }Copy the code

Method 2:

module.exports = {
    plugins:[
        new OptimizeCSSAssetsPlugin({
            assetNameRegExp: /\.css$/g,
            cssProcessor: require('cssnano'),
            // cssProcessorOptions: cssnanoOptions,
            cssProcessorPluginOptions: {
            	preset: ['default', {
                    discardComments: {
                        removeAll: true,
                    },
                    normalizeUnicode: false
            	}]
            },
            canPrint: true
        })
    ]
}
Copy the code

Enable Tree Shaking to remove dead code

Tree Shaking can weed out dead code that you don’t need. It relies on ES6’s modular import and export syntax, first seen in Rollup and introduced in Webpack 2.0. Suitable for Lodash, utils.js and other scattered utility class files. It only works if the code follows ES6’s modularity syntax, which is static (paths in import and export statements must be static strings and cannot be placed in other code blocks). If ES5 modularity is used, for example, module.export = {… }, require(x+y), if (x) {require(‘./util’)}, Webpack cannot figure out which code can be removed.

Tree Shaking

1) Modify. Babelrc to preserve ES6 modular statements:

{"env" : [["env", {"module": false}, // disable Babel module conversion, preserve ES6 module syntax]]}Copy the code

2) Starting webpack with –display-used-exports can print a prompt about code cull in shell

3) Use UglifyJSPlugin, or use it at startup –optimize-minimize

Resolve. MainFields: [‘ jsNext :main’, ‘main’] to specify ES6 modularized code entry when parsing third-party library code

Optimize output quality – improve code speed

1. Use Prepack to evaluate ahead of time

The prepack-webpack-plugin can calculate in advance and obtain the results directly when the code is running, thus improving the speed of the code. Instead of evaluating the results at runtime, the results are printed directly to improve performance. Prepack is a prepack.

1) Install prepack-webpack-plugin:

npm install -D prepack-webpack-plugin
Copy the code

2) Webpack.config.js is configured as follows:

const PrepackWebpackPlugin = require('prepack-webpack-plugin').default;
const configuration = {};

module.exports = {
    // ...
    plugins: [
        new PrepackWebpackPlugin(configuration)
    ]
};
Copy the code

In the case of Scope promotion, apply as follows:

Scope collieries is a built-in function of Webpackage 3.x. It analyzes the dependency relationship between modules and combines the scattered modules into a function as far as possible, but it cannot cause code redundancy. Therefore, only the modules that have been referenced once can be combined. As the dependency relationship between modules needs to be analyzed, ES6 modularization should be used in the project code; otherwise, Webpack will degrade and not use Scope collieries. The use configuration of Scope is as follows:

const ModuleConcatenationPlugin = require('webpack/lib/optimize/ModuleConcatenationPlugin');
module.exports = {
    // ...
    plugins: [
        new ModuleConcatenationPlugin()
    ]
}
Copy the code

Optimize output quality – speed up network requests

1. Use CDN to accelerate the loading of static resources

1) Principle of CDN acceleration

By deploying resources around the world, CDN enables users to access resources nearby and speeds up access. To access the CDN, you need to upload static resources of web pages to the CDN service. When accessing these resources, the URL provided by the CDN service is used.

CDN will enable caching for resources for a long time. For example, if a user obtains index.html from CDN, the user will still use the previous version until the cache expires, even if the index. Industry practices:

  • HTML files: store them on your own server with cache disabled and do not access CDN
  • Static JS, CSS, images and other resources: enable CDN and cache, and the file name has the hash value calculated by the content. In this way, the hash will change as the content changes, the file name will change, and it will be re-downloaded regardless of the cache time.

To take a detailed example, a single page application builds the code structure as follows:

dist
|-- app_9d89c964.js
|-- app_a6976b6d.css
|-- arch_ae805d49.png
|-- index.html
Copy the code

Additionally, under the HTTP1.x version of the protocol, browsers limit concurrent requests to 4 to 8 for the same domain name. Therefore, it is limited to put all static resources on the CDN service under the same domain name. Therefore, static resources can be scattered on different CDN services. For example, JS files are placed under the domain name js.cdn.com, CSS files are placed under the domain name css.cdn.com, and image files are placed under the domain name img.cdn.com. Using multiple domain names brings with it a new problem: increased resolution time. Whether to use multiple domain names to disperse resources needs to be measured according to their own needs. You can reduce the delay by adding to the HTML HEAD tag.

2) Webpack realizes CDN access

The main configuration for Webpack access to CDN is as follows:

const path = require('path'); const ExtractTextPlugin = require('extract-text-webpack-plugin'); const {WebPlugin} = require('web-webpack-plugin'); Module.exports = {// omit entry configuration... Output: {// Add Hash value to the name of the output JavaScript file filename: '[name]_[chunkhash:8].js', path: Path. resolve(__dirname, './dist'), // Specify the CDN directory to store JavaScript files. {rules: [{// add support for CSS files test: /\.css$/, // Extract chunks of CSS code into separate files use: ExtractTextPlugin. Extract ({/ / compress CSS code use: [' CSS - loader? Minimize], / / store the CSS import CDN directory of resources (such as picture) publicPath URL: '//img.cdn.com/id/'}),}, {// add support for PNG files test: /\.png$/, // Add Hash values to output PNG file names use: [' file - loader? Name = _ [name] [8] hash: [ext] '],}, / / omit other loader configuration... }, plugins: [// use WebPlugin to automatically generate HTML new WebPlugin({// HTML template file path: './template. HTML ', // the output HTML filename filename: 'index.html', // specify the CDN directory for storing CSS files. '//css.cdn.com/id/',}), new ExtractTextPlugin({// add Hash value filename to output CSS filename: _ ` [name] [8] contenthash: CSS `,}), / / omit code compression plug-in configuration... };Copy the code

Extract common code between pages for browser caching

1) principle

Large web sites are usually composed of multiple pages and will certainly rely on the same style files, script files, and so on. If these common files were not extracted, each single-page chunck would contain the common code, equivalent to transferring n copies of duplicate code. If the common code is extracted into a file, then when the user visits a web page and loads the public file, then visits other pages that rely on the public file, the file is directly used in the browser’s cache, without having to load the request repeatedly.

2) Method of use

Extract common code that multiple pages depend on into common.js

const CommonsPlugin = require('webpack/lib/optimize/CommonsChunkPlugin'); Module.exports = {plugins:[new CommonsChunkPlugin({chunks:['a','b'], // which chunks to get name:'common', // The extracted common part forms a new chunk})]}Copy the code

B, find the base library to rely on, write a base.js file, and extract the common code with common.js into the base, common.js will remove the base library code, while base.js remains unchanged.

//base.js import 'react'; import 'react-dom'; import './base.css'; //webpack.config.json module.exports = { entry:{ base: './base.js' }, plugins:[ new CommonsChunkPlugin({ chunks:['base','common'], name:'base', //minChunks:2, Represents the minimum number of chunks that a file needs to appear in the specified chunks to be extracted, in case there is no code in common.js})]}Copy the code

C. Get the base library code base.js, the common code without the base library, and the respective code file xx.js of the page.

The pages are referenced in the following order: base.js–> common.js–> xx.js

3. Limit the number of chunk splitting and reduce the cost of HTTP requests

1) principle

After WebPack is compiled, you may notice that there are some very small chunks – which incur a lot of HTTP request overhead. Fortunately, the LimitChunkCountPlugin can be used to reduce the number of HTTP requests by merging chunks.

2) Method of use

const LimitChunkCountPlugin = require('webpack/lib/optimize/LimitChunkCountPlugin'); module.exports = { // ... Plugins: [new LimitChunkCountPlugin({// limit the maximum number of chunks, which must be greater than or equal to 1, maxChunkchunks: 10, // Set the minimum size of chunks minChunkSize: 2000]}})Copy the code

Optimize the development experience

1. Use automatic refresh

1) Webpack listens to files

There are two ways to listen for files:

A:

Set watch: true in the configuration file webpack.config.js.

Module. export = {// Only when the listener mode is enabled, WatchOptions make sense // Default is false, that is, do not enable watch: true, // listen mode running parameters // listen mode is enabled, meaningful watchOptions: {// Files or folders that are not listened on. Regular matching is supported. // Empty by default: // node_modules/, // The way to listen for changes in a file in Webpack is to get the last edit time of the file regularly. // The last edit time of the file is saved every time. If the current edit time is not consistent with the last edit time of the file, // The file is considered changed. // poll specifies the number of milliseconds in which a poll is checked. // When a file changes, webPack does not immediately tell the listener. Instead, it caches the file, collects the changes for a period of time, and then tells the listener once. AggregateTimeout: 300, aggregateTimeout: 300, // Don't listen to files in the node_modules directory. /node_modules/, } }Copy the code

Method 2:

When executing the start Webpack command, take the –watch argument, and the full command is Webpack –watch.

2) Control the automatic refresh of the browser

Method 1: webpack-dev-server

When the webpack module is started with the webpack-dev-server module, the listening mode of the Webpack module is turned on by default. The Webpack module tells the Webpack-dev-server module when a file changes.

Koa + Webpack-dev -middleware + webpack-hot-middleware isomorphism

Segmentfault.com/a/119000000…

2. Enable module hot update HMR

The HMR-Hot Module Replacement feature replaces, adds, or removes modules while the application is running without having to reload the entire page, so previews are faster and wait time is less. The idea is to inject proxy clients into each chunk to connect to DevServer and web pages. Opening mode:

1) webpack – dev – server – hot

2) use HotModuleReplacementPlugin plug-in

const HotModuleReplacementPlugin = require('webpack/lib/HotModuleReplacementPlugin');

module.exports = {
    plugins: [
        new HotModuleReplacementPlugin()
    ]
}
Copy the code

After this function is enabled, if the submodule is modified, partial refresh can be achieved, but if the root JS file is modified, the entire page will be refreshed. The reason is that when a submodule is updated, events are passed up one layer at a time until a file in one layer receives the currently changed module and then executes a callback function. If layers are thrown out to the outermost layer and no file is received, the entire page is refreshed.

Never enable HMR in production.

Summary:

In the use of Webpack to build front-end projects, gradually exposed some performance problems, which mainly include the following aspects:

  • Full code builds too slowly, and even small changes take too long to see the results of updates and compilations. (Significant improvement with the introduction of HMR hot update)
  • As the complexity of project business increases, the volume of engineering modules will also increase sharply, and the built modules are usually calculated in M units.
  • Multiple projects share basic resources and there is repeated packaging, and the reuse rate of basic library code is not high.
  • The loading of the first screen depends too much and the blank screen lasts a long time.

The webPack optimization scheme above comes in handy for these problems. As development engineers, we should constantly pursue high performance of project engineering, uphold the principle of “what solution solves what problem”, and continue to improve and optimize project performance for actual development projects, constantly improve development efficiency and reduce resource costs.