First, speed analysis

Install the speed-measure-webpack-plugin

npm install --save-dev speed-measure-webpack-plugin
Copy the code

Importing plug-ins, creating plug-in objects

const SpeedMeasurePlugin = require('speed-measure-webpack-plugin'); // Import plug-ins
const smp = new SpeedMeasurePlugin(); // Create a plug-in object
Copy the code

Use the plug-in’s wrap() method to wrap the configuration

module.exports = smp.wrap({
  entry: {
    index: './src/index.js'.search: './src/search.js',},output: {
    path: path.join(__dirname, 'dist'), //__dirname(directory name of the current module) + dist
    filename: '[name]_[chunkhash:8].js'.// Add file fingerprint chunkhash to output file name after packaging
  },
plugpins: [],... });Copy the code

After the packaging is complete, the console outputs the packaging time of each Loader. You can further optimize the packaging speed based on the time

Second, volume analysis

What problems can volume analysis analyze?

  1. Dependent third party module file size
  2. The size of the component code in the business

After packaging, the volume ratio of each module can be clearly and intuitively seen

Install the plugin webpack-bundle-Analyzer

npm install --save-dev webpack-bundle-analyzer
Copy the code

Importing plug-ins, creating plug-in objects

const { BundleAnalyzerPlugin } = require('webpack-bundle-analyzer');
Copy the code

Add the plugpins configuration

 plugins: [
    new BundleAnalyzerPlugin()
  ],
Copy the code

After the packaging is complete, the browser will open http://127.0.0.1:8888/ to display the volume analysis after packaging

Three, packaging speed optimization

In the process of webpack construction, one of the direct influences on the construction efficiency is the compilation of files, and another is the classification and packaging of files. File compilation is time-consuming, and files can only be processed one by one in the Node environment, so this optimization needs to be addressed. So how to optimize the packaging speed?

1. Use older versions of WebPack and Node.js

The new version of webpack4 uses the v8 engine for optimizations, which include

  • For of v. forEach
  • Map and Set replace Object
  • Includes replacement indexOf ()
  • By default, md4 hash algorithm is used instead of MD5 algorithm, because MD4 is faster than MD5 algorithm
  • Webpack AST can be passed directly from the Loader to the AST, reducing parsing time
  • Use string methods instead of regular expressions

Later versions of Node.js make further improvements to the native JS API and JS data structure

2. Multi-process/multi-instance build (resource parallel parsing)

In the process of webpack construction, we need to use Loader to convert JS, CSS, pictures, fonts and other files, and the amount of file data to convert is also very large, and these conversion operations cannot concurrently process files, but need to process files one by one. What we need is to split this part of the task into multiple sub-processes for parallel processing, and the sub-processes send the results to the main process after completion, thus reducing the total build time.

alternative

  • Thread-loader (official release)
  • parallel-webpack
  • HappyPack
HappyPack

Note: Since HappyPack authors are losing interest in JS, maintenance will be less in the future. Webpack4 and later recommend using Ththread loader

How it works: Every time WebPack parses a module, HappyPack assigns it and its dependencies to the worker process. HappyPack splits modules into modules. For example, if we have multiple modules, we give them to HappyPack. First, after the WebPack Compiler (hook) run method, the process will arrive at HappyPack, and HappyPack will do some initialization. After initialization, a thread pool will be created. The thread pool will allocate the modules in the build task. For example, a module and its dependencies will be allocated to one of the HappyPack threads. Each thread in the thread pool processes each module and its dependencies. After processing, there is a communication process that transfers the processed resources to one of the main processes of HappyPack to complete a build process.

Copy the same pages from the SRC directory

Perform packaging without introducing HappyPack

NPM install –save-dev happypack: happypack: happypack: happypack: happypack: happypack: happypack: happypack: happypack: happypack: happypack: happypack: happypack: happypack: happypack: happypack

rules: [
      {
        test: /.js$/.// Compile all files with the js suffix
        use: [
          // 'babel-loader'
          'happypack/loader',]}]Copy the code

Add happypack-loader to the plugin

  plugins: [
    new HappyPack({
      // 3) re-add the loaders you replaced above in #1:
      loaders: ['babel-loader'],
    }),
]
Copy the code

You can obviously see how much faster packaging is with Happypack

thread-loader

Working Principle: Similar to HappyPack, each time webPack parses a module, Thread-Loader assigns it and its dependencies to worker processes. The installation

npm install --save-dev thread-loader
Copy the code

Add thread-loader to rule. Thread-loader can be configured to workers (number of processes).

rules: [
      {
        test: /.js$/.// Compile all files with the js suffix
        include: path.resolve('src'), // Indicates the loader to be used for all.js files in the SRC directory
        use: [
          'babel-loader',
          {
            loader: 'thread-loader'.options: {
              workers: 3,}},// 'happypack/loader',],},]Copy the code

The packaging speed has also improved significantly with the use of Thread-Loader

3. Multi-process/multi-instance code compression (parallel compression)

There is a code compression phase after the code is built and before the output, this phase can also be parallel compression to optimize the build speed;

alternative

  • webpack-parallel-uglify-plugin
  • uglifyjs-webpack-plugin
  • Terser-webpack-plugin **(Recommended for WebPack 4.0, supports compression of ES6 code)**
npm install terser-webpack-plugin --save-dev
Copy the code
const TerserPlugin = require('terser-webpack-plugin');
Copy the code

Optimization add the TerserPlugin plugin to enable Parallel

  optimization: {
    minimize: true.minimizer: [
      new TerserPlugin({
        // Code compression plug-in
        parallel: 4.// Enable parallel compression})],},Copy the code

4. Improve packaging speed through subcontracting

The htML-webpack-externals-plugin can be used to separate the base package, and after separation, the required resource files can be introduced in the way of CDN. The disadvantage is that one base library must specify a CDN, and in the actual project development, multiple base libraries may be referenced, as well as some business packages. This will type many script tags

new HtmlWebpackExternalsPlugin({
  externals: [{module: 'react'.entry: 'https://unpkg.com/react@16/umd/react.development.js'.global: 'React',}]})Copy the code

Further subcontracting, using precompiled resource modules

DLLPlugin, the official built-in plug-in of Webpack, is adopted for subcontracting. DdllReferenceRlugin references the Manifest.json DLLPlugin to package components and framework libraries involved in a project, such as React, ReactDOM, and Redux, into a file. Manifest.json is a description of the package that has been separated. The actual project can reference manifest.json. The reference is then associated with the package isolated from the DLLPlugin, which is used to map the DLLReferencePlugin to the related dependencies

  1. First use DLLPlugin for subcontracting

Create a separate build configuration file, webpack.dll.js, in which you identify the packages that need to be separated and add the DLL to package.json

  "scripts": {
    "dll": "webpack --config webpack.dll.js"
  }
Copy the code

webpack.dll.js

const webpack = require('webpack');
const path = require('path');

module.exports = {
  mode: 'development'.entry: {
    // Library corresponds to output
    library: ['react'.'react-dom'],},output: {
    filename: '[name]_[chunkhash].dll.js'.// The separated file name, a placeholder +hash.dll. Js [name] corresponds to the entry library
    path: path.join(__dirname, 'build/library'), // Output to the build directory in the current directory
    library: '[name]'.// The name of the library exposed after packaging
  },
  plugins: [
    new webpack.DllPlugin({
      name: '[name]_[hash]'.// Name in library.json after packaging
      path: path.join(__dirname, 'build/library/[name].json'), // The path to generate [name].json after packaging})]};Copy the code

NPM run DLL will then generate two files in the build directory

Manifest.json, as mentioned earlier

Once built, use the DllReferencePlugin to reference manifest.json

plugins: [
    new webpack.DllReferencePlugin({
      manifest: require('./build/library/library.json'),}),]Copy the code

5. Improve the secondary packaging speed through cache

  • Babel-loader enables caching
  • Terser-webpack-plugin enables caching
  • Use cache-loader or hard-source-webpack-plugin
    new HappyPack({
      loaders: ['babel-loader? cacheDirectory=true'],})Copy the code

Set babel-loader’s cacheDirectory=true to enable caching

  optimization: {
    minimize: true.minimizer: [
      new TerserPlugin({
        // Code compression plug-in
        parallel: 4.// Enable parallel compression
        cache: true,})]},Copy the code

Set cache for terser-webpack-plugin: true enables caching

Use the hard – source – webpack – the plugin

npm install --save-dev hard-source-webpack-plugin
Copy the code
  plugins: [
    new HardSourceWebpackPlugin()  
]
Copy the code

The first run starts writing to the cache file

Enabling caching significantly improves packaging speed

6. Narrow your build goals

Build as few modules as possible, for example babel-loader does not resolve node_modules

  • Optimize the resolve.modules configuration (reduce the module search hierarchy)
  • Optimize the resolve.mainfields configuration
  • Optimize the resolve.extensions configuration

Fourth, packaging volume optimization

Mainly optimize the resource volume of packaged images, JS and CSS files

1. Image compression

Using the Imagemin of Node library, configure image-Webpack-Loader to optimize the image, identify the image resources during the construction of the plug-in, and analyze the advantages of image resources optimization of imagemin

  • Imagemin has many customization options
  • More third-party optimization plug-ins can be introduced, for examplepngquant
  • Multiple image formats can be introduced

Imagemin compression principle

  • Pngquant is a PNG compressor that significantly reduces file size by converting images to a more efficient 8-bit PNG format with alpha channels (typically 60%-80% smaller than 24/32-bit PNG files).

Alpha channels are the transparency and semi-transparency of an image

  • Pngcrush: Its main purpose is to reduce the size of PNG IDAT data streams by trying different compression levels and PNG filtering methods;
  • Optipng: Its involvement was inspired by PngCrush. Optipng can recompress image files to a smaller size without losing any information;
  • Tingpng: Also converts 24-bit PNG files into smaller 8-bit images with indexes, while all non-essential metadata is stripped away;
npm install image-webpack-loader --save-dev
Copy the code
rules: [
      {
        test: /.(png|jpg|gif|jpeg)$/,
        use: [
          {
            loader: 'file-loader'.options: {
              name: '[name]_[hash:8].[ext]',}}, {loader: 'image-webpack-loader'.options: {
              mozjpeg: {
                progressive: true.quality: 65,},// optipng.enabled: false will disable optipng
              optipng: {
                enabled: false,},pngquant: {
                quality: [0.65.0.9].speed: 4,},gifsicle: {
                interlaced: false,},// the webp option will enable WEBP
              webp: {
                quality: 75,},},},],}]Copy the code

2. Delete unnecessary CSS

NPM I purgecss-webpack-plugin-d can also be used to identify the CSS class by traversing the code through the plugin

const PurgecssPlugin = require('purgecss-webpack-plugin');
const PATHS = {
  src: path.join(__dirname, 'src'),}; plugins: [new PurgecssPlugin({
      paths: glob.sync(`${PATHS.src}/ * * / * `, { nodir: true})}),]Copy the code

(3) dynamic Polyfill

What is Polyfill?

By default, Babel only converts new JavaScript syntax, such as arrow functions, and does not convert new apis, such as Iterator, Generator, Set, Maps, Proxy, Reflect, Symbol, Promise, etc. And some methods defined on global objects (such as Object.assign) do not transcode; So we need polyfill; Link: www.jianshu.com/p/482285279…

The official explanation

  • It is a service that accepts requests for a set of browser features and returns only the polyfill required by the requesting browser.
  • There are many different browsers and versions of browsers in use around the world, each with a completely different set of features than the others. This can make browser development a daunting task. The latest versions of popular browsers can do many things that older browsers cannot – but you may still have to support older browsers. By trying to recreate missing features using Polyfills, Polyfill.io makes it easier to support different browsers: you can use the latest and most powerful features in supported or unsupported browsers.

throughcaniuseA query showed that promise was 96.17% compatible

Since Polyfill is not required, some browsers that do not support the new ES6 syntax need to load Polyfill for 3%. It is not necessary for a few users to have all users load Polyfill;

We can use polyfill-service to only return the required polyfill to the User. Every time the User opens a page, the browser will request the Polyfill-service, and the Polyfill-service will identify the User Agent. IO /v3/url-buil… IO /v3/url-buil…

Or by introducing the CDN < script SRC = “https://cdn.polyfill.io/v2/polyfill.min.js” > < / script > to load the polyfill – service IO /v3/polyfill… Url to view the situation of User Agent in different browsers;

Webpack4 packaging construction speed optimization and volume optimization of the content of this end, the article through learning Cheng Liufeng teacher “Play webpack” course practice summary, welcome to discuss and correction, above.