Webpack catalog

  1. The core configuration of webpack5
  2. Webpack5 modularity principle
  3. Webpackage 5 with Babel/ESlint/ browser compatibility
  4. Webpack5 performance optimization
  5. Webpack5 Loader and Plugin implementation
  6. Webpack5 core source code analysis

Performance analysis tool

Performance optimization has always been our concern. From the perspective of performance optimization analysis, we can start from the construction process and the construction product.

Construction Process analysis

For the build process, if we want to see the packaging time consumed by each loader, each Plugin, we can use a plug-in: Speed-measure-webpack-plugin, NPM install speed-measure-webpack-plugin -d. Use the following configuration

const SpeedMeasurePlugin = require('speed-measure-webpack-plugin')
const smp = new SpeedMeasurePlugin()
module.exports = (env) = > {
  const config = {
    // Configure the properties
  }
  return smp.wrap(config)
}
Copy the code

This Plugin has some compatibility issues in the latest Webpack version (incompatible with some plugins), so we can only add compatible plug-ins to delete or comment out

Construction product analysis

For build artifacts, we can use the webpack command to generate both the file analysis artifact stats.json and the analysis tool Webpack-bundle-Analyzer

stats.json

We add it in the script of package.json"build:stats": "webpack --config ./config/webpack.common.js --env production --profile --json=stats.json", we implementnpm run build:stats, now we can generate one in the current directorystats.jsonFile, and format itThis product records the size of the module to build the product, the import relationship before the module, etc. We can also put the generated stats.json file inGithub.com/webpack/ana…, to analyze

webpack-bundle-analyzer

We can also install NPM install webpack-bundle-Analyzer -d, using the configuration plug-in, and package with webpack-bundle-Analyzer, an analysis tool that looks at package sizes

const { BundleAnalyzerPlugin } = require('webpack-bundle-analyzer')
{
  plugins: [
    / /...
    new BundleAnalyzerPlugin()
  ]
}
Copy the code

After the package is finished, the plug-in opens a service on port 8888. We can directly visit the port page in the browser to see the size of each package.

Performance optimization scheme

At present, there are corresponding optimization schemes for specific scenarios in actual projects. Now let’s look at some commonly used performance optimization schemes

Package code separation

There are three ways to separate code

  • Entry starting point: Configure manual separation of code using Entry
  • Prevent duplication: Use Entry Dependencies or SplitChunksPlugin to duplicate and separate code
  • Dynamic import: Separation of code through inline function calls to modules

Entry configures manual code separation

The Entry property can pass strings or objects. The configuration is as follows

{
  entry: {
    main: resolve('src/main.js'),
    index: resolve('src/index.js')},output: {
    path: resolve('build'),
    filename: 'js/[name].bundle.js'.// Use the [name] placeholder to export different bundles
    clean: true}}Copy the code

Js and main.bundle.js, and importing the bundles in index.html. This is typically used to package multiple page projects.

Entry Dependencies

Now we use a Lodash package in index.js and main.js respectively

// index.js
import _ from 'lodash'
console.log(_.join('hello'.'index.js'))
Copy the code
// main.js
import _ from 'lodash'
console.log(_.join('hello'.'main.js'))
Copy the code

You can see that LoDash is packaged into two bundles, which doubles its size. Can we separate the dependencies that are introduced repeatedly into a common bundle? See the configuration below

entry: {
  main: {import: resolve('src/main.js'), dependOn: 'lodash'},
  index: {import: resolve('src/index.js'), dependOn: 'lodash'},
  lodash: 'lodash'
}
Copy the code

Now you can see that by configuring dependOn to indicate the dependency LoDash, you can see that the lodash.bundle.js file is pulled out of the package. If two files depend on another package, such as dayJS, then the configuration and packaging are as follows

entry: {
  main: {import: resolve('src/main.js'), dependOn: 'shared'},
  index: {import: resolve('src/index.js'), dependOn: 'shared'},
  shared: ['lodash'.'dayjs']}Copy the code

Loadsh and dayjs are now packaged into shared.bundle.js.

SplitChunksPlugin

We can use the chunking strategy of SplitChunks, and we introduce several common attributes

  • chunks: The default value is async, the other value is initial, which means processing for passed code, and all, which means processing for both synchronous and asynchronous code
  • minSize: The size of the package should be at least minSize. If a package can’t be split up to minSize, the package will not be split up
  • maxSize: Split the package larger than maxSize into a package larger than minSize
  • minChunks: At least the number of times it is introduced. The default is 1. If we write a 2, but it is introduced once, it will not be split separately
  • name: Sets the name of the unpack. It can be a name or false. If false is set, you need to set the name in the cacheGroups
  • cacheGroupsFor example, a LoDash package will not be packaged immediately after it is split, but will wait until there are other packages that meet the rules to be packaged together
    • test: Matches the packets that match the rules
    • name: The name attribute of the subcontract
    • filenamePlaceholder property: Placeholder name for the deallocation
    • priority: Packaging priority
optimization: {
  / /...
  splitChunks: {
    // async asynchronous import
    // initoal Synchronously imports
    // all Asynchronous/synchronous
    chunks: 'all'.minSize: 20000.// The minimum size of a split package is minSize. The default size is 20KB
    maxSize: 20000.// Unpack a package larger than maxSize into a package larger than minSize. The default value is 0
    minChunks: 2.// The imported package is imported at least several times by default
    cacheGroups: {   // Cache grouping
      vendor: {  // Third party package to vendor
        test: /[\/]node_modules[\/]/.Matching / / node_modules
        filename: 'js/[id]_vendors.js'.Filename can be used as a placeholder. Name is a fixed name
        // name: 'js/check_vendors.js',  
        priority: -10  // When all the packaging criteria are met, the packaging is carried out according to the priority priority, and the big ones are packaged first
      },
      default: {  // Default packaging, when other conditions are not met
        minChunks: 2.filename: 'js/[id]_common.js'.// Common. Js is packaged with multiple entries
        priority: -20}}}}Copy the code

Dynamic import module

Another way to split code is dynamic import, and WebPack provides two ways to implement dynamic import

  • Use ECMAScriptimport()Syntax is currently recommended
  • Use webpack legacyrequire.ensure, is no longer recommended

For example, if a module is not used when the initial page is opened, but is used when a certain point is clicked, we want to load the module without loading it at first, but dynamically import it when we need to use it.

We can change the asynchronous packaging file name by configuring output.chunkfilename

output: {
  / /...
  chunkFilename: 'js/chunk.[id].[name].js'
}
Copy the code

 

We load foo.js asynchronously with import. When packaged, we see chunk.604.604.js, but the id and name are the same. We can change the name with magic comments

Now after repackaging you can see that the asynchronous module name has been replaced with the name we configured.

The dependency package is loaded asynchronously 2 seconds after we open it in the browser. We actually load routes in a lot of frameworks like Vue using lazy import loading, which is what we said above.

In the asynchronous loading module, we can also configure a prefetch and preload with magic comments

  • prefetch(Prefetch) : Prefetch will start loading resources that may be needed for future navigation after the parent chunk finishes loading.
  • preload(Preload) : Resources may be required under the current navigation, requested along with the parent chunk

Currently we recommend using prefetch, configure /*webpackPrefetch: true*/ as follows

After packaging, you can see that the chunk introduced in the browser is imported through , and in the network, you can also see that the current module is loaded after all modules are loaded, fetched from the Prefetch cache and executed. So the resource is pre-loaded and then loaded from the cache when we use it.

Tree Shaking

Tree Shaking is code that we don’t use. It’s Tree Shaking for javascript and CSS. Teser compression is used in the Tree Shaking approach to javascript, so let’s take a look at javascript and CSS compression tools.

Javascript compression

In the early days, we used Uglip-JS to compress and uglify our JavaScript code, but it is no longer maintained and does not support ES6+ syntax. Now we mainly use the TerserPlugin configuration. This plugin is installed automatically when webpack5 is installed

  • There is one in the WebpackminimizerProperty, which in production mode is used by defaultTerserPluginIf we are not happy with the default configuration, we can create our ownTerserPluginAnd override the associated configuration
  • First, we need to open itminimizeTo compress our code (enabled by default in Production mode)
  • Second, we can create one in MinimizerTerserPlugin
    • extractComments: The default value is true, which means that comments are extracted into a separate file. During development, we can set this value to false if we do not want the comment to remain
    • parallelThe default value is true, and the default number of concurrent runs is os.cpus().length-1. You can also set your own number, but use the default value
    • terserOptions: Set up our Terser-related configuration
      • compress: Sets compression related options
      • mangle: Sets uglification options, which can be set to true
      • toplevel: Whether the underlying variable is converted
      • keep_classnames: Reserved class name
      • keep_fnames: Preserves the function name

The TeserPlugin configuration is as follows

const TerserPlugin = require('terser-webpack-plugin')
/ /...
optimization: {
  minimize: true.// Minimizer configuration switch
  minimizer: [
    new TerserPlugin({   // There is no need to configure, compress js by default
      parallel: true.// Build with CPU multicore
      extractComments: false.// Pack the license. TXT comment file to eat
      terserOptions: {
        compress: {
          arguments: true.dead_code: true,},// Set compression related options;
        mangle: true.// Set the ugliness option, which can be set to true;
        toplevel: true.// Whether the underlying variable is converted;
        keep_classnames: false.// Keep the name of the class;
        keep_fnames: false.// Keep the name of the function;}})],}Copy the code

CSS compression

In addition to JS compression, we also use CSS compression, CSS compression is usually to remove useless Spaces, etc., because it is difficult to modify the selector, attribute name, value, etc., we can use a plug-in: Css-minimizer-webpack-plugin, csS-minimizer-webpack-plugin, csS-minimizer-webpack-plugin, csS-minimizer-webpack-plugin, csS-minimizer-webpack-plugin, csS-minimizer-webpack-plugin, csS-minimizer-webpack-plugin, csS-minimizer-webpack-plugin NPM install csS-minimizer-webpack-plugin -d Configuration is as follows

const TerserPlugin = require('terser-webpack-plugin')
const CssMinimizerWebpackPlugin = require('css-minimizer-webpack-plugin')
/ /...
optimization: {
  minimize: true.// Minimizer configuration switch
  minimizer: [
    new TerserPlugin({   // There is no need to configure, compress js by default
      / /...
    }),
    new CssMinimizerWebpackPlugin({
      parallel: true}})],Copy the code

Once packaged, you can see that the CSS file has been compressed

Javascript Tree Shaking

Tree Shaking is a term for eliminating dead_code in a computer. Tree Shaking relies on static parsing of ES Modules (not executing any code, You know exactly what module dependencies are), tree shaking is implemented in Webpack as follows

  • Webpack2 officially has built-in support for ES2015 modules and the ability to detect unused modules
  • This capability is officially extended in Webpackage 4, and is available via package.jsonsideEffectsProperty to tell WebPack where files can be safely deleted at compile time
  • Webpack5 also provides support for some of CommonJS’s tree shaking

Webpack currently implements Tree Shaking in two different ways

  • UsedExports: Indicates whether certain functions are used and then optimizes them using Terser
  • SideEffects: skip the entire module/file to see if the file has sideEffects

usedExports

In order to see the effects of usedExports, we need to set it to Development mode because some of the optimizations that WebPack defaults to in Production mode can have a big impact.

Now prepare our priceformat. js file and our entry file main.js, which only imports the add function from priceformat. js

// priceFormat.js
const add = (a, b) = > {
  return a + b
}

const minus = (a, b) = > {
  return a - b
}

// console.log(add(1, 2))

export {
  add,
  minus
}
Copy the code
import { add } from './js/priceFormat'
var dom = document.createElement('div')
dom.innerHTML = 'div' + add(1.2)
document.body.appendChild(dom)
Copy the code

Configure and repackage webpack

{
  mode: 'development'.devtool: 'source-map'.optimization: {
    usedExports: true}}Copy the code

When usedExports is set to true, the unused Harmony export minus comment indicates that Terser can remove the code during optimization.

At this point, we’re going to minimize true and repackage

{
  mode: 'development'.devtool: 'source-map'.optimization: {
    usedExports: true.minimize: true}}Copy the code

Now looking at the package file, the minus function has been removed, so usedExports implements Tree Shaking in conjunction with Terser.

sideEffects

  • SideEffects is used to tell the webpack compiler which modules have sideEffects. SideEffects mean that the code performs some special tasks and cannot be determined by export alone
  • Set in package.jsonsideEffectsThe default value is true, indicating that all imported files have side effects and will not be deleted
    • If we set sideEffects to false, we are telling Webpack that it is safe to remove unused exports
    • If there’s some that we want to keep, we can set it to an array

Now we set sideEffects to false in package.json to indicate that there are no sideEffects, and then import foo.js in main.js and repackage

// foo.js
console.log('foo')
Copy the code
// main.js
import './js/foo'
Copy the code

We can see that the code for foo.js does not appear in the packaged file because it considers foo.js not a side effect file, so it is removed during the packaging process.

We can also set the sideEffects to an array and store the sideEffects files in it so that they won’t be deleted when packaged.

For example, we set sideEffects to [“./ SRC /js/foo.js”] and repackage

Now we can see that the code for Foo appears in the package file, so the configuration is in effect. When introducing CSS modules, we can also configure sideEffects in rules so that CSS modules are considered sideEffects and files are not deleted, as follows

{
  test: /.less$/,
  use: [
    isProduction ? MiniCssExtractPlugin.loader : 'style-loader'.'css-loader'.'postcss-loader'.'less-loader'].sideEffects: true 
},
Copy the code

CSS Tree Shaking

In addition to JS Tree Shaking, CSS also has Tree Shaking. In the early days, we used to use the PurifyCss plugin for Tree Shaking, but the library is no longer maintained. There is another library for Tree Shaking CSS. PurgeCSS is also a tool for removing unused CSS. Install NPM install purgecss-webpack-plugin -d. The configuration is as follows:

const PurgecssWebpackPlugin = require('purgecss-webpack-plugin')
const glob = require('glob')  // Webpack comes with plugins that match folders or files
{
  plugins: {
    / /...
    new PurgecssWebpackPlugin({
      paths: glob.sync(`${resolve('./src')}/ * * / * `, {nodir: true}),  // Matches all files in the SRC directory, excluding folders
      safelist: function() {  // Secure whitelist, not shaken by tree
        return {
          standard: ['html'.'body']}}}),}}Copy the code

CSS tree shaking is not an option for vue templates.

Dynamically linked library

A DLL is a software link library that allows you to share code that doesn’t change very often and extract it into a shared library that will be introduced into other projects’ code later in the compilation process. The use of DLL libraries is divided into two steps:

1. Package a DLL library

We create a new project, add the WebPack configuration file, we use a DllPlugin built into WebPack, and we package the Vue into a DLL library

const { DllPlugin } = require('webpack')
const path = require('path')
const resolve = (src) = > {
  return path.resolve(__dirname, src)
}
module.exports = {
  mode: 'production'.entry: {
    vue: ['vue']},output: {
    path: resolve('./dll'),
    filename: 'dll_[name].js'.library: 'dll_[name]'
  },
  plugins: [
    new DllPlugin({
      name: 'dll_[name]'.path: resolve('./dll/[name].manifest.json')]}})Copy the code

After repackaging, we have a DLL build directory with dll_vue.js and vue.manifest.json files

2. Introduce the DLL library into the project

We copy the build directory to the project we want to reference and use the following configuration, which uses the DllReferencePlugin built into WebPack. In addition, we need to install an Add-asset-html-webpack-plugin dependency package. This dependency is used to introduce static resources into THE HTML. There are actually two operations: one is to assign static resources to the build directory, and the second is to inject script into the HTML and reference the static resources. Install NPM install add-asset-html-webpack-plugin -d

const AddAssetHtmlPlugin = require('add-asset-html-webpack-plugin')
const { DllReferencePlugin } = require('webpack')

module.exports = {
  / /...
  plugins: [
    / /...
    new DllReferencePlugin({
      context: resolve('/'),
      manifest: resolve('./dll/vue.manifest.json')}),new AddAssetHtmlPlugin({
      outputPath: './auto'.filepath: resolve('./dll/dll_vue.js')]}})Copy the code

Introduced the CDN

The full name of CDN is Content Delivery Network. CDN is an intelligent virtual network built on the basis of the existing network. It relies on the edge servers deployed in various places, through the central platform of load balancing, content distribution, scheduling and other functional modules, users can get the content nearby, reduce network congestion, and improve user access response speed and hit ratio. The key technologies of CDN mainly include content storage and distribution. In actual projects, we use CDN in two main ways

  1. All static resources packed are put into the CDN server, and all resources of users are loaded through the CDN server
  2. Some third-party resources are put on the CDN server

For example, we use the lodash and DAYJS libraries. In the production environment, we can configure the externals attribute and pass in an object. Key is the imported installation package and value is the global variable used by the installation package

externals: {
  lodash: '_'.dayjs: 'dayjs'
}
Copy the code

We rely on the CDN link of the package through EJS syntax configuration in the template index.html. After repackaging, there is no content of the package we rely on in vender.js.

GIZP compression

HTTP compression is a way built in between the server and client to improve transmission speed and bandwidth utilization. What is the flow of HTTP compression

  1. HTTP data is compressed before the server sends it (this can be done in Webpack)
  2. Compatible browsers tell the server which compression formats they support when they send a request to the server
  3. The server directly returns the corresponding compressed file in the browser supported compression format and informs the browser in the response header

The current compression format is very many, as can be seen below

  • Compress – a method of UNIX’s compress program (not recommended for most applications for historical reasons, gzip or deflate should be used instead)
  • Deflate – Compression based on the Deflate algorithm (defined in RFC 1951), encapsulated in the Zlib data format
  • Gzip – GNU Zip format (defined in RFC 1952) is the most widely used compression algorithm
  • Br – A new open source compression algorithm designed for encoding HTTP content

NPM install compression-webpack-plugin – NPM install compression-webpack-plugin -D

const CompressionWebpackPlugin = require('compression-webpack-plugin')  // Compress the package plug-in
{
  plugins: [
    / /...
    new CompressionWebpackPlugin({
      threshold: 0.test: /.(css|js)$/i,
      minRatio: 0.8.algorithm: 'gzip'}})]Copy the code

The Chunk inline Html

There is also a plugin that can help inline some chunks into HTML to reduce the number of HTTP requests

  • The Runtime code, for example, is small but must be loaded
  • So we can inline the HTML directly

This plugin is implemented in react-dev-utils, but does not remove runtime files. We can install another plugin, script-ext-html-webpack-plugin, NPM install script-ext-html-webpack-plugin -d Configuration is as follows

const ScriptExtHtmlWebpackPlugin = require('script-ext-html-webpack-plugin') // Help to inline some chunk modules into HTML (will delete runtime-chunk)
{
  plugins: [
    new ScriptExtHtmlWebpackPlugin({
      inline: /runtime.*.js(.gz)? $/  // Matches the runtime file name}})]Copy the code

If the above re is configured with gzip compression, then the generated GZ file will also be matched and deleted.

The Hash value is properly configured

Placeholder properties placeholder properties placeholder properties placeholder properties placeholder properties placeholder properties placeholder properties placeholder properties placeholder properties placeholder properties Hash, chunkhash, contenthash, hash itself is processed by MD4’s hash function to produce a 128-bit hash value (32 hexadecimal).

  • The hash value is generated in relation to the entire project. For example, if we have two entry files index.js and main.js, if we use hash and then modify main.js, then our index.js file name will also change.
  • Chunkhash solves the above problem, if we modify one of the entry files, the name of the other entry file will not change.
  • However, if a CSS file is introduced in index.js, for example, the CSS file is extracted separately. When we modify the index.js file, the CSS file name will also change. In this case, we can use contenthash

So we usually use contenthash for Runtime chunk or CSS chunk and chunkhash for entry files.

Scope Hoisting

Scope is one of the new features that has been added since Webpack3, which is to increase the Scope and make webpack code smaller and faster

By default, webPack packs have a number of function scopes, including (for example, the outermost)IIFE: Either running the original code or loading a module requires executing a series of functions that Scope can merge into a module to run.

Scope is very simple to use, webpack has built-in corresponding modules:

  • inproductionIn mode, this module is enabled by default
  • indevelopmentIn mode, we need to open the module ourselves, as follows.
new webpack.optimize.ModuleConcatenationPlugin()
Copy the code