Author: Mai Le

Source: Hang Seng LIGHT Cloud Community

Compression and merge

The optimization points involved in resource consolidation and compression include two aspects: one is to reduce the number of HTTP requests, and the other is to reduce the size of HTTP request resources. We’ll explore HTML compression, CSS compression, JavaScript compression and obfuscation, and file merging in detail.

HTML compression

What is HTML compression?

Prepare an HTML file with an uncompressed size of 6KB, using the Node tool to compress it:

var fs = require('fs');
var minify = require('html-minifier').minify;
fs.readFile('./test.htm'.'utf8'.function (err, data) {
    if (err) {
        throw err;
    }
    fs.writeFile('./test_result.html', 
      minify(data,{
        removeComments: true.// Delete comments
        collapseWhitespace: true.// Merge Spaces
        minifyJS:true.// Compress the CSS in the file
        minifyCSS:true // Compress the js in the file
      }),
      function(){
        console.log('success'); }); });Copy the code

After compression, the content is reduced by half, the size is 3KB, and the HTML has no Spaces, comments, etc.

<! DOCTYPE html><html lang="en"><head><meta charset="UTF-8"><meta http-equiv="X-UA-Compatible" content="IE=edge"><meta name="viewport" content="width=device-width,initial-scale=1"><title>Document</title><style>img{width:700px; height:200px; Display: block} < / style > < / head > < body > < ul id = "uls" > < li > < a href = "http://www.baidu.com" > baidu < / a > < / li > < / ul > < img loading = "lazy" src="./image/home-1.png" alt="photo"> <img loading="lazy" src="./image/home-2.png" alt="photo"> <img loading="lazy" src="./image/home-3.png" alt="photo"> <img loading="lazy" src="./image/home-4.png" alt="photo"> <img loading="lazy" src="./image/home-5.png" alt="photo"> <img loading="lazy" src="./image/home-6.png" alt="photo"> <img loading="lazy" src="./image/home-7.png" alt="photo"> <img loading="lazy" src="./image/home-8.png" alt="photo"> <img loading="lazy" src="./image/home-9.png" alt="photo"> <img loading="lazy" src="./image/home-10.png" alt="photo"> <img loading="lazy" src="./image/home-11.png" alt="photo"> <img loading="lazy" src="./image/home-12.png" alt="photo"> <img loading="lazy" src="./image/home-13.png" alt="photo"> <img loading="lazy" src="./image/home-14.png" alt="photo"> <img loading="lazy" src="./image/home-15.png" alt="photo"> <img loading="lazy" src="./image/home-16.png" alt="photo"> <img loading="lazy" src="./image/home-17.png" alt="photo"> <img loading="lazy" src="./image/home-18.png" alt="photo"> <img loading="lazy" src="./image/home-19.png" alt="photo"> <img loading="lazy" src="./image/home-20.png" alt="photo"> <img loading="lazy" src="./image/home-21.png" alt="photo"> <img loading="lazy" src="./image/home-22.png" alt="photo"> <img loading="lazy" src="./image/home-23.png" alt="photo"> <img loading="lazy" src="./image/home-24.png" alt="photo"> <img loading="lazy" src="./image/home-25.png" alt="photo"> <img loading="lazy" src="./image/home-26.png" alt="photo"> <img loading="lazy" src="./image/home-27.png" alt="photo"> <img loading="lazy" src="./image/home-28.png" alt="photo"> <img loading="lazy" src="./image/home-29.png" alt="photo"> <img loading="lazy" src="./image/home-30.png" alt="photo"><script></script><script></script></body></html>Copy the code

The compression of CSS

.box {
  background: red;
}
Copy the code
// mini.js
var CleanCSS = require('clean-css');
var fs = require('fs');
var minify = require('html-minifier').minify;

var options = { 

};
fs.readFile('./css/index.css'.'utf8'.function (err, data) {
    if (err) {
        throw err;
    }
    const a =  new CleanCSS(options).minify(data)
    console.log(a)
    fs.writeFile('./css/test.css', 
      a.styles,
      function(err){
        if(err) {
          console.log(err)
        }
        console.log('success'); }); });Copy the code

node mini.js

CSS is compressed with one line:

.box{background:red}
Copy the code

Compression js

The processing of JavaScript mainly includes three aspects: deleting invalid characters and comments, reducing and optimizing code semantics and protecting code obfuscation. The principle of removing invalid characters and comments is similar to the compression of HTML and CSS. Here we mainly introduce code semantic reduction and optimization and code obfuscation protection.

Code semantics reduction and optimization

Compression of JavaScript can shorten the length of some variables. For example, after compression of a very long variable name, it can be replaced by a very short one like A and B, which can further effectively reduce the amount of JavaScript code. It can also be optimized for some duplicate code, such as removing duplicate variable assignments and reducing and merging invalid code.

Code obfuscation protection

Due to any user can access to the web page, can through the browser to the front-end developer tools to view the JavaScript code, if the front-end code semantics is very clear, not compressed didn’t also in confusion, the format is still intact, then theoretically anyone visit the web site can easily see the logic in our code, To do something that threatens the security of the system. Therefore, compression and obfuscation of JavaScript code is also a protection of our front-end code.

How to compress?

Preparing the JS file

var progress = 0;
for(var i = 0; i < 10000; i ++) {
   progress = i + 9;
   console.log(progress)
}
Copy the code
var fs = require('fs');
var uglifyJs = require('uglify-js');

var options = { 
  toplevel: true 
};
fs.readFile('./js/file3.js'.'utf8'.function (err, data) {
    if (err) {
        throw err;
    }
    const a =  uglifyJs.minify(data, options)
    console.log(a) 
    fs.writeFile('./js/test.js', 
      a.code,
      function(err){
        if(err) {
          console.log(err)
        }
        console.log('success'); }); });Copy the code

After compression, whitespace is removed and variable names are replaced with simple ones:

{ code: 'for(var o=0; o<1e4; o++)console.log(o+9); ' }
Copy the code

File merging

Suppose we have three JavaScript files, a.js, B.js and c.js respectively. When the keep-alive mode is used and no merge request is made, its request process is shown in the figure below:

If it is a merge request, it only needs to issue a request to get a-b-c.js to receive the full contents, as shown in the figure below:

Benefits:

The benefits of merging files are obvious, reducing the number of network requests and reducing waiting time.

The bad:

But also can’t say merge file is omnipotent, also has the deficiency.

The first is the problem of first screen rendering. After file merging, the size of JavaScript file will definitely be larger than before merging. Therefore, when rendering HTML web pages, if the rendering process will rely on the JavaScript file to be requested, it must wait for the file request to be loaded before rendering.

The second problem is cache invalidation, because most projects have a cache policy at present, that is, each requested JavaScript file will add an MD5 stamp, which is used to identify whether the file has been modified or updated. When the file is found to be modified, the cache invalidation will be used to request the file again. If only a small change has occurred in the source file, only the modified file will be invalidated without file merging, which will result in a large cache invalidation.

Merger Suggestions:

Merge common libraries. Usually, our front-end code will contain our own business logic code and reference third-party public library code. The business logic code will change more frequently than the public library code, so the public library code can be packaged into a file, and the business code is processed separately.

Merge business code by page. However, most front-end applications are single page, how to merge according to the page? It can be packaged by route and loaded only when the corresponding route is accessed.

Combined with Webpack for package build optimization

Reduce the scope of loader execution

module.exports = {
  module: {
    rules: [{test: /\.js$/,
        exclude: 'node_modules'.// Exclude modules
        include: ' '.// Contains modules
        use: [
          // [style-loader](/loaders/style-loader)
          { loader: 'bable-loader'}]}};Copy the code

Without a exclude field, Webpack uses babel-loader for all JavaScript files in the path of the configuration file, which is powerful but slow to execute.

The babel-loader has already been executed for the third-party library files before publication. Therefore, it is unnecessary to execute the babel-loader again, which increases the unnecessary packaging and construction time. Therefore, the exclude field cannot be omitted.

For image files, there is no need to include or exclude to reduce the frequency of loader execution, because no matter where images are imported, the final packaging needs to be processed by urL-loader. Therefore, include or exclude syntax does not apply to all Loader types. It depends on the specific situation.

Open the cache

Enabling caching to cache build results to the file system multiplies the efficiency of babel-Loader.

{ 
            loader: 'bable-loader' ,
            options: {
              cacheDirectory: true // Enable caching}}Copy the code

Ensure that plug-ins are lean and reliable

As a result, in the development environment, there is no need to consider the loading rate of the code to the user, and compression will reduce the readability of the code and increase the development cost, so there is no need to introduce the code compression plug-in in the development environment.

In the case of it is necessary to use the plugin, it is recommended to use webpack plug-in, recommended by the club’s official website because the plug-in performance of the channel is often after the official test, if you are using a third party company or individual without verification development plug-in, although they may help us to solve corresponding some problems encountered in the process of building a pack, but its performance is no guarantee, This may cause the overall packaging speed to slow down.

Configure resolve properly

Resolve. Extensions configuration optimization

When we introduce JavaScript code modules, we generally do not introduce file name extensions, which are usually written as follows:

import Foo from './components/Foo'

Copy the code

When a project is large, multiple module files will be split and referenced to facilitate the organization and maintenance of the code. As you can imagine, it is very troublesome to fill in file suffixes every time a module is imported, because the file suffixes of code modules are.js,.jsx in React, and.ts in TypeScript.

We can declare these suffixes using the Extensions attribute in Resolve and let WebPack find and complete the file suffixes for us when the project is built and packaged. References to component paths can also be simplified through the alias configuration of Resolve, which looks like this:

 resolve: {
    extensions: ['.js'.'.json'],}Copy the code

When an import statement such as require(‘./data’) is encountered, Webpack will first look for the./data.js file, and if the file does not exist, it will look for the./data.json file, and if it still cannot be found, it will report an error.

The longer the list is, or the later the correct suffix is, the more attempts will be made, so the configuration of the resolve.extensions will also affect the performance of the build. The following points need to be followed when configuring the resolve.extensions to optimize buildability as much as possible.

1 The list of postfix attempts should be as small as possible. Do not include impossible situations in the project in the postfix attempts list. 2 File suffixes with the highest frequency should be placed first in order to exit the search process as soon as possible. 3 When writing import statements in the source code, try to add suffixes as much as possible to avoid the search process. For example, write require(‘./data’) as require(‘./data.json’) in the affirmative case.

Resolve. Modules configuration optimization

Resolve. Modules defaults to [‘node_modules’], which means to go to the current./node_modules directory to find the module we want, if not, go to the next directory.. /node_modules, then go to.. /.. /node_modules /node_modules /node_modules /node_modules /node_modules

If the installed third-party modules are stored in the./node_modules directory of the project root directory, there is no need to use the default way to search layer by layer. You can specify the absolute path to save the third-party modules to reduce the search.

  resolve: {
    ...
    modules: [path.resolve(__dirname, 'node_modules')]},Copy the code

Resolve. Alias configuration optimization

Practical projects often rely on some huge third-party modules. Take the React library as an example.

You can see that there are two sets of code in the released React library.

1. The first set is modular code using CommonJS specification. These files are stored in the lib directory, with the entry file named in package.json as the entry of the module. 2. The first set is to pack all the React related code into a single file, which can be directly executed without modularization. Dist /react.js is used in the development environment, which contains checking and warning code. Dist /react.min.js for online environments is minimized.

By default, Webpack will recursively parse and process dozens of dependent files from the entry file./node_modules/react/react.js, which can be a time-consuming operation. By configuring resolve.alias, Webpack can use a separate, complete react.min.js file directly when dealing with the React library, skipping time-consuming recursive parsing.

resolve: {
    extensions: ['.js'.'.jsx'].alias: {
      The '@': path.resolve(__dirname, 'src'),
      react: path.resolve(__dirname, './node_modules/react/dist/react.min.js'),},modules: [path.resolve(__dirname, 'node_modules')]},Copy the code

According to the environment:

      react: isDev ? path.resolve(__dirname, './node_modules/react/cjs/react.development.js') : path.resolve(__dirname, './node_modules/react/cjs/react.production.min.js'),

Copy the code

Using DllPlugin

Whenever a change is made that requires repackaging, WebPack by default analyzes all referenced third-party component libraries and packages them into our project code. In general, third-party package code is stable and version-invariant.

DllPlugin it will use, it is based on a Windows dynamic link library (DLL) thoughts created, the plug-in will to third-party libraries separately packaged into a file, as a pure rely on libraries, it will not participate in repackaged with our project code, only when the version changes rely on their own will to pack.

DllPlugin plugin: used to package individual dynamic link library files.

DllReferencePlugin plugin: Used to introduce the DllPlugin plugin packaged dynamic link library file into the main configuration file.

Use as follows:

webpack.dll.js

const webpack = require('webpack');
const TerserPlugin = require('terser-webpack-plugin');
const { CleanWebpackPlugin } = require('clean-webpack-plugin');
 
module.exports = {
  mode: "development".entry: {vue: ["vue"],},output: {
    path: __dirname + "/src/dll/".filename: '[name].dll.js'.library: 'dll_[name]'
  },
  optimization: {
    minimize: true.Js / / compression
    minimizer: [
			new TerserPlugin({
        parallel: true.// Enable multi-process compression official recommendation}})],plugins: [new CleanWebpackPlugin(), // Empty the path of output
    new webpack.DllPlugin({
      path: __dirname+'/src/dll/[name].dll.json'.name: 'dll_[name]',}})]Copy the code

The Webpack configuration is as follows:

new webpack.DllReferencePlugin({
     manifest:require('./src/dll/jquery.json')}),new webpack.DllReferencePlugin({
     manifest:require('./src/dll/loadsh.json')})Copy the code

Insert the HTML

 // Package a file and import the resource automatically in HTML
          new AddAssetHtmlPlugin([{
            outputPath: "js/".filepath: path.resolve(__dirname,'src/dll/vue.dll.js') // File path
          }])
Copy the code

Convert a single process to a multi-process

We all know that WebPack is single-process, and even if multiple tasks exist at the same time, they can only be executed one by one in a queue, which is the limitation of NodeJS.

We can use Happypack to unlock the full advantage of CPU in multi-core concurrency and help us break up the packaged build tasks into multiple sub-tasks for concurrent execution, which will greatly improve the efficiency of packaging.

// It is recommended to use this when there are many files, if only one or two will slow down.
const HappyPack=require('happypack');
const os=require('os'); // Get the operating system
const happyThreadPool=HappyPack.ThreadPool({size:os.cpus().length}) // Create a process pool. {test: /\.js$/,
        loader: 'happypack/loader? id=happyBabel'.// Use happypack for js processing
        include: [resolve('src')]},...new HappyPack({
      id:'happyBabel'.loaders:[
        {
          loader:'babel-loader? cacheDirectory=true' / / instead of Babel - loader}].threadPool:happyThreadPool,
      verbose:true})...Copy the code

Compress the volume of the packing result

Remove redundant code

Webpack has been bringing tree-shaking to ES6 since version 2.0. For example, two modules module1 and Module2 are imported into a component through import, but module1 is used but module2 is not used. Because the usage of referenced modules can be identified in static analysis, when packaging this component, Tree-shaking will directly remove Module2 for us.

The rest of the code is removed mostly during the code compression phase, removing comments, Spaces, removing the console, and so on.

Webpack4 already supports automatic code compression, just set mode mode to Production to enable code compression.

Code splits are loaded on demand

Use react as an example to create a project with scaffolding:

create-react-app test

cd test

npm install

npm run start

Foo.js

// Foo.js
import React from 'react';
import moment from 'moment';
import { TEXT } from '.. /utils/index'


class Foo extends React.Component {
  constructor(props) {
    super(props);
    this.state = {};
  }

  componentDidMount() {
    this.setState({
      now: moment().format('YYYY-MM-DD hh:mm:ss')}); }render() {
    return (
      <div>
        Foo { this.state.now }
        {TEXT}
      </div>); }}export default Foo;
Copy the code

Bar.js

// Bar.js
import React from 'react';
import moment from 'moment';
import { TEXT } from '.. /utils/index'



class Bar extends React.Component {
  constructor(props) {
    super(props);
    this.state = {};
  }

  componentDidMount() {
    this.setState({
      now: moment().format('YYYY-MM-DD hh:mm:ss')}); }render() {
    const { now } = this.state;
    return (
      <div>
        Bar { now }
        { TEXT }
      </div>); }}export default Bar
Copy the code

App.js

The foo and bar components are loaded asynchronously, moment is the node_modules library shared by both components, and TEXT is the copy in the utils file shared by both components.

NPM run start Starts the project

Click load Foo:

As you can see, the code for the Foo component is loaded after the button is clicked, thus enabling on-demand and optimizing the first screen rendering with routing.

In addition, it can be seen that 0.chunk.js is the third party that the two components depend on together. If you click the load Bar, you do not need to load again, only 1.chunk.js can be loaded. Contents of the Bar component:

Visual analysis

1 Official Version

(1) Mac webpack – profile – json > stats. Json (2) the Window: webpack – profile – json | Out – the file ‘stats. Json’ – Encoding OEM

Then upload the output JSON file to the following website for analysis

webpack.github.io/analyse/

Select the stats.json file to see the details, generate a visual page, and then open the Modules page to analyze the packaging time and size of each file

2 Community Version

npm i webpack-bundle-analyzer –save

const wba = require('webpack-bundle-analyzer').BundleAnalyzerPlugin

plugin: [
    new wba()
]
Copy the code

You can see the size and proportion of each module more intuitively, and then optimize accordingly.

Hopefully, the content of this article will help you optimize your project build.