preface

Why optimize the loading speed of web pages? When people develop products, they talk about user experience, but often forget that loading speed is related to user experience. According to Amazon’s internal statistics, for every 100ms increase in web page loading speed, revenue decreases by 1%. The slower your website is, the less customers are willing to pay for your stuff. Page loading speed also affects SEO.

Therefore, as a professional front-end engineer, you not only need to develop a user-friendly user interface, but also need to pay attention to the site load speed.

This article will share tips for optimizing web loading speed using real case studies. The main character is Omlet Arcade, a mobile game live community I helped develop. The framework is React.js and the packaging tool is Webpack, but the concepts in this article can be applied to any framework or packaging tool.

Results: Optimized for 43% reduction in JavaScript bundle size and 30% reduction in load.

Want to know how to do that friend? Then watch it with me!

The first step to optimizing load speed: use web speed tools

What if users complain about slow loading of web pages? If you don’t know what the problem is, any performance-optimized solution is likely to be futile.

So, the first step is to make good use of diagnostic tools and identify problems.

There are many web speed diagnostic tools available on the Web. I recommend starting with Google’s PageSpeed Insight, the most popular and well-known tool.

Diagnostic tool: Google PageSpeed Insight

PageSpeed Insight is a speed-measuring tool that lets Google use bots to visit your site, simulate how fast it will load, and analyze how your site’s performance can be improved.

For example, when we fed our own website, PageSpeed Insight spilled out a list of suggestions.

As you can see, the first item in the optimization proposal suggests that we remove unused JavaScript, saving 3.6 seconds in load time. By following its advice, you can begin to figure out where to start.

As for the grading part, I personally think it is good to refer to it, especially the grading of mobile version is very strict, after all, the network speed limit of mobile phone is much stricter than that of desktop computer.

How to prioritize web speed optimization?

After reading PageSpeed Insight, you may not know where to start for a while!

I suggest starting with two directions: “items that will improve the most after fixing” or “items in your business that can be changed.”

After modification, the most improved items can be seen obvious results, which is our first choice when we want to do efficiency optimization.

But sometimes things don’t go the way you want them to, like the part you want to change that requires AWS permissions, or the scope of another engineer’s business that they can’t match. The next best thing to do is to look at areas of your business that you can change. If you make some progress, your supervisor will have more confidence to let you make more complex changes later.

For example, PageSpeed suggests two directions for optimizing your company’s products:

1. The picture size is not optimized

2.JavaScript bundle is too large.

How do you prioritize? Here’s my thinking:

Image sizes involve additional processing at upload time on every client (Android/iOS/ Web), as well as on the server side, making it difficult to convince everyone to go along with the changes in the first place. On the other hand, if the JavaScript bundle is too large, the front-end engineer has more autonomy to change it without the cooperation of others.

Here are my priorities: first, I will help JavaScript bundle lose weight; Optimize the size of the image only if there is extra time and support from other engineers.

What happens if my JavaScript bundle is fat?

Most modern front-end projects follow a modular architecture and use packaging tools to compress JavaScript code into a single bundle. However, as the project grows, the business logic and third-party suites inevitably increase, leading to unintentionally bulky JavaScript bundles that drag down web performance.

Fat JavaScript bundles are one of the main culprits of slow web page loading.

Why does the size of a JavaScript Bundle affect loading speed?

Let me explain how the browser works: the browser parses the HTML to create the DOM Tree, and the first screen is drawn after the complete DOM tree is created. The faster the first screen is drawn, the faster and more efficient the user feels the page will load.

However, when the browser encounters JavaScript while building the DOM Tree, it must download it and stop to execute it. If the JavaScript bundle size is very large, it will take a lot of time to download, slowing down the time when the first screen is drawn.

Code Splitting: The saviour of JS Bundle hypertrophy

Webpack comes with the powerful Code Splitting feature, which allows you to split a single file into chunks.

These pieces can be loaded in parallel, dynamically as needed, or individually cached, thus speeding up browser downloads.

Next, I introduce the first trick of Code Splitting: splitting vendor bundles.

Break up the vendor bundle

In this section I will describe what a Vendor bundle is, what its benefits are, and how I split the vendor bundle for my project.

“What is a Vendor bundle?” Projects packaged with Webpack break the bundled JavaScript bundle into three parts for efficiency purposes:

Application bundle: The UI and business logic of the product. Vendor Bundle: Third-party packages that your product relies on, such as react. js or various NPM packages. Webpack Runtime and Manifest: Is responsible for the interaction between all modules and is generally negligible in size. Use the code splitting technique to split third-party packages into additional bundles, which are called vendor bundles.

So what’s the advantage of unpacking Vendor Bundle?

The answer is “easy to cache”, because third-party suites do not change frequently. If the user is not visiting our site for the first time, it is very likely that the browser cache has already downloaded vendor Bundle, as long as the application Bundle contains the business logic changes.

Vendor Bundle planning instance

So how do we plan for our product bundle?

In addition to the classic configuration of Webpack, we have an additional first-party kit, which is planned into four pieces altogether:

Manifest.js arcade. Js: business logic Vendor. js: third-party suite omlib.js: first-party (internal) suite that defines specifications for communicating with the Server API. Let’s examine the size before we start tuning!

Manifest.js is small enough to be ignored, so the bundle size before optimization is as follows:

Arcade. Js: 585KB, gzipped Vendor. js: 366KB, gzipped omlib.js: 205KB, gzipped total 1156KB.

The mobile version:

Arcade. Js: 426KB, gzipped Vendor. js: 366KB, gzipped omlib.js: 205KB, gzipped total 997KB.

Then we need to check if this configuration is reasonable and if there is room for improvement.

Use Webpack-bundle-Analyzer for analysis

Next, it is a Plugin for Webpack, which can arrange bundle contents according to file size and make visual presentation for us to analyze. webpack-bundle-analyzer

Here’s the bundle before optimization:

Notice that the Application bundle contains a large chunk of node_modules, a third-party suite.

Looking at previous efforts, we can see that they split several large third-party packages into vendor bundles.

That’s not wrong, but I think it’s more profitable to bundle the entire node_modules into a vendor bundle. As mentioned earlier, Vendor bundles don’t change very often, so packing all third-party packages into vendor Bundles makes caching more efficient and efficient.

So modify webpack.config.js as follows:

// webpack.config.js 

optimization : {

   splitChunks : {

     cacheGroups : {

       vendor : {

         test : /[\\/]node_modules[\\/]/,

         name : 'vendor' ,

      },

      omlib : {

         test : /[\\/]libs[\\/]/,

         name : 'omlib' ,

         chunks : 'all' ,

      },

    },

  },

},

Copy the code

The bundle size is as follows:

Arcade. Js: 310KB, gzipped Vendor. js: 632KB, gzipped omlib.js: 205KB, gzipped overall: 1156KB, unchanged

“Application Bundle: 585KB -> 310KB. The mobile version:

arcade.js: 218KB, gzipped vendor.js: 632KB, gzipped omlib.js: 205KB, Gzipped overall: 997KB -> 1055KB (because it contains the library of the table machine, it becomes bulky, this issue will be fixed later)

Application Bundle: 426KB -> 218KB

You can see that the Application Bundle is much smaller after the change!

To summarize the effect of this change: Reduce the size of the Application bundle and make it faster for users to enter the site with caching.

Looking at this, you might be wondering: The overall size of the download remains the same, but will first-time users still be the same slow?

The dynamic import technique, which I’ll mention next, can remedy this problem.

Webpack’s Code Splitting supports Dynamic imports and allows you to dynamically download modules that you need.

This section shows you how to do dynamic import based on the path so that the page code is not downloaded until you enter the page.

What is Dynamic Import? How to use it?

Dynamic import means that the code is not packaged into the original bundle and is downloaded over the Internet only when the code is actually used.

To use dynamic import is simple, just use the syntax of import() in your code.

For example, we load the third-party suite Lodash with the import() syntax:

function  getComponent () {

   return  import ( /* webpackChunkName: "lodash"* /'lodash' )

    .then( ( { default : _ } ) => {

       const element = document .createElement( 'div' );

      element.innerHTML = _.join([ 'Hello' , 'webpack'].' ' );

       return element;

    })

    .catch( error =>  'An error occurred while loading the component' );

}



getComponent()

  .then( component => {

     document .body.appendChild(component);

  })

Copy the code

Webpack sees the import() syntax and packages LoDash separately until it calls getComponent().

  

.

                   Asset Size Chunks Chunk Names

Currently emitted by index.bundle. js 7.88 KiB index [emitted] index

vendors~lodash.bundle.js    547 KiB vendors~lodash [emitted] vendors~lodash

Entrypoint index = index .bundle.js

.

Copy the code

Make Dynamic Imports – Combat based on the path

Next, I’ll show you how to do dynamic import by path so that you don’t download the code until you enter a page.

Why try to do dynamic import for all paths?

The reason is that according to GA data, users tend to stay on a few popular pages with fewer page changes, so it is more efficient to download only the required code during page changes.

The implementation is as simple as modifying routing as follows:

<Route path= "/" component={AppRoot}>

  <IndexRoute

    getComponent={() => {

      return  import ( 'containers/HomeContainer' ).then(({ default : HomeContainer }) => HomeContainer);

    }}

  />

  <Route path= "/games"

    getComponent={() => {

      return  import ( 'containers/GamesContainer' ).then(({ default : GamesContainer }) => GamesContainer);

    }}

  />

  { /* Other routes */ }

</Route>

Copy the code

After code splitting, single-page bundles were all under 20KB.

Arcade. Js: 215KB, gzipped vendor.js: 630KB, gzipped omlib.js: 205KB, gzipped “Application Bundle: 310 -> 215 + 20 = 235KB

Arcade. Js: 169KB, gzipped vendor.js: 630KB, gzipped omlib.js: 205KB, gzipped “Application Bundle: 218 -> 169 + 20 = 189KB”

Dynamic import by path reduces app Bundle downloads by approximately 25% (desktop) / 14% (mobile) and overall downloads by 6% (desktop) / 2% (mobile).In summary, the effect of this change is to reduce downloads for old users with caching, but not for new users. The reason is that there is more sharing between pages (including business logic, components, etc.) than expected, so it is not possible to cleanly unbundle the bundle according to the page.

See here you must think: how bad! Lie to me!!!!!!! Before you leave, I’ll show you another dynamic import trick that works even better than this change.

Make dynamic imports of large third party suites

If you look closely at the vendor bundle content, you may notice that some third-party packages are very bulky, so we’ll take care of these bulky third-party packages and make them load only when needed.


For example, we want jsZip to be loaded only when the component is used for an infrequently used component:

class  DropZone  extends  Component  {

  componentDidMount() {

    this .importJSZip = import ( 'jszip /dist/jszip.min.js').then(({ default : JSZip }) => JSZip );

  }



  onDrop() {

    this .importJSZip.then( JSZip => {

      const newZip = new  JSZip ();

// Start compression

    });

  }

}

Copy the code

We then manually exclude large third-party packages from the Vendor bundle:

// webpack.config.js 

splitChunks : {

   cacheGroups : {

     vendor : {

       test: /[\\/]node_modules[\\/](? ! jszip)/, // Exclude modules that need to be dynamically loaded

      name : ' vendor' ,

    },

  },

}.

Copy the code

Here are the third party suites that we load dynamically:

hls .js 

moment .js 

JSZip

Copy the code

In addition, here are some third-party kits we found that can be removed:

Request: 70KB (In the past, the project supported both Node and browser environments, so use request; It is currently only used on the pure browser side, so it is replaced with the native fetchAPI. After slimming down:

Arcade. Js: 215KB, gzipped Vendor.js: 267KB, gzipped omlib.js: 205KB, gzipped mobile version:

Arcade. Js: 169KB, gzipped vendor.js: 267KB, gzipped omlib.js: 205KB, gzipped “Vendor bundle: 630KB -> 267KB “to summarize the effect of this change, take the most frequently used home page as an example: From 1156KB to 205 (omlib) + 267 (Vendor) + 215 (app) + 20 (home) + 77 (hls.js) = 784KB, “reduced downloads by 32%”. The mobile version has also been reduced from 997KB to 205 + 267 + 215 + 20 = 707KB, a “29% reduction in downloads” that will help improve web performance.

You should be eager to see if there is room for optimization in your third-party suite.

The technique I’m going to introduce is also very effective, so let’s keep watching!

Remove unused code using Tree Shaking

Tree Shaking is removing unused code from the JavaScript bundle. This function is possible because of the static structure of the ES2015 module syntax import and export.

Webpack uses tree shaking as follows:

1. Rewrite CommonJS require and module.exports syntax to import and export 2 Identify side effect modules in package.json (usually CSS files) :

// Package.json 

"sideEffects" : [

   "*.css" ,

   "*.scss" 

].

Copy the code

Side effect refers to the fact that when a module is imported, there are some additional actions that affect the environment and should not be removed from tree shaking, for example: import ‘xxx. CSS ‘; The syntax will use javascript to inject styles.

This technique is easy to say, but it can be very hard to implement in practice, because our project has a long history of using a lot of CommonJS syntax, and it can be quite difficult to modify these artifacts to ensure that the existing behavior is the same (sigh).

Using the Tree Shaking technique, our internal library was greatly reduced in size:

“Omlib. js: 205KB -> 86.65KB” to summarize the effect of this change, although the rewriting process was painful, the final result is quite good, reducing the size by about 10%. If your project has a lot of CommonJS syntax, try Tree shaking!

Here are some tips that can help you improve your performance!

Use CDN for libraries like jQuery, React, etc.

The advantage of using A CDN is that you can save traffic and improve performance on your first stop, since you may be able to get a copy of the CDN before browsing another website.

React-dom, for example, which everyone uses, saves 36KB.

Babel Loader uses preset-2015 to convert ES2015 syntax to browser-supported syntax, and preset-env is an enhanced version of preset-2015. Used to replace PRESET -2015.

The advantage of preset-env is that its packaging process is smart, including only necessary plugins and polyfills depending on which browser you want to support; In other words, if you don’t need to support older browsers that are underused, the bundled bundle will theoretically be much smaller.

For details, see this article: [teaching] @babel/preset-env

Our site used to rely on polyfill.js for about 31KB, but after preset-env, the polyfill size is only 18KB.

conclusion

87 + 267 + 36 (React) + 202 + 77 (hls.js) + 20 = 689KB (-41%) Note that this is worst case, everything can be cached. Mobile version: * arcade. Js: 151KB, gzipped * vendor.js: 267KB, gzipped * omlib.js: 87KB, gzipped 997KB -> 87 + 267 + 36 (React) + 151 + 20 = 561KB (-44%) – ->

Cumulative downloads decreased by 41%