On June 24, 2017, Zhou Mingli, senior front-end engineer of Tencent, delivered a speech on QQ Wallet H5 application Development practice at TFC 2017, Tencent Web Front-end Conference. IT big said as the exclusive video partner, by the organizers and speakers review authorized release.

Read the word count: 3071 | 5 minutes to read

Guest Speech Video Address:
suo.im/5bap99

Abstract

In the era of mobile Internet, improving web performance is the goal of every front-end team. As the front-end engineer of QQ Wallet team, how do we realize H5 page opening in seconds by developing nodeJS service and using service worker? Let’s discuss the development practice of QQ wallet H5 application.

QQ wallet numerous H5 applications

In 2015, we officially established the wallet team. From the beginning, QQ Wallet only had one wallet entrance, and has developed a series of services such as phone recharge, card coupons, points, Penguin Internet Cafe, urban services and smart campus.

QQ Wallet H5 application development challenge

The access layer server is under heavy pressure

The average daily PV of QQ Wallet H5 application is more than 1000W, and the PV of promotion period can reach the level of hundreds of millions. Therefore, server performance optimization needs to be solved.

The mobile network environment is complex

In order to improve the user experience, it is necessary to spit out the page as soon as possible. Under android platform, it is generally required to be 3 seconds, while on ios platform, it is required to be less than 2 seconds. In some business scenarios, it is directly required to open the page in seconds.

Complex interactive scene

Mobile terminal, especially android platform, is prone to cause bottlenecks in web performance, such as long list rendering, image memory footprint, cSS3 animation performance problems need to be solved.

Cache management scheme based on SERVICE WORKER

Browser cache

Take the card coupon page for example, the whole page will send a total of 35 requests, of which 13 requests are data reporting, and the rest are valid requests. Of these valid requests, 9 are JS requests, 8 are IMG, and a few others.

We can estimate that a page is probably 77% static. Static resource requests tend to change less, and we don’t want the browser to download the same resource repeatedly, so we optimize our page by using the browser cache.

We control our cache policy by configuring some HTTP request headers, and then update our resources by version number.

But there are, in my opinion, three small problems with such a process.

Insufficient caching mechanism

Updates are unreliable. Because the UPDATE of CDN is not real-time, most of the time our resources have been published online, but its CDN has not been updated.

Poor offline experience. The current browser caching is not good for the offline experience, even though it is cached locally, it still shows you are not connected to the Internet when you open the page after you disconnect from the network.

Not customizable, such as incremental updates. Browser caching today is mostly done by configuration, but it can’t be done if you need to implement some custom policy.

Service Worker

ServiceWorker is a proxy layer between Web applications and servers that browsers provide to address AppCache’s previous inability to manage offline caches.

In general, a Service Worker is a program that runs automatically in the background of a browser and is responsible for assisting the browser, managing and responding to all requests from Web applications, so as to achieve a better offline experience.

Performance enhancements, such as prefetching and caching resources that users might need, such as static resource files needed in a page; Background data synchronization can be synchronized. Responding to resource requests from other sources; Centrally receive computationally expensive data updates; Background service hook; Custom templates are used for specific URL patterns and for module compilation and dependency management on the client side.

Wait state

The Service Worker in the installed state does not enter the installed state directly. If an old version of the Service Worker is running on other pages in the browser, the new Service Worker will wait until the other pages are closed. This is mainly to prevent resources used in the Service Worker from being accidentally released.

Once all other related pages are closed, it means that the old resource file is no longer needed. At this point we can carry out the next step of cleaning up.

Activate the event

The Fetch event

MoggyCache Manages offline packets

QQ Wallet team built a MoggyCache offline package management system. Through this system we can configure the current project needs to use static resources. In addition, the offline package is enabled or disabled for grayscale users. Can be configured to the platform and stored in an internal DB.

How MoggyCache works

Our Node.js service dynamically generates two scripts, the Install script and the worker script, by reading the above configuration.

The install script mainly reads a switch of the current offline package and its current gray user policy to determine whether the current user needs to install our offline package.

Once it determines that the user needs to install the offline package, it will register the worker script as the Service worker of the current page through the registration process. The worker sends the list of resources from the configuration to the worker script. The Worker will go through the process to load and cache these resources locally.

MoggyCache new feature

The above process just simplifies some of our work, but does not solve the problem. So we added a few more features to that.

Automatic synchronization

When we configure a project each time, we will calculate the MD5 of all resources and store them in DB. Our Node.js Service will then read the configuration and deliver MD5 to the Service Worker.

Now, when we update resources and need to publish them, a post-script is added to the publishing system. Once the resources are published, this post-script will trigger the offline package system to recalculate the MD5 of each resource and push it to the Service Worker again.

The Service Worker then has two MD5’s, one for the old version and one for the current version. By comparing the two MD5’s, we know which resources are out of date. When an expired resource is found, the Service Worker goes back to the server and pulls the latest resource. The whole process is automatic without human intervention. This solves the problem of unreliability.

Incremental updating

We notify the MoggyCache system with a post-script whenever a new resource is published. It reads the new resource and evaluates it, calculates the format of the delta package, and stores the delta package in the service.

Because MD5 has been updated, the worker script resends the request to our service. If the Service finds an available delta package for the resource, it returns the delta package directly to the Service Worker. The Service Worker can enforce different policies by determining the request headers.

Access layer service architecture

In the early days of QQ Wallet establishment, we used PHP + APACHE access layer architecture. The VERSION of PHP was very old at the time, and we needed 20 servers to respond to all requests, and the QPS for a single server was only 200.

You can see from these data that the performance is still not good enough. In addition, PHP uses some private modules written by Tencent, and there are different versions between each module, leading to a series of problems such as high deployment cost, difficult expansion, apache log missing, and lack of monitoring of Web services.

After a while, we settled on NODEJS.

The reason for choosing NODEJS

Excellent performance: Node.js adopts the asynchronous IO scheme based on libuv library, which has a significant improvement in performance compared with Apache’s multi-process synchronous blocking scheme.

Front-end friendly: front-end familiar with JS grammar, node.js can smooth over, training new students is easier.

Mature community: Mature community and sound technical documentation, a large number of mature modules can be used by the business.

The reason for choosing a self-developed framework

Application scenario: The main usage scenario of our MoggyServer service is page straight out.

More control: self-developed frameworks, more control, more extensibility, these are all things we need to consider.

Stable and fast: Most frameworks in the community include extra features like static file processing, JSON data processing, etc., which is not what we want.

Smooth Service Restart


We built the logic for a smooth server restart into the framework. Node.js stops serving by configuring a post-script on the publishing system to notify the child of a new file to publish, and sending those message notifications to the old child when the child receives the messages.

Because the new code has been published online, you can recreate a new child process with the new code. The old child process will log itself out after it has finished processing the user that was still in service, and the whole process will be smooth.

Template engine optimization

We use credits to render a business page 10,000 times, and then count how long it takes.

Here we mainly simplify the ejS template nesting syntax, so there is a performance improvement.

Rendering with traditional EJS takes 7500 ms, while rendering with TPL takes 920 ms. This is the template engine optimization we did.

MoggyServer Online data

QQ Wallet now runs only 7 servers, and has completed tens of millions of levels of service. Single QPS has been increased from 200 to 900. At the beginning of 2017, our total request volume peaked at 169 million, requiring 57 machines to support, but using only 30% of the CPU per machine.

Straight out page loading

Traditional page loading scheme: After the user clicks the entrance, Native pulls up the WebView, waits for the completion of webView initialization, sends HTTP request to Node service to pull the page data, and finally renders the page.

SONIC optimization scheme

Serial to parallel

Compared with the traditional loading scheme, the optimization scheme instantiates the Webview in native execution and simultaneously initiates a request to the Sonic server to optimize the previous serial operation into parallel. Therefore, the time here is changed from sum(Webview, Request) to Max (Webview, Request). Reduced time consumption at the client level.

Page caching

Sonic supports local caching at the H5 page level, splitting returned pages into data layers and template layers for caching, generating local cached files for permanent caching, and displaying cached data completely if the page is unchanged.

Incremental updating

In the case of page updates, sonic to comparison and calculation of the client cache pages in the change of place, encapsulated into a json data structure returned to the client page updates as well as the cache update, so can greatly reduce the size of the package back, especially for mobile network can greatly save the request flow for the user.

That’s all for my sharing, thank you!