Here are a few original articles on the React stack:
- Analyze the Twitter front-end architecture to learn about complex scenario data design
- React + ES next = ♥
- React+Redux creates “NEWS EARLY”, a one-page app that understands the true nature of the cutting-edge technology stack
- React + Redux project example
- .
Today we further analyze a practical case: Uber APP mobile web version.
If you’re not interested in the React stack, or don’t know much about it, that’s fine. Because if you read this, it’s all about performance optimization.
The inspiration and main content of this article is translated from Narendra N Shetty’s article: How I Built a Super Fast Uber Clone for the Mobile Web, with a lot of expansions and digs.
Starting point and product prototype
For a long time, I believe everyone will agree with a point of view: mobile traffic is beyond the PC is indisputable fact. Developing the mobile Web is also interesting and challenging for front-end developers.
No, Uber just released the latest version of the APP, a new style, and the experience is great. Therefore, the author decided to use React to build a new Uber of his own from scratch.
During development, I spent a lot of time on basic components and styling. In this process, Uber’s official open React map library was mainly applied, and SVG-overlay and HTML-overlay were used to draw routes between “destination” and “starting point” on the map.
The final basic interaction can be seen in the Gif below:
Take the road to optimization
Now, we have the basic product form. The problem is to improve all aspects of the product performance experience. I used Chrome Lighthouse to check the performance of the product. The final result is:
wow… The first drawing time is close to 2 seconds, so don’t look at the time after that. Imagine a user taking out his cell phone and trying to hail a car. The main screen takes more than 1,9189.9 ms to draw, which is extremely unbearable.
Next, say nothing, roll up your sleeves and figure out how to optimize.
Optimization Method 1 — Code Splitting
The first thing I came up with and used was Code Splitting, which we could do with Webpack. What is Webpack Code Splitting? You can refer to the following quotation if English reading is difficult:
Code splitting means splitting a file into chunks, and Webpack allows us to define split points based on which files can be split and loaded on demand.
Because I use the React stack and use the React router, the code split can be based on routing and loading timing. This can be done using the getComponent API of the React-Router:
<Route path="home" name="home" getComponent={(nextState, cb) => {
require.ensure([], (require) => {
cb(null.require('.. /components/Home').default);
}, 'HomeView'); }} >Copy the code
The component is loaded and rendered only when the route is requested.
At the same time, I used webPack’s CommonChunkPlugin to extract third-party code. What is the rationale for this?
Careful readers may notice that there is a problem with code splitting above: resources imported on demand (routed) can have a lot of duplicate code. Especially the third party resources we use. By this time, you should be able to understand the CommonChunkPlugin. There are several ways to configure this plugin. Here we use: selective extraction (object passing) :
{
'entry': {
'app': './src/index.js'.'vendor': [
'react'.'react-redux'.'redux'.'react-router'.'redux-thunk']},'output': {
'path': path.resolve(__dirname, './dist'),
'publicPath': '/'.'filename': 'static/js/[name].[hash].js'.'chunkFilename': 'static/js/[name].[hash].js'
},
'plugins': [
new webpack.optimize.CommonsChunkPlugin({
name: ['vendor'].// The block name of the public block
minChunks: Infinity.// The minimum number of references is 2. Passing Infinity just creates the common block, but does not move the module.
filename: 'static/js/[name].[hash].js'.// The file name of the public block]}}),Copy the code
In this way, we extract the common code (react, react-redux, redux, react-Router, redux-thunk) specifically into the vendor module.
Through the above methods, the author was delighted to find that the First Meaningful Paint time was shortened from 19189.9ms to 4584.3ms:
This is definitely exciting.
Optimization method 2-Server Side Rendering (Server side rendering)
You’ve probably been hearing the terms “server-side render” or “server-side straight out.” But never practiced, never understood his meaning. Well, let me describe what server-side direct out is.
Server-side straight out, which simply boils down to the server returning a “preliminary final” HTML document on the first request from the browser. The HTML document has been concatenated. This will allow users to see the first screen as soon as possible, although this effect is “emasculated” and not final.
This approach is mainly aimed at the “before and after separation” of the traditional model. In traditional mode, the server returns an HTML document, the browser parses the document label, pulls CSS, and then pulls JS files. After the JS file is loaded, the JS content is executed and a request is sent to get the data. Finally, the data is rendered on the page.
Therefore, the Server Side Rendering method puts the process of JS data request on the Server, and even the combination of data and HTML processing can be done on the Server.
This is mainly to speed up the first screen rendering time. Of course, using server-side rendering also optimizes SEO issues that front-end rendering can’t overcome.
Simple enough to understand in theory, the difficulty lies in how the front-end scripts of the server-side environment are handled and aligned with the client side.
In this project, I used Express as the nodeJS framework, combined with react-Router:
server.use((req, res) = > {
match({
'routes': routes,
'location': req.url
}, (error, redirectLocation, renderProps) => {
if (error) {
res.status(500).send(error.message);
}
else if (redirectLocation) {
res.redirect(302, redirectLocation.pathname + redirectLocation.search);
}
else if (renderProps) {
// Create a new Redux store instance
const store = configureStore();
// Render the component to a string
const html = renderToString(<Provider store={store}><RouterContext {. renderProps} / ></Provider>);
const preloadedState = store.getState();
fs.readFile('./dist/index.html', 'utf8', function (err, file) {
if (err) {
return console.log(err);
}
let document = file.replace(/<div id="app"><\/div>/, `<div id="app">${html}</div>`);
document = document.replace(/'preloadedState'/, `'${JSON.stringify(preloadedState)}'`);
res.setHeader('Cache-Control', 'public, max-age=31536000');
res.setHeader("Expires", new Date(Date.now() + 2592000000).toUTCString());
res.send(document);
});
}
else {
res.status(404).send('Not found')
}
});
});Copy the code
With the above approach, we were pleased to see that First Meaningful Paint time has been shortened to 921.5ms:
This is undoubtedly exhilarating.
-Compressed Static Assets
Compress files, of course, is an easy and effective measure. To do this, I use webPack’s CompressionPlugin:
{
'plugins': [
new CompressionPlugin({
test: /\.js$|\.css$|\.html$/}})]Copy the code
Also, use express-static-gzip to configure the server:
server.use('/static', expressStaticGzip('./dist/static', {
'maxAge': 31536000.setHeaders: function(res, path, stat) {
res.setHeader("Expires".new Date(Date.now() + 2592000000).toUTCString());
returnres; }}));Copy the code
Express-static-gzip is a middleware on top of Express.static. If no compressed version is found for the file at the specified path, return as compressed version.
After this process, we shortened the time by 400ms, OK, now the First Meaningful Paint time is 546.6ms.
Optimization method 4-Caching
As of this point, we have been optimized from the original 19189.9ms to 546ms, and we certainly continue to be able to cache static files on the client side to make load times even shorter.
The author used SW-Toolbox with service workers.
Sw – toolbox: A collection of service worker tools for offlining runtime requests. Service Worker Toolbox provides some simple helpers for use in creating your own service workers. Specifically, it provides common caching strategies for dynamic content, such as API calls, third-party resources, and large or infrequently used local resources that you don’t want precached.
The Service Worker implements common runtime caching patterns, such as dynamic content, API calls, and third-party resources, as simple as writing a README.
A service worker is a service worker.
In 2014, W3C published the draft of Service worker. Service worker provides many new capabilities, enabling Web app to have the same offline experience and message push experience as native app. A service worker is a script that, like a Web worker, runs in the background. As a separate thread, the runtime environment is different from normal scripts, so it cannot directly participate in web interaction. Native app can be used offline, message push and background automatic update. The emergence of service worker is precisely to enable Web app to have similar capabilities.
Sw-toolbox, as the name implies, is a Toolbox of service worker. For details, let’s look at the code:
toolbox.router.get('(.*).js', toolbox.fastest, {
'origin':/.herokuapp.com|localhost|maps.googleapis.com/.'mode':'cors'.'cache': {
'name': `js-assets-${VERSION}`.'maxEntries': 50.'maxAgeSeconds': 2592e3}});Copy the code
The above code means that we applied toolbox.fastest handler to get requests with JAVASCRIPT content. Toolboot.fastest indicates: For this request, we get it both from the cache and through the normal request network. Whichever of the two methods returns faster is applied. In addition, the third parameter of Toolbox.router. Get represents the configuration item.
The thoughtful reader might be thinking, this is for browsers that support Service workers, but what about browsers that don’t? Let’s just set:
res.setHeader("Expires".new Date(Date.now() + 2592000000).toUTCString());Copy the code
Let’s get a feel for the page-loading waterfall by doing this:
Optimization method 5-Preload and then load
If you haven’t heard of Preload, don’t worry. Let’s take a look:
Preload is a new Web standard designed to improve performance and provide more fine-grained load control for Web developers. Preload enables developers to customize the load logic of resources without suffering the performance penalty of script-based resource loaders.
To put it in terms you can understand: The preload recommendation allows certain resources to be preloaded all the time, and the browser must request the preload tag for the resource.
What’s the point of all this? For example: some hidden resources in CSS and Javascript. The browser doesn’t realize it needs these resources until it’s too late, so in most cases, the loading of these resources causes a delay in page rendering.
Preload was introduced to optimize this process. For preload compatibility, see here.
For browsers that do not support Preload, I use Prefetch. But unlike preload, prefetch tells the browser what resources might be used to load the next page, not the current page. Therefore, the loading priority of this method is very low.
These new standards are actually interesting, and there’s a lot more to them than that. Interested students can understand, but also welcome to discuss with me.
To get back to business, I used in the head tag:
<link rel="preload". as="script">Copy the code
The final optimization results are shown in the figure below:
conclusion
In fact, using React+Webpack to make an Uber is no longer the point. What’s really exciting is how the whole process is optimized. We use a lot of mature, immature (new technology), hoping to inspire readers!
Happy Coding!
PS: author Github warehouse, welcome to communicate through various forms of code.