Written in 2016.10.19

I. Research objectives

The concept of HTTP2 has been around for quite some time, and the web is full of articles about it. But from the results of the search, the existing articles are mostly biased towards the introduction of HTTP2, few really from the data specific analysis. The purpose of this article is to fill in the gaps, through a series of experiments and data analysis, an in-depth study of HTTP2 performance. Of course, due to my limited technology, the method used in the experiment will certainly have deficiencies, if you find any problems, please also put forward to me, I will try to modify and improve the experimental method!

Two, basic knowledge

For the basics of HTTP2, refer to the following articles, which will not be described here.

  • HTTP,HTTP2.0,SPDY,HTTPS Some things you should know,
  • HTTP 2.0 stuff
  • Fantastic daily HTTP2.0
  • One-minute preview of HTTP2 features and packet capture analysis
  • HTTP/2 for web application developers
  • 7 Tips for Faster HTTP/2 Performance

Through the study of relevant materials, we have a general understanding of HTTP2, next by designing a model, the performance of the experimental test of HTTP2.

3. Experimental design

Setup experimental group: build an HTTP2 (SPDY) server that can respond to requests HTTP2. In addition, the size of the response content and the delay time of the response can be customized.

Set up a control group: build an HTTP1.x server, http1. x way to respond to the request, which can be customized with the experimental group. In addition, to minimize error, the HTTP1.x server uses the HTTPS protocol.

Test process: The client initiates a request to the experimental server and the control server by setting the response content size, the number of requested resources, the delay time, the upstream and downstream bandwidth and other parameters, and calculates the time required to complete the response.

Because the switch from Nginx to HTTP2 requires the upgrade of nginx version and HTTPS certificate, and the operation process involved in a variety of customized Settings on the server side is relatively complex, so the use of Nginx as an experimental server scheme is abandoned, but NodeJS scheme is adopted. In the initial stage of the experiment, native NodeJS and Node-HTTP2 module were used to build the server. Later, express framework was used to build the server with Node-SPDY module. The reason is that native NodeJS is very complex to handle complex requests, and the Express framework has made a number of optimizations for requests, responses, and so on to reduce human error. General Performance node-spdy vs Node-spdy http2 #98 The function of node-spdy module is basically the same as that of Node-http2 module.

1. Server construction

The server logic of the experimental group and the control group is completely consistent, and the key codes are as follows:

app.get('/option/? ', (req, res) => {
	allow(res)
	let size = req.query['size']
	let delay = req.query['delay']
	let buf = new Buffer(size * 1024 * 1024)
	setTimeout(() => {
		res.send(buf.toString('utf8'))
	}, delay)
})
Copy the code

The logic is to dynamically set the size and latency of the response resource based on the parameters passed in from the client.

2. Client building

The client can dynamically set the number of requests, number of resources, size of resources, and server latency. At the same time, with Chrome developer tools, you can artificially simulate different network environments. After the resource request response is complete, the total elapsed time is automatically calculated. The key codes are as follows:

for (let i = 0; i < reqNum; i++) {
	$.get(url, function (data) {
		imageLoadTime(output, pageStart)
	})
}
Copy the code

The client makes multiple requests for resources through a loop, and the number of requests can be set. Each loop is updated with imageLoadTime to enable time statistics.

3. Experimental projects

A. Http2 performance study

By examining the content of the article in Chapter 2, you can attribute the performance impact of HTTP2 to “latency” and “number of requests.” In this experiment, “resource volume” and “network environment” are added as influencing factors. Detailed test experiments will be conducted for these four items below. Each experiment was repeated 10 times and recorded on average.

B. Server-side push research

Http2 also has one very special feature – server-side push. Server push allows the server to actively push resources to the client. This experiment will also study this function, mainly studying the use method of server push and its impact on performance.

HTTP2 performance data statistics

1. The impact of delay factors on performance

Conditions/number of experiments 1 2 3 4 5
Delay time (ms) 0 10 20 30 40
Number of resources (unit) 100 100 100 100 100
Resource Size (MB) 0.1 0.1 0.1 0.1 0.1
Statistical time (s) http1.x 0.38 0.51 0.62 0.78 0.94
Statistics time (s) http2 0.48 0.51 0.49 0.48 0.50

2. Impact of number of requests on performance

From the previous experiment, we can know that when the delay is 10ms, http1.x and http2 time statistics are similar, so this experiment delay is set to 10ms.

Conditions/number of experiments 1 2 3 4 5
Delay time (ms) 10 10 10 10 10
Number of resources (unit) 6 30 150 750 3750
Resource Size (MB) 0.1 0.1 0.1 0.1 0.1
Statistical time (s) http1.x 0.04 0.16 0.63 3.03 20.72
Statistics time (s) http2 0.04 0.16 0.71 3.28 19.34

Increase the delay time and repeat the experiment:

Conditions/number of experiments 6 7 8 9 10
Delay time (ms) 30 30 30 30 30
Number of resources (unit) 6 30 150 750 3750
Resource Size (MB) 0.1 0.1 0.1 0.1 0.1
Statistical time (s) http1.x 0.07 0.24 1.32 5.63 28.82
Statistics time (s) http2 0.07 0.17 0.78 3.81 18.78

3. Impact of resource volume on performance

From the above two experiments, we can know that when the delay is 10ms and the number of resources is 30, the time statistics of http1.x and http2 are similar, so the experiment delay time is set to 10ms, the number of resources is 30.

Conditions/number of experiments 1 2 3 4 5
Delay time (ms) 10 10 10 10 10
Number of resources (unit) 30 30 30 30 30
Resource Size (MB) 0.2 0.4 0.6 0.8 1.0
Statistical time (s) http1.x 0.21 0.37 0.59 0.68 0.68
Statistics time (s) http2 0.25 0.45 0.61 0.83 0.73
Conditions/number of experiments 6 7 8 9 10
Delay time (ms) 10 10 10 10 10
Number of resources (unit) 30 30 30 30 30
Resource Size (MB) 1.2 1.4 1.6 1.8 2.0
Statistical time (s) http1.x 0.78 0.94 1.02 1.07 1.13
Statistics time (s) http2 0.92 0.86 1.08 1.26 1.33

4. Impact of network environment on performance

From the above two experiments, we can know that when the delay is 10ms and the number of resources is 30, the time statistics of http1.x and http2 are similar, so the experiment delay time is set to 10ms, the number of resources is 30.

Condition/network condition Regular 2G Good 2G Regular 3G Good 3G Regular 4G Wifi
Delay time (ms) 10 10 10 10 10 10
Number of resources (unit) 30 30 30 30 30 30
Resource Size (MB) 0.1 0.1 0.1 0.1 0.1 0.1
Statistical time (s) http1.x 222.66 116.64 67.37 32.82 11.89 0.87
Statistics time (s) http2 138.06 71.02 40.77 20.82 7.70 0.94

HTTP2 server push experiment

This experiment mainly studies the influence of network environment on server push speed. In this experiment, the requested/pushed resource is a 290Kb JS file. The experiment was repeated ten times in each network environment, averaged and filled in a table.

Condition/network condition Regular 2G Good 2G Regular 3G Good 3G Regular 4G Wifi
Total Client Request Time (s) 9.59 5.30 3.21 1.57 0.63 0.12
Total server push Time (s) 18.83 10.46 6.31 3.09 1.19 0.20
Resource loading speed – Client request (s) 9.24 5.13 3.08 1.50 0.56 0.08
Resource loading speed – Server push (s) 9.28 5.16 3.09 1.51 0.57 0.08
Condition/network condition No Throttling
Total Client Request Time (ms) 56
Total Server Push Time (ms) 18
Resource loading speed – Client request (s) 15.03
Resource loading speed – Server push (s) 2.80

From the above table, we can find a very strange phenomenon. After network throttling is enabled (including Wifi option), the speed pushed by the server is far less than that of ordinary client requests. However, after network throttling is disabled, the speed pushed by the server is obviously better. In the Wifi option of network throttling, the download speed is 30M/s and the upload speed is 15M/s. The actual download speed on the test network was only 542K/s and the upload speed was only 142K/s, far short of the Wifi option. To understand this, we need to understand how “server push” works and where the pushed resources are stored.

The normal client request process is shown below:

The process of server push is shown as follows:

As can be seen from the schematic diagram above, the server push can send the resources required by the client to the client along with index.html, saving the step of repeated requests by the client. Because there are no requests, connections, etc., static resources can be pushed by the server to greatly improve the speed. But there is a question here, where are these pushed resources stored? After referring to Issue 5: HTTP/2 Push in this article, I finally found the reason. We can dig deeper into the schematic of the server-side push process:

The resources pushed by the server are placed in one place between the network and the HTTP cache, which can be understood as “local” here. When the client has parsed index.html, it requests the resource locally. Because the resource is localized, this request is very fast, which is one of the advantages of server-side push performance. Of course, the localized resource will return a 200 status code instead of a 304 or 200 (from cache) status code like localStorage. Chrome’s network throttling tool, which throttles any “network requests” in between, treats static resources pushed by the server as network requests because they also return a 200 status code, causing the problems seen in the above experiment.

6. Research conclusions

Through the above series of experiments, we can see that the performance advantages of HTTP2 are concentrated in “multiplexing” and “server push”. For a small number of requests (approximately less than 30), the performance difference between http1.x and http2 is similar. The performance advantage of Http2 is realized when the number of requests is large and the latency is greater than 30ms. Http2 also performs better than http1.x for poorly networked environments. At the same time, if static resources are processed by server push, the loading speed will be greatly improved.

In practice, because of the advantages of HTTP2 multiplexing, front-end application teams do not need to merge multiple files into one, generate Sprite graphs and other methods to reduce network requests. Beyond that, HTTP2 has little impact on front-end development.

Server upgrade http2, if using NodeJS, just upgrade node-HTTP module to Node-spdy module, and add the certificate. Nginx 1.9.5 Released with HTTP/2 Support

To use server-side push, the logic of the response needs to be extended on the server side, which needs to be implemented on a case-by-case basis.

Seven, afterword.

The paper come zhongjue shallow, and must know this to practice. Without a real design experiment, I would never have known that HTTP2 had a pit in it, and there are also some things to be aware of when using Chrome for debugging.

If you find any problems with my experimental design, or if you think of a better way to experiment, you are welcome to ask me. I will definitely read your suggestions carefully.


The source code required by the experiment is attached below: 1, the client page

<! -- http1_vs_http2.html --> <! DOCTYPE html> <html lang="en">
<head>
   <meta charset="UTF-8">
   <title>http1 vs http2</title>
   <script src="/ / cdn.bootcss.com/jquery/1.9.1/jquery.min.js"></script>
   <style>
   	.box {
   		float: left;
   		width: 200px;
   		margin-right: 100px;
   		margin-bottom: 50px;
   		padding: 20px;
   		border: 4px solid pink;
   		font-family: Microsoft Yahei;
   	}
   	.box h2 {
   		margin: 5px 0;
   	}
   	.box .done {
   		color: pink;
   		font-weight: bold;
   		font-size: 18px;
   	}
   	.box button {
   		padding: 10px;
   		display: block;
   		margin: 10px 0;
   	}
   </style>
</head>
<body>
   <div class="box">
   	<h2>Http1.x</h2>
   	<p>Time: <span id="output-http1"></span></p>
   	<p class="done done-1"> * Unfinished... </p> <button class="btn-1">Get Response</button>
   </div>

   <div class="box">
   	<h2>Http2</h2>
   	<p>Time: <span id="output-http2"></span></p>
   	<p class="done done-1"> * Unfinished... </p> <button class="btn-2">Get Response</button>
   </div>

   <div class="box">
   	<h2>Options</h2>
   	<p>Request Num: <input type="text" id="req-num"></p>
   	<p>Request Size (Mb): <input type="text" id="req-size"></p>
   	<p>Request Delay (ms): <input type="text" id="req-delay"></p>
   </div>

   <script>
   	function imageLoadTime(id, pageStart) {
   	  let lapsed = Date.now() - pageStart;
   	  document.getElementById(id).innerHTML = ((lapsed) / 1000).toFixed(2) + 's'
   	}
   	
   	let boxes = document.querySelectorAll('.box')
   	let doneTip = document.querySelectorAll('.done')
   	let reqNumInput = document.querySelector('#req-num')
   	let reqSizeInput = document.querySelector('#req-size')
   	let reqDelayInput = document.querySelector('#req-delay')

   	let reqNum = 100
   	letReqSize = 0.1let reqDelay = 300

   	reqNumInput.value = reqNum
   	reqSizeInput.value = reqSize
   	reqDelayInput.value = reqDelay

   	reqNumInput.onblur = function () {
   		reqNum = reqNumInput.value
   	}

   	reqSizeInput.onblur = function () {
   		reqSize = reqSizeInput.value
   	}

   	reqDelayInput.onblur = function () {
   		reqDelay = reqDelayInput.value
   	}

   	function clickEvents(index, url, output, server) {
   		doneTip[index].innerHTML = 'x Unfinished... '
   		doneTip[index].style.color = 'pink'
   		boxes[index].style.borderColor = 'pink'
   		let pageStart = Date.now()
   		for (let i = 0; i < reqNum; i++) {
   			$.get(url, function (data) {
   				console.log(server + ' data')
   				imageLoadTime(output, pageStart)
   				if (i === reqNum - 1) {
   					doneTip[index].innerHTML = Finished ')! '
   					doneTip[index].style.color = 'lightgreen'
   					boxes[index].style.borderColor = 'lightgreen'
   				}
   			})
   		}
   	}

   	document.querySelector('.btn-1').onclick = function () {
   		clickEvents(0, 'https://localhost:1001/option? size=' + reqSize + '&delay=' + reqDelay, 'output-http1'.'http1.x')
   	}

   	document.querySelector('.btn-2').onclick = function () {
   		clickEvents(1, 'https://localhost:1002/option? size=' + reqSize + '&delay=' + reqDelay, 'output-http2'.'http2')
   	}
   </script>
</body>
</html>
Copy the code

2. Server-side code (http1.x differs from http2 only in one way)

const http = require('https') // If the value is http2, the value will be changed'https'The module is changed to'spdy'Module const url = require('url')
const fs = require('fs')
const express = require('express')
const path = require('path')

const app = express()

const options = {
  key: fs.readFileSync(`${__dirname}/server.key`),
  cert: fs.readFileSync(`${__dirname}/server.crt`)
}

const allow = (res) => {
  res.header("Access-Control-Allow-Origin"."*")
  res.header("Access-Control-Allow-Headers"."X-Requested-With")
  res.header("Access-Control-Allow-Methods"."PUT,POST,GET,DELETE,OPTIONS")
}

app.set('views', path.join(__dirname, 'views'))
app.set('view engine'.'ejs')
app.use(express.static(path.join(__dirname, 'static')))

app.get('/option/? ', (req, res) => {
	allow(res)
	let size = req.query['size']
	let delay = req.query['delay']
	let buf = new Buffer(size * 1024 * 1024)
	setTimeout(() => {
		res.send(buf.toString('utf8')}, delay)}) HTTP.createserver (options, app). Listen (1001, (err) => {// The http2 server port is 1002if (err) throw new Error(err)
	console.log('Http1.x server listening on port 1001.')})Copy the code