This summary was actually written during my turn at the company’s weekly tech share last year. At that time, the company was making bold preparations for the live broadcasting business, and I thought that the director would entrust me with an important task, so I turned over many forums and did a simple technology sharing. Later, another colleague took over the live broadcasting business, so I didn’t have the opportunity to practice live broadcasting. It’s a pity. Ok, nonsense not to say, begin our theory ~

Technical background

2016 is the first year of live broadcasting. On the one hand, major broadband providers have increased their prices in compliance with public opinion, and on the other hand, a large amount of capital has flowed into the live broadcasting sector, which has promoted the updating and iteration of technology. On the market, the most commonly used live broadcast protocol is HLS launched by Apple (originally supporting H5 broadcast), of course, there are RTMP, HTTP-FLV, RTP and so on.

Video file formats and live protocols

Video file format

In fact, video file formats are often called container formats, that is, the formats we most often talk about in our general life, FLV, MP4, OGG and so on. It can be understood as putting a stream of bits into a particular box in a certain order. So what’s wrong with choosing a different format for video?

The answer is no, but you need to know how to unlock the box and be able to find a decoder to decode it. If you look at it this way, as long as I have decoders and players for mp4, OGV, WebM, etc., I don’t have any problems. So the video bits are banked into a box, and if one of them is broken, the resulting file is actually unusable, because the box itself is broken.

However, one of the misunderstandings above is that I only understood the video as a static stream. Imagine if a video needs to be played continuously, for example, live, live broadcast, etc. Here, let’s take the TS/PS stream to illustrate.

  • PS (Program Stream) : static file Stream

  • TS (Transport Stream) : dynamic file streams

For the above two container formats, a video bitstream is actually treated differently.

  • PS: Banishes the finished video bits into a box to generate a fixed file
  • TS: Divide the videos you receive into boxes. The result is a file with multiple boxes.

As a result, if one or more boxes are damaged, the PS format will not be viewable, while TS will simply have a frame-hopping or Mosaic effect. The specific difference between the two is that the higher the error tolerance rate of the video, TS will be selected, while the lower the error tolerance rate of the video, PS will be selected.

Live broadcast protocol HLS

HTTP Live Streaming (HLS) is a video Streaming protocol based on HTTP. This is the live streaming protocol proposed by Apple. HLS is currently supported on IOS and older versions of Android. So what is HLS? The main two pieces of HLS content are.m3u8 files and.ts player files.

The HLS protocol is based on HTTP, and a server that provides HLS needs to do two things:

Encoding: image is encoded in H.263 format, sound is encoded in MP3 or HE-AAC, and finally packaged into MPEG-2 TS (Transport Stream) container;

Split: Divide coded TS files into small files with suffix TS and generate a.m3u8 plain text index file.

The browser uses the M3U8 file. M3u8 is similar to the audio list format M3U, which can be easily thought of as a playlist containing multiple TS files. The player plays them one by one in order. After all of them are played, the player requests the M3U8 file to obtain the playlist containing the latest TS file and continues to play. The whole live broadcast process is made up of a constantly updated M3U8 and a bunch of small TS files. M3u8 must be updated dynamically and TS can go through CDN.

Here, we focus on the client-side process. First of all, live broadcasting is live because its content is updated in real time. So how does HLS work?

We use HLS to include it directly with a video:

<video autoplay controls>
    <source src="xxx.m3u8" type="application/" />
    <p class="warning">Your browser doesn't support video</p>
Copy the code

As described above, it essentially requests an index file for.m3u8. This file contains descriptions of the.ts file, for example:

When the master Playlist URL is filled in, the user will only download the master Playlist once. The player then decides which media Playlist (the child M3U8 file) to use based on the current environment. If the user’s playing conditions change during the playback, the player will also switch the corresponding Media PlayList.

Of course, HLS supports more than just shard playback (specifically for live streaming), it also includes other features that should be available.

  • Use HTTPS to encrypt TS files

  • Fast/run backward

  • Insert ads

  • Switch between different resolutions

As you can see, the HLS protocol is essentially a single HTTP request/response, so it is adaptable and not affected by firewalls. But it also has an Achilles’ heel: significant delays. If each TS is shard by 5 seconds, and one M3U8 puts 6 TS indexes, there will be a delay of at least 30 seconds. If you reduce the length of each TS and the number of indexes in M3U8, the latency does decrease, but it leads to more frequent buffering, and the request pressure on the server multiplies. So you have to find a middle ground.

Note: HLS only supports Safari on PC, similar to Chrome, which cannot play M3U8 format using HTML5 video tag. Some mature schemes on the Internet can be directly adopted, such as: Sewise-player, MediaElement, videojs-contrib-hls, jwPlayer.

Live broadcast protocol RTMP

Real Time Messaging Protocol (RTMP) is a live video Protocol developed by Macromedia and now owned by Adobe. The same as HLS, RTMP can be used for live video broadcasting. The difference is that RTMP is based on Flash and cannot be played in iOS browsers, but its real-time performance is better than HLS. So this protocol is used to upload video streams, which are pushed to the server.

Here is a comparison of HLS and RTMP:

Live broadcast protocol HTTP-FLV

Similar to RTMP, HTTP-FLV is a live distribution stream for FLV video format. But there’s a big difference.

  • Initiate a long connection and download the corresponding FLV file
  • Simple header information

Now on the market, the most commonly used is HTTP-FLV for playback. However, the H5’s HTTP-FLV is also a pain point because it is not supported on mobile. However, flv.js can now help older browsers parse through mediaSource. The use of HTTP-FLV is also simple. As with HLS, you only need to add one connection:

<object type="application/x-shockwave-flash" src="xxx.flv"></object>
Copy the code

Basic Structure of Live broadcasting

At present, relatively mature live broadcast products are generally implemented by combining Server, H5 and Native(Android and ios), basically following the following routine:

  • Video recording terminal: generally, audio and video input devices on the computer or camera or microphone on the mobile terminal. Currently, mobile phone video is mainly used on the mobile terminal.

  • Video player: It can be the player on the computer, the Native player on the mobile phone, and the VIDEO tag of H5. Currently, the Native player on the mobile phone is the main player.

  • Video server: Generally, a Nginx server is used to accept the video source provided by the video recording end and provide the video streaming service to the video playing end.

  • In real-time, webscoket can be used to send and receive new bullets and render them in real time.

  • For browsers that don’t support Webscoket, this can be relegated to long polling or a front-end timer sending a request for a live barrage.

  • Animation and collision detection (i.e. barrage does not overlap) during barrage rendering, etc

H5 live broadcast scheme

Use flv.js for live streaming

  • Introduction to the

Flv.js is an open source project from Bilibli. It parses FLV files and feeds them to native HTML5 Video tags to play audio and Video data, making it possible for browsers to play FLV without Flash.

  • advantage

Thanks to the browser’s hardware acceleration for native Video tabs, performance is good and hd is supported. It also supports recording and live broadcasting. Remove the reliance on Flash.

  • Browser dependency

Flv.js relies on a compatible list of browser features

1, the HTML 5 Video

2. Media Source Extensions

3, WebSocket

4. HTTP FLV: FETCH or stream

  • The principle of

The only thing flv.js does is to decode the FLV data from native JS and feed it to native HTML5 Video tags via the Media Source Extensions API. (HTML5 native only supports playback in MP4 / WebM format, not FLV)

Why does FLV.js go around, get FLV from the server, decode it, convert it and feed it to the Video tag? Here’s why:

1. Compatible with current live broadcast schemes: Most of the audio and video services of current live broadcast schemes use FLV container format to transmit audio and video data.

2, FLV container format is simpler than MP4 format, faster and more convenient to parse.

  • Compatible with the plan


1. Prefer HTTP-FLV because of its low latency and good performance. 1080P is smooth.

2. Use Flash player to play RTMP streams if flv.js is not supported. Flash has good compatibility but poor performance and is disabled by default in many browsers.

3. You can use HLS if you don’t want to use Flash compatibility, but only Safari supports HLS on PC

The mobile terminal

1. Prefer HTTP-FLV because of its low latency, and the performance of devices that support HTTP-FLV is sufficient to run FLv.js.

2. Use HLS if flv.js is not supported, but HLS latency is very high.

3. HLS does not support live streaming, because mobile terminals do not support Flash.

Well, after all, it is the entry theory, if there is a follow-up business practice I will update, feel reading so far, than the heart ~