This is the fourth day of my participation in the August More text Challenge. For details, see: August More Text Challenge

background

When watching douyin pretty sister every day, I only know happy slide screen, click ❤️, but did not think of one day suddenly started to do live broadcast related business, a face meng bi, two eyes.

Jargon is everywhere, and Ben is nowhere to be found. Help me, I just want to see a beautiful girl.

After the crying, reason told me that everything is by protocol, the Internet is built on a set of protocols, and so is live streaming. A little research shows that there are mainly two video streaming protocols, HLS and RTMP.

HTTP Live Streaming (HLS)

HTTP Live Streaming (HLS) is an HTTP based video Streaming protocol proposed and implemented by APPLE. HLS is supported by many Apple products, such as QuickTime and Safari on Mac OS and Safari on iOS. Proposed by Apple in 2009, HLS is the default video-streaming standard required for iOS devices. Android also supports HLS. See Guide to Mobile Video Streaming with HLS

Since then, Android has added support, as have most other platforms.

HLS is popular for several reasons.

  • HLS can be played almost anywhere. Several big platforms web, mobile, TV basically have free HLS player support.
  • Apple requires HLS. If you want to stream it on iOS, you can’t escape it.
  • HLS is relatively simple. It uses a common and already stored video format (MP4 or TS, with codecs such as H.264 and AAC), plus an ugly but human-readable text format (M3U8).
  • It works over HTTP. No special services are required (unlike the old school style RTMP protocol or the new WebRTC protocol). HLS can be easily distributed through a firewall or proxy server, and can be easily distributed using a CDN, and can be easily implemented on the client side.

The principle of

The HLS protocol is based on HTTP, and a server that provides HLS needs to do two things:

  • Encoding: the image is encoded in H.264 format, the sound is encoded in MP3 or HE-AAC, and finally packaged into MPEG-2 TS (Transport Stream) container;
  • Split: Divide the encoded TS file into small files with the same length as TS, and generate a.m3u8 plain text index file

HLS divides the entire stream into small HTTP-based files to download, a few at a time. The HLS protocol consists of HTTP, M3U8, and TS. In these three parts, HTTP is the transport protocol, M3U8 is the index file, and TS is the audio and video media information.

The browser uses the M3U8 file. Using HLS on HTML5 pages is very simple and straightforward

<video src="example.m3u8" controls></video>
Copy the code

or

<video controls>
    <source src="example.m3u8"></source>
</video>
Copy the code

HLS index file is m3U8 file, first download level index file (master_playList.m3u8), it records the second level index file address (Alternate-A, Alternate-B, Alternate-C) address, Then the client downloads the secondary index file, and the secondary index file records the download address of the TS file, so that the client can download the TS video file in sequence and play it continuously.

A typical level 1 index M3U8 file format is as follows:

# # EXTM3U EXT - X - STREAM - INF: PROGRAM - ID = 1, BANDWIDTH = 2000000, CODECS = "mp4a. 40.2, Avc1.4 d401f "skiing - 720 p. m3u8 # EXT - X - STREAM - INF: PROGRAM - ID = 1, BANDWIDTH = 375000, CODECS =" mp4a. 40.2, Avc1.4 d4015 "skiing - 360 p. m3u8 # EXT - X - STREAM - INF: PROGRAM - ID = 1, BANDWIDTH = 750000, CODECS =" mp4a. 40.2, Avc1.4 d401e "skiing - 480 p. m3u8 # EXT - X - STREAM - INF: PROGRAM - ID = 1, BANDWIDTH = 3500000, CODECS =" mp4a. 40.2, Skiing - 1080 p. m3u8 avc1.4 d401e"Copy the code

Detailed introduction is as follows:

  • bandwidthSpecifies the bit rate of the video stream
  • PROGRAM-IDNo need for attention,
  • each#EXT-X-STREAM-INFThe next line is the path to the level 2 index file, either relative or absolute. This example uses relative paths.

Level 1 index M3U8 file records the path of level 2 index file of video streams with different bit rates. The client can determine its own current network bandwidth and decide which video stream to play. It can also smoothly switch to a video stream that matches the bandwidth as the network bandwidth changes.

A secondary index file has the following format (skiing- 480p.m3U8):

#EXTM3U # ext-x-targetduration :10 # ext-x-version :3 # ext-x-media-sequence :0 # ext-x-playlist-type :VOD #EXTINF:9.97667, File000. Ts # EXTINF: 9.97667, file001 ts # EXTINF: 9.97667, file002. Ts # EXTINF: 9.97667, file003. Ts # EXTINF: 9.97667, file004.tsCopy the code

It’s easy to think of level 2 M3U8 as a playlist containing multiple TS files. The player plays one by one in order, and then requests the M3U8 file to get the playlist containing the latest TS file to continue to play, and repeat. The whole live broadcast process relies on a constantly updated M3U8 and a bunch of small TS files, M3U8 must be dynamically updated, TS can go CDN.

The overall architecture

The architecture of HLS is divided into three parts: Server, CDN and Client. That is the server, the distribution component, and the client. The architecture diagram is as follows:

HLS shortcomings

HLS is not a panacea, and it has a fatal weakness: latency is very obvious. If each TS is split in 5 seconds, and each M3U8 has 6 TS indexes, the delay is at least 30 seconds. If you reduce the length of each TS and the number of indexes in M3U8, the latency will indeed decrease, but it will result in more frequent buffering and the request pressure on the server will increase exponentially. So you have to find a middle ground.

On the other hand, HLS is based on short-connection HTTP, which is based on TCP, which means that HLS needs to constantly establish connections with the server, and TCP costs three handshakes each time a connection is established, a slow start process, and four waves each time a connection is disconnected.

Real Time Messaging Protocol (RTMP)

Real Time Messaging Protocol (RTMP) is a set of live video protocols developed by Macromedia and now owned by Adobe.

Based on TCP, the protocol is a family of protocols, including RTMP basic protocol and RTMPT/RTMPS/RTMPE and other variants. RTMP is a network protocol designed for real-time data communication. It is mainly used for audio, video and data communication between Flash/AIR platform and streaming media/interactive servers that support RTMP.

The lack of support for mobile WEB playback is a pain. Although it cannot be played on the iOS H5 page, the iOS native app can write and decode it. On the browser side, the HTML5 video tag cannot play RTMP videos. You can use video-.js to do this.

Its main advantages:

  • The real-time performance is very good, and the delay is small, usually 1-3s
  • TCP – based persistent connections do not need to be established multiple times.

Comparison between HLS and RTMP

Overall broadcast scheme

As you can see, each of the two agreements has its own advantages, so it is up to you to decide which one to use in the actual project. The overall structure of live streaming is as follows.

Reference thanks

  • mobile-hls-guide
  • Streaming media protocol HLS
  • Live fire, DO you know H5 live technology? H5 live broadcast program analysis and explanation