preface

Not long ago, I took time to study and explore the popular video live broadcasting, understand its overall implementation process, and discuss the feasibility of mobile HTML5 live broadcasting.

It is found that HLS and RTMP are the mainstream video live broadcast schemes on the WEB. Currently, HLS is dominant on the mobile WEB (HLS has the problem of delay, so RTMP can also be adopted by videol.js), while RTMP is dominant on the PC with good real-time performance. Next, H5 live theme sharing will be carried out around these two video streaming protocols.

1. Video streaming protocols HLS and RTMP

1. HTTP Live Streaming

HTTP Live Streaming (HLS) is an HTTP-based video Streaming protocol implemented by Apple and well supported by QuickTime on Mac OS, Safari on iOS and Safari on iOS. Higher versions of Android also add support for HLS. Some common clients such as MPlayerX and VLC also support HLS.

The HLS protocol is based on HTTP, and a server that provides HLS needs to do two things:

  • Encoding: image is encoded in H.263 format, sound is encoded in MP3 or HE-AAC, and finally packaged into MPEG-2 TS (Transport Stream) container;
  • Split: Divide coded TS files into small files with suffix TS and generate a.m3u8 plain text index file.

The browser uses the M3U8 file. M3u8 is similar to the audio list format M3U, which can be easily thought of as a playlist containing multiple TS files. The player plays them one by one in order. After all of them are played, the player requests the M3U8 file to obtain the playlist containing the latest TS file and continues to play. The whole live broadcast process is made up of a constantly updated M3U8 and a bunch of small TS files. M3u8 must be updated dynamically and TS can go through CDN. A typical M3U8 file format is as follows:

#EXTM3U

#EXT-X-STREAM-INF:PROGRAM-ID=1, BANDWIDTH=200000

gear1/prog_index.m3u8

#EXT-X-STREAM-INF:PROGRAM-ID=1, BANDWIDTH=311111

gear2/prog_index.m3u8

#EXT-X-STREAM-INF:PROGRAM-ID=1, BANDWIDTH=484444

gear3/prog_index.m3u8

#EXT-X-STREAM-INF:PROGRAM-ID=1, BANDWIDTH=737777

gear4/prog_index.m3u8

As you can see, the HLS protocol is essentially a single HTTP request/response, so it is adaptable and not affected by firewalls. But it also has an Achilles’ heel: significant delays. If each TS is shard by 5 seconds, and one M3U8 puts 6 TS indexes, there will be a delay of at least 30 seconds. If you reduce the length of each TS and the number of indexes in M3U8, the latency does decrease, but it leads to more frequent buffering, and the request pressure on the server multiplies. So you have to find a middle ground.

For browsers that support HLS, this will play:


Copy the code

Note: HLS only supports Safari on PC, similar to Chrome, which cannot play M3U8 format using HTML5 video tag. Some mature schemes on the Internet can be directly adopted, such as: Sewise-player, MediaElement, videojs-contrib-hls, jwPlayer.

2. Real Time Messaging Protocol

Real Time Messaging Protocol (RTMP) is a live video Protocol developed by Macromedia and now owned by Adobe. This solution requires a dedicated RTMP streaming Media service such as Adobe Media Server, and only Flash can be used to implement the player in the browser. It is very real-time and has very little latency, but the inability to support mobile WEB playback is its Achilles heel.

Although it cannot be played on the H5 page of iOS, it can be written and decoded to parse native apps of iOS by itself. RTMP has low latency and good real-time performance.

On the browser side, the HTML5 video tag cannot play videos of the RTMP protocol, which can be implemented through video.js.



	


Copy the code

3. Comparison of video streaming protocols HLS and RTMP

agreement The principle of Time delay advantages Usage scenarios
HLS Short link Http Set a period of data to generate TS slice files to update M3U8 files 10s – 30s cross-platform Mobile terminal based
RTMP Long Tcp link Data for each moment is sent immediately after it is received 2s Low delay and good real-time performance PC+ live broadcast + real-time requirements + strong interaction

Ii. Form of live broadcast

At present, the main form of live display is YY live and Inke live. It can be seen that the structure can be divided into three layers: (1) background video layer (2) attention and comment module (3) thumbs-up animation

However, the current H5 is similar to the live broadcast page and has little technical difficulty in implementation. It can be divided into the following implementation modes: ① The bottom video background using video video label to play ② The attention and comment module using WebScoket to send and receive new messages in real time through DOM and CSS3 ③ The like using CSS3 animation

After understanding the form of live broadcast, the following is the overall process of live broadcast.

Iii. Overall process of live broadcast

The overall process of live broadcast can be roughly divided into:

  • Video capture terminal: it can be audio and video input device on computer, camera or microphone on mobile phone. Currently, mobile phone video is mainly used.

  • Live streaming video server: an Nginx server collects the video stream (H264/ACC coding) transmitted by the video recording end. The server side analyzes and codes the video stream in RTMP/HLS format and pushes it to the video player end.

  • Video Player: The video Player can be the QuickTime Player and VLC on the computer, the native Player on the mobile phone, and the VIDEO tag of H5. Currently, the native Player on the mobile phone is the main Player.

Iv. H5 recording video

For H5 video recording, you can use the powerful Web Real-Time Communication (webRTC), a technology that supports real-time voice or video dialogues in Web browsers. The disadvantage is that it is well supported only on Chrome of PC, but not so well supported on mobile terminals.

1. Use webRTC to record video

1) call window. The navigator. WebkitGetUserMedia () to obtain the user’s PC camera video data. (2) get to video streaming data into the window. The webkitRTCPeerConnection (a video streaming data format). ③ Use WebScoket to transfer the video stream data to the server.

Pay attention toWebRTC: Although Google has been pushing WebRTC for a long time, there are a number of products, but most mobile browsers do not support WebRTC (the latest iOS 10.0 does not support WebRTC), so the real video recording will be achieved by the client (iOS,Android), better results.

2. The native iOS application invokes the camera to record videos

① Audio and video collection, using AVCaptureSession and AVCaptureDevice can collect the original audio and video data flow. (2) H264 coding for video, AAC coding for audio, respectively in iOS has been encapsulated good coding library (X264 coding, FAAC coding, FFMPEG coding) to achieve the audio and video coding. ③ Assemble and encapsulate the encoded audio and video data. 4 establish the RTMP connection and push the RTMP connection to the server.

5. Build Nginx+Rtmp live streaming service

1. Install nginx and nginx-rtmp-module

Clone the nginx project locally:

	
brew tap homebrew/nginx
Copy the code

② Install the nginx-rtmp-module

	
brew install nginx-full --with-rtmp-module
Copy the code

Conf configuration file to configure RTMP and HLS

Find the nginx. Conf configuration file (path/usr/local/etc/nginx/nginx. Conf), configuration RTMP, HLS.

Add RTMP configuration to HTTP node

RTMP {server {# listen port 1935; Application rtmplive {live on; Set the maximum number of connections for the RTMP engine. Default value: off MAX_connections 1024; } # application HLS {live on; hls on; hls_path /usr/local/var/www/hls; hls_fragment 1s; }}}Copy the code

② Add the HLS configuration to HTTP

location /hls {  
       # Serve HLS fragments  
       types {  
           application/vnd.apple.mpegurl m3u8;  
           video/mp2t ts;  
       }  
       root /usr/local/var/www;  
       #add_header Cache-Controll no-cache;
       expires -1;
   }Copy the code

3. Restart the nginx service

Restart the nginx service. Enter http://localhost:8080 in the browser to check whether the welcome page is displayed.

nginx -s reload
Copy the code

6. Conversion format and coding of live stream

When the server receives the video stream transmitted from the video recording end, it needs to parse and encode it and push the video stream in RTMP/HLS format to the video player end. Commonly used common coding library scheme, such as X264 coding, FAAC coding, FFMPEG coding and so on.

In view of FFmpeg tool collection of a variety of audio and video format encoding, we can choose FFmpeg for format conversion, coding and streaming.

1. Install FFmpeg

brew install ffmpeg
Copy the code

2. Stream MP4 files

Address: video files/Users/gao/Desktop/video/test. The mp4 push pull flows address: RTMP: / / localhost: 1935 / rtmplive/home, RTMP: / / localhost: 1935 / rtmplive/home

/ / RTMP protocol flow ffmpeg - re - I/Users/gao/Desktop/video/test. The mp4 - vcodec libx264 - acodec aac -f FLV RTMP: / / 10.14.221.17:1935 / rtmplive/home / / HLS flow ffmpeg agreement - re - I/Users/gao/Desktop/video/test. The mp4 - vcodec libx264 - vprofile baseline - acodec aac - ar 44100 - strict - 2-1 - f FLV ac - q 10 RTMP: / / 10.14.221.17:1935 / HLS/testCopy the code

Note: After we push stream, we can install VLC, FFplay (rTMP-enabled video player) local pull stream for demonstration

3. Run the FFmpeg stream push command

① Live broadcast of video files

ffmpeg -re -i /Users/gao/Desktop/video/test.mp4 -vcodec libx264 -vprofile baseline -acodec aac -ar 44100 -strict -2 -ac 1 - f FLV - q 10 RTMP: / / 192.168.1.101:1935 / HLS/test ffmpeg - re - I/Users/gao/Desktop/video/test. The mp4 - vcodec libx264 - vprofile baseline - acodec aac - ar 44100 - strict - 2-1 - f FLV ac - q 10 RTMP: / / 10.14.221.17:1935 / HLS/testCopy the code

② Push stream camera + desktop + microphone recording for live broadcast

ffmpeg -f avfoundation -framerate 30 -i "1:0" \-f avfoundation -framerate 30 -video_size 640x480 -i "0" \-c:v libx264 -preset ultrafast \-filter_complex 'overlay=main_w-overlay_w-10:main_h-overlay_h-10' -acodec libmp3lame -ar 44100 -ac 1 The -f FLV RTMP: / / 192.168.1.101:1935 / HLS/testCopy the code

For more commands, see FFmpeg commands for processing RTMP streaming media

Vii. H5 live video play

Mobile terminal iOS and Android both naturally support the HLS protocol. After completing the video collection terminal and video streaming service, you can directly configure the video label on the H5 page to play live videos.

    
      
    

Your browser does not support HTML5 video.

Copy the code

Ps: ① Add webKit-playsinline attribute to the video TAB (iOS support) to ensure that the video is embedded in the webpage. ② For the problem that wechat browser has the highest level of video label, you need to apply for adding whitelist, please refer to bbs.mb.qq.com/thread-1242…

Eight, summary

In this paper, the whole process of video collection and uploading, video pushing and streaming processing by the server, and live video playing on THE H5 page are elaborated, and the implementation principle of live video is elaborated in detail. Many performance optimization problems will be encountered during the implementation process.

① H5 HLS limit must be H264+AAC code.

(2) H5 HLS playback lag problem, the server can do a good sharding strategy, ts files on the CDN, front-end can try to do DNS cache, etc.

③ In order to achieve better real-time interaction, H5 live broadcast can also adopt RTMP protocol and play through video.js.

References:

H5
H5 视频直播 video
HLS and RTMP protocols

Last update: 2016-10-13 11:06:15