Live broadcasting has become an indispensable thing in the Internet. It is said that bilibili, Huya and Betta are all ready to go public.

Futu securities as an Internet securities company live, of course, also little. At present, the main function of live broadcasting in Futu Niuniu software (hereinafter referred to as niuniu) is education and publicity.

Fu Tu Niu Niu is the leading Hong Kong stock and The United States stock trading software.

The article structure

  1. Live a
  2. Live broadcast of the process
  3. Live streaming on the Web
    • HLS agreement
    • RTMP protocol
    • Comparison between HLS and RTMP
  4. Live in actual combat
    • Install nginx, nginx-rtmp-module, FFmpeg (all on MAC)
    • Conf configuration file to configure RTMP and HLS
    • Restart the nginx
    • Check whether the port is enabled
    • FFmpeg Execute command
    • code
    • The effect
  5. There’s a hole in the air
  6. conclusion

Live a

Current niu Niu support:

  1. The mobile Niuniu (Android, IOS) anchor terminal only supports webcam live streaming, while the audience terminal supports display

  2. PC Niuniu anchor terminal support camera live, recording live screen, camera + recording live screen, audience support display

  3. MAC Niuniu and web end anchor end does not support, audience end support display

Live broadcast of the process

Live video can be divided into collection, pre-processing, coding, transmission, decoding and rendering.

collect

Generally by the client (IOS, Android, PC or other tools, such as OBS) completed, IOS is relatively simple, Android to do some model adaptation work, PC is the most troublesome various weird camera driver, of course, these problems, Tencent cloud has helped us to deal with, hehe.

Before processing

It mainly deals with live facial beauty. The facial beauty algorithm needs GPU programming and needs people who understand image processing algorithm. There is no good open source implementation, so you need to refer to the paper for research. The hard part is not the beauty effect, but the balance between GPU usage and beauty effect.

coding

Hard code for sure, soft code 720p is hopeless, and even if you can barely code it, the CPU will overheat and burn the camera. Code to find the best balance in the resolution, frame rate, code rate, GOP and other parameters design.

transmission

It is generally handed over to the CDN service provider.

decoding

It’s decoding the operation that was previously encoded, and in the Web the decoding is HLS.

Apply colours to a drawing

The main solution is to use the player. The player commonly used in the Web is video.js. At present, we use Tencent Cloud Player.

In fact, a complete live broadcast, far from the above several links, the following is the entire flow chart of Tencent cloud live broadcast solution:

Live streaming on the Web

At present, webcasting on the Internet is mainly about demonstration. The mainstream Web demonstration may involve HLS and RTMP. Now we will focus on HLS and RTMP protocols.

HLS

HLS (HTTP Live Streaming) is an HTTP based video Streaming protocol implemented by Apple. It is supported by QuickTime on Mac OS and Safari on iOS. Higher versions of Android also add support for HLS.

Some common clients such as MPlayerX and VLC also support the HLS protocol. If you want to play on Chrome, you need to use Videojs-contrib-hs.js parsing.

HLS flow chart

Server Server

The SERVER component of HLS is responsible for obtaining the Media input stream, which is then encoded in MPEG-4 (H.264 video and AAC Audio) format and hardware-packaged into the MPEG-2 (MPEG-2 Transport Stream) transport stream. As shown in the figure, the transfer stream goes through a stream Segmenter. The job here is that the MPEG-2 transfer stream is broken up into small fragments and then saved as one or more series of.ts media files. This process is accomplished with the help of a coding tool such as an Apple Stream Segmenter. (Video is a.ts file, and pure audio is encoded as audio snippets, usually in AAC, MP3, or AC-3 format with ADTS headers.) The server can take the form of hardware coding and software coding, and its function is to slice existing media files and manage them using index files according to the rules described above. Software slices usually use tools provided by Apple or third-party integration tools.

Distribution component

The segmenter also creates an index file, which usually contains a list of these media files, as well as metadata. It’s usually one by one. List of M38U. The list element is associated with a URL for client access. These urls are then requested in order.

Index file structure diagram

Primary index file
#EXTM3U
# EXT - X - STREAM - INF: PROGRAM - ID = 1, BANDWIDTH = 409037, RESOLUTION = 416 x234 CODECS = "mp4a. 40.2 avc1.42001 e"
Gear1/prog_index.m3u8

Copy the code

First line: #EXTM3U

The first line of each M3U file must be this tag to serve as an identifier

Second line: # ext-x-stream-inf

The property list of the tag directly specifies whether the current stream is VIDEO or AUDIO

Include attributes:

  1. BANDWIDTH Specifies the bit rate
  2. Program-id Unique ID (this attribute is deprecated in later versions of the protocol)
  3. CODECS Specifies the encoding type of the stream
  4. The RESOLUTION of the RESOLUTION
Subindex file
#EXTM3U
#EXT-X-TARGETDURATION:11
#EXT-X-VERSION:3
#EXT-X-MEDIA-SEQUENCE:0
#EXT-X-PLAYLIST-TYPE:VOD
# EXTINF: 10.133333.
fileSequence0.ts
# EXTINF: 10.000666.
fileSequence1.ts
# EXTINF: 10.667334.
fileSequence2.ts
# EXTINF: 9.686001.
fileSequence3.ts
# EXTINF: 9.768665.
fileSequence4.ts
# EXTINF: 10.000000.
fileSequence5.ts
#EXT-X-ENDLIST

Copy the code

#EXTM3U M3U file header, must be placed on the first line.

# ext-x-targetDuration Specifies the maximum duration of each fragment TS.

# ext-x-version Indicates the protocol VERSION.

# ext-x-media-sequence Specifies the SEQUENCE number of the TS fragment.

# ext-x-playlist-type provides information about the variability of the PLAYLIST. This is available for the entire PLAYLIST file and is optional.

#EXTINF extra info, fragment TS information, such as duration, bandwidth, etc.

The difference between primary index file and subindex file
  • Both primary and subindex files. The playlist M3U8
  • The primary index file is only downloaded once, but the subindex file is reloaded periodically for live programming

The client client

Assign components by standard web servers. They are responsible for accepting Client requests and providing associated resources to the Client.

The videojs-contrib-hsS. js component parsing process

Videojs – contrib – HLS. Js image resolution

HLS simply means that the entire stream is broken up into small pieces and downloaded based on HTTP files, a small portion at a time.

As mentioned earlier, a.m3U8 file is introduced when H5 plays live video. This file is a file that stores the video stream metadata based on the HLS protocol.

Each.m3u8 file corresponds to several TS files. These TS files are the real data stored in the video. The m3U8 file only stores the configuration information and related paths of some TS files. This file is then parsed through a parser (Videojs-contrib-hsS.js) and the corresponding TS file is found to play, so generally for speed,.m3u8 is placed on the web server and ts files are placed on the CDN.

RTMP

Real Time Messaging Protocol (RTMP) is a set of live video protocols developed by Macromedia and now owned by Adobe. This solution requires the establishment of a dedicated RTMP streaming Media service such as Adobe Media Server, and only Flash players can be implemented in the browser. Its real-time performance is excellent, and latency is small, but its inability to support mobile WEB playback is a pain in the neck.

On the browser side, the HTML5 video tag cannot play RTMP videos. You can use video-.js to do this.

<link href="http://vjs.zencdn.net/5.8.8/video-js.css" rel="stylesheet">
 
<video id="example_video_1" class="video-js vjs-default-skin" controls preload="auto" width="640" height="264" loop="loop" webkit-playsinline>
<source src="RTMP: / / 10.14.221.17:1935 / rtmplive/home" type='rtmp/flv'>
</video>
 
<script src="http://vjs.zencdn.net/5.8.8/video.js"></script>
<script>
videojs.options.flash.swf = 'video.swf';
videojs('example_video_1').ready(function(a) {
this.play();
});
</script>
Copy the code

Pros and cons of HLS VS RTMP

agreement The principle of Time delay advantages disadvantages Usage scenarios
HLS(http) Collection of data over time to generate TS slice files more m3U8 files 10s-30s cross-platform Delay sexual high The mobile terminal
RTMP(TCP) Data at each moment is sent immediately after it is received 2s Low delay and good real-time performance Poor cross-platform PC+ live broadcasting + high real-time requirements + strong interaction

Live combat (build RTMP and HLS live streaming services)

Install nginx

brew install nginx
Copy the code

Install nginx – RTMP – module

brew install nginx-full --with-rtmp-module
Copy the code

Then install FFmpeg (is a set of recording, conversion, audio/video encoding and decoding functions as one of the complete open source tool, we will use it to do the push and slice).

brew install ffmpeg
Copy the code

Conf configuration file to configure RTMP and HLS

Find the nginx. Conf configuration file (/ usr/local/etc/nginx/nginx. Conf), and configure corresponding live flow configuration.

RTMP {server {# listen 1935; Application rTMplive {live on; Set maximum number of connections for RTMP engine. Default: off max_connections 1024; } # HLS {live on; hls on; hls_path /Users/youname/Documents/notes/live/public/hls; < span style = "word-break: break-all; }}}Copy the code

Add HLS configuration to HTTP:

location /hls { # Serve HLS fragments types { application/vnd.apple.mpegurl m3u8; video/mp2t ts; } root /Users/youname/Documents/notes/live/public; Add_header cache-controll no-cache; expires -1; }Copy the code

Restart the nginx

sudo nginx -s reload
Copy the code

Check whether the port is enabled

netstat -an | grep 1935
Copy the code

If the following information is displayed, it is enabled.

The same is true for HTTP ports.

So now we’ve completed the setup of the service.

  1. RTMP push flow address for RTMP: / / 127.0.0.1:1935 / rtmplive/home
  2. Push HLS flow address for RTMP: / / localhost: 1935 / rtmplive/HLS

FFmpeg Execute command

We push flow MP4 files, for example, my video file address: / Users/youname/Desktop/w01661pl9vw p702.1. MP4

The RTMP protocol push command is as follows:

Ffmpeg - re - I/Users/youname/Desktop/w01661pl9vw p702.1. Mp4 - vcodec libx264 - acodec aac-fFLV RTMP: / / 127.0.0.1:1935 / rtmplive/homeCopy the code

The HLS protocol push command is as follows:

Ffmpeg - re - I/Users/youname/Desktop/w01661pl9vw p702.1. Mp4 - vcodec libx264 - vprofile baseline - acodec aac ar - 44100 -strict -2 -ac 1-fFLV - q 10 RTMP: / / 127.0.0.1:1935 / HLS /test 

Copy the code

Parameters:

  1. Video address: / Users/youname/Desktop/w01661pl9vw p702.1. Mp4
  2. Address: push flow RTMP: / / 127.0.0.1:1935 / rtmplive/home, RTMP: / / localhost: 1935 / rtmplive/HLS

If the following figure is displayed after the preceding command is executed, the command is successfully executed.

For FFmpeg function commands, refer to the FFmpeg function command set

Live streaming via Web

In the case of two push streams, we use video-.js player to play (in Niuniu, Tencent Cloud Player is used).

Implementation of RTMP protocol code

Note that the address in SRC is the address of the RTMP push stream. If the message “The current system environment does not support the video format” is displayed, you need to enable the Flash in the browser to play the video.


      
<html>
<head>
<meta charset="UTF-8">
<title>Insert title here</title>
<link href="http://vjs.zencdn.net/5.19/video-js.min.css" rel="stylesheet">
<script src="http://vjs.zencdn.net/5.19/video.min.js"></script>
</head>
<body>
<video
    id="my-player"
    class="video-js"
    controls
    preload="auto"
    data-setup='{}'>
    <source src='RTMP: / / 127.0.0.1 rtmplive/home' type='rtmp/flv'/>  
  </p>
</video>
<script type="text/javascript">
   var player = videojs('my-player');
   var options = {};
   var player = videojs('my-player', options, function onPlayerReady(a) {
     videojs.log('Your player is ready! ');
     // In this context, `this` is the player that was created by Video.js.
     this.play();
     // How about an event listener?
     this.on('ended'.function(a) {
       videojs.log('Awww... over so soon? ! ');
     });
   });
</script>
</body>
</html>
Copy the code

HLS code implementation

Below SRC is the slice address.


      
<html>
<head>
<meta charset=utf-8 />
<title>videojs-contrib-hls embed</title>
  <link href="https://unpkg.com/video.js/dist/video-js.css" rel="stylesheet">
  <script src="https://unpkg.com/video.js/dist/video.js"></script>
  <script src="https://unpkg.com/videojs-contrib-hls/dist/videojs-contrib-hls.js"></script>
</head>
<body>
  <video id="my_video_1" class="video-js vjs-default-skin" controls preload="auto" width="640" height="268" 
  data-setup='{}'>
    <source src="http://www.tony.com/hls/test.m3u8" type="application/x-mpegURL">
  </video>
</body>
</html>
Copy the code

The effect

Let’s see how it really works.

RTMP rendering

Effect of HLS

Ts and M3U8 files

Pit encountered in actual combat

  • Auto play problem
  • The performance of players on different platforms is not uniform
  • Internal page debugging is difficult
  • Communication between Native and Web

Auto play problem

In the X5 kernel browser, you must use standard events that trigger touchend, Click, Doubleclick, or KeyDown events.

The performance of players on different platforms is not uniform

Many Android browsers replace the video TAB with native player styles and behaviors, making it difficult to control the behavior and behaviors.

Embedded pages are difficult to debug

Currently, Weinre debugging is used, but Weinre debugging cannot see the actual effect in native. For example, when web calls native, native needs to feedback an effect, and Weinre cannot see the effect.

Native communicates with the Web

Schema and jsBridge, schema can only be web to call native, and can not be native to call Web; JsBridge can make native calls to the Web, but it cannot notify the Web if the iframe is not fully loaded.

conclusion

The whole live streaming is a very complicated process, and there will be many performance problems in the implementation process, which requires a tradeoff between performance and timeliness. Ts and M3U8 should try to cache, and push streaming should be used as much as possible in the browser.

Welcome to pay attention to the Futu Web development team, the original link

References:

How to build a complete live video system?

H5 live set sail

Is there an HTML5 player that supports M3U8?

HTTP Live Streaming (HLS) – Concept

M3U8 format explanation and practical application analysis