preface

Due to the impact of the epidemic, whether it is live delivery or live education, the online mode has been pushed to the peak. Due to business and interest, I have studied the knowledge related to live broadcast in H5. PS: If you know something about live broadcasting, you can skip the whole article. This article only covers some basic knowledge about live broadcasting.

  1. Learn a little bit about the architecture of live streaming.
  2. A deal for live streaming.
  3. Live broadcast in wechat mini program.
  4. Create a local h5 live demo.

Structure of Live streaming

As shown above: FirstPush the flowYou can think of it simply as your camera, which takes images and pushes a stream of data to the server. The secondPull flowYou can think of it as your player, which is responsible for pulling the image data stream off the server. And then there isMedia streaming serverWe usually use third-party servers directly, such as Tencent Cloud Live, Ali Cloud live and so on. After simply binding domain name and application, we can directly use it. Of course, at the end of this article, we will build a very simple media streaming server by ourselves.

Common live streaming protocols

HLS

There are a lot of similar information on the Internet, to sum up: Apple design, great latency, but super compatibility (H5 video tag compatibility).

The principle of this protocol is simple: HTTP request downloads playlist file (.m3u8), which contains slice file information (.ts file), and then downloads TS file feed for video to play

, as shown above: the M3U8 file contains three slices.tsFile, each file istargetDuration = 9s“, the three ts files on the left are 38,39,40, and the three ts files on the right are 39,40,41, and so on. The new playlist files are downloaded, and the new ts files are downloaded to the browser and played.

Such a large delay in HLS is understandable, because it is necessary to wait for multiple TS slices to be generated before sending them to the player. However, each TS slice file is around 10s, so this delay is unavoidable in the mode design.

RTMP

The protocol is also summarized as follows: Adobe’s plait protocol (so browser playback requires plug-ins) requires Flash player to play, has extremely short latency, is based on TCP long links, and does not require multiple handshakes. While the latency is excellent, the need for Flash is a real pain, and browsers have announced that they will no longer support Flash, so the protocol ends here.

DASH

Dash.js is compatible with almost all major browsers (including those that support MEDIA SOURCE EXTENSION) and has low latency. This protocol also allows streaming files to be sliced. YouTube and Netflix are said to use this protocol. If you open up the Network panel it’s like HLS, you’re constantly downloading new files.

HTTP-FLV

Domestic mainstream live streaming platforms all use this protocol, which is no longer continuous downloadsliceFile up instead by not settingcontent-lengthHeader to enable HTTP requests to be downloaded continuously without interruptionAs shown in the picture above, I randomly found a network request in the live broadcast of B station. The file is named after.flvAt the end of the file, you can also see that the size of the file has reached more than 40 MB when I took the screenshot.

Http-flv protocol delay is very low, their own Tencent cloud test can reach within 5s, another thing to say is open source libraryflv.js, should be the library to create the domestic web broadcast broadcast FLV format first, this library is also usedMEDIA SOURCE EXTENSIONInterface, so that the H5 tag can play FLV files.

Media Source Extension(MSE)

I mentioned it twice beforeMedia Source ExtensionOne is played on H5dashOne is playingflvWhen formatted, the API structure diagram is as follows:As can be seen from the picture,MediaSourceContained in theSourceBufferAnd theSourceBufferContains one by oneTrack, each Track is what we often call sound Track and visual Track. We can operate Track through some provided interfaces, such as switching Track, playing or suspending Track, or even jumping to a certain frame of Track, etc. (we do not have a deep understanding, because after all, it hurts to know too much ~). The following example code briefly shows how to use MSE:

    var vidElement = document.querySelector('video');

if (window.MediaSource) {
  var mediaSource = new MediaSource();
  vidElement.src = URL.createObjectURL(mediaSource);
  mediaSource.addEventListener('sourceopen', sourceOpen);
} else {
  console.log("The Media Source Extensions API is not supported.")}function sourceOpen(e) {
  URL.revokeObjectURL(vidElement.src);
  var mime = 'video/webm; codecs="opus, vp9"';
  var mediaSource = e.target;
  var sourceBuffer = mediaSource.addSourceBuffer(mime);
  var videoUrl = 'droid.webm';
  fetch(videoUrl)
    .then(function(response) {
      return response.arrayBuffer();
    })
    .then(function(arrayBuffer) {
      sourceBuffer.addEventListener('updateend'.function(e) {
        if(! sourceBuffer.updating && mediaSource.readyState ==='open') { mediaSource.endOfStream(); }}); sourceBuffer.appendBuffer(arrayBuffer); }); }Copy the code

The sourceOpen event is triggered when we assign mediaSource to the SRC tag of the video. Then we fetch the video stream in the callback of this event and store it in an arrayBuffer. AppendBuffer to sourceBuffer. My understanding is that the Media Source Extension can turn the content for example into a stream and feed it to the video label, so that the content that cannot be played by the video can be played.

    MediaSource.isTypeSupported('video/mp4; Codecs = "avc1.42 E01E mp4a. 40.2" ')
Copy the code

In addition, this interface is used to check whether the MSE supports a particular encoding and container box

In Conclusion

Live broadcast in wechat mini program

Wechat mini program if you want to implant live events is very simple things, uselive-pusherComponent to push the flow, feeling is to open the camera to record, and then uselive-playerComponent plays. The media server in the middle can directly use Tencent cloud (after all, Tencent’s own products). The following figure shows the generated pull stream and push stream address respectively. Bind the live domain name in the Cloud, directly in theAuxiliary toolTo enter a customAppNameandStreamNameThe corresponding push flow and pull flow address can be generated, and then the generated push flow pull flow address aslive-pusherandlive-playerThe url input can be. Note here that wechat applet only supports RTMP and FLV format pull streams.

Build a live demo locally

Push streams use OBS

“, this tool is very powerful, you can broadcast your entire desktop, your camera, even a window, etc., there is one in SettingsStreamOption, which input the corresponding Server address line

Media server

node-media-server

This media server acts as the topTencent cloudIn the function, you push up the stream of data to transcode and then send to the pull stream end, introduce a node write server callednode-media-server:The configuration is simple and will port1936As a push port for RTMP,8002Enable the HTTP pull-flow port and use it locallyffmpegTranscoding, in8002The playback streams under various protocols can be pulled up as shown below:

NGINX RTMP Module

NGINX: NGINX: NGINX: NGINX: NGINX

rtmp {
  server {
    listen 1935; # Listen on standard RTMP port
    chunk_size 4000;

    application show {
        live on;
        # Turn on HLS
        hls on;
        hls_path /mnt/hls/;
        hls_fragment 3;
        hls_playlist_length 60;
        # disable consuming the stream from nginx asrtmp deny play all; }}}Copy the code

Pull stream player

Here are some of the players we’ve chosen to play with. You can use flv.js to play http-FLV or dash.js to play DASH. In addition to using the web player again, of course, you can also download a local VLC player app, can play the content of various protocols ~Go to this local demo to run you can see the effect.