In fact, the construction of such a complete video broadcast system needs to involve most mainstream Internet technologies, mainly involving the following aspects:

First, build live streaming platform source, mobile live streaming push end

The live streaming end is the host end, which mainly collects video data through mobile phone camera and audio data through microphone. After a series of pre-processing, coding and packaging, the streaming is then pushed to CDN for distribution.

1, collecting

The MOBILE live broadcast SDK directly collects audio and video data through the phone camera and microphone. Among them, IOS hardware types are not many, so it is relatively simple, android due to the market hardware models are very many, so it is difficult to achieve a library for all hardware.

2. Pretreatment

This link mainly deals with beauty, watermark, blur and other effects. The beautifying function is almost a standard part of live streaming. In our survey, we found that too many cases were abandoned because there was no beauty function. In addition, the state has explicitly stipulated that all live streams must be watermarked and played back for at least 15 days.

3, coding

In order to facilitate the pushing, pulling and storage of mobile video, video coding compression technology is usually used to reduce the volume of video, and h.264 is now the most commonly used video coding. In audio, AAC coding format is more commonly used, other options such as MP3, WMA.

4, pushing flow

In order to push the stream, the audio and video data must be encapsulated by transmission protocol and turned into stream data. The commonly used stream transmission protocols include RTSP, RTMP, HLS, etc. The transmission delay of RTMP is usually 1-3 seconds. RTMP has also become the most commonly used stream transmission protocol in mobile live broadcast, which has high real-time requirements. Finally, the audio and video streaming data is pushed to the network through a certain Qos algorithm and distributed through CDN.

Two, build live platform source code, server processing

In order to adapt the stream to different protocols of different terminals, the server also needs to transcode the stream, including screenshot, recording, watermarking, etc.

Three, build live platform source, player end

1, the flow

Pull flow is actually the reverse process of push flow. Firstly, the bit stream is acquired through the player, and the standard pull stream formats are RTMP, HLS, FLV and so on.
RTMP is a patent agreement of Adobe, which is well supported by open source software and open source libraries, such as the open source Librtmp library. It is very simple to play RTMP live broadcast as long as flashPlayer is supported, and the live broadcast delay is generally 1-3 seconds.

1. Decode and render

It refers to the extraction of original data from audio and video data, that is, the playback of audio and video. The h.264 and H.265 encoding formats introduced above are lossy compression, so there is certain information loss in the extracted original data, rather than the original sampling data. Therefore, it has become the core secret of video companies to retain the best original picture through various coding parameters when the volume of video is minimal.
In a word, it is very complicated to build a live broadcast system. For most operational live broadcast systems, professional development teams are directly asked for project development, system testing and final online deployment, and a whole set of development is put on the shelves. Finally, a special technical maintenance team is required to deal with technical problems in the operation process.