Flutter briefly

The Flutter is a cross-platform framework that takes the audio, video, and web modules down to the C++ or ARM layer and encapsulates them into an audio and video SDK for the UI layer PC, iOS, and Android to call.

As a UI layer cross-platform framework, Flutter, as the name implies, also realizes a cross-platform development in the UI layer. It is expected that if Flutter develops well, it will gradually become a full-link cross-platform development from the bottom layer to the UI layer, with technicians responsible for developing the SDK and UI layer respectively.

What are the advantages of the development of Flutter

We can take a look at why the Flutter can achieve high performance:

Taking IOS as an example, Apple’s UIKit implements UI rendering by calling QuaztCore, its own rendering framework. Graphics rendering also calls low-level apis, such as OpenGL and Metal.

The Flutter is also consistent with the native API logic and implements the UI layer by calling SKIA, the underlying drawing framework layer. In doing so, Flutter implements a UI framework of its own that provides cross-platform possibilities for performance beyond the native API.

When we say how a framework ultimately performs, it’s really up to the designer and developer. As for what’s going on right now:

In practice, we found that the fluidity of the Flutter interface was better than that of the Native interface on some low end machines when normal development did not deliberately optimize the UI code.

Audio and video design concept of Flutter

Why use Flutter for audio and video development

As Flutter reaches more and more big companies, it’s not hard to see why audio and video are a big business. Short video, IM, new media and other relatively heavy business will have a figure of audio and video, so how to achieve a powerful, high-performance, controllable audio and video playback function through a powerful cross-platform framework? Are we still just using the upper-layer API of the plug-in? If you have the patience to read through this article, you will have a much better understanding of the audio and video implementation on Flutter than before

The implementation idea of Flutter developing audio and video

Before we get started, let’s think about what you would do if you were building a Video player with Flutter. What difficulties will you encounter? It is often more rewarding to read articles with questions. Probably the first word that pops up like most of you do is PlatformView. Indeed, PlatformView seems to be a good solution. As a technical solution published in Flutter 1.0, PlatformView is relatively mature and can be used quickly. But obviously, today we’re not going to focus on that, so why not this lovely solution? Consider this business scenario:

For example, we want to call the camera to take a picture or record a video, but in the process of taking a picture or recording a video we need to display a preview to our Flutter UI. If we want to use the message channel mechanism defined by the Flutter to do this, we need to transmit every frame captured by the camera from the native to the Flutter. This can be very costly, as the memory and CPU costs of transmitting image or video data in real time through message channels are huge! — Flutter

Because of this business scenario, it’s likely that our main character today — the Texture — will be a lot of your buddies confused.

A brief introduction: The Texture can be understood as an object in the GPU that holds the image data to be drawn. The Flutter Engine maps the Texture data directly in memory (without having to transfer data between the native and the Flutter). The Flutter assigns an ID to each Texture and provides a Texture component in the Flutter.

const Texture({   Key key,   @required this.textureId, })
Copy the code

video_player

Let’s take a closer look at video_player, the audio and video player plugin that is part of the official Flutter plugin. I will through several parts of the personal think more critical source code, point out to you the implementation of the plug-in.

FLTVideoPlayer

First we can see that the source code encapsulates a class called FLTVideoPlayer

If it’s just a simple presentation of PlatformView, I don’t need to wrap a Player class that’s so complicated, but I’ve annotated the methods and the arguments in the class, and I’ve annotated every line so that everyone can understand it.

Note that the core point of FLTVideoPlayer is not the play method that seems to shine, but the initialization method shown in the dotted line above. If you look at the source code, you can see that this method is called either when the local Asset audio is loaded, or when the URL audio is loaded.

FLTVideoPlayerPlugin

How do we transfer data from the Native layer to the Dart layer? That’s what our plug-in does. This section posts the core code directly. So you can see here, our PlatformChannel is EventChannel, why is this not a methodChannel or a BasicMessageChannel, and the answer is already in the previous section, We need to transmit the video data we get, or more accurately, a streaming transmission, and EventChannel is for streaming.

Taking a closer look at this method, it’s obvious that we’re creating our EventChannel, instead of using a fixed channelName as usual with simple plugins. Here our channel is related to our textureId. Why? This is for our multi-window function, which is the effect of playing multiple images in one interface shown in the example of the plug-in. This kind of design can also be applied to multi-window conversations in the implementation of video calls. In other words, the widgets can be used in the Flutter.

Flutter Source Code

Dart implementation strategy is also implemented through EventChannel, which includes features supported by the plug-in, such as pause, round play, and so on.

We first found our EventChannel definition. Everything seems fine, the only big question is, how did textureId get it? How do you connect to the native? Let’s go ahead and find out that this method is called in a method of the MethodChannelVideoPlayer class, but we still can’t see where textureId comes from.

OK, so continue to find, continue to find here videoEventsFor call point, but still can’t see! I just notice that I passed in a private variable, also coincidentally called textureId.

Click to jump to the implementation of the create method, where we initialize VideoPlayer and return his textureId.

Finally, we reach the end, and the end is a basic message echannel, through which we communicate with the Native layer on the Flutter, calling back our textureId.

conclusion

This article will introduce you to one of the solutions to audio and video in The Flutter, the Texture, which is the solution used by the official video plugin. It should also overturn some of your understanding of the Flutter plugin. So should we select PlatformView or Texture when selecting the implementation?