preface

Recently, I read the technical explanation and actual combat of the Flutter, and directly read the second chapter, I found that the ability enhancement is a little bit confused. After a quick review, I went to the Website Flutter to look at the camera example, do some source analysis on the camera example, and then go back to chapter 2 on power enhancement. It was really easy.

This article starts with a camera example that takes photos and analyzes video (no Android camera, video interaction source analysis) to learn how images are shared between the native and Flutter plug-ins, and how native components are nested within the Flutter. Finally, I pasted the links related to the second chapter of the book – the same layer rendering based on external textures.

The resources

Technical analysis and combat of Flutter technology evolution and creation of chapter 2 capability enhancement

Example of the Webcam with Flutter

Plug-in address

The Flutter is externally textured

The rendering performance of the flutter external texture is optimized with shared memory. Real-time rendering is not a dream

directory

A camera example

Example of the Webcam with Flutter

rendering

In the code, as in the Chinese example, note that the camera initialization and some Settings are asynchronous.

pubspec.yaml

Dependencies: Camera: ^0.5.2+2 Video_player: ^0.10.12+2 Path_Provider: ^0.4.1Copy the code

main.dart

List<CameraDescription> cameras; void main() async{ WidgetsFlutterBinding.ensureInitialized(); cameras = await availableCameras(); runApp(MyApp()); } class MyApp extends StatelessWidget { @override Widget build(BuildContext context) { return MaterialApp( title: 'Flutter Demo', theme: ThemeData( primarySwatch: Colors.blue, ), home: CameraHome(), ); }}Copy the code

camera.dart

import 'dart:io';
​
import 'package:camera/camera.dart';
import 'package:flutter/material.dart';
import 'package:fluttertwo/main.dart';
import 'package:path_provider/path_provider.dart';
import 'package:video_player/video_player.dart';
​
class CameraHome extends StatefulWidget {
  @override
  _CameraHomeState createState() {
    return _CameraHomeState();
  }
}
​
class _CameraHomeState extends State<CameraHome> with WidgetsBindingObserver {
  CameraController controller;
  String imagePath; //图片保存路径
  String videoPath; //视频保存路径
  VideoPlayerController videoController;
  VoidCallback videoPlayerListener;
  bool enableAudio = true;
  final GlobalKey<ScaffoldState> _scaffoldKey = GlobalKey<ScaffoldState>();
​
  @override
  void setState(fn) {
    super.setState(fn);
    WidgetsBinding.instance.addObserver(this);
  }
​
  @override
  void dispose() {
    WidgetsBinding.instance.removeObserver(this);
    super.dispose();
  }
​
  @override
  void didChangeAppLifecycleState(AppLifecycleState state) {
    //如果APP不在前台
    if (state == AppLifecycleState.inactive) {
      controller?.dispose();
    } else if (state == AppLifecycleState.resumed) {
      //在前台
      if (controller != null) {
        onNewCameraSelected(controller.description);
      }
    }
  }
​
  @override
  Widget build(BuildContext context) {
    return Scaffold(
      key: _scaffoldKey,
      appBar: AppBar(
        title: Text("相机示例"),
      ),
      body: Column(
        children: <Widget>[
          Expanded(
            child: Container(
              child: Padding(
                padding: EdgeInsets.all(1.0),
                child: Center(
                  child: _cameraPreviewWidget(),
                ),
              ),
              decoration: BoxDecoration(
                color: Colors.black,
                border: Border.all(
                  color: controller != null && controller.value.isRecordingVideo
                      ? Colors.redAccent
                      : Colors.grey,
                  width: 3.0,
                ),
              ),
            ),
          ),
          _captureControlRowWidget(),
          _toggleAudioWidget(),
          Padding(
            padding: EdgeInsets.all(5.0),
            child: Row(
              mainAxisAlignment: MainAxisAlignment.start,
              children: <Widget>[
                _cameraTogglesRowWidget(),
                _thumbnailWidget(),
              ],
            ),
          ),
        ],
      ),
    );
  }
​
  ///显示已拍摄的图片/视频缩略图
  Widget _thumbnailWidget() {
    return Expanded(
      child: Align(
        alignment: Alignment.centerRight,
        child: Row(
          mainAxisSize: MainAxisSize.min,
          children: <Widget>[
            videoController == null && imagePath == null
                ? Container()
                : SizedBox(
                    child: (videoController == null)
                        ? Image.file(File(imagePath),width: 64.0,height: 64.0,)
                        : Container(
                            child: Center(
                              child: AspectRatio(
                                aspectRatio: videoController.value.size != null
                                    ? videoController.value.aspectRatio
                                    : 1.0,
                                child: VideoPlayer(videoController),
                              ),
                            ),
                            decoration: BoxDecoration(
                              border: Border.all(color: Colors.pink),
                            ),
                            width: 64.0,
                            height: 64.0,
                          ),
                  ),
          ],
        ),
      ),
    );
  }
​
  ///展示所有摄像头
  Widget _cameraTogglesRowWidget() {
    final List<Widget> toggles = <Widget>[];
​
    if (cameras.isEmpty) {
      return Text("没有检测到摄像头");
    } else {
      for (CameraDescription cameraDescription in cameras) {
        toggles.add(SizedBox(
          width: 90.0,
          child: RadioListTile<CameraDescription>(
              title: Icon(getCameraLensIcon(cameraDescription.lensDirection)),
              groupValue: controller?.description,
              value: cameraDescription,
              onChanged: controller != null && controller.value.isRecordingVideo
                  ? null
                  : onNewCameraSelected),
        ));
      }
      return Row(
        children: toggles,
      );
    }
  }
​
  ///开启或关闭录音
  Widget _toggleAudioWidget() {
    return Padding(
      padding: EdgeInsets.only(left: 25),
      child: Row(
        children: <Widget>[
          Text("开启录音"),
          Switch(
            value: enableAudio,
            onChanged: (value) {
              enableAudio = value;
              if (controller != null) {
                onNewCameraSelected(controller.description);
              }
            },
          ),
        ],
      ),
    );
  }
​
  ///相机工具栏
  Widget _captureControlRowWidget() {
    return Row(
      mainAxisAlignment: MainAxisAlignment.spaceEvenly, //均匀放置
      mainAxisSize: MainAxisSize.max,
      children: <Widget>[
        IconButton(
          icon: Icon(Icons.camera_alt),
          color: Colors.blue,
          onPressed: controller != null &&
                  controller.value.isInitialized &&
                  !controller.value.isRecordingVideo
              ? onTakePictureButtonPressed
              : null,
        ),
        IconButton(
          icon: Icon(Icons.videocam),
          color: Colors.blue,
          onPressed: controller != null &&
                  controller.value.isInitialized &&
                  !controller.value.isRecordingVideo
              ? onVideoRecordButtonPressed
              : null,
        ),
        IconButton(
          icon: Icon(Icons.stop),
          color: Colors.red,
          onPressed: controller != null &&
                  controller.value.isInitialized &&
                  controller.value.isRecordingVideo
              ? onStopButtonPressed
              : null,
        ),
      ],
    );
  }
​
  ///开始录制视频
  void onVideoRecordButtonPressed() {
    startVideoRecording().then((value) {
      if (mounted) {
        setState(() {});
      }
      if (value != null) {
        showInSnackBar("正在保存视频于 ${value}");
      }
    });
  }
​
  ///终止视频录制
  void onStopButtonPressed() {
    stopVideoRecording().then((value) {
      if (mounted) {
        setState(() {});
      }
      showInSnackBar("视频保存在: ${videoPath}");
    });
  }
​
  Future<void> stopVideoRecording() async {
    if (!controller.value.isRecordingVideo) {
      return null;
    }
    try {
      await controller.stopVideoRecording();
    } on CameraException catch (e) {
      _showCameraException(e);
      return null;
    }
    await _startVideoPlayer();
  }
​
  Future<void> _startVideoPlayer() async {
    final VideoPlayerController vcontroller =
        VideoPlayerController.file(File(videoPath));
    videoPlayerListener = () {
      if (videoController != null && videoController.value.size != null) {
        if (mounted) {
          setState(() {});
        }
        videoController.removeListener(videoPlayerListener);
      }
    };
    vcontroller.addListener(videoPlayerListener);
    await vcontroller.setLooping(true);
    await vcontroller.initialize();
    await videoController?.dispose();
    if (mounted) {
      setState(() {
        imagePath = null;
        videoController = vcontroller;
      });
    }
    await vcontroller.play();
  }
​
  Future<String> startVideoRecording() async {
    if (!controller.value.isInitialized) {
      showInSnackBar("请选择一个摄像头");
      return null;
    }
    //确定视频保存的路径
    final Directory extDir = await getApplicationDocumentsDirectory();
    final String dirPath = "${extDir.path}/Movies/flutter_test";
    await Directory(dirPath).createSync(recursive: true);
    final String filePath = "$dirPath/${timestamp()}.mp4";
​
    if (controller.value.isRecordingVideo) {
      return null; //正在录制
    }
    try {
      videoPath = filePath;
      await controller.startVideoRecording(filePath);
    } on CameraException catch (e) {
      _showCameraException(e);
      return null;
    }
    return filePath;
  }
​
  ///拍照按钮点击回调
  void onTakePictureButtonPressed() {
    takePicture().then((value) {
      if (mounted) {
        setState(() {
          imagePath = value;
          videoController?.dispose();
          videoController = null;
        });
        if (value != null) {
          showInSnackBar('图片保存在 $value');
        }
      }
    });
  }
​
  Future<String> takePicture() async {
    if (!controller.value.isInitialized) {
      showInSnackBar("错误: 请选择一个相机");
      return null;
    }
    final Directory extDir = await getApplicationDocumentsDirectory();
    final String dirPath = '${extDir.path}/Movies/flutter_test';
    await Directory(dirPath).createSync(recursive: true);
    final String filePath = '$dirPath/${timestamp()}.jpg';
    if (controller.value.isTakingPicture) {
      return null;
    }
    try {
      await controller.takePicture(filePath);
    } on CameraException catch (e) {
      _showCameraException(e);
      return null;
    }
    return filePath;
  }
​
  String timestamp() => DateTime.now().millisecondsSinceEpoch.toString();
​
  ///预览窗口
  Widget _cameraPreviewWidget() {
    if (controller == null || !controller.value.isInitialized) {
      return Text(
        "选择一个摄像头",
        style: TextStyle(
          color: Colors.white,
          fontSize: 24.0,
          fontWeight: FontWeight.w900,
        ),
      );
    } else {
      //调整child到设置的宽高比
      return AspectRatio(
        aspectRatio: controller.value.aspectRatio,
        child: CameraPreview(controller),
      );
    }
  }
​
  ///摄像头选中回调
  void onNewCameraSelected(CameraDescription cameraDescription) async {
    if (controller != null) {
      await controller.dispose();
    }
    controller = CameraController(
      cameraDescription,
      ResolutionPreset.high,
      enableAudio: enableAudio,
    );
    controller.addListener(() {
      if (mounted) {
        setState(() {});
        if (controller.value.hasError) {
          showInSnackBar("Camera error ${controller.value.errorDescription}");
        }
      }
    });
    try {
      await controller.initialize();
    } on CameraException catch (e) {
      _showCameraException(e);
    }
  }
​
  _showCameraException(CameraException e) {
    logError(e.code, e.description);
    showInSnackBar("Error: ${e.code}\n${e.description}");
  }
​
  showInSnackBar(String message) {
    _scaffoldKey.currentState.showSnackBar(SnackBar(
      content: Text(message),
    ));
  }
}
​
/// 获取不同摄像头的图标(前置、后置、其它)
IconData getCameraLensIcon(CameraLensDirection direction) {
  switch (direction) {
    case CameraLensDirection.back:
      return Icons.camera_rear;
    case CameraLensDirection.front:
      return Icons.camera_front;
    case CameraLensDirection.external:
      return Icons.camera;
  }
  throw ArgumentError("Unknown lens direction");
}
​
void logError(String code, String message) =>
    print('Error: $code\nError Message: $message');
​
Copy the code

Analysis of the working principle of the Flutter camera

Using the above code, we can see that after using the Camera and Video_player plug-ins (the plug-in address), the Flutter side only needs the following key code to take pictures and record videos normally

Await controller.takePicture(filePath); / / video recording final VideoPlayerController vcontroller = VideoPlayerController. The file (file (videoPath)); vcontroller.play()Copy the code

1 and take photos

Let’s look directly at the **takePicture** method called by the CameraController

Future<void> takePicture(String path) async { if (! value.isInitialized || _isDisposed) { throw CameraException( 'Uninitialized CameraController.', 'takePicture was called on uninitialized CameraController', ); } if (value.isTakingPicture) { throw CameraException( 'Previous capture has not returned yet.', 'takePicture was called before the previous capture returned.', ); } try {// note 1 value = value.copywith (isTakingPicture: true); await _channel.invokeMethod<void>( 'takePicture', <String, dynamic>{'textureId': _textureId, 'path': path}, ); value = value.copyWith(isTakingPicture: false); } on PlatformException catch (e) { value = value.copyWith(isTakingPicture: false); throw CameraException(e.code, e.message); }}Copy the code

The rest of the code doesn’t matter. Let’s look at the key code first. You can see from comment 1 that the invokeMethod method on the MethodChannel is called, For those who are not clear, see the example I shared last time about the Flutter communication with Native and the source code analysis. Here we look at the constructor for the MethodChannel

final MethodChannel _channel = const MethodChannel('plugins.flutter.io/camera');
Copy the code

Next, go to the Android or ios module of the Camera plugin to find the corresponding MethodChannel. A global search reveals the following code in the Android module’s MethodCallHandlerImpl

    methodChannel = new MethodChannel(messenger, "plugins.flutter.io/camera");
    imageStreamChannel = new EventChannel(messenger, "plugins.flutter.io/camera/imageStream");
Copy the code

InvokeMethod (‘takePicture’,

{‘textureId’: _textureId, ‘path’: Path},) calls back to the onMethodCall method on the Android side
,>

@Override public void onMethodCall(@NonNull MethodCall call, @NonNull final Result result) { switch (call.method) { case "availableCameras": / / sample code beginning call this method to obtain a list camera try {result. Success (CameraUtils. GetAvailableCameras (activity)); } catch (Exception e) { handleException(e, result); } break; case "initialize": { if (camera ! = null) { camera.close(); } cameraPermissions.requestPermissions( activity, permissionsRegistry, call.argument("enableAudio"), (String errCode, String errDesc) -> {if (errCode == null) {try {// comment 1 instantiateCamera(call, result); } catch (Exception e) { handleException(e, result); } } else { result.error(errCode, errDesc, null); }}); break; } case "takePicture": {// note 2 Camera. TakePicture (call. Argument ("path"), result); break; } case "prepareForVideoRecording": { // This optimization is not required for Android. result.success(null); break; } case "startVideoRecording": .... case "stopVideoRecording": ... case "pauseVideoRecording": ... case "resumeVideoRecording": ... case "startImageStream": ... case "stopImageStream": ... case "dispose": ... default: result.notImplemented(); break; }}Copy the code

InstantiateCamera (call, result) and 2 takePicture, 1 you may find that is not to take a photo call, this does not exist, in the example code, when taking a photo, Selecting the camera is the initialization operation called, that is, the method in note 1

The instantiateCamera method is called internally as follows

  private void instantiateCamera(MethodCall call, Result result) throws CameraAccessException {
    String cameraName = call.argument("cameraName");
    String resolutionPreset = call.argument("resolutionPreset");
    boolean enableAudio = call.argument("enableAudio");
    TextureRegistry.SurfaceTextureEntry flutterSurfaceTexture =
        textureRegistry.createSurfaceTexture();
    DartMessenger dartMessenger = new DartMessenger(messenger, flutterSurfaceTexture.id());
    camera = new Camera(activity,flutterSurfaceTexture,dartMessenger,cameraName,resolutionPreset,
            enableAudio);
​
    camera.open(result);
  }
Copy the code

Here’s a look at some of the operations that the Flutter side set up for the camera, the key code in these two lines

DartMessenger dartMessenger = new DartMessenger(messenger, flutterSurfaceTexture.id());
camera.open(result);
Copy the code

DartMessenger is used to communicate with the Flutter side, such as the CameraController class on the Flutter side

What’s going on inside Camera. Open?

public void open(@NonNull final Result result) throws CameraAccessException { pictureImageReader = ImageReader.newInstance( captureSize.getWidth(), captureSize.getHeight(), ImageFormat.JPEG, 2); // Used to steam image byte data to dart side. imageStreamReader = ImageReader.newInstance(previewSize.getWidth(), previewSize.getHeight(), ImageFormat.YUV_420_888, 2); cameraManager.openCamera( cameraName, new CameraDevice.StateCallback() { @Override public void onOpened(@NonNull CameraDevice device) { cameraDevice = device;  try { startPreview(); } catch (CameraAccessException e) { result.error("CameraAccess", e.getMessage(), null); close(); return; } Map<String, Object> reply = new HashMap<>(); reply.put("textureId", flutterTexture.id()); reply.put("previewWidth", previewSize.getWidth()); reply.put("previewHeight", previewSize.getHeight()); result.success(reply); } @Override public void onClosed(@NonNull CameraDevice camera) { dartMessenger.sendCameraClosingEvent(); super.onClosed(camera); }... }Copy the code

We need to pay attention to the onOpened method in the openCamera method above. After the camera is opened, this will call startPreview to open the camera preview and then call back the data to the Flutter side. These include textureId, previewWidth, and previewHeight. This one is important here, so keep this in mind for later when analyzing the camera interface display. But it’s not easy to turn on the camera preview. The inner call is as follows

Public void startPreview() throws CameraAccessException {// pictureImageReader.getSurface()); } private void createCaptureSession( int templateType, Runnable onSuccessCallback, Surface... surfaces) throws CameraAccessException { // Close any existing capture session. closeCaptureSession(); // Create a new capture builder. captureRequestBuilder = cameraDevice.createCaptureRequest(templateType); // Build Flutter surface to render to SurfaceTexture surfaceTexture = flutterTexture.surfaceTexture(); surfaceTexture.setDefaultBufferSize(previewSize.getWidth(), previewSize.getHeight()); Surface flutterSurface = new Surface(surfaceTexture); captureRequestBuilder.addTarget(flutterSurface); List<Surface> remainingSurfaces = Arrays.asList(surfaces); . // Collect all surfaces we want to render to. List<Surface> surfaceList = new ArrayList<>(); surfaceList.add(flutterSurface); surfaceList.addAll(remainingSurfaces); // Start the session cameraDevice.createCaptureSession(surfaceList, callback, null); }Copy the code

PictureImageReader in note 1 is of the ImageReader type. This class is allowed to receive rendered image data directly from the Surface.

All Surfaces are collected at note 4 and handed to the CameraDevice, which is called back by The CameraManager calling openCamera. CameraDevice is an abstract representation of a single camera connected to an Android device. Concrete implementation can see the CameraDeviceImpl class, audio and video related development should be very clear, as a slag will not be introduced. The createCaptureSession method has three parameters: the output Surface set for each CaptureRequest, the callback to create the session, and the thread on which the callback is executed

Next look at the takePicture method

camera.takePicture(call.argument("path"), result); // The first parameter is the file path from the Flutter and the second parameter is passed to set the response dataCopy the code

Notice that the camera object type is IO flutter. Plugins. Camera. The camera, takePicture method is as follows

public void takePicture(String filePath, @NonNull final Result result) { ...... pictureImageReader.setOnImageAvailableListener( reader -> { try (Image image = reader.acquireLatestImage()) { ByteBuffer  buffer = image.getPlanes()[0].getBuffer(); writeToFile(buffer, file); result.success(null); . }Copy the code

Then if the Flutter side displays, use image.file directly

One last thing not analyzed is how does the camera display on the Flutter side, the top half of the sample rendering

In the example, the call to CameraPreview(controller) is shown directly

class CameraPreview extends StatelessWidget { const CameraPreview(this.controller); final CameraController controller; @override Widget build(BuildContext context) { return controller.value.isInitialized ? Texture(textureId: controller._textureId) : Container(); }}Copy the code

The key display code is Texture(textureId: controller._textureid). How does the _textureId of the controller get

Future<void> initialize() async { ...... final Map<String, dynamic> reply = await _channel.invokeMapMethod<String, dynamic>( 'initialize', <String, dynamic>{ 'cameraName': description.name, 'resolutionPreset': serializeResolutionPreset(resolutionPreset), 'enableAudio': enableAudio, }, ); _textureId = reply['textureId']; . return _creatingCompleter.future; }Copy the code

As mentioned earlier, textureId is transmitted back from the Android side to the Flutter side when the camera is initialized. What does this textureId do? Continue with the Texture Dart class

class Texture extends LeafRenderObjectWidget {
  /// Creates a widget backed by the texture identified by [textureId].
  const Texture({
    Key key,
    @required this.textureId,
  }) : assert(textureId != null),
       super(key: key);
​
  /// The identity of the backend texture.
  final int textureId;
​
  @override
  TextureBox createRenderObject(BuildContext context) => TextureBox(textureId: textureId);
​
  @override
  void updateRenderObject(BuildContext context, TextureBox renderObject) {
    renderObject.textureId = textureId;
  }
}
​
Copy the code

The Texture code is very small. It is inherited from LeafRenderObjectWidget. There is some knowledge about custom components here. CreateRenderObject must be executed, so let’s take a look at the TextureBox returned

class TextureBox extends RenderBox { TextureBox({ @required int textureId }) : assert(textureId ! = null), _textureId = textureId; int _textureId; set textureId(int value) { assert(value ! = null); if (value ! = _textureId) { _textureId = value; markNeedsPaint(); } } @override void paint(PaintingContext context, Offset offset) { if (_textureId == null) return; context.addLayer(TextureLayer( rect: Rect.fromLTWH(offset.dx, offset.dy, size.width, size.height), textureId: _textureId, )); }}Copy the code

In the Paint method, a TextureLayer is used to set the data and put it into the PaintingContext. In fact, we can more or less guess that the textureId is used to provide credentials for the image view on the Flutter side.

2, video

VideoPlayerController does the same thing as CameraController, except one is for taking pictures and one is for recording videos, and the textureId is also in VideoPlayerController, Are obtained in the initialize method, is through _videoPlayerPlatform VideoPlayerController. Just create (dataSourceDescription) access, probably code process is as follows

//1 _textureId = await _videoPlayerPlatform.create(dataSourceDescription); //2 _videoPlayerPlatform final VideoPlayerPlatform _videoPlayerPlatform = VideoPlayerPlatform.instance.. init(); //3 instance -> _instance static VideoPlayerPlatform _instance = MethodChannelVideoPlayer(); //4 @override Future<int> create(DataSource dataSource) async { ..... TextureMessage response = await _api.create(message); return response.textureId; } //5._api type VideoPlayerApi _api = VideoPlayerApi(); //6\._api_create Future<TextureMessage> create(CreateMessage arg) async {final Map<dynamic, dynamic> requestMap = arg._toMap(); const BasicMessageChannel<dynamic> channel = BasicMessageChannel<dynamic>( 'dev.flutter.pigeon.VideoPlayerApi.create', StandardMessageCodec()); final Map<dynamic, dynamic> replyMap = await channel.send(requestMap); if (replyMap == null) { throw PlatformException( code: 'channel-error', message: 'Unable to establish connection on channel.', details: null); } else if (replyMap['error'] ! = null) { final Map<dynamic, dynamic> error = replyMap['error']; throw PlatformException( code: error['code'], message: error['message'], details: error['details']); } else { return TextureMessage._fromMap(replyMap['result']); }}Copy the code

Here the BasicMessageChannel is used for communication and the data is decoded through StandardMessageCodec.

The general textureId acquisition process is like this, in fact, and similar to taking pictures, similar work. The rest is not analyzed here, you can go to the article resources to download the plug-in, have a look.

Iii. Interpretation of Flutter technology and enhancement of combat capability

Here I just started to prepare to post the contents of the book, but later I found that Salted fish technology was published on Zhihu, and the layout is better than CSDN, so I directly post the link.

No surprise — the Flutter is textured

The rendering performance of the flutter external texture is optimized with shared memory. Real-time rendering is not a dream