Shi Jianping (Chu Yi)

Previous review: Taobao Widgets: A new Open Card Technology! , Large-scale Application of Taobao Widgets in 2021 Singles’ Day

This article mainly explains the principle of Canvas rendering under widgets from a technical perspective.

Before getting into the body of the need to explain what is a “widget”, the widget is taobao module/card level open solution, its main small program for the private domain provider class standard & consistent production, such as open, operation ability, it has a variety of business forms, such as goods, rights and interests of the CARDS and interactive CARDS, etc., ISV developed widgets can be deployed to store, detail, subscription and other business scenarios at very low cost, greatly improving operational & distribution efficiency.

From an end-to-end technical perspective, a widget is first and foremost a business container, characterized by DSL standardization, cross-platform rendering, and cross-scene flow:

  • DSL standardization is a DSL where widgets are fully compatible with applets (not just DSLS, but atomic API capabilities, production links, and so on) that developers can learn quickly without extra learning;
  • As the name suggests, the widget kernel (based on WEX 2.0) can render exactly the same effect on different operating systems, including Android and iOS, using a solution similar to flutter’s self-drawing. Developers don’t need to worry about compatibility.
  • Finally, cross-scene flow means that the widget container can be “embedded” into other business containers of various technology stacks, such as Native, WebView, small program, etc., so as to shield the difference of the underlying container for developers and achieve the effect of developing at one time and running in multiple places.

Coincidentally, the technical scheme of Canvas under widgets is quite similar to the technical scheme of the widget container embedded in other business containers, so the author will discuss it from Canvas rendering.

The principle of revelation

End-to-end overall technical architecture

The overall architecture of the technical side of the widget is shown in the figure below, which can be divided into two layers of “shell” and “core” from a macro perspective.

A “shell” is a widget container that consists of a DSL, widget JSFramework, atomic apis, and extension modules such as Canvas.

The “core” is the core of the widget, based on the new weex2.0. In weex1.0, we used the native rendering scheme of class RN, while weex2.0 upgraded to the self-drawing rendering scheme of class Flutter with The Times. Therefore, weex2.0 assumes the core responsibilities of JS execution, rendering and events of widgets, and subdivided into JS script engine, Framework and rendering engine three modules. Javascript engine uses lightweight QuickJS on Android side, JavaScriptCore on iOS side, and supports Bindings that are independent of script engine by JSI. The Framework layer provides browser-consistent CSSOM and DOM capabilities, plus the C++ MVVM Framework and some webapis (Console, setTimeout,…). ; Finally, Unicorn rendering engine is called internally, which mainly provides rendering related capabilities such as layout, drawing, composition and rasterization. Both Framework and rendering engine layer are developed in C++, and the platform is abstracted to better support cross-platform.

It’s worth noting that unicorn’s rendering engine has built in PlatformView capability, which allows you to embed a Weex-rendered Surface on top of another Surface that is completely provided by PlatformView developers. With this extended capability, Camera, Video and other components can be accessed at a low cost, and Canvas quickly migrates the Native Canvas(internally called FCanvas) under the small program to the widget container based on this ability.

Look at the rendering process from multiple perspectives

For more details, please refer to the author’s previous article “Design and Thinking of cross-platform Web Canvas Rendering Engine Architecture (including implementation scheme)”.

To the point of this paper, first of all, we still have a macroscopic view of the Canvas rendering process. Please take a look at the following diagram from right to left.

For developers, the Canvas API is directly accessible, including the Canvas2D API developed by W3C and the WebGL API developed by Khronos Group. They are obtained via canvas.getcontext (‘2d’) and canvas.getcontext (‘webgl’), respectively. These JS apis are bound to Native C++ implementations via JSBinding. 2D is implemented based on Skia while WebGL calls the OpenGLES interface directly. The graphics API needs to bind to a platform form environment called Surface, which can be SurfaceView or TextureView on Android.

Further to the left is the widget container layer. For Weex, the basic unit of rendering compositing is the LayerTree, which describes the page hierarchy and records each node rendering command, Canvas is a Layer in this LayerTree — PlatformViewLayer(this Layer defines the position and size information of Canvas). The LayerTree is synthesized to the Weex Surface through unicorn raster module. In the end, both weex and Canvas Surface participate in Android rendering pipeline and are raster to Display by SurfaceFlinger synthesizer.

The above is the macro rendering link. Below, the author tries to describe the whole rendering process from different perspectives such as Canvas/Weex/Android platform.

Canvas perspective

From the perspective of Canvas itself, the platform and container can be ignored for the time being. There are two key points: creating the Rendering Surface and Rendering Pipeline process. The following sequence diagram shows this process, which involves four threads: Platform thread (i.e. Platform UI thread), JS thread, raster thread, and IO thread.

  • Rendering Surface Setup: When receiving a message to create a PlatformView upstream, the Canvas API is asynchronously bound to the JS thread, and the TextureView/SurfaceView is created on the Platform thread. When the SurfaceCreated signal is received, the Raster thread will initialize the EGL environment and bind it to the Surface. When the Rendering Surafce is created, the JS thread will be notified that the Rendering environment is Ready. Unlike 2D, when Rendering a WebGL Context, Rendering Surace is created on the JS thread by default (without Command Buffer enabled).

  • After receiving the Ready event, the developer can retrieve the Canvas handle and use getContextAPI to select 2d or WebGL Rendering Context. For 2D, when the JS thread calls the rendering API, the developer only records the rendering instruction without rendering, and the real rendering takes place in the rasterized thread. For WebGL, by default, the GL graphics API is directly called in the JS thread. However, both 2D and WebGL rendering are driven by the platform VSYNC signal, which sends a RequestAnimationFrame message to the JS thread to actually start rendering a frame. For 2D, the previous rendering command will be played back in the raster thread, the real rendering command will be submitted to the GPU, and swapbuffer will be sent to display, while WebGL will directly send display in the JS thread swapbuffer. If the image needs to be rendered, it will be downloaded and decoded in the IO thread and eventually used in the JS or raster thread.

Weex engine perspective

From the perspective of Weex engine, Canvas is an extension component. Weex does not even know the existence of Canvas. It only knows that a certain area of the current page is embedded through PlatformView, and it does not care about the specific content.

The left half of the following diagram depicts the core flow of Weex2.0 rendering links: The JS code of the widget is executed through the script engine, and the DOM structure of the widget is converted into a series of WEEX rendering instructions (such as AddElement node creation, UpdateAttrs node attributes update, etc.) through the weeX CallNative universal Binding interface. Then Unicorn was restored to a static Node Tree based on the rendering instruction, which recorded the parent-child relationship, the Node’s own style & attributes and other information. The static node tree will further generate the RenderObject rendering tree in Unicorn UI thread. The rendering tree will generate multiple layers and combine them into LayerTree Layer structure through layout, drawing and other processes. The LayerTree will be sent to the raster module for synthesis through engine BuildScene interface. Finally render to Surface and display through SwapBuffer.

The right part is the rendering process of Canvas. The Canvas perspective has been introduced in the general process, so I won’t go into further details. Here I focus on the embedding scheme of Canvas. However, PlatformView has several implementations, each of which has a very different process. Let’s expand on this.

In my opinion, Weex2.0 provides a variety of PlatformView embedding solutions for Android, including two: VirtualDisplay and Hybrid Composing, as well as my own drilling solutions.

VirtualDisplay

In this mode, PlatformView content will eventually be converted into an external texture to participate in Unicorn synthesis process. First create SurfaceTexture and store it in Unicorn engine. Then create Android.app. Presentation and use Canvas TextureView as the child node of the Presentation. And render to VirtualDisplay. It is well known that VirtualDisplay needs to provide a Surface as Backend, so here the Surface is created based on the SurfaceTexture. When the SurfaceTexture is filled, the engine side receives notification and transforms the SurfaceTexture into OES texture to participate in Unicorn rasterization process. Finally, the Unicorn SurfaceView or TextureView was synthesized together with other layers.

The performance of this mode is ok, but the main disadvantages are the inability to respond to Touch events, the loss of a11Y features, and the inability to get TextInput focus. It is these compatibility issues that result in limited application scenarios.

Hybrid Composing

In this mode the widget is no longer on the rendering to the SurfaceView or TextureView, but be rendered to one or more copies by android. The media. The ImageReader associated on the Surface. Unicorn encapsulates an Android custom View based on ImageReader and uses the Image object produced by ImageReader as the data source to continuously convert it into Bitmap and participate in the Android native rendering process. So why is it possible to have multiple Imagereaders? Because of the possibility of cascading layouts, PlatformView can have DOM nodes above and below it. On the other hand, PlatformView itself (such as Canvas) is no longer converted into a texture but also participates in the Android rendering process as a normal View.

In my opinion, the Hybrid Composing mode solves most of the compatibility issues in VirtualDisplay, but it also introduces new issues. In Composing, the mode requires merging threads and enabling PlatformView. The tasks of the Raster thread will be thrown to the Android main thread for execution, increasing the main thread pressure; Second, The Android native View based on ImageReader encapsulation (UnicornImageView mentioned below) needs to constantly create Bitmap and draw, especially before Android 10, it needs to generate Bitmap by software copy, which has a certain impact on performance.

Hyrbid Composing is compatible, so the engine is currently implementing PlatformView by default.

Android Platform perspective

In my opinion, I’m going to examine this process a little bit further in terms of The Android platform (Weex + Hybrid PlatformView).

In Hybrid Composing mode, widgets are rendered into one or more Unicorn ImageViews. UnicornImageView(Overlay) -> FCanvasTextureView -> UnicornImageView(Background) -> DecorView From an Android platform perspective, the view structure looks like the figure above. The Weex root view (UnicornView) is nested in the Android root view DecorView, which contains multiple UnicornImageViews and an FCanvasPlatformView (TextureView).

From a platform perspective, we don’t even need to care about the contents of UnicornImageView and FCanvas, just that they are both inherited from Android.view. view and follow the native Android rendering process. Native rendering is driven by the VSYNC signal, and the top-level function viewrotimpl #PerformTraversal triggers the Measure, Layout, and Draw processes. In the case of rendering, the message is first sent to the root DecorView. And dispatchDraw calls each View’s onDraw function in turn.

  • For FCanvas PlatformView, it is a TextureView, which is essentially a SurfaceTexture that fires a frameAvailable callback when the SurfaceTexture finds new content filling its internal buffer. The SurfaceTexture is then converted to a texture by the Android rendering thread via updateTexImage and synthesized by the system.
  • For UnicornImageView, it is a custom View that essentially encapsulates the ImageReader. After the internal Surface buffer associated with the ImageReader is filled, the latest frame data can be obtained by acquireLatestImage. In UnicornImageView#onDraw, it is the latest frame data that is converted into a Bitmap and rendered to android.graphics.canvas.

Android’s View Hierarchy is also associated with a Surface, usually called Window Surface. After the above View Hierarchy goes through the drawing process, the DisplayList will be generated, and the Android rendering thread will parse the DisplayList through the HWUI module to generate the actual graphics rendering instructions and hand them to GPU for hardware rendering. Finally, all contents are drawn to the above Window Surface, and then combined with other Surface (such as status bar, SurfaceView, etc.) into the FrameBuffer through SurfaceFlinger system and finally displayed on the device. This is the rendering process from an Android perspective.

Summary and Prospect

After the above analysis from multiple perspectives, I believe readers have a preliminary understanding of the rendering process. To sum up, Canvas, as the core capability of the widget, is supported by weeX kernel PlatformView extension mechanism. On the one hand, this loosely coupled and pluggable architecture mode enables the project to iterate quickly. So that Canvas can quickly land in new scenes to enable business, and on the other hand, it also makes the system more flexible and extensible.

But at the same time, readers can also see that PlatformView itself has some performance defects, and performance optimization is one of the goals of our subsequent evolution. Next, we will try to deeply integrate Canvas and Weex kernel rendering pipeline, so that Canvas and Weex kernel share Surface. No longer embedded via PlatformView extensions, and more streamlined rendering links will be available for interactive widgets in the future, so stay tuned.

Pay attention to [Alibaba mobile technology] wechat public number, every week 3 mobile technology practice & dry goods to give you thinking!