preface

  • We were exploring earlierIOS animation and rendering principlesWhen, have learned:
    • Computer graphics rendering principle;
    • Mobile terminal screen imaging and Caton principle;
    • IOS rendering framework and iOS layer rendering principle;
  • There are several ways that we, as programmers, can learn to improve our level of technical competence. For example,Chain learning,Loop learning,Comparative learning methodAnd so on.
    • Obviously, we’re exploringIOS animation and rendering principles”Is used whenChain learning.
    • Next, in order to broaden our knowledge and make a relatively systematic understanding of the same subject, we will adoptLoop learning, to explore several related topics:
      • IOS -OffScreenRendering off-screen rendering principle
      • Causes and solutions of iOS CPU and GPU resource consumption
  • All of the above areiOS NativeFrom a development perspective, from another dimension,Of course we should pay attention to WebApps,Cross-platform developmentYou guessed right, I am here to use the comparative learning method, related topics:
    • Rendering principles for Web and Class RN large front ends
    • Flutter page rendering principle
    • Performance comparison of page rendering under three different technical schemes
  • Of course, in our era of technology ecology, ali, Tencent, Baidu and other big companies also launched small programs, so we should also pay attention to
    • Small program framework rendering principle
  • In the previous several topics, we expanded and learned the rendering principles of other non-pure Native channels from the perspective of iOSNative. As we know, in a project team, we should consider other channel terminals when making technical solutions. So we can’t leave other Native terminals behind
    • Android page rendering principles  The purpose of this article is to tellAndroid page rendering principlesPage rendering principle and unfold!!!!!!!!!

Android

The Android system was developed by Google, an American company. As one of the intelligent mobile terminal carrying systems, it has attracted much attention since its birth. Up to now, 9+ generations have been iterated. Many domestic mobile phone developers (such as Xiaomi) are based on the Android system for transformation, and then carried in their own products

The underlying Kernel space of Android is based on the Linux Kernel. The upper user space is composed of Native system library, virtual machine running environment and framework layer. It connects the [kernel space] and [user space] of the system through Syscall. The user space is mainly written by C++ and Java code, and the Java layer and Native layer (C++/C) of the user space are opened through JNI technology, so as to connect the whole system.

Today, we will take the principle of Android page rendering as the theme, and discuss the rendering framework and rendering pipeline of Android system, in order to do the knowledge reserve of technical selection in the later implementation process of the project! Let’s get to today’s topic!!

An overview of the

This article mainly discusses the principle of Android page rendering. Before we discuss the topic of this article, we need to prepare some knowledge. Let’s take a brief review of the principles of computer graphics rendering. If you have some background in computer graphics, you can skip the previous knowledge preparation and read the second section of this article directly.

This paper has the following chapters: 1. Foreshadowing knowledge

  • Computer graphics rendering principle;
  • Mobile terminal screen imaging and Caton principle;

Android view layer and window iii. Various Rendering frames and rendering pipeline of Android system IV. Summary recommended reading Related reading

First, foreshadowing knowledge

The graphics rendering principle of Android system is actually the same as the computer graphics rendering principle of computer graphics in the core part. Therefore, before we understand the View system of Android and its 2D and 3D rendering framework and rendering pipeline, we need to enter my article: Computer graphics rendering principle for certain knowledge preparation.

In the articles attached to the link, we can learn about “INTELLIGENT hardware CPU, GPU design concept and performance differences between the two”, “computer graphics rendering chip GPU birth history”, “3D graphics rendering libraries (OpenGL, DirectX, etc.) around the GPU, graphics related terminology and OpenGL work “, “electron beam CRT scanning principle of screen imaging”, “screen imaging principle” and many other relevant core points.

If you do not want to pay attention to CPU, GPU. To understand the principles of screen imaging on mobile devices directly, you can also read my simplified version of screen imaging and lag on mobile Terminals written specifically for mobile devices. In this article, we can focus on two dimensions: the first is the Bug problem encountered in system imaging, and the second is the solution to solve the problem. A few key points can be briefly summarized as:

  • Question:"Screen Tearing","Off the frame Jank",Blank screen caused by misconnection of view imaging switch
  • Solution:"Vsync","Double Buffering","Triple Buffering"

In conclusion, we mainly focus on the whole rendering pipeline of screen imaging, so that we can discuss the image rendering principle of Android later: ① obtaining layer rendering data →② PROCESSING pixel data by GPU →③ frame buffer (storing pixel information)→④ Video controller reading cache → number →⑤ digital-to-analog conversion and display

Our topic today is to focus on the first link. There are several points to start with:

  • View Layer and view Window of Android system and various graphics rendering frames of Android system (2D/3D)
  • The android rendering pipeline
  • Android event mechanism

Second, Android view layer and view window

2.1 Three drivers of Android display system :View, AMS, WMS

During development, when we code and draw the interface, we only care about a single interface, but when we run the project and the project is displayed on the device, there may be multiple interfaces. Those of you who have done Android development know that we need to inherit an Activity and use it to present our interface. One of the activities’ responsibilities is managing the interface life cycle, which goes hand in hand with managing the view window. This involves the two main services in Android, AMS (Active ManagerService) and WMS (Windows ManagerService). View, AMS, WMS can be said to be the troika of the whole upper display system.

In Android, a window is described by a Surface. Multiple Windows (not all of which are activities) that need to be displayed at the same time need to be merged. This requires the display system in the heavyweight service SurfaceFlinger, SurfaceFlinger control window synthesis, multiple Windows into one, and then sent to the LCD.

Surfaceflinger is a Native service. How to describe a window? Surfaceflinger uses the concept of layers, or layers. SurfaceFlinger composition, which is based on Display HAL implementation, merges multiple layers. With Display HAL, implementations vary widely from vendor to vendor.

2.2 SurfaceUI

In general, The Android system uses a UI architecture called Surface to provide the user interface for applications.

  • In Android applications,Each Activity component is associated with one or more Windows.Every window should have a Surface The interface is very simple, broken down, including wechat, suspension toolbox, notification bar, bottom virtual button bar.

    Each interface in the upper layer actually corresponds to a Surface object in SufaceFlinger, and the upper layer draws its own content in the corresponding Surface.
  • With the Surface, applications can render Windows’ UI on top of it. Eventually, all the surfaces that have been drawn will be submitted uniformlySurface Management ServicesSurfaceFlingerSynthesize it and display it on the screen.
    • Next, SufaceFlinger needs to synthesize all the graphs in the Surface corresponding to the upper layer, as shown in the following figure:

    1. Corresponding to an upper level Window (dialog box, Activity, status bar)
    2. A drawing board for upper graphics
    3. Canvas is a brush. The upper layer draws graphics on the Surface by calling the API of Canvas
    4. There are multiple buffers inside the Surface, forming a BufferQueue
  • Both applications and SurfaceFlinger can use hardware such as gpus to render UI for smoother UI.

To summarize the process of Android application display in a word:

The Android application calls the SurfaceFlinger service to render the “measured, laid out and drawn Surface” with the HELP of the GPU through the” Graphics Rendering framework “, and finally the DIGITal-to-analog conversion is displayed on the screen.

2.3 Key members behind Android GRAPHICAL interface

Let’s start with a few key members:

  • Surface: Each window of an Android app corresponds to a Canvas, or Surface

    • You can think of it as a window for an Android application.
    • Surface is an interface for exchanging buffers between producers and consumers.
  • SurfaceView: The SurfaceView is a component that can be used to embed additional composition layers in the View hierarchy.

    • The SurfaceView takes the same layout parameters as other views, so you can operate on it just like any other View, but the SurfaceView’s contents are transparent.
    • When the View component of the SurfaceView is about to become visible, the framework asks the SurfaceControl to request a new Surface from the SurfaceFlinger.
  • Surface Buffer: The graphics are transferred through Buffer as a carrier, and Surface is the further encapsulation of Buffer.

    • In other words, there are multiple buffers inside the Surface for the upper layer to use. How to manage these buffers?

    • The Surface provides a BufferQueue, which forms a Producer and Consumer model with the upper layer corresponding to the Producer and the SurfaceFlinger corresponding to the Consumer.
    • The three are linked by Buffer, and each Buffer has four states:
      • Free: available for upper-layer use
      • Dequeued: queue out, being used by upper layer
      • Queued: column entry, upper rendering complete, waiting for SurfaceFlinger synthesis
      • Acquired: The Buffer is being Acquired and SurfaceFlinger is holding it for synthesis
    • A transfer process of Buffer is roughly as follows:
      • Move from BufferQueue to upper level
      • Put the BufferQueue back when the top layer is drawn
      • SurfaceFlinger then syntheses it
      • And then it goes back to the BufferQueue
      • This loop, forming a Buffer to be recycled process.
  • EGLSurface and OpenGL ES: OpenGL ES (GLES) defines the graphics rendering API used in conjunction with EGL.

    • EGI is a library that specifies how Windows are created and accessed from the operating system
    • Use the GLES call to draw texture polygons;
    • Use the EGL call when you want to put the render on screen;
  • WindowManager: WindowManager controls window objects, which are containers used to hold view objects.

    • Window objects are always supported by Surface objects.
    • WindowManager oversees the life cycle, input and focus events, screen orientation, transitions, animation, positioning, deformation, Z-axis order, and many other aspects of Windows.
    • WindowManager sends all window metadata to SurfaceFlinger so that SurfaceFlinger can use this data to compose the Surface on the screen.
  • SurfaceFlinger: Android service. Manage frame buffer of Android system.

    1. Allocating graphics buffer
    2. Composite graphics buffer
    3. Manage VSYNC events
    • SurfaceFlinger takes buffers of data from multiple sources, composts them and sends them to the display.
    • WindowManager provides buffers and window metadata to SurfaceFlinger, which SurfaceFlinger can use to compose the Surface onto the screen
    • SurfaceFlinger accepts buffers in two ways: through BufferQueue and SurfaceControl, or through ASurfaceControl.
  • ViewRootImpl: Used to control window rendering and communicate with WindowManagerService and SurfaceFlinger.

  • BufferQueue: The BufferQueue class connects the component (producer) that can generate a buffer of graphical data to the component (consumer) that receives the data for display or further processing.

    • Almost everything that moves the graphical data buffer in the system depends on the BufferQueue.

  • The View and ViewGroup

    • View is the base class of all Android controls, whether simple TextView, Button or complex LinearLayout and ListView, their common base class is View. It represents a blank rectangular area.
    • The View class also has an important subclass called ViewGroup, but viewGroups are often used as containers for other components.
    • All UI components of Android are based on views and ViewGroups. Android uses the “combinator” mode to design views and Viewgroups: A ViewGroup is a subclass of View, so a ViewGroup can also be used as a View.
    • For the GRAPHICAL user interface of an Android application, the ViewGroup serves as a container to hold other components, and the ViewGroup can contain the ViewGroup component in addition to the normal View component

    • In the View, it has its position, color, and so on. It also manages events like clicks and touch gestures
    • So we can introduce a graph, just to make sense of itGraphic rendering frameAnd then toSurfaceThe relationship between

  • Canvas: Canvas is a 2D graphics API that acts as the actual renderer of the Android View tree.

    • Canvas can be divided into Skia software drawing and HWUI hardware accelerated drawing.

    • Before Android4.0, the default is Skia drawing, which is completely completed by CPU drawing instructions, and all operations in the main thread, in complex scenes, single frame is easy to exceed 16ms, resulting in lag.

    • Starting with Android4.0, hardware-accelerated rendering is enabled by default, and 5.0 has split the rendering operation into two threads:

      • The main thread and the render thread, the main thread is responsible for recording render instructions
      • The render thread is responsible for passingOpenGL ESOnce rendered, both threads can execute concurrently.
    • In addition to Canvas, developers can also render directly from OpenGL ES in asynchronous threads, generally suitable for games, video playback and other independent scenes.

  • OpenGL ES OpenGL is a C-based 3D rendering API

  • Vulkan: Vulkan is a low-overhead, cross-platform API for high-performance 3D graphics.

    • Like OpenGL ES, Vulkan provides tools for creating high-quality real-time graphics in your applications.
    • VulKan is used to replace OpenGL. It supports BOTH 3D and 2D and is much more lightweight
  • Skia Skia is the underlying 2D graphics library for Android

3. Various rendering frames and rendering lines of Android system

3.1 Evolution of Android rendering

It’s helpful to understand the history of Android’s constant optimization of rendering.

The Android 4.1

Introduced Project Butter: Vsync, triple Buffering, Choreography Dancer.

The Android 5.0

The RenderThread thread (which is maintained by the system at the framework level) was introduced, leaving the previously direct rendering instructions from the CPU (OpenGL/ Vulkan/Skia) to a separate rendering thread. Reduce main thread work. Even if the main thread is stuck, rendering is not affected.

The Android 7.0

Vulkan support was introduced. OpenGL is a 3D rendering API, VulKan is used to replace OpenGL. It supports BOTH 3D and 2D and is much more lightweight.

3.2 Graphic rendering framework :OpenGL, Vulkan, Skia

  • OpenGL is a cross-platform 3D graphics rendering specification interface. OpenGL EL is optimized for embedded devices such as mobile phones.

  • Vulkan: Has the same functionality as OpenGL, but it supports 3D and 2D at the same time. It is lighter and has higher performance than OpenGL.

  • Skia: Skia is the image rendering library, 2D graphics can be done by themselves. 3D effects (hardware dependent) supported by OpenGL, Vulkan, Metal. It supports not only 2D and 3D, but also CPU software drawing and GPU hardware acceleration. Android and Flutter use it to draw.

3.3 Graphics rendering pipeline

Before introducing the rendering pipeline in collaboration with various rendering frameworks, let’s introduce an official system diagram:

This diagram roughly describes the flow of graphic data:

  • OpenGL ES, MediaPlayer and other producers produce graphical data into the Surface
  • Surface byIGraphicBufferProducertheGraphicBufferTransfer across processes to consumersSurfaceFlinger.
  • SurfaceFlingerAccording to theWMSProvides window information compositing for allLayer(corresponding to Surface), the specific synthesis strategy byhwcomposerHAL module decides and implements,
  • Finally, it is also sent to Display by this module, andGrallocThe module is responsible for allocating the graphics buffer.

Explanation of the concepts in the figure:

  • Image Stream Producers: Producers that can generate graphic buffers. For example, OpenGL ES, Canvas 2D, mediaserver’s video decoder.

  • Image Stream Consumers: The most scenario consumer is SurfaceFlinger, which uses OpenGL and Hardware Composer to compose a set of surfaces.

    • OpenGL ES apps can consume graphics streams, such as camera Apps consume Camera preview graphics streams;
    • Non-opengl ES applications can also consume, such as the ImageReader class
  • Window Manager: Used to manage Windows, which is a container for a set of Views. WM sends the phone’s window metadata (including screen safety, Z-order, etc.) to SurfaceFlinger, so SurfaceFlinger can use this information to synthesize surfaces and output them to the display device.

  • Hardware Composer(Hardware Hybrid Renderer): This is the Hardware abstraction layer of the display subsystem.

    • Displays the hardware abstract implementation of the subsystem. SurfaceFlinger can delegate some composition work to Hardware Composer to share the workload on OpenGL and GPU.
    • SurfaceFlinger just acts as another OpenGL ES client. Therefore, SurfaceFlinger uses OpenGL ES as it composes one or two buffers into a third buffer. This makes the resulting power consumption much lower than performing all the calculations through the GPU.
  • HAL: Hardware abstraction layer. Display graphical data to the device screen

    • Hardware Composer HAL does the other half of the work and is at the heart of all Android graphics rendering.
    • Hardware Composer must support events, one of which is VSYNC (the other is hot plug for plug-and-play HDMI)
  • Gralloc: full name graphics Memory allocator, used for graphics production to request memory allocation.

From this figure we can know that Android graphics rendering design concept has aThe producer consumer patternAnd the bridge between them isBuffer Data:

Producers and consumers operate in different processes.

  • The producer requests a free buffer :dequeueBuffer()
  • The producer fills the cache and returns it to the queue: queueBuffer()
  • The consumer acquireBuffer()
  • Return to queue: releaseBuffer()

We introduced the official image earlier and we can turn it into another layered image to understand the rendering process and several graphic frames involved:

In general, there are two ways that app developers can draw images to the screen:

  • Canvas
  • OpenGL ES

Canvas is a 2D graphics API that is the actual renderer of the Android View tree.

  • CanvasAnd can be divided intoSkiaSoftware drawing andHwuiHardware-accelerated drawing.
  • The default is before Android4.0SkiaDrawing, this method completely through CPU to complete the drawing instructions, and all operations in the main thread, in complex scenes, single frame is easy to exceed 16ms, resulting in stalling.
  • Starting with Android4.0, hardware-accelerated rendering is enabled by default, and 5.0 has split the rendering operation into two threads: the main thread, which records the render instructions, and the render thread, which passes themOpenGL ESOnce rendered, both threads can execute concurrently.

In addition to Canvas, developers can also render directly from OpenGL ES in asynchronous threads, generally suitable for games, video playback and other independent scenes.

From the perspective of application, no matter Canvas or OpenGL ES, the final rendering target is Surface. Flutter, a popular cross-platform UI framework, also renders directly to Surface on Android.

  • Surface is a window, for example: an Activity is a Surface, a Dialog is also a Surface, carrying the upper Layer of graphic data, corresponding to the Layer on the SurfaceFlinger side.

    • The Native layer Surface implements the ANativeWindow structure and holds a IGraphicBufferProducer in its constructor that is used to interact with the BufferQueue.

    • The BufferQueue is the link between the Surface and the Layer, and when the upper graphics are rendered to the Surface, it’s actually rendered to a GraphicBuffer in the BufferQueue, The GraphicBuffer is then written to the BufferQueue by IGraphicBufferProducer so that the SurfaceFlinger can do the subsequent composition and display.

    • SurfaceFlinger is responsible for compositing all layers and sending them to Display. These layers can be compositing in two main ways:

      • OpenGL ES: Compose these layers into the FrameBuffer and submit the FrameBuffer tohwcomposerComplete remaining composition and display work.
      • hwcomposerThrough:HWCThe module composes part of the layer and FrameBuffer and displays it to Display.

As for the detailed workflows of OpenGL ES or Skia, this article will not cover them. If you want to know more, please follow my related reading and recommended reading

Four,

This article briefly introduces the UI composition of Android system and some related concepts behind it. Several graphics rendering frames and rendering processes are briefly introduced.

Topics not discussed in depth in this paper include:

  1. During the rendering process,SurfaceFlingerThe working principle of
  2. During the rendering process,How is data passed between the UI thread, the RenderThread thread, and the SurfaceFlinger
  3. Render the layer frameSkia,OpenGLES,VulkanThe use of
  4. .

Recommended reading

  • 01-Android Skia Graphics Library
  • 02-Android Rendering series -App Full analysis of the entire rendering process

Related reading (14 articles in total)

  • On principle of 01 – computer | computer graphics rendering this article
  • 02 – | mobile terminal screen computer imaging and caton

IOS related Topics

  • 01 – iOS | the underlying principle of iOS rendering framework and principle of iOS layer rendering
  • 02 – iOS underlying principle | iOS animation rendering principle
  • 03 – iOS underlying principle | iOS OffScreen Rendering off-screen Rendering principle
  • 04 – iOS underlying principle | caton caused by CPU and GPU resources consumption reasons and solutions

WebApp related topics

  • 01- Rendering principles for Web and Class RN large front ends

Topics related to cross-platform development solutions

  • 01-Flutter page rendering principle

Stage summary: Performance comparison of Native, WebApp and cross-platform development schemes

  • 01- Performance comparison of Native, WebApp and cross-platform development schemes

Android and HarmonyOS page rendering features

  • 01-Android page rendering principle
  • 02-HarmonyOS page rendering principles (For the output)

Small program page rendering thematic

  • 01- Applets framework rendering principle

conclusion

  • 01- Big front-end technology collocation and selection of the project scheme (For the output)
  • 02- Thoughts on stack accumulation for big front-end engineers (For the output)