[TOC]

preface

Recently, I have done a lot of work related to animation in iOS development. When I further explored the relevant principles of animation and rendering, I found that a series of related knowledge from hardware bottom to software framework was involved. In order to understand the principle of iOS image rendering, we first of all, the principles of computer graphics rendering was an understanding of the principles and an output of 01 – computer | computer graphics rendering principle this article; We in 01 – computer principle | computer graphics rendering principle this article can learn graphics rendering pipeline terminal equipment and screen image display principle; Then, we will follow the topic of screen image display principle, and enter into the topic of in-depth understanding of screen imaging and caton, for the subsequent further study of related topics to do knowledge reserve.

First, screen imaging process

We in 01 – computer principle | computer graphics rendering principle this article can learn graphics rendering pipeline terminal equipment and screen image display principle; Let’s start with a brief review of screen imaging

1. Display the pixel information obtained from image rendering on the physical screen

  • After the image rendering process is complete, the next step is to display the resulting pixel information on the physical screen.

  • The gPU-processed collection of pixels, known as bitmaps, is cached by the frame buffer for later display.

  • After the rendering of GPU in the last step, the pixel information is stored in the Framebuffer, and then the Video Controller will read the information in the Framebuffer and transfer it to the Monitor for display through digital-analog conversion. The complete process is shown below:

    • ① Rendering →② frame buffer (storing pixel information)→③ Video controller reading cache → DIGITal-analog conversion and display

2. Scanning of electron beam of display during pixel information display

  • The display’s electron beam scans the screen line by line, starting at the top left corner of the screen, and the image information for each point on the screen is read from a bitmap in the frame buffer and displayed on the screen. The scanning process is as follows:

  • During the beam scan, the screen displays the corresponding results, and each time the entire screen is scanned, it is equivalent to presenting a complete image.

  • By constantly refreshing the screen and presenting new frames, continuous images are presented.

  • The rate at which the screen refreshes is called Frame per Second. Due to the transient effect of the human eye, when the screen refreshes at a high enough rate (an FPS is usually around 50 to 60), it makes the picture look continuous and fluid. For iOS, 60 FPS should be the best experience you can get

3. The screen is an important reliance on continuous image experience

  • After the introduction of the previous matting, it is not difficult to make clear that the screen imaging process is”Render → Image → scan → render on physical screen“. whileThe screen presents a continuous imageIt is the process of continuous switching between the upper and lower frames after the completion of imaging and complete scanning of each frame of picture.
  • Therefore, it is not difficult to draw a conclusion that in order to have a good experience of continuous images on the screen, it is necessary to ensure that “the upper and lower frames of images are fully scanned and smoothly switched”. However, in the middle of creating such a good experience, the great engineers who first encountered these problems encountered"Screen tear","Frame"“But our smart and diligent engineer friend also asked"Double Buffering","Triple Buffering","Vsync","Hsync"And other solutions to solve the problems encountered. We’ll go through them one by one

I won’t tear the Screen

  • The so-called screen tearing refers to the phenomenon that “the scanned part is the picture of the previous frame, while the unscanned part will show a new frame”. Part of the upper and lower frames appear on the screen at the same time, causing the picture to be different from the same frame.
  • In this single-cache mode, the optimal screen image scenario is a smooth pipeline:
    • Every time the electron beam starts to scan a new frame from the beginning, the rendering process of the frame by CPU+GPU has been finished, and the rendered bitmap has been put into the frame buffer.
    • But this perfect situation is very fragile and prone to screen tearing:
    • The CPU+GPU rendering process is a very time-consuming process. If the bitmap is not rendered when the electron beam starts to scan a new frame, but is not rendered until it reaches the middle of the screen and is placed in the frame buffer —- then the scanned part is the previous frame and the unscanned part is displayed as a new frame, causing the screen to tear.

Vsync + Double Buffering

  • One strategy to solve screen tearing and improve display efficiency is to use Vsync signal and Double Buffering.
    • According to Apple’s documentation, iOS devices will always use Vsync + Double Buffering.
    • Vsync:Vertical Synchronisation (Vsync) is the equivalent of locking the frame buffer.
      • When the beam completes a frame of scanning and is about to start from scratch, it sends out a vSYNC signal.
      • Only when the video controller receives Vsync does it update the bitmap in the frame buffer to the next frame, ensuring that the same frame is displayed each time and thus avoiding screen tearing.
    • Double Buffering:Double buffering mechanism
      • However, in the case of Vsync buffer locking, the video controller needs to pass in the bitmap of the next frame after receiving Vsync, which means that the whole CPU+GPU rendering process has to be completed in a flash, which is obviously unrealistic.
      • So double buffering adds a new back buffer.
        • The render results are pre-stored in the back buffer
        • When receiving the Vsync signal, the video controller will replace the contents of the back buffer to the frame buffer, which can ensure that the replacement operation is completed almost instantly (in fact, the memory address is swapped).

Drop frame Jank

  • Enabling the Vsync signal and dual buffering solves the problem of screen tearing, but introduces a new problem: frame dropping.
  • What is frame drop?
    • Frame drop: Frame drop means that the video controller will not replace the bitmap in the frame buffer if the CPU and GPU have not rendered the new bitmap by the time Vsync is received. The screen then rescanns to show the exact same image from the previous frame. This is equivalent to two cycles showing the same picture, which is called frame drop.
    • As shown in the figure, A and B represent two frame buffers. When B is not finished rendering, Vsync signal is received, so the screen can only display the same frame A again, which occurs the first frame drop.

Triple Buffering

  • In fact, there is room for improvement. We noticed that the CPU and GPU were idle for a period of time when frames were dropped:

    • When the content of A is being scanned and displayed on the screen, while the content of B has been rendered, the CPU and GPU are idle.
    • So if we add a frame buffer, we can use this time to do the next render and temporarily store the render results in the new frame buffer.
  • As shown in the figure, due to the addition of a new frame buffer, the gap period of frame dropping can be utilized to a certain extent, the CPU and GPU performance can be rationally utilized, and the number of frame dropping can be reduced.

Six, the nature of screen stuck

  • The direct cause of cell phone use lag is frame loss.
  • As mentioned earlier, the screen refresh rate must be high enough to be smooth. For aN iPhone, the maximum screen refresh rate is 60 FPS, and 50 FPS is generally a good experience.
  • However, if too many frames are dropped and the refresh rate is too low, the experience will be not smooth.
  • With that in mind, we can roughly sum up:
    • The root cause of screen lag: THE CPU and GPU rendering pipeline take too long, resulting in frame loss.
    • What Vsync and double buffering mean: Force synchronous screen refresh to solve the screen tear problem at the expense of dropping frames.
    • The significance of three buffers: reasonable use of CPU, GPU rendering performance, reduce the number of frames.

reading

  • On principle of 01 – computer | computer graphics rendering this article
  • 02 – | mobile terminal screen computer imaging and caton