OpenGL ES

  • OpenGL ES is a subset of OpenGL
  • Advanced 3D graphics applications for embedded devices and mobile terminal devices, such as iOS, Android, Windows, etc
  • OpenGL ES is cross-platform and does not provide window-dependent methods

In this article, mainly about the iOS OpenGL ES, OpenGL ES API, OpenGL ES Programming Guide

OpenGL ES The following figure is from The Official Apple document OpenGL ES as a client-server Architecture

The rendering of OpebGL ES is divided into two parts: CPU and GPU

  • The CPU part
    • App code schedules the OpenGL ES Framework through the OpenGL ES API
    • OpenGL ES Server is scheduled by OpenGL ES Client to transfer vertex data to GPU
  • GPU part: do some graphics hardware processing, such as rasterization, display, etc

OpenGL ES Graphics pipeline

The OpenGL ES graphics pipeline has the following two diagrams, where the principle is the same, but the description is different figure 1

  • The API takes the vertex data and copies the vertex data from memory to the vertex buffer (video memory)
  • Once the data is retrieved, it is passed to the vertex shader through the Attribute channel, while Texture coordinates are passed to the vertex shader and the fragment shader through the Texture channel
  • Then, pixel assembly, that is, the connection mode of pixel, there are a total of 9, there are 6 commonly used, this step will transform vertices into graphics
  • Rasterization: Determine the position of the graphics corresponding to the screen
  • Slice/fragment/pixel shader: Handles the color value of the corresponding pixel point
  • The color value of each pixel is stored in the frame cache and displayed on the display
  • API: You can manipulate vertex buffers, vertex shaders, texture coordinates, fragment shaders through THE API

Apple official Icon

OpenGL ES as a Graphics Pipeline

  • App: provide pixel assembly vertex information, picture information
  • Vertex shader: Handles Vertex — graphic transformations (rotation, scaling, translation)
  • Geometry (Pixel Assembly) : Pixel assembly + cropping (The part beyond the screen is cropping)
  • Fragment: Texturing + atomizing
  • Framebuffer Operation: transparency mixing, template, depth testing; Finally, in blending, these actions are all done in the frame buffer just before they are displayed

Vertex shader

Simply put, it is a shader program that handles vertices, as shown in the figure below

  • Input, there are three ways
    • Input vertex data through the Attribute channel, providing data for each vertex
    • Enter uniform variables, i.e. immutable data used in vertex/slice shaders, over uniform channels
    • Sampler: A special unified variable type that represents the texture used by the vertex shader
  • Output: Final vertex data processed. There are two types
    • gl_Position, is a GLSL built-in variable to which the final vertex data is assigned after processing
    • gl_PointSize, is the size of the pointer, that is, you can change the size of each point in the vertex shader, which is less used

The vertex shader handles the business

  • Matrix transformation position
  • Calculate lighting formula to generate per-vertex color (also available with slice shader)
  • Generate/transform texture coordinates: there is no way for the slice shader to pass in attributes. You can indirectly pass texture coordinate attributes to the slice shader via vertex shaders

Vertex shader GLSL code example

  • Attribute and Uniform indicate channels between the client and server
  • Vec4 and VEC2 are vector types, representing four-dimensional vector and two-dimensional vector
  • Mat4:4 by 4 matrix
  • Varying is a modifier: IT passes texture coordinates to the fragment shader with VARYING
  • Lowp: Low intensive reading
  • Operations in main
    • The bridge of texture coordinates is realized
    • The multiplication of the vertex rotation matrix is realized: the column vector is multiplied by the column matrix to get the rotated vertex coordinates
    • Assign the vertex coordinates obtained above togl_Position
attribute vec4 position;
attribute vec2 textCoordinate; 
uniform mat4 rotateMatrix; 
varying lowp vec2 varyTextCoord; 
void main()
{
    varyTextCoord = textCoordinate;
    vec4 vPos = position;
    vPos = vPos * rotateMatrix;
    gl_Position = vPos; 
}

Copy the code

Pixel assembly, rasterization

  • Primitives assembly: Computes vertex data into primitives
  • Rasterization: The process of converting pixels into a set of 2D fragments, mainly because the screen is 2D, so the converted pixels are also 2D

Chip shader

The following figure shows what inputs and outputs are in the slice shader

  • Input, like vertex shaders, can be done in three ways
    • Texture coordinates passed by the vertex shader
    • Enter uniform variables, i.e. immutable data used in vertex/slice shaders, over uniform channels
    • Sampler: A special uniform variable type that indicates that vertex shaders use textures, such as textures passed through the sampler
  • Output: The result of a pixel processed by the chip shader

Chip shader business

  • Calculation of the color
  • Get texture value
  • Fill the pixels with color values (texture values/color values)

Example of GLSL code for slice shader

  • Varying: Must be exactly the same as in the vertex shader in order to pass texture coordinates
  • Sampler2D sampler type
  • Texture2D (Texture sampler, texture coordinates) : Gets the color value of the corresponding position/coordinate
  • Gl_FragColor (built-in variable) : Assigns the final color value to it
varying lowp vec2 varyTextCoord; 
uniform sampler2D colorMap;
void main() {
gl_FragColor = texture2D(colorMap, varyTextCoord); }

Copy the code

conclusion

  • Vertex shaders and slice shaders are snippets of code, similar to functions/methods in iOS, that return values
    • The return value of the vertex shader is copied togl_Position
    • The result of the slice shader is assigned togl_fragColor
  • These two return values belong to the built-in variables in GLSL, are wrapped, and can be assigned to directly
    • Gl_Position: The result of a vertex shader after a series of treatments
    • Gl_fragColor: The result of a pixel being processed by a chip shader

Piece-by-fragment operation

This process is handled internally by THE GPU, and the developer does not need to care about it. The processed data is stored in the frame cache, and finally the frame cache is read to display the graphics on the screen

OpenGL ES application

Image filter

  • Gets every pixel in the picture
  • Pixel saturation processing
  • Get a new color
  • Flip the new color into the frame cache
  • And then display it

Video filter principle and processing is the same (GLSL code), video is also processed frame by frame, and a frame is an image

  • Get video MP4 files
  • Get h264 (video compressed file) –
  • Decode (unzip) the video and restore it to a picture frame by frame
  • Frame by frame

EGL(Embedded Graphics Library)

  • Required by the OpenGL ES commandRender contextandDrawing on the surface ofTo complete the graphics image drawing
  • Render context: Stores the relevant OpenGL ES state and is a state machine
  • Draw surface: The surface used to draw primions. You need to specify the cache area for rendering, such as color buffer, depth, and template
  • The OpenGL ES API does not provide information on how to create a rendering context or how a context is connected to a native windowed system. EGL is the interface between Khronos rendering apis (such as OpenGL ES) and the native window system.The only platform that supports OpenGL ES but does not support EGL is iOS. Apple provides its own iOS implementation of the EGL API, called EAGL
  • Because each window-system has a different definition, EGL provides a basic opaque type, EGLDisplay, which encapsulates all system dependencies and is used to interface with native window-systems