[sound Ming]

First of all, this series of articles are based on their own understanding and practice, there may be wrong places, welcome to correct.

Secondly, this is an introductory series, covering only enough knowledge, and there are many blog posts on the Internet for in-depth knowledge. Finally, in the process of writing the article, I will refer to the articles shared by others and list them at the end of the article, thanking these authors for their sharing.

Code word is not easy, reproduced please indicate the source!

Tutorial code: [Making portal】

directory

First, Android audio and video hard decoding:
  • 1. Basic knowledge of audio and video
  • 2. Audio and video hard decoding process: packaging basic decoding framework
  • 3. Audio and video playback: audio and video synchronization
  • 4, audio and video unencapsulation and packaging: generate an MP4
2. Use OpenGL to render video frames
  • 1. Preliminary understanding of OpenGL ES
  • 2. Use OpenGL to render video images
  • 3, OpenGL rendering multi-video, picture-in-picture
  • 4. Learn more about EGL of OpenGL
  • 5, OpenGL FBO data buffer
  • 6, Android audio and video hardcoding: generate an MP4
Android FFmpeg audio and video decoding
  • 1, FFmpeg SO library compilation
  • 2. Android introduces FFmpeg
  • 3, Android FFmpeg video decoding playback
  • 4, Android FFmpeg+OpenSL ES audio decoding playback
  • 5, Android FFmpeg+OpenGL ES play video
  • Android FFmpeg Simple Synthesis MP4: Video unencapsulation and Reencapsulation
  • 7, Android FFmpeg video encoding

You can read about it in this article

This paper mainly introduces the basic knowledge of OpenGL, including coordinate system, shader, basic rendering process and so on.

An introduction to the

OpenGL, I think a lot of people will say, I know this thing can render 2D images and 3D models, but at the same time they will say, OpenGL is very difficult, very advanced, I don’t know how to use it.

1. Why does OpenGL “feel hard”?

  • There are many and miscellaneous functions, and the rendering process is complex
  • The GLSL shader language is hard to understand
  • Process-oriented programming thinking, and Java and other object-oriented programming thinking is different

2. What is OpenGL ES?

To solve these problems and make OpenGL “not too hard to learn,” you need to break it down into simple steps, and then the simple things come together and everything falls into place.

First, let’s see what OpenGL is.

  • The CPU and GPU

On mobile phones, there are two components, one is CPU, the other is GPU. There are also two ways to display graphical interfaces on mobile phones. One is to use CPU to render, and the other is to use GPU to render. It can be said that GPU rendering is actually a kind of hardware acceleration.

The reason why GPU can greatly improve rendering speed is because GPU is best at parallel floating-point arithmetic, which can be used to do parallel arithmetic on many, many pixels.

OpenGL (Open Graphics Library) is a tool for indirectly operating GPU. It is a set of defined cross-platform and cross-language Graphics API, the underlying Graphics Library that can be used for 2D and 3D picture rendering, and the programming interface specifically implemented by various hardware manufacturers.

  • OpenGL and OpenGL ES

OpenGL ES (full name: OpenGL for Embedded Systems) is a subset of OpenGL. It is designed for small devices such as mobile phones and pads. It cuts unnecessary methods, data types and functions, reduces volume and optimizes efficiency.

OpenGL ES version

Currently the main version is 1.0/1.1/2.0/3.0/3.1

  • 1.0: Android 1.0 and later support this API specification
  • 2.0: incompatible with OpenGL ES 1.x. Android 2.2(API 8) and later support this API specification
  • 3.0: Backward compatible with OpenGL ES 2.x. Android 4.3(API 18) and later support this API specification
  • 3.1: Backward compatibility with OpenGL ES3.0/2.0. Android 5.0 (API 21) and later support this API specification

Version 2.0 is the most widely supported version of Android. This version is mainly used for introduction and code writing.

OpenGL ES coordinate system

In audio and video development, there are mainly two coordinate systems involved: world coordinate and texture coordinate.

Since 3D mapping is basically not involved, only x/ Y coordinates are looked at and z coordinates are ignored. If 3D related knowledge is involved, you can Google by yourself and it is not within the scope of discussion.

Let’s start with two graphs:

  • OpenGL ES World coordinates

As you can tell by the name, this is the coordinates of OpenGL’s own world, it’s a normalized coordinate system, ranging from -1 to 1, with the origin in the middle.

  • OpenGL ES texture coordinates

Texture coordinates, in fact, are screen coordinates. Standard texture coordinates have the origin at the lower left of the screen, while Android coordinates have the origin at the upper left. This is one of the things you need to be aware of when using OpenGL on Android.

Texture coordinates range from 0 to 1.

Note: the xy orientation of the coordinate system is important in determining how vertex coordinates and texture coordinates are mapped.

So, what is the relationship between these two coordinate systems?

World coordinates are the coordinates used for display, that is, where the pixel should be displayed is determined by the world coordinates.

Texture coordinates, which indicate where on the texture the desired color should be obtained at the position point specified by the world coordinates. The location of the color is determined by texture coordinates.

Correct mapping between the two is required to display a normal picture.

OpenGL shader language GLSL

After OpenGL 2.0, a new programmable rendering pipeline was added to allow more flexible control of rendering. However, it is necessary to learn another GPU-specific programming language with a syntax similar to THAT of C called GLSL.

  • Vertex shader & slice shader

Before introducing GLSL, let’s take a look at two unfamiliar terms: vertex shader and slice shader.

Shader, is a small program that can run on the GPU, written in GLSL language. By name, vertex shaders are programs that manipulate vertices, while slice shaders are programs that manipulate color properties of pixels.

The vertex shader corresponds to the world coordinate system and the slice shader corresponds to the texture coordinate system.

For each point on the screen, the program fragment in the vertex and fragment shader is executed once, in parallel, and rendered to the screen.

  • GLSL programming

Below, a brief introduction to the GLSL language is given through one of the simplest vertex shaders and slice shaders

# vertex shader

attribute vec4 aPosition;

void main() {
  gl_Position = aPosition;
}
Copy the code
# slice shader

void main() {
    gl_FragColor = vec4(1.0.0.0.0.0.1.0)}Copy the code

First of all, GLSL is a KIND of C language. The shader framework is basically the same as C language, declaring variables at the top, followed by the main function. In the shader, there are several built-in variables that can be used directly (only the ones commonly used for audio and video development are listed here, as well as some other 3D development variables) :

  • A built-in input variable to the vertex shader

Gl_Position: vertex coordinates gl_PointSize: point size. If no value is assigned, the default value is 1

  • The chip shader has built-in output variables

Gl_FragColor: indicates the current chip color

Look back at the shader code above.

1) In the vertex shader, a vec4 vertex coordinate xyzw is passed, and then passed directly to the built-in variable gl_Position, i.e. rendering directly according to the vertex coordinates without any position transformation.

Note: Vertex coordinates are passed in to Java code, as discussed later, and w is homogeneous and has no effect on 2D rendering

2) In the fragment shader, assign gl_FragColor directly, again as a vec4, representing the RGBA color value, which is red.

Vec4 is a 4-dimensional vector, which can be used to represent coordinates XYZW, rGBA, of course, vec3, VEC2, etc., you can refer to this article: GLSL, the shader language, very detailed, suggested to see.

This way, when two simple shaders are connected in series, each vertex (pixel) shows a red dot, resulting in a red screen.

Specific GLSL data types and syntax will not expand the introduction, the GLSL code will be more in-depth. For more details, please refer to the author’s article “Shader Language GLSL”.

4, Android OpenGL ES rendering process

To be honest, OpenGL’s rendering process is quite tedious, and also makes a lot of people afraid of the place, but if it comes down to it, in fact, the whole rendering process is basically fixed, as long as it is packaged according to a fixed process, in fact, it is not so complicated.

Next, go into the actual combat, layer by layer to remove the mystery of OpengGL.

1. Initialization

On Android, OpenGL is usually used in conjunction with GLSurfaceView, where Google has encapsulated the basic flow of rendering.

OpenGL is a thread-based state machine. All operations related to OpenGL, such as texture ID creation, initialization, rendering, etc., must be done in the same thread. Otherwise, exceptions will occur.

Developers often don’t appreciate this mechanism when they first get into OpenGL because Google has already done this for developers in GLSurfaceView. This is a very important aspect of OpenGL and will be explored further in a future EGL article.

  1. A new page
class SimpleRenderActivity : AppCompatActivity() {
    // Custom OpenGL renderer, see below for more details
    lateinit var drawer: IDrawer

    override fun onCreate(savedInstanceState: Bundle?). {
        super.onCreate(savedInstanceState)
        setContentView(R.layout.activity_simpler_render)

        drawer = if (intent.getIntExtra("type".0) = =0) {
            TriangleDrawer()
        } else{ BitmapDrawer(BitmapFactory.decodeResource(CONTEXT!! .resources, R.drawable.cover)) } initRender(drawer) }private fun initRender(drawer: IDrawer) {
        gl_surface.setEGLContextClientVersion(2)
        gl_surface.setRenderer(SimpleRender(drawer))
    }

    override fun onDestroy(a) {
        drawer.release()
        super.onDestroy()
    }
}
Copy the code

      
<android.support.constraint.ConstraintLayout 
    xmlns:android="http://schemas.android.com/apk/res/android"
    android:layout_width="match_parent"
    android:layout_height="match_parent">
    <android.opengl.GLSurfaceView
            android:id="@+id/gl_surface"
            android:layout_width="match_parent"
            android:layout_height="match_parent"/>
</android.support.constraint.ConstraintLayout>
Copy the code

The page is very simple, with a full screen of GLSurfaceView, initialized with OpenGL version 2.0, and SimpleRender, inherited from GlSurfaceView.renderer

IDrawer will explain this in detail when it comes to drawing triangles. This interface class is just for extension. you can also write the rendering code directly in SimpleRender.

  1. Implementing the Render interface
class SimpleRender(private val mDrawer: IDrawer): GLSurfaceView.Renderer {

    override fun onSurfaceCreated(gl: GL10? , config:EGLConfig?). {
        GLES20.glClearColor(0f.0f.0f.0f)
        GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT)
        mDrawer.setTextureID(OpenGLTools.createTextureIds(1) [0])}override fun onSurfaceChanged(gl: GL10? , width:Int, height: Int) {
        GLES20.glViewport(0.0, width, height)
    }

    override fun onDrawFrame(gl: GL10?). {
        mDrawer.draw()
    }
}
Copy the code

Notice that there are three callback interfaces implemented. These are the interfaces exposed in the process that Google encapsulates, left to the developer for initialization and rendering, and the callbacks of all three interfaces are in the same thread.

  • OnSurfaceCreated calls two lines of OpenGL ES code to clear the screen with a black color.
GLES20.glClearColor(0f.0f.0f.0f)
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT)
Copy the code

At the same time, a texture ID is created and set to the Drawer as follows:

fun createTextureIds(count: Int): IntArray {
    val texture = IntArray(count)
    GLES20.glGenTextures(count, texture, 0) // Generate a texture
    return texture
}
Copy the code
  • In onSurfaceChanged, call glViewport and set the width, height and position of the region to be drawn by OpenGL

The drawing area here is the drawing area of OpenGL in the GLSurfaceView, and it’s usually completely covered.

GLES20.glViewport(0.0, width, height)
Copy the code
  • And in onDrawFrame, that’s where you actually draw. The interface constantly calls back to refresh the drawing area. A simple triangle drawing is used here to illustrate the entire drawing process.
2. Render a simple triangle

Start by defining a render interface class:

interface IDrawer {
    fun draw(a)
    fun setTextureID(id: Int)
    fun release(a)
}
Copy the code
class TriangleDrawer(private val mTextureId: Int = -1): IDrawer {
    // Vertex coordinates
    private val mVertexCoors = floatArrayOf(
        -1f, -1f.1f, -1f.0f.1f
    )

    // Texture coordinates
    private val mTextureCoors = floatArrayOf(
        0f.1f.1f.1f.0.5 f.0f
    )

    / / texture ID
    private var mTextureId: Int = -1

    / / OpenGL program ID
    private var mProgram: Int = -1

    // Vertex coordinates receiver
    private var mVertexPosHandler: Int = -1
    // Texture coordinates receiver
    private var mTexturePosHandler: Int = -1

    private lateinit var mVertexBuffer: FloatBuffer
    private lateinit var mTextureBuffer: FloatBuffer

    init {
        // step 1: Initialize vertex coordinates
        initPos()
    }

    private fun initPos(a) {
        val bb = ByteBuffer.allocateDirect(mVertexCoors.size * 4)
        bb.order(ByteOrder.nativeOrder())
        // Convert coordinate data to FloatBuffer, which is passed to OpenGL ES program
        mVertexBuffer = bb.asFloatBuffer()
        mVertexBuffer.put(mVertexCoors)
        mVertexBuffer.position(0)

        val cc = ByteBuffer.allocateDirect(mTextureCoors.size * 4)
        cc.order(ByteOrder.nativeOrder())
        mTextureBuffer = cc.asFloatBuffer()
        mTextureBuffer.put(mTextureCoors)
        mTextureBuffer.position(0)}override fun setTextureID(id: Int) {
        mTextureId = id
    }
    
    override fun draw(a) {
        if(mTextureId ! = -1) {
            // step 2: Create, compile, and start the OpenGL shader
            createGLPrg()
            // [Step 3: Start rendering]
            doDraw()
        }
    }

    private fun createGLPrg(a) {
        if (mProgram == -1) {
            val vertexShader = loadShader(GLES20.GL_VERTEX_SHADER, getVertexShader())
            val fragmentShader = loadShader(GLES20.GL_FRAGMENT_SHADER, getFragmentShader())

            // Create OpenGL ES program, note: need to create in OpenGL render thread, otherwise cannot render
            mProgram = GLES20.glCreateProgram()
            // Add a vertex shader to the program
            GLES20.glAttachShader(mProgram, vertexShader)
            // Add a chip shader to the program
            GLES20.glAttachShader(mProgram, fragmentShader)
            // Connect to the shader program
            GLES20.glLinkProgram(mProgram)

            mVertexPosHandler = GLES20.glGetAttribLocation(mProgram, "aPosition")
            mTexturePosHandler = GLES20.glGetAttribLocation(mProgram, "aCoordinate")}// Use the OpenGL program
        GLES20.glUseProgram(mProgram)
    }

    private fun doDraw(a) {
        // Enable vertex handles
        GLES20.glEnableVertexAttribArray(mVertexPosHandler)
        GLES20.glEnableVertexAttribArray(mTexturePosHandler)
        // Set shader parameters
        GLES20.glVertexAttribPointer(mVertexPosHandler, 2, GLES20.GL_FLOAT, false.0, mVertexBuffer)
        GLES20.glVertexAttribPointer(mTexturePosHandler, 2, GLES20.GL_FLOAT, false.0, mTextureBuffer)
        // Start drawing
        GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0.4)}override fun release(a) {
        GLES20.glDisableVertexAttribArray(mVertexPosHandler)
        GLES20.glDisableVertexAttribArray(mTexturePosHandler)
        GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, 0)
        GLES20.glDeleteTextures(1, intArrayOf(mTextureId), 0)
        GLES20.glDeleteProgram(mProgram)
    }

    private fun getVertexShader(a): String {
        return "attribute vec4 aPosition;" +
                "void main() {" +
                " gl_Position = aPosition;" +
                "}"
    }

    private fun getFragmentShader(a): String {
        return "precision mediump float;" +
                "void main() {" +
                "Gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0);" +
                "}"
    }

    private fun loadShader(type: Int, shaderCode: String): Int {
        // Create a vertex shader or slice shader based on type
        val shader = GLES20.glCreateShader(type)
        // Add the resource to the shader and compile it
        GLES20.glShaderSource(shader, shaderCode)
        GLES20.glCompileShader(shader)

        return shader
    }
}
Copy the code

Even if you draw a simple triangle, the code still looks complicated. It’s a little clearer to break it down into three steps.

1) Initialize vertex coordinates

Earlier we talked about OpenGL’s world coordinates and texture coordinates. We need to determine these two coordinates before drawing.

[Important Note]

What’s left unsaid is that all of OpenGL ES’s graphics are made up of triangles. A quadrilateral, for example, is made up of two triangles, and other more complex graphics can also be divided into large and small triangles.

Therefore, vertex coordinates are also set according to the connection of the triangle. There are three ways to draw:

  • GL_TRIANGLES: Using triangles with independent vertices

  • GL_TRIANGLE_STRIP: Reuse vertices to form triangles

  • GL_TRIANGLE_FAN: Reuse the first vertex to form a triangle

In general, the GL_TRIANGLE_STRIP drawing mode is used. So the order of vertices in a quadrilateral looks something like this (v1 minus v2 minus v3) (v2 minus v3 minus v4).

The corresponding texture coordinates should also be in the same order as the vertex coordinates, otherwise there will be inversion, deformation and other anomalies

Since the drawing is a triangle, the two coordinates are as follows (only the xy coordinates are set here, ignoring the Z coordinates, and every two data forms a coordinate point) :

// Vertex coordinates
private val mVertexCoors = floatArrayOf(
    -1f, -1f.1f, -1f.0f.1f
)
// Texture coordinates
private val mTextureCoors = floatArrayOf(
    0f.1f.1f.1f.0.5 f.0f
)
Copy the code

In the initPos method, arrays are converted to ByteBuffer because the underlying layer cannot receive them directly

2) Create, compile, and start OpenGL shaders

 private fun createGLPrg(a) {
    if (mProgram == -1) {
        val vertexShader = loadShader(GLES20.GL_VERTEX_SHADER, getVertexShader())
        val fragmentShader = loadShader(GLES20.GL_FRAGMENT_SHADER, getFragmentShader())

        // Create OpenGL ES program, note: need to create in OpenGL render thread, otherwise cannot render
        mProgram = GLES20.glCreateProgram()
        // Add a vertex shader to the program
        GLES20.glAttachShader(mProgram, vertexShader)
        // Add a chip shader to the program
        GLES20.glAttachShader(mProgram, fragmentShader)
        // Connect to the shader program
        GLES20.glLinkProgram(mProgram)

        mVertexPosHandler = GLES20.glGetAttribLocation(mProgram, "aPosition")
        mTexturePosHandler = GLES20.glGetAttribLocation(mProgram, "aCoordinate")}// Use the OpenGL program
    GLES20.glUseProgram(mProgram)
}

private fun getVertexShader(a): String {
    return "attribute vec4 aPosition;" +
            "void main() {" +
            " gl_Position = aPosition;" +
            "}"
}

private fun getFragmentShader(a): String {
    return "precision mediump float;" +
            "void main() {" +
            "Gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0);" +
            "}"
}

private fun loadShader(type: Int, shaderCode: String): Int {
    // Create a vertex shader or slice shader based on type
    val shader = GLES20.glCreateShader(type)
    // Add the resource to the shader and compile it
    GLES20.glShaderSource(shader, shaderCode)
    GLES20.glCompileShader(shader)

    return shader
}
Copy the code

As mentioned above, GLSL is a programming language for Gpus, and shaders are small programs. In order to run this small program, it needs to be compiled and bound before it can be used.

The shader in this example is the simplest shader mentioned above.

As you can see, the shader is just a string

LoadShader: gles20.glcreateshader: get vertex shader and slice shader, depending on type.

Then call the following method to compile the shader

GLES20.glShaderSource(shader, shaderCode)
GLES20.glCompileShader(shader)
Copy the code

Once the shader is compiled, bind, connect, and start the program.

Remember that coordinates in shaders are passed from Java to GLSL?

If you’re careful, you might find these two lines of code

mVertexPosHandler = GLES20.glGetAttribLocation(mProgram, "aPosition")
mTexturePosHandler = GLES20.glGetAttribLocation(mProgram, "aCoordinate")
Copy the code

Yes, this is the way Java and GLSL interact with each other, using properties to set values for GLSL.

3) Start rendering

private fun doDraw(a) {
    // Enable vertex handles
    GLES20.glEnableVertexAttribArray(mVertexPosHandler)
    GLES20.glEnableVertexAttribArray(mTexturePosHandler)
    // Set the shader parameter. The second parameter represents the amount of data a vertex contains, which is xy, so 2
    GLES20.glVertexAttribPointer(mVertexPosHandler, 2, GLES20.GL_FLOAT, false.0, mVertexBuffer)
    GLES20.glVertexAttribPointer(mTexturePosHandler, 2, GLES20.GL_FLOAT, false.0, mTextureBuffer)
    // Start drawing
    GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0.3)}Copy the code

First activate the shader’s vertex coordinates and texture coordinates properties, then pass the initialized coordinates to the shader, and finally start drawing:

GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0.3)
Copy the code

There are two ways to draw: glDrawArrays and glDrawElements, the difference between the two is that glDrawArrays are drawn directly using the defined vertex order; GlDrawElements is another index array that needs to be defined to confirm the combination and drawing order of vertices.

Through the above steps, you can see a red triangle on the screen.

When you draw a triangle, you just set the color of the pixel without using the texture. What is the use of the texture?

Next, display an image with a texture and see how it works.

It is recommended to first look at the process of drawing triangles, drawing pictures is based on the above process, repeat code will not be posted.

3, texture map, display a picture

The following is only posted and draw a triangle is not the same part of the code, detailed code see source code.

class BitmapDrawer(private val mTextureId: Int.private val mBitmap: Bitmap): IDrawer {
    //------- [note 1: the coordinates have been changed to form a quadrilateral] -------
    // Vertex coordinates
    private val mVertexCoors = floatArrayOf(
        -1f, -1f.1f, -1f,
        -1f.1f.1f.1f
    )

    // Texture coordinates
    private val mTextureCoors = floatArrayOf(
        0f.1f.1f.1f.0f.0f.1f.0f
    )
    
    //------- [note 2: new texture receiver] -------
    // Texture receiver
    private var mTextureHandler: Int = -1

    fun draw(a) {
        if(mTextureId ! = -1) {
            // step 2: Create, compile, and start the OpenGL shader
            createGLPrg()
            //------- [note 4: add two steps] -------
            // step 3: Activate and bind the texture unit
            activateTexture()
            // [Step 4: Bind image to texture unit]
            bindBitmapToTexture()
            //----------------------------------
            // [Step 5: Start rendering]
            doDraw()
        }
    }
    
    private fun createGLPrg(a) {
        if (mProgram == -1) {
            // omit the part consistent with drawing the triangle
            / /...
        
            mVertexPosHandler = GLES20.glGetAttribLocation(mProgram, "aPosition")
            mTexturePosHandler = GLES20.glGetAttribLocation(mProgram, "aCoordinate")
            // [note 3: Get texture receiver]
            mTextureHandler = GLES20.glGetUniformLocation(mProgram, "uTexture")}// Use the OpenGL program
        GLES20.glUseProgram(mProgram)
    }

    private fun activateTexture(a) {
        // Activate the specified texture unit
        GLES20.glActiveTexture(GLES20.GL_TEXTURE0)
        // Bind the texture ID to the texture unit
        GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mTextureId)
        // Pass the active texture unit to the shader
        GLES20.glUniform1i(mTextureHandler, 0)
        // Set edge transition parameters
        GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR.toFloat())
        GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR.toFloat())
        GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE)
        GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE)
    }

    private fun bindBitmapToTexture(a) {
        if(! mBitmap.isRecycled) {// Bind the image to the active texture unit
            GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, mBitmap, 0)}}private fun doDraw(a) {
        // omit the part consistent with drawing the triangle
        / /...
        
        // Add 1 to the vertex to make it 4.
        // Start drawing: Last argument, change the number of vertices to 4
        GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0.4)}private fun getVertexShader(a): String {
        return "attribute vec4 aPosition;" +
                "attribute vec2 aCoordinate;" +
                "varying vec2 vCoordinate;" +
                "void main() {" +
                " gl_Position = aPosition;" +
                " vCoordinate = aCoordinate;" +
                "}"
    }

    private fun getFragmentShader(a): String {
        return "precision mediump float;" +
                "uniform sampler2D uTexture;" +
                "varying vec2 vCoordinate;" +
                "void main() {" +
                " vec4 color = texture2D(uTexture, vCoordinate);" +
                " gl_FragColor = color;" +
                "}"
    }
    
    // Omit the part that is consistent with the drawing triangle
    / /...
}
Copy the code

Inconsistencies have been identified in the code (see x in the code). Take a look at each one:

1) Vertex coordinates

Vertex coordinates and texture coordinates are changed from 3 to 4 to form a rectangle. The combination method is also GL_TRIANGLE_STRIP.

2) Shaders

Let’s start with qualifiers in GLSL

  • Attritude: usually used with different amounts of each vertex. Such as vertex color, coordinates, etc.
  • Uniform: Generally used for all vertices in a 3D object are the same. Such as light source position, uniform transformation matrix and so on.
  • Varying: Specifies the variable. It is generally used to specify the amount of the vertex shader passed to the chip shader.

Const: constant.

The lines of code are parsed as follows:

private fun getVertexShader(a): String {
    return  // Vertex coordinates
            "attribute vec2 aPosition;" +
            // Texture coordinates
            "attribute vec2 aCoordinate;" +
            // Used to pass texture coordinates to the slice shader, named the same as in the slice shader
            "varying vec2 vCoordinate;" +
            "void main() {" +
            " gl_Position = aPosition;" +
            " vCoordinate = aCoordinate;" +
            "}"
}

private fun getFragmentShader(a): String {
    return  Lowp (low)/mediump(medium)/highp(high)
            "precision mediump float;" +
            // The texture unit passed in from Java
            "uniform sampler2D uTexture;" +
            // Texture coordinates passed in from the vertex shader
            "varying vec2 vCoordinate;" +
            "void main() {" +
            // Take the color from the texture unit according to the texture coordinates
            " vec4 color = texture2D(uTexture, vCoordinate);" +
            " gl_FragColor = color;" +
            "}"
}
Copy the code

Two new steps have been added to the drawing process:

3) Activate and bind texture units

private fun activateTexture(a) {
    // Activate the specified texture unit
    GLES20.glActiveTexture(GLES20.GL_TEXTURE0)
    // Bind the texture ID to the texture unit
    GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mTextureId)
    // Pass the active texture unit to the shader
    GLES20.glUniform1i(mTextureHandler, 0)
    // Configure texture filter mode
    GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR.toFloat())
    GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR.toFloat())
    // Configure texture surround
    GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE)
    GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE)
}
Copy the code

Since displaying an image requires a texture unit to deliver the content of the entire image, you first need to activate a texture unit.

Why is it a texture unit?

Gles20.gl_texture0, GLES20.gl_texture1, GLES20.gl_texture3… Gles20.gl_texture0 is the default, and OpenGL activates the first texture unit by default. In addition, the texture unit glES20.gl_texture1 = GLES20.gl_texture0 + 1, and so on.

After activating the specified texture unit, bind it to the texture ID and pass it to the shader: gles20.glUniform1i (mTextureHandler, 0). The second parameter index must be the same as the texture unit index.

For example, GLES20. GlUniform1i corresponds to uniform qualifier variables in GLSL. Es20.glgetattriblocation corresponds to attribute qualifier variables in GLSL and so on

The last four lines of code configure the texture filter mode and the texture surround mode (an introduction to these two modes is referenced from Learnopengl-CN)

  • Texture filter mode

Texture coordinates are resolution independent and can be any floating point value, so OpenGL needs to know how to map texture pixels to texture coordinates.

GL_NEAREST and GL_LINEAR are generally used.

When set to GL_NEAREST, OpenGL will select the pixel whose center point is closest to the texture coordinates.

When set to GL_LINEAR, it computes an interpolation based on texture pixels near texture coordinates, approximating the colors between those texture pixels.

  • Texture wrap
Around the way describe
GL_REPEAT Default behavior for textures. Repeat texture image.
GL_MIRRORED_REPEAT Same as GL_REPEAT, but each repeat image is mirrored.
GL_CLAMP_TO_EDGE The texture coordinates are constrained between 0 and 1, and the excess portion repeats the edges of the texture coordinates, creating the effect of the edges being stretched.
GL_CLAMP_TO_BORDER The coordinate beyond is the user-specified edge color.

4) Bind image to texture unit

Once the texture unit is activated, you can bind the BMP to the specified texture unit by calling the texImage2D method.

GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, mBitmap, 0)
Copy the code

5) to draw

When drawn, the last argument of the last sentence changes from the three vertices of the triangle to the four vertices of the rectangle. If you fill in 3 again, you will see that half of the image, the triangle (diagonally split), is displayed.

GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0.4)
Copy the code

At this point, an image is displayed through a texture map.

Of course, you’ll notice that this image is distorted, covering the entire GLSurfaceView window. This involves the transformation of vertex coordinates, which will be explained in the next article.

Five, the summary

After the above simple triangle drawing and texture mapping, we can summarize the 2D drawing process of OpenGL ES in Android:

  1. Configure OpenGL ES with GLSurfaceView and specify Render
  2. Glsurfaceview. Renderer, copy exposed method, and configure OpenGL display window, clear screen
  3. Create texture ID
  4. Configure vertex coordinates and texture coordinates
  5. Initializes the coordinate transformation matrix
  6. Initialize the OpenGL program, compile, link vertex shaders and fragment shaders, and get variable properties in GLSL
  7. Activate the texture unit, bind the texture ID, configure the texture filter mode and surround mode
  8. Bind textures (e.g. bind bitmaps to textures)
  9. Start to draw

This is basically a general process, although rendering images is slightly different from rendering video, and point 5 will be covered in the next article.

6. Refer to the article

Understand OpenGLES2.0

The shader language GLSL

LearnOpenGL-CN