An overview of the

This is a new series, learn OpengGl Es, actually is “OpenGl Es Application Development Practice Guide Android volume” study notes, interested can directly read this book, of course, this will record their own understanding, the following only as notes, in case you forget later

The first nine chapters of the book will then be analyzed in turn

Android OpenGl Es learning (a) : create an OpenGl Es program

Android OpenGl Es learning (2) : define vertices and shaders

Android OpenGl Es learning (3) : build shaders

Android OpenGl Es learning (4) : add color

Android OpenGl Es learning (5) : adjust the aspect ratio

Android OpenGl Es Learning (vi) : Enter 3d

Android OpenGl Es learning (7) : Using textures

Android OpenGl Es Learning (8) : Build simple objects

Add touch feedback to Android OpenGl Es

The end result is a simple game of hockey, something like this

Understand the texture

Opengl texture can be used to represent, in images, photos and even branch data generated by an image algorithm, each 2 d texture is composed of many small texture (texels) elements, they are small pieces of data, similar to the discussed earlier pieces and pixels, common way is to use the texture directly to load an image data

Each 2 d texture has its own coordinate space, the range is from (0, 0) to (1, 1), by convention, a dimension is called S, a dimension called T, when we need to turn the texture to a triangle or a set of triangles, we will make every vertex specifies a set of texture coordinates ST, so use Opengl know, That part of the texture is drawn to the triangle, and these texture coordinates are sometimes referred to as UV texture coordinates

For a Opengl texture, he has no inner direction, so we can use different coordinates directed him to any direction we like, but most computer graphics have a default direction, they are usually prescribed for the y axis is positive down, y increases with moving to the bottom of the image, we need to remember, We have to take this into account if we want to view the image in the right direction

In Opengl2.0, textures need not be square, but each dimension must be a power of two (POT). The reason for this is that non-pot textures can be used in very limited situations, and POT textures can be used in all situations

There is also a maximum texture size, which varies from implementation to implementation, but it is usually large, such as 2048 * 2048

Load the texture into OpengL

public static int loadTexture(Context context, int resourceid) {
        int[] textureObjectId = new int[1];
        GLES20.glGenTextures(1, textureObjectId, 0);
        if (textureObjectId[0] = =0) {
            Log.d("mmm"."Texture loading failed");
            return 0;
        }

        BitmapFactory.Options options = new BitmapFactory.Options();
        options.inScaled=false;

        Bitmap bitmap = BitmapFactory.decodeResource(context.getResources(), resourceid, options);
        if (bitmap==null){
            Log.d("mmm"."Loading image is empty");
            GLES20.glDeleteTextures(1,textureObjectId,0);
            return 0;
        }


        GLES20.glBindTexture(GLES20.GL_TEXTURE_2D,textureObjectId[0]);

        
        return textureObjectId[0];
    }
Copy the code
methods describe
GLES20.glGenTextures(1, textureObjectId, 0) When you create a texture object, OpengL stores the generated texture ID in textureObjectId. A return value of 0 indicates failure, and a return value of none indicates success
GLES20.glDeleteTextures(1,textureObjectId,0) Delete the created texture ID
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D,textureObjectId[0]); Bind the texture ID. The first parameter indicates that the texture is treated as a 2d texture, and the second parameter indicates that the texture object ID is bound to

Introduce the above code

First we call glGenTextures to create a texture object. If it returns 0, it fails. If it doesn’t return 0, it succeeds. Using options. InScaled = false said tell Android images of the original data we need, rather than a narrow version of the image, and then call BitmapFactory. DecodeResource load image resources, if the loading image resources fails, Call gles20.glDeleteTextures to remove the texture ID. If the image resource loads successfully, call gles20.glbindTexture to bind the texture object

Understanding texture filtering

When the texture is larger or smaller, we also need to use the texture filtering, to introduce the following texture filtering, when we render a texture surface rendering, the texture elements may not be able to accurately mapped to Opengl fragments, has two kinds of circumstances, zoom in and out, when we tried to put several texture elements into a fragment, narrow occurs, Magnification occurs when we extend a texture element to multiple segments, and for each case we can configure OpengL to use a texture filter

Firstly, two basic filtering modes, nearest neighbor filtering and bilinear filtering, are introduced

Nearest neighbor filtering

When we zoom in on the texture, the jagged effect looks obvious. Each texture unit is clearly shown as a small square. When we zoom in on the texture, many details are lost because there are not enough fragments to draw all the texture units, as shown below:

Bilinear filtration

Bilinear filtering using bilinear interpolation smooth transition between pixels, and not for each elements, using the latest texture Opengl will use four adjacent texture elements, and between them using a linear interpolation algorithm, called he bilinear, along the two dimension interpolation, because he is bilinear filtering enlarged image below

This texture looks much smoother than the one above

MIP map

Although bilinear filter is suitable for processing, but when reduced to more than a certain size, he is not good, the size of a texture on the surface rendering takes to reduce, the more there will be more texture to a snippet, crowded because using bilinear filtering for each texture elements, we will lose a lot of details, Because each frame selects a different texture element, it also causes moving objects to flicker

To overcome these limitations, we can use MIP mapping techniques, which can be used to generate a set of optimized textures of different sizes. When generating this set of textures, Opengl uses all of the texture elements, generating textures at each level, and when filtering textures, making sure that all of the texture elements are used. During rendering, Opengl will select the appropriate level for each fragment based on the number of textures in each fragment, as shown below

Using MIP map takes up more memory, but the rendering will be faster, because of the small level of texture in the GPU texture cache takes up less space, as shown in the above, the MIP map will choose the most appropriate level of texture, and then use the optimized texture bilinear filtering each level of texture are made from all the texture information of building, The resulting images look better and retain more detail

Trilinear filtration

If OpengL switches back and forth between different MIP map levels, when we use bilinear filtering to use MIP maps, we sometimes see obvious jumps and lines when switching between different MIP maps in its rendering scene, we can switch to trilinear filtering, Telling OpengL to interpolate between the two nearest MIP map levels also helps eliminate transitions between each MIP map and results in a smoother image

Sets the default texture filter parameters

On the first code

      GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR_MIPMAP_LINEAR);
        GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
Copy the code

We use a glTexParameteri call to set each filter,GL_TEXTURE_MIN_FILTER for zoom out, GL_TEXTURE_MAG_FILTER for zoom out, and for zoom out, So let’s choose GL_LINEAR_MIPMAP_LINEAR which tells OpengL to use tri-linear filtering, and GL_LINEAR which tells OpengL to use bilinear filtering for magnification

Load the texture into OpenGl and return the ID

   GLUtils.texImage2D(GLES20.GL_TEXTURE_2D,0,bitmap,0);
Copy the code

This call tells OpengL to read the Bitmap image data and copy it to the currently bound texture ID

Now that the data has been loaded into OpenGl, we don’t need to hold bitmap data anymore. Normally, releasing this bitmap would take Dalvik several garbage collection cycles. We can call Bitmap. recycle and ask him to recycle it immediately

Generate MIP maps

Generating an MIP map is a relatively easy matter. Call the following method directly

   GLES20.glGenerateMipmap(GLES20.GL_TEXTURE_2D);
Copy the code

Unbind texture

Now that we are finished loading the texture, we need to unbind it so that we don’t accidentally change the texture by calling other texture methods

       GLES20.glBindTexture(GLES20.GL_TEXTURE_2D,0);
Copy the code

Passing 0 unbinds the current texture and returns the texture ID

Take a look at the completion code

   // Returns the texture ID
    public static int loadTexture(Context context, int resouurceId) {
        int[] textureId = new int[1];
        // Get the texture ID
        GLES20.glGenTextures(1, textureId, 0);
        if (textureId[0] = =0) {
            Log.d("mmm"."Failed to create a new Texture");
            return 0;
        }

        BitmapFactory.Options options = new BitmapFactory.Options();
        options.inScaled = false;

        Bitmap bitmap = BitmapFactory.decodeResource(context.getResources(), resouurceId, options);

        if (bitmap == null) {
            Log.d("mmm"."Failed to create bitmap");
            GLES20.glDeleteTextures(1, textureId, 0);
            return 0;
        }

        /** * The first argument tells OpengL to treat it as a 2d texture. The second argument tells OpengL which texture ID to bind to
        GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureId[0]);

        // Set the filter
        /** * GL_TEXTURE_MIN_FILTER: GL_LINEAR_MIPMAP_LINEAR: GL_TEXTURE_MAG_FILTER: GL_LINEAR_MIPMAP_LINEAR: GL_TEXTURE_MAG_FILTER: For magnification use GL_TEXTURE_MAG_FILTER: bilinear */
        GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR_MIPMAP_LINEAR);
        GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);

        // Load the image into OpengL
        GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, bitmap, 0);
        / / release the bitmap
        bitmap.recycle();
        // Generate mIP map
        GLES20.glGenerateMipmap(GLES20.GL_TEXTURE_2D);

        // Unbind the texture by passing the second argument to 0
        GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, 0);

        return textureId[0];

    }
Copy the code

Create a new shader collection

Before we can draw the texture onto the screen, we need to create a new set of shaders that can take the texture and draw it onto the fragment,

Creates a new vertex color finder

 attribute vec4 a_Position;
 // S t texture coordinates with 2 components
 attribute vec2 a_TextureCoordinates;

 varying vec2 v_TextureCoordinates;

 uniform mat4 u_Matrix;

  void main() {
      gl_Position =  u_Matrix * a_Position;
      v_TextureCoordinates = a_TextureCoordinates;
   }
Copy the code

This code first sets a uniform matrix and then the familiar position attribute a_Position. We add a new attribute a_TextureCoordinates, which has several components called S coordinates and T coordinates, so it is defined as a VEC2. We pass these coordinates to the interpolated VARYING of the vertex shader, called v_TextureCoordinates

Create a new fragment shader

 precision mediump float;
// Texture specific data
 uniform sampler2D u_TextureUnit;

 // Texture coordinates st
 varying vec2 v_TextureCoordinates;
   void main() {
        //texture2D: Extract the specific color value according to the texture coordinate ST
        gl_FragColor = texture2D(u_TextureUnit,v_TextureCoordinates);
    }
Copy the code

To draw the texture onto the object, OpengL calls the fragment shader for each fragment. Each caller receives the texture coordinates of v_TextureCoordinates. The fragment finder receives the actual texture data via u_TextureUnit. U_TextureUnit is defined as sampler2D, a variable type that refers to a two-dimensional texture array

The interpolated texture coordinates and texture data are passed to the shader function texture2D which reads the color value of that particular coordinate in the texture and then assigns the result to the color of the gl_FragColor set fragment

According to the object-oriented form of classification

Each class represents a type of physical object. We create a new class for Table and a new class for Mallet. We create a Mallet class for Mallet data and a Table for Table data

Add table data

Define a class to store table data, this class will store the table position data, add texture coordinates, and apply the texture to the table

public class Table {

    private final int BYTES_PER_FLOAT = 4;
    private final FloatBuffer floatBuffer;
    private int POSITION_COMPONENT_COUNT = 2;
    private final int TEXTURE_COMPONENT_COUNT = 2;
    private final int STRIDE = (POSITION_COMPONENT_COUNT + TEXTURE_COMPONENT_COUNT) * BYTES_PER_FLOAT;

    float[] tableVertices = {
            // Vertex coordinates xy
            0f.0f.//ST texture coordinates
            0.5 f.0.5 f,

            -0.5 f, -0.8 f.0f.0.9 f.0.5 f, -0.8 f.1f.0.9 f.0.5 f.0.8 f.1f.0.1 f,

            -0.5 f.0.8 f.0f.0.1 f,

            -0.5 f, -0.8 f.0f.0.9 f};public Table(a){
        floatBuffer = ByteBuffer
                .allocateDirect(tableVertices.length * BYTES_PER_FLOAT)
                .order(ByteOrder.nativeOrder())
                .asFloatBuffer()
                .put(tableVertices);
    }

    public void bindData(TextureSharderProgram textureSharderProgram) {
        setAttributeLocation(0, textureSharderProgram.getA_position(), POSITION_COMPONENT_COUNT, STRIDE);

        setAttributeLocation(POSITION_COMPONENT_COUNT, textureSharderProgram.getA_TextureCoordinates(), TEXTURE_COMPONENT_COUNT, STRIDE);

    }

    public void draw(a) {
        GLES20.glDrawArrays(GLES20.GL_TRIANGLE_FAN, 0.6);
    }

    public void setAttributeLocation(int dataOffset, int attributeLocation, int componentCount, int strite) {
        floatBuffer.position(dataOffset);
        GLES20.glVertexAttribPointer(attributeLocation, componentCount, GLES20.GL_FLOAT,
                false, strite, floatBuffer);
        GLES20.glEnableVertexAttribArray(attributeLocation);
        floatBuffer.position(0); }}Copy the code

Understand the above code

First define vertex coordinates and power coordinates

First, the vertex coordinates are in the range [-1,1] and the texture coordinates are in the range [0,1]. Let me draw them as shown below

We see that the second and sixth vertices of texture coordinates and vertex coordinates are actually defined in the opposite direction of the y direction. The reason for this is that the computer coordinates and texture coordinates are in different directions

The early constructor vertex data is then loaded into local memory

BindData then binds the vertex data to the shader properties

Finally draw the table using the draw method

Add mallet data

public class Mallet {
    private final int BYTES_PER_FLOAT = 4;
    private final FloatBuffer floatBuffer;
    private int POSITION_COMPONENT_COUNT = 2;
    private final int COLOR_COMPONENT_COUNT = 3;
    private final int STRIDE = (POSITION_COMPONENT_COUNT + COLOR_COMPONENT_COUNT) * BYTES_PER_FLOAT;

    float[] MalletVertices = {
            / / the vertices
            0f, -0.4 f.//rgb
            1f.0f.0f.0f.0.4 f.0f.0f.1f
    };

    public Mallet(a) {
        floatBuffer = ByteBuffer
                .allocateDirect(MalletVertices.length * BYTES_PER_FLOAT)
                .order(ByteOrder.nativeOrder())
                .asFloatBuffer()
                .put(MalletVertices);
    }


    public void bindData(ColorShaderProgram colorShaderProgram) {
        setAttributeLocation(0, colorShaderProgram.getA_position(), POSITION_COMPONENT_COUNT, STRIDE);
        setAttributeLocation(POSITION_COMPONENT_COUNT, colorShaderProgram.getA_color(), COLOR_COMPONENT_COUNT, STRIDE);

    }


    public void draw(a) {
        GLES20.glDrawArrays(GLES20.GL_POINTS, 0.2);
    }

    public void setAttributeLocation(int dataOffset, int attributeLocation, int componentCount, int strite) {
        floatBuffer.position(dataOffset);
        GLES20.glVertexAttribPointer(attributeLocation, componentCount, GLES20.GL_FLOAT,
                false, strite, floatBuffer);
        GLES20.glEnableVertexAttribArray(attributeLocation);
        floatBuffer.position(0); }}Copy the code

I’m going to write it roughly the same way as I did above, so I’m not going to talk about it

Add classes for shaders

We need to create two shaders, a texture color finder, and then a color shader

Texture shader

public class TextureSharderProgram {
    private final int u_matrix;
    private final int u_TextureUnit;
    private final int a_position;
    private final int a_TextureCoordinates;
    private final int program;

    public TextureSharderProgram(Context context) {
        // Read the shader source code
        String fragment_shader_source = ReadResouceText.readResoucetText(context, R.raw.texture_fragment_shader);
        String vertex_shader_source = ReadResouceText.readResoucetText(context, R.raw.texture_vertex_sharder);
        program = ShaderHelper.buildProgram(vertex_shader_source, fragment_shader_source);


        a_position = GLES20.glGetAttribLocation(program, "a_Position");
        a_TextureCoordinates = GLES20.glGetAttribLocation(program, "a_TextureCoordinates");


        u_matrix = GLES20.glGetUniformLocation(program, "u_Matrix");
        u_TextureUnit = GLES20.glGetUniformLocation(program, "u_TextureUnit");
    }


    public void setUniforms(float[] matrix, int textureId) {
        GLES20.glUniformMatrix4fv(u_matrix, 1.false, matrix, 0);

        // Set the active texture unit to texture unit 0
        GLES20.glActiveTexture(GLES20.GL_TEXTURE0);

        // Bind the texture to the unit
        GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureId);

        // Pass the selected shader to the u_TextureUnit property of the fragment shader
        GLES20.glUniform1i(u_TextureUnit, 0);
    }

    public int getA_position(a) {
        return a_position;
    }

    public int getA_TextureCoordinates(a) {
        return a_TextureCoordinates;
    }

    public void useProgram(a){
        // Use the programGLES20.glUseProgram(program); }}Copy the code

There are three new apis

methods describe
GLES20.glActiveTexture(GLES20.GL_TEXTURE0) Set the active texture unit to Texture unit 0
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureId) Bind the texture to this unit
GLES20.glUniform1i(u_TextureUnit, 0); Pass the selected shader to the u_TextureUnit property of the fragment shader

Color shader

public class ColorShaderProgram {
    private final int a_color;
    private final int a_position;
    private final int u_matrix;
    private final int program;

    public ColorShaderProgram(Context context) {

        // Read the shader source code
        String fragment_shader_source = ReadResouceText.readResoucetText(context, R.raw.fragment_shader1);
        String vertex_shader_source = ReadResouceText.readResoucetText(context, R.raw.vertex_shader2);
        program = ShaderHelper.buildProgram(vertex_shader_source, fragment_shader_source);

        a_color = GLES20.glGetAttribLocation(program, "a_Color");
        a_position = GLES20.glGetAttribLocation(program, "a_Position");
        u_matrix = GLES20.glGetUniformLocation(program, "u_Matrix");
    }


    public void setUniforms(float[] matrix) {
        GLES20.glUniformMatrix4fv(u_matrix, 1.false, matrix, 0);
    }

    public int getA_color(a) {
        return a_color;
    }

    public int getA_position(a) {
        return a_position;
    }


    public void useProgram(a){
        // Use the programGLES20.glUseProgram(program); }}Copy the code

textures

Go straight to the code

public class AirHockKeyRender4 implements GLSurfaceView.Renderer {


    / / texture
    private final Context mContext;


    // Projection matrix
    private float[] mProjectionMatrix = new float[16];
    // Model matrix
    private float[] mModelMatrix = new float[16];
    private Table table;
    private Mallet mallet;
    private TextureSharderProgram textureSharderProgram;
    private ColorShaderProgram colorShaderProgram;
    private int textureid;


    public AirHockKeyRender4(Context context) {
        this.mContext = context;
    }

    @Override
    public void onSurfaceCreated(GL10 gl, EGLConfig config) {
        // Clear the screen and make it blue
        GLES20.glClearColor(0.0 f.0.0 f.0.0 f.0.0 f);

        table = new Table();
        mallet = new Mallet();

        textureSharderProgram = new TextureSharderProgram(mContext);
        colorShaderProgram = new ColorShaderProgram(mContext);

        textureid = TextureHelper.loadTexture(mContext, R.mipmap.text);

    }

    @Override
    public void onSurfaceChanged(GL10 gl, int width, int height) {
        // After the Surface is created, this method is called every time the Surface size changes, such as switching between horizontal and vertical screens
        // Set the screen size
        GLES20.glViewport(0.0, width, height);
        // Create a perspective projection that starts at z-1 and ends at -10
        MatrixHelper.perspetiveM(mProjectionMatrix, 45, (float) width / (float) height, 1f.10f);


        // Set to the identity matrix
        Matrix.setIdentityM(mModelMatrix, 0);
        // Shift z-axis -2f
        Matrix.translateM(mModelMatrix, 0.0f.0f, -3f);

        // Rotate -60 degrees about the X-axis
        Matrix.rotateM(mModelMatrix, 0, -60.1.0 f.0f.0f);


        float[] temp = new float[16];
        // Matrix multiplication
        Matrix.multiplyMM(temp, 0, mProjectionMatrix, 0, mModelMatrix, 0);
        // Assign the matrix repeatedly to the projection matrix
        System.arraycopy(temp, 0, mProjectionMatrix, 0, temp.length);
    }

    @Override
    public void onDrawFrame(GL10 gl) {
        // Clear the screen of all colors, then reset the glClearColor color
        GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
		
        / / table
        textureSharderProgram.useProgram();
        textureSharderProgram.setUniforms(mProjectionMatrix, textureid);
        table.bindData(textureSharderProgram);
        table.draw();
		
        / / draw the gavelcolorShaderProgram.useProgram(); colorShaderProgram.setUniforms(mProjectionMatrix); mallet.bindData(colorShaderProgram); mallet.draw(); }}Copy the code

Run to view the effect

Mix two pictures

Change the color finder

 precision mediump float;
// Texture specific data
 uniform sampler2D u_TextureUnit;

  uniform sampler2D u_TextureUnit1;
 // Texture coordinates st
 varying vec2 v_TextureCoordinates;
   void main() {
        //texture2D: Extract the specific color value according to the texture coordinate ST
        gl_FragColor = texture2D(u_TextureUnit1,v_TextureCoordinates)*texture2D(u_TextureUnit,v_TextureCoordinates);
    }
Copy the code

U_TextureUnit1 is used to load the second texture data,texture2D(u_TextureUnit1,v_TextureCoordinates)*texture2D(u_TextureUnit,v_TextureCoordinates); Get the last mixed data

Load the second image

        textureid1 = TextureHelper.loadTexture(mContext, R.mipmap.air_hockey_surface);

Copy the code

This is very easy to use the same method we encapsulated, loading the chapter 2 image

textures

   public void setUniforms1(int textureId) {

        // Set the active texture unit to texture unit 1
        GLES20.glActiveTexture(GLES20.GL_TEXTURE1);

        // Bind the texture to the unit
        GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureId);

        // Pass the selected shader to the u_TextureUnit property of the fragment shader
        GLES20.glUniform1i(u_TextureUnit1, 1);
    }
Copy the code

The second image is received with texture 1 and assigned to u_TextureUnit1

The last call

   @Override
    public void onDrawFrame(GL10 gl) {
        // Clear the screen of all colors, then reset the glClearColor color
        GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);

        textureSharderProgram.useProgram();
        textureSharderProgram.setUniforms(mProjectionMatrix, textureid);
        textureSharderProgram.setUniforms1(textureid1);
        table.bindData(textureSharderProgram);
        table.draw();

        colorShaderProgram.useProgram();
        colorShaderProgram.setUniforms(mProjectionMatrix);
        mallet.bindData(colorShaderProgram);
        mallet.draw();
    }
Copy the code

Look at the effect

summary

Texture is not directly drawn, he needs to be bound to the texture unit, then put these texture unit is passed to the shader, by changing the texture ChanYuanJi for we also can draw different texture in the scene, but excessive switch can make the performance degradation, we can also use multiple texture unit drawing several at the same time

Click the link below to help cast your vote

rank.juejin.cn?t=user&u=renxhui&utm_campaign=annual_2020&utm_medium=self_web_share&utm_source=renxhui