background

This article is included in the column Data Visualization and Graphics

First of all, I made a simple comparison between Canvas 2D and WebGL in the last article. We can see that the API of WebGL is relatively difficult to understand. So I think the following coding is more to take WebGL to achieve our results (yes, play ~) of course, the knowledge point is from simple to complex… Visual physics, lighting/global lighting, anti-aliasing, delayed rendering, real-time rendering…. Maybe I lost the reason I started the column; Of course, if you want to discuss further, you can communicate in private.

The above implementation of simple 2D graphics
Refer to the above if necessary

This outline

  1. What is texture? What’s the difference between 2D and 3D texture mapping?
  2. The Texturing Pipeline
  3. Coding (simple use of textures in WebGL)

1. What is texture? What’s the difference between 2D and 3D texture mapping?

In computer graphics, Texturing is the use of images, functions, or other data sources to change the appearance of an object’s surface.

2 d texture

A 2D Texture is a simple bitmap image that provides the color values of the surface points for a 3D model

3 d texture

A 3D texture can be thought of as a set of 2D textures that describe a picture of 3D spatial data.

2. The Texturing Pipeline

The render pipeline may be a black box for everyday use, but understanding this is a huge improvement in your programming… (do not understand it does not matter originally this piece of introduction is also very shallow most copy from other places. Ha ha)


In short, Texturing is an efficient technique for “modeling” the properties of an object’s surface. Pixels in an image texture are often called Texels and are different from pixels on the screen.

See the following image – a detailed process of applying a texture map to a single texture

  1. The projector function converts 3D points in space into texture coordinates, which takes the surface position and projects it into the parameter space. For example, there are functions related to spheres, cylinders, and plane projections
  2. The Corresponder Function converts parameter-space coordinates to texture space locations.
  • The first step. A set of parameter-space values about the texture is obtained by applying the projector function to the points in the space.
  • The second step. Before accessing the texture with these new values, one or more of the corresponder functions can be used to convert parameter-space values into the texture space.
  • The third step. Use these texture space values (texture-space locations) to obtain the corresponding values from the texture. For example, you can use an array index of an image texture to retrieve pixel values.
  • Step 4. Then the value transform function is used to perform value transformation on the retrieval results, and finally the newly changed surface attributes, such as material or shading normals, are used.

It’s very important in graphics that you have to supplement your knowledge of normals or normal vectors.

Coding (simple texture mapping 2D in WebGL)

Simply use WebGL to implement a texture (map) and animate it through a matrix (rotation). Insert a GIF with a dot card.

1. Added texture coordinates and modified vertex coordinates (using matrix/rotation)

//vertex shader attribute vec4 a_position; attribute vec2 a_texcoord; uniform mat4 u_matrix; varying vec2 v_texcoord; void main() { gl_Position = u_matrix * a_position; // pass v_texcoord = a_texcoord; } // fragment shader precision mediump float; varying vec2 v_texcoord; uniform sampler2D u_texture; void main() { gl_FragColor = texture2D(u_texture, v_texcoord); }

2. Initialize the context and shader program

var canvas = document.querySelector("#canvas"); var gl = canvas.getContext("webgl"); if (! gl) { return; }

3. Initial setup of GLSL program and add buffer (texture)

Var / / set the GLSL program program. = webglUtils createProgramFromScripts (gl, [" vertex shader - ", "fragment shader -"]); // Get the vertex coordinates to bind to var positionLocation = gl.getAttributionLocation (program, "a_position"); var texcoordLocation = gl.getAttribLocation(program, "a_texcoord"); Var MatxLocation = gl.getUniformLocation(program, "u_matrix"); var textureLocation = gl.getUniformLocation(program, "u_texture"); Var positionBuffer = gl.createBuffer(); var positionBuffer = gl.createBuffer(); gl.bindBuffer(gl.ARRAY_BUFFER, positionBuffer); var positions = [ -1, -1, -1, 1, 1, -1, 1, -1, -1, 1, 1, 1, ]; gl.bufferData(gl.ARRAY_BUFFER, new Float32Array(positions), gl.STATIC_DRAW); Var texcoordBuffer = gl.createBuffer(); var texcoordBuffer = gl.createBuffer(); gl.bindBuffer(gl.ARRAY_BUFFER, texcoordBuffer); var texcoords = [ 0, 0, 0, 1, 1, 0, 1, 0, 0, 1, 1, 1, ]; gl.bufferData(gl.ARRAY_BUFFER, new Float32Array(texcoords), gl.STATIC_DRAW);

4. Create and load the texture

/ / solve the problem of image cross-domain function requestCORSIfNotSameOrigin (img, url) {if ((new url (url, window. The location. Href)). The origin! == window.location.origin) { img.crossOrigin = ""; }} // create a texture {width: w, height: h, texture: Tex} / / initialization 1 x1 px pixel image loaded after the update function loadImageAndCreateTextureInfo (url) {var Tex = gl. CreateTexture (); gl.bindTexture(gl.TEXTURE_2D, tex); // Fill the texture with a 1x1 blue pixel. gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, 1, 1, 0, gl.RGBA, gl.UNSIGNED_BYTE, new Uint8Array([0, 0, 255, 255])); // let's assume all images are not a power of 2 gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE); gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE); gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR); var textureInfo = { width: 1, height: 1, texture: tex, }; var img = new Image(); img.addEventListener('load', function() { textureInfo.width = img.width; textureInfo.height = img.height; gl.bindTexture(gl.TEXTURE_2D, textureInfo.texture); // Call TexImage2D () to write the loaded image graphics data to the texture gl.TexImage2D (gl.Texture_2D, 0, gl.rgba, gl.rgba, gl.unsigned_byte, img); render() }); requestCORSIfNotSameOrigin(img, url); img.src = url; return textureInfo; } / var/load texture texInfo = loadImageAndCreateTextureInfo (' https://webglfundamentals.org/webgl/resources/leaves.jpg ');

5. Draw relevant Settings texture coordinates and assign matrix coordinates. Draw using a shader program.

Function render(time) {time *= 1; / / time acceleration rotation picture / / set the canvas size webglUtils again. The resizeCanvasToDisplaySize (gl. Canvas); // Trims the pixel area gl. Viewport (0, 0, gl. Canvas. Width, gl. Canvas. gl.clear(gl.COLOR_BUFFER_BIT); gl.bindTexture(gl.TEXTURE_2D, texInfo.texture); // Use the shader program gl.useProgram(program); // Set parameters so that we can draw any size image gl.bindBuffer(gl.ARRAY_BUFFER, positionBuffer); gl.enableVertexAttribArray(positionLocation); gl.vertexAttribPointer(positionLocation, 2, gl.FLOAT, false, 0, 0); gl.bindBuffer(gl.ARRAY_BUFFER, texcoordBuffer); gl.enableVertexAttribArray(texcoordLocation); gl.vertexAttribPointer(texcoordLocation, 2, gl.FLOAT, false, 0, 0); var aspect = gl.canvas.clientWidth / gl.canvas.clientHeight; var matrix = m4.scaling(1, aspect, 1); Matrix = m4.zRotate(matrix, time); matrix = m4.zRotate(matrix, time); Matrix = m4.scale(matrix, 0.5, 0.5, 1); // Set the matrix gl.uniformMatrix4fv(MatrixLocation, false, matrix); Uniform1i (TextuRelocation, 0); uniform1i(TextuRelocation, 0); // Array (2 triangles, 6 vertices) gl.drawarrays (gl.triangles, 6); requestAnimationFrame(render); }
  1. webgl-utils.jsWebGL related functions encapsulate the library of tools
  2. m4.jsMatrix related mathematical function library

    Complete code examplePlease click on the Git repository to see the code examples

There are some things you might want to know about 2D rendering

  1. Texture cache
  2. Texture compression
  3. 2D/2D texture optimization
  4. RENDER OPTIMIZATION…

The last

Finally, I strongly hope that you can learn relevant theoretical knowledge; Theory may not be used much on a daily basis, but it can determine how far you go. I will speed up this column (1-2 articles a week) and the others (1) I will cover all the basics related to computer graphics. Then I mainly write the direction of data visualization.