preface

Recently, I received a project about full-screen video. The most important part of the project assigned to me is that the end page requires 3D greeting card display. It happened to be the same as the one shared by NingBo children’s shoes a few days ago. Ha ha ~

The implementation process

The demo address

20190426- At this stage I just made a basic version, curve animations and so on are minor things.

20190427- EaseBackout curve method is now added, d3-ease feels good to use, as well as the flowerfloat logic, which will be explained later. I’m so hungry. Let’s go eat. Ha ha (updated, pure text, do not know at any time to ask questions)

20190429- Design overhaul, no longer like this, I put this out as a demo, Sweat ~~ Fortunately, it’s all the same

Finger drag rotation logic is not used in this project, so it was not addedCopy the code

Ps: Do you think devTool in Chrome is not this interface? Ha ha ha, recently working on a visual toolCopy the code

Without further ado, let’s move on:

WebGL initialization (general operation)

  1. Get WebGLRenderingContext
   const gl = canvas.getContext('webgl');
Copy the code
  1. Compile the shader and attach the compiled shader to the created program
// Vertex shader const vertShader = gl.createshader (gl.vertex_shader); gl.shaderSource(vertShader,vertSource); //vertSource: gl.compileShader(vertShader) // Const fragShader = gl.createshader (gl.fragment_shader); // Const fragShader = gl.createshader (gl.fragment_shader); gl.shaderSource(fragShader,fragSource); //fragSource: gl.pileshader (fragShader); //program related const program = gl.createProgram(); gl.attachShader(program,vertShader); // Attach vertex shader gl.attachShader(program,fragShader); // Append the element shader gl.linkProgram(program);Copy the code

3. Open the deep test because the greeting card is 3D

    gl.enable(gl.DEPTH_TEST);
Copy the code

4. Since the element is not a model but a rectangle, but the material is transparent, when the element is superimposed, the current pixel will be overwritten into the buffer. For example, the color value (0,0,0) will overwrite the existing color (1,0,0,1). The solution bar is to turn on mixed mode.

    gl.blendFunc(gl.SRC_ALPHA, gl.ONE_MINUS_SRC_ALPHA);
    gl.enable(gl.BLEND);
Copy the code

Of course you don't have to turn this on if you make sure every element is a JPG. BlendFunc defines the blending method. The first parameter defines what to do with the source pixel and the second parameter defines what to do with the target pixel (color buffer). Final pixel = source pixel color * source transparency + buffer color * (1- Source transparency)Copy the code

Now that the initialization is done, let’s see how to add elements.

Add the Card element

First, attach the shader source code

//vertSource
uniform mat4 uCameraMatrix;
uniform mat4 uTransformMatrix;
attribute vec3 aPosition;
attribute vec2 aUv;
varying vec2 vUv;
void main(){
    const floatScale = 1.0/1.6; Const mat4 viewAngle = mat4(1,0,0,0, 0,cos(-.1),-sin(-.1),0, 0); 0, sin (- 1), cos (- 1), 0, 0, - 50-1000, 1); vec3 cPosition = (aPosition)*vec3(scale,scale,-1); Gl_Position = viewAngle uCameraMatrix * * (uTransformMatrix * vec4 (cPosition. Xy, 0.0, 1.0) + vec4 (0, 0, cPosition. Z, 0)); vUv = aUv; // For sampling location}Copy the code
// Chip shader precision highpfloat;
uniform sampler2D uImage;
varying vec2 vUv;
void main(){
    vec4 color = texture2D(uImage,vUv);
    if(color. A = = 0.0) {the discard; } gl_FragColor = color;} gl_FragColor = color; }Copy the code

The vertex shaders mentioned above are mainly discussed

Gl_Position = viewAngle uCameraMatrix * * (uTransformMatrix * vec4 (cPosition. Xy, 0.0, 1.0) + vec4 (0, 0, cPosition. Z, 0));Copy the code

UCameraMatrix: Perspective matrix

ViewAngle: equivalent to lookAt, I wanted to integrate the two matrices directly in JS, but gl-mat4 lookAt method is not correct, I gave up, directly write in, and I took the Z axis in the wrong, because THE perspective matrix in GL-mat4 gave me the reverse, I’m not used to it. 😓, and just came back.

Why don’t I just write uTransformMatrix

UTransformMatrix * VEC4 (cPosition,1.0) is written instead

UTransformMatrix * vec4 (cPosition. Xy, 0.0, 1.0) + vec4 (0, 0, cPosition. Z, 0)? What I do is I take the same matrix and I rotate each element with respect to its base, if the Z-axis is not zero then the rotation is not the base, so I have to transform first, and then I do the z-axis shift, which is what I want each element to rotate with respect to its base.

After Shader came the drawArrays.

I divided the whole 3D into two types of elements, one is constant, namely the ground, and the other is rotated with the expansion, namely the non-ground part.

Compared with other parts of the ground, only the uTransformMatrix is an identity matrix, while the others change over time. Therefore, I choose to unify them into the same structure and add a rotateFlag to distinguish them. Each data structure is as follows:

interface attribData{ buffer:WebGLBuffer; data:Float32Array; // Record vertex and UV texture? :WebGLTexture; RotateFlag: Boolean; // Rotate the switch}Copy the code
  1. The element data is written first
CreateStandEle (file,[x,y,z]){//file: image name //this.option.assets[file]: image const scale = math.sqrt ((600+z)/600); Const imgWidth = (<HTMLImageElement>this.option.assets[file]). NaturalWidth *scale; const imgHeight = (<HTMLImageElement>this.option.assets[file]).naturalHeight*scale; const name = file.match(/card\_([^\.]+)/)[1]; Const data = {buffer: this. Gl. CreateBuffer (), data: new Float32Array ([/ / vertex data UV data - imgWidth / 2 x, y, z, 0, 1, X +imgWidth/2,y,z, 1,1, x-imgwidth /2,y+imgHeight,z, 0,0, x+imgWidth/2,y+imgHeight,z, 1,0]), texture:this.gl.createTexture(), rotateFlag:true}; This. CardData [name] = data; // This. this.gl.bindBuffer(this.gl.ARRAY_BUFFER,data.buffer); // Write data this.gl.bufferData(this.gl.ARRAY_BUFFER, data.data, this.gl.STATIC_DRAW) to ARRAY_BUFFER; this.gl.activeTexture(this.gl.TEXTURE0); this.gl.bindTexture(this.gl.TEXTURE_2D,data.texture);letformat = this.gl.RGB; // Alpha is not necessary for JPGif(texture.search(/\.png$/)>=0){ format = this.gl.RGBA; } // Set the texture to gl.texture_2d this.gl.texImage2D(this.gl.TEXTURE_2D,0,format,format,this.gl.UNSIGNED_BYTE,this.option.assets[texture]); This.gl.texparameteri (this.gl.texture_2d, this.gl.texture_min_filter, this.gl.linear); this.gl.texParameteri(this.gl.TEXTURE_2D,this.gl.TEXTURE_MAG_FILTER,this.gl.LINEAR); this.gl.texParameteri(this.gl.TEXTURE_2D,this.gl.TEXTURE_WRAP_S,this.gl.CLAMP_TO_EDGE); this.gl.texParameteri(this.gl.TEXTURE_2D,this.gl.TEXTURE_WRAP_T,this.gl.CLAMP_TO_EDGE); }Copy the code
The basic logic is to create a buffer -> bind a buffer -> assign a value to the bufferCopy the code

Scale is necessary, the code in the characteristics of the matrix is nearly coscodl small, but the design manuscript are flat, without the concept of distance, and near and far, in the front of the PSD of the elements according to the principle of depending almost small is not for processing, then the near will have a very big, then have a homecoming said, I directly reduce image is ok, So the question again, reduce image after nearly depending the principle of small, is actually the closest elements in the amplification effect, small pictures put the deficiency is known to all, so a policy of direct reduce image is wrong, the only way is to modify the element size, to populate the pictures, so as not to appear the phenomenon of virtual.

  1. Perform rendering
    render(timeStamp,offsetTime){
        if(this rotateX < math.h PI / 2) {enclosing rotateX + = 0.01 * offsetTime; }else{ this.rotateX = Math.PI/2; } object.keys (this.cardData).foreach (I =>{// iterate over and render all elements this.renderBuffer(this.cardData[I]); }); super.render(timeStamp,offsetTime); } renderBuffer(data:attribData){ this.gl.clear(this.gl.COLOR_BUFFER_BIT|this.gl.DEPTH_BUFFER_BIT); this.gl.useProgram(this.cardProgram);if(data.rotateFlag){
            const rotate = this.rotateX-Math.PI/2;
            this.gl.uniformMatrix4fv(this.cardParam.uTransformMatrix,false, new Float32Array ([,0,0,0 1, 0, Math. Cos (rotate), - Math. Sin (rotate), 0, 0, Math. Sin (rotate), Math. Cos (rotate), 0, 0,0,0,1, ])); }else{// Assign an identity matrix to uTransformMatrix if the element is not rotated. this.gl.uniformMatrix4fv(this.cardParam.uTransformMatrix ,false,this.identityMatrix);
        }
        this.gl.bindBuffer(this.gl.ARRAY_BUFFER,data.buffer);
        this.gl.vertexAttribPointer(<GLint>this.cardParam.aPosition,3,this.gl.FLOAT,false, 4 * 5, 0); this.gl.vertexAttribPointer(<GLint>this.cardParam.aUv,2,this.gl.FLOAT,false, 5, 4 * 4 * 3); this.gl.activeTexture(this.gl.TEXTURE0); this.gl.bindTexture(this.gl.TEXTURE_2D,data.texture); This. Gl. DrawArrays (enclosing gl TRIANGLE_STRIP, 0, 4); }Copy the code

The basic logic is bind buffers & bind textures -> tells the graphics card to read vertex data from the currently bound buffer ->drawArrays

The flower flutters and moves efficiently

vUV

// Chip shader precision highpfloat; uniform sampler2D uImage; uniform int uType; //0: non-grass 1: grass These are all uniform Settings in JSfloatuTime; // The current timestamp varying VEC2 vUv; voidmain(){
    vec4 color = vec4(0);
    if(uType == 1){// the grass partfloatOffset = short (vUv, vec2 (0.5, 1.0)); Offset (offset, 2.) = pow / 8.0 * sin (uTime); mat2 rotate = mat2( cos(offset),-sin(offset), sin(offset),cos(offset) ); Vec2 cUv = vec2 (0.5, 1.0) + rotate * (vUv - vec2 (0.5, 1.0));if(cUv.x<0.||cUv.y<0.||cUv.x>1.||cUv.y>1.) discard;
        color = texture2D(uImage,cUv);
    }else{
        color = texture2D(uImage,vUv);
    }
    if(color. A = = 0.0) {the discard; } gl_FragColor = color; }Copy the code

The grass part is the current UV offset processing.

Because the grass is rooted to the ground, and the fluttering is not linear, the farther away from the ground, the greater the fluttering is, the vertex shader cannot operate the skew operation (similar to the SKEw operation of CSS Transform).

I choose here is the current UV bottom center (grass root) distance to the second power, the farther the distance from the bottom center is more obvious.

floatOffset = short (vUv, vec2 (0.5, 1.0)); Offset (offset, 2.) = pow / 8.0; // /8.0 is used directly, the amplitude is too large and the UV value is between 0 and 1, make a reduction processingCopy the code

When sin is 0, the result is the same as the vUv that was passed in because of the multiplication. That is to say, as with the map element, when sin is 1 and -1 is the maximum deviation, that is, the distorted image.

    offset = offset*sin(uTime);
Copy the code

This value is then used as the Angle of the rotation matrix to generate the new UV coordinates.

Mat2 rotate = mat2(// rotate matrix cos(offset),-sin(offset), sin(offset),cos(offset)); Vec2 cUv = vec2 (0.5, 1.0) + rotate * (vUv - vec2 (0.5, 1.0)); // Rotate relative to the bottom centerCopy the code

All that remains is to pass in the timestamp and uType value for each frame in JS. The rest is handled by the webGL rendering pipeline.

This part should actually be called a filter. For example, water dynamic effect, flame dynamic effect, and filter in pixiJs basically follow the same process. The effect of shaking tone and RGB color separation are all handled here.

This part of the pure text also do not know if you can understand. I don’t think there’s much more to add. The rest is the non-WebGL part. After the project is completed in two days, the webGL section will be shown at the end. Sweat…

What do not understand the message ~~~ welcome to ask ~ ha ha ha