primers

When watching How I Built a Wind Map with WebGL, I used the framebuffer, so I checked the data and tried it separately.

Frame buffer object

WebGL has the ability to use rendered results as textures, using the FrameBuffer object.

By default, WebGL’s final drawing results are stored in the color buffer. Frame buffer objects can be used instead of the color buffer, as shown below. Objects drawn in the frame buffer do not display directly on the Canvas, so this technique is also called offscreen Drawing.

The sample

To verify the above functionality, this example draws an image in the frame buffer and displays it again as a texture.

Based on the logic of using the image example, there are several major changes:

  • data

  • Frame buffer object

  • draw

data

Drawing in the frame buffer is the same as normal drawing, except that it is not displayed, so it also has the corresponding drawing area size, vertex coordinates and texture coordinates.


offscreenWidth: 200.// The width of the off-screen drawing

offscreenHeight: 150.// The height of the off-screen drawing

// Some code is omitted

// Vertex and texture coordinates drawn for the frame buffer

this.offScreenBuffer = this.initBuffersForFramebuffer(gl);

// Some code is omitted

initBuffersForFramebuffer: function (gl) {

    const vertices = new Float32Array([

    0.5.0.5, -0.5.0.5, -0.5, -0.5.0.5, -0.5,]);/ / rectangle

    const indices = new Uint16Array([0.1.2.0.2.3]);

    const texCoords = new Float32Array([

    1.0.1.0./ / the top right corner

    0.0.1.0./ / the top left corner

    0.0.0.0./ / the bottom left corner

    1.0.0.0./ / the bottom right hand corner

    ]);

    const obj = {};

    obj.verticesBuffer = this.createBuffer(gl, gl.ARRAY_BUFFER, vertices);

    obj.indexBuffer = this.createBuffer(gl, gl.ELEMENT_ARRAY_BUFFER, indices);

    obj.texCoordsBuffer = this.createBuffer(gl, gl.ARRAY_BUFFER, texCoords);

    return obj;

},

createBuffer: function (gl, type, data) {

    const buffer = gl.createBuffer();

    gl.bindBuffer(type, buffer);

    gl.bufferData(type, data, gl.STATIC_DRAW);

    gl.bindBuffer(type, null);

    return buffer;

}

// Some code is omitted

Copy the code

Vertex shaders and slice shaders can be newly defined, and a set of them is common here for convenience.

Frame buffer object

To draw in the frame buffer, create the corresponding frame buffer object.


// Frame buffer object

this.framebufferObj = this.createFramebufferObject(gl);

// Some code is omitted

createFramebufferObject: function (gl) {

    let framebuffer = gl.createFramebuffer();

    let texture = gl.createTexture();

    gl.bindTexture(gl.TEXTURE_2D, texture);

    gl.texImage2D(

        gl.TEXTURE_2D,

        0,

        gl.RGBA,

        this.offscreenWidth,

        this.offscreenHeight,

        0,

        gl.RGBA,

        gl.UNSIGNED_BYTE,

        null

    );

    // Reverse the Y direction of the image

    gl.pixelStorei(gl.UNPACK_FLIP_Y_WEBGL, true);

    // Texture coordinates are horizontally filled with s

    gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);

    // Texture coordinates are filled vertically with t

    gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);

    // Texture magnification

    gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.LINEAR);

    // Texture reduction

    gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR);

    framebuffer.texture = texture; // Save the texture object

    // Associate buffer objects

    gl.bindFramebuffer(gl.FRAMEBUFFER, framebuffer);

    gl.framebufferTexture2D(

        gl.FRAMEBUFFER,

        gl.COLOR_ATTACHMENT0,

        gl.TEXTURE_2D,

        texture,

        0

    );

    // Check whether the configuration is correct

    var e = gl.checkFramebufferStatus(gl.FRAMEBUFFER);

    if(gl.FRAMEBUFFER_COMPLETE ! == e) {console.log("Frame buffer object is incomplete: " + e.toString());

    return;

    }

    gl.bindFramebuffer(gl.FRAMEBUFFER, null);

    gl.bindTexture(gl.TEXTURE_2D, null);

    return framebuffer;

}

// Some code is omitted

Copy the code
  • The createFramebuffer function creates a framebuffer object. The function for deleting an object is deleteFramebuffer.

  • The texture object created in the sample has several characteristics: 1. The width and height of the texture are consistent with the width and height of the drawing region; 2. When texImage2D is used, the last parameter of texImage2D is null, which reserves a blank area to store the texture object. Framebuffer. Texture = texture.

  • The bindFramebuffer function binds the framebuffer to the target, and then uses framebufferTexture2D to bind the previously created texture object to the framebuffer’s color associated object gl.COLOR_attachment0.

  • CheckFramebufferStatus Checks whether the frame buffer object is configured correctly.

draw

The main difference when drawing is that there is a switching process:


// Some code is omitted

draw: function () {

    const gl = this.gl;

    const frameBuffer = this.framebufferObj;

    this.canvasObj.clear();

    const program = this.shaderProgram;

    gl.useProgram(program.program);

    // This turns the drawn target into the frame buffer

    gl.bindFramebuffer(gl.FRAMEBUFFER, frameBuffer);

    gl.viewport(0.0.this.offscreenWidth, this.offscreenHeight);

    this.drawOffFrame(program, this.imgTexture);

    // Unbind the frame buffer and draw the target into the color buffer

    gl.bindFramebuffer(gl.FRAMEBUFFER, null);

    gl.viewport(0.0, gl.canvas.width, gl.canvas.height);

    this.drawScreen(program, frameBuffer.texture);

},

// Some code is omitted

Copy the code
  • Use the bindFramebuffer first to make the drawn object into a frame buffer, specifying the corresponding viewport.

  • After the frame buffer is drawn, unbind and return to the normal default color buffer, also need to specify the corresponding viewport, but also more special is the use of the buffer object texture, which indicates that it is drawn from the frame buffer.

Observation and thinking

The examples found on the Internet feel complicated, so here are some observations and thoughts in trying to simplify.

framebuffer.textureIs it an existing attribute or an artificial one?

Framebuffer. texture = texture, so the framebuffer object itself has the texture attribute.

The print log shows that this attribute was not created when it was first created, so it must have been added artificially.

framebuffer.textureWhen did you get the content?

When the framebuffer object is initialized, the stored texture is empty, but as a result, the texture has content after the framebuffer is drawn.

In the drawing logic, texture-related statements are:


gl.activeTexture(gl.TEXTURE0);

gl.bindTexture(gl.TEXTURE_2D, texture);

gl.uniform1i(program.uSampler, 0);

gl.drawElements(gl.TRIANGLES, 6, gl.UNSIGNED_SHORT, 0);

Copy the code

The gl.drawElements method draws the result of the color association object stored in the framebuffer. The color association object of the framebuffer is in turn associated with the created blank texture object when initialized.

Why doesn’t the final display cover the entire canvas?

When the displayable content is finally drawn, you can see that the vertices correspond to the entire canvas, and the texture coordinates correspond to the entire texture, but why isn’t the entire canvas covered?

Eventually draw content can be shown using texture mapping results from frame buffer, and the frame buffer vertices corresponding is the half of the buffer area, if the whole frame buffer mapped texture as a result, according to the final rendering visible area scaling, so the final draw not covered the right result is expected.

This is a full canvas example, just adjust the buffer vertices to correspond to the entire buffer size.

The resources

Recently, I saw the love Myth, and it seemed like a myth to me that I was still actively looking for love in middle age.