Normal mapping for WebGL learning

See demo: Texture mapping for a practical look

To add extra detail and add realism, we used diffuse maps and specular maps, both of which add textures to triangles. But from the point of view of light it is the surface normal vector that makes the surface look flat and smooth. From the perspective of the lighting algorithm, there is only one thing that determines the shape of the object, and that is the normal vector that is perpendicular to it. There is only one normal vector on the surface of the brick, and the surface is illuminated in a consistent manner entirely according to this normal vector. What happens if each slice uses a different normal? So that we can change the normal vector according to the finer details of the surface; This gives you the illusion of something much more complicated on the surface:

With each slice using its own normals, we can convince the light that a surface consists of many tiny planes (perpendicular to the normal vector), and the detail of the object’s surface will be greatly improved. This technique of using a normal for each slice instead of using the same normal for all the slices on a surface is called normal mapping or bump mapping.

These are all normal maps from LearnOpenGL CN related articles. You can only look at openGL for more in-depth knowledge about WebGL. Fortunately, the principle is basically the same. WebGL1 is based on openGL ES 2.0, and WebGL2 is based on openGL ES 3.0.

Normal map

A normal map is the x, y, and z of the normal vectors stored in the color vectors R, G, and B of the texture. But they have other names: t (tangent), B (auxiliary tangent), n (normal), and they form a tangent space called the TBN coordinate system. Since the color range is [0,1] and the TBN coordinate system is [-1,1], the values extracted from the normal map must be converted to be used.

// Convert normal vector to range [-1,1]
vec3 normal = normalize(normal * 2.0 - 1.0);
Copy the code

The normal map is bluish because all normals point towards the Z-axis (0, 0, 1), corresponding to the blue component in RGB, which is blue. The normal vector is slightly shifted from the Z axis to the other direction, and the color changes slightly, so that it appears to have a depth. For example, you can see that the top color tends to be green because the normal line at the top tends to point in the positive Y-axis (0, 1, 0) corresponding to the total green component of RGB, which is green.

The advantage of normal mapping is that it is possible to show very high detail with a low precision model that looks like a high precision model.

A simple mesh of 500 triangles plus a normal map can be as good as a fine mesh model of 4M triangles. The normal map has a huge advantage and the complexity of dealing with 4M triangles is unimaginable.

But normal mapping is not a panacea, and it has its drawbacks. Because it only changes the way the light is calculated on the surface of the object, it is not suitable for objects with large bumps, which will have a occlusion effect, and normal mapping is not possible.

And the articleNormal mapThere’s a very detailed explanation of the principle and the derivation of the tangent line, and the result is the following formula, which I’m not going to describe here.

shader

Here I use a more convenient algorithm to achieve the same effect. The principle is to use the derivative (dFdx/dFdy) to find the rate of change of each pixel at the interpolated point as a normal line, see the function dHdxy_fwd. The normal vector that is perpendicular to both directions can be obtained by taking the cross product (cross) with the normal vector of the current plane. Look at the function normalarb, and the final normal vector is the value we want, very brilliant. Here are a few key functions built into GLSL.

dFdx(p) // Partial derivative in the x direction
dFdy(p) // Partial derivative in the y direction
cross(p0,p1) The cross product of the vector p0,p1
Copy the code

We use the derivative function dFdx/dFdy, in WebGL1 is the need to open the extension, vertex shader does not need to change, the main change is the slice shader. For details, see the following fragment shader code:

#extension GL_OES_standard_derivatives : enable// Make sure that this extension is enabled
/ /...
uniform sampler2D u_diffMap;
uniform sampler2D u_specMap;
uniform sampler2D u_normMap;
/ /...
vec2 dHdxy_fwd(a) {
    vec2 dSTdx = dFdx( v_texcoord );
    vec2 dSTdy = dFdy( v_texcoord );
    float Hll = bumpScale * texture2D( u_normMap, v_texcoord ).x;
    float dBx = bumpScale * texture2D( u_normMap, v_texcoord + dSTdx ).x - Hll;
    float dBy = bumpScale * texture2D( u_normMap, v_texcoord + dSTdy ).x - Hll;
    return vec2( dBx, dBy );
}
vec3 perturbNormalArb( vec3 surf_pos, vec3 surf_norm, vec2 dHdxy ) {
    vec3 vSigmaX = vec3( dFdx( surf_pos.x ), dFdx( surf_pos.y ), dFdx( surf_pos.z ) );
    vec3 vSigmaY = vec3( dFdy( surf_pos.x ), dFdy( surf_pos.y ), dFdy( surf_pos.z ) );
    vec3 vN = surf_norm;
    vec3 R1 = cross( vSigmaY, vN );
    vec3 R2 = cross( vN, vSigmaX );
    float fDet = dot( vSigmaX, R1 );
    fDet *= ( float( gl_FrontFacing ) * 2.0 - 1.0 );
    vec3 vGrad = sign( fDet ) * ( dHdxy.x * R1 + dHdxy.y * R2 );
    return normalize( abs( fDet ) * surf_norm - vGrad );
}

/ /...
// Calculate the per-pixel normal vector from the normal map
vec3 normal = perturbNormalArb( -v_position, normal, dHdxy_fwd());
/ /...
// Total illumination
gl_FragColor = vec4(ambient + diffuse + specular, diffuseColor.a);
Copy the code

Finally, see Demo: Texture mapping

Afterword.

LearnOpenGL CN