An overview of the

OpenGL

OpenGL is a software interface for rendering 2D and 3D vector graphics hardware. Essentially, it is a LIBRARY of 3D graphics and models that is highly portable and has very fast rendering speed. OpenGL is not a language, but more like a LIBRARY of C runtime functions. It provides pre-packaged functionality to help developers write powerful THREE-DIMENSIONAL applications. OpenGL can run on a variety of operating systems, such as Windows, UNIX/Linux, Mac OS, and OS/2. Today, OpenGL is widely used in games, medical imaging, geographic information, weather simulation and other fields. It is the industry standard for high-performance images and interactive scene processing. Efficient implementations of OpenGL (utilizing graphic-accelerated hardware) exist on Windows, some UNIX platforms, and Mac OS. These implementations are typically provided by the display device vendor and rely heavily on the hardware provided by that vendor.

OpenGL ES and WebGL

OpenGL ES (OpenGL for Embedded Systems) is a subset of OpenGL 3D graphics API, designed for Embedded devices such as mobile phones, PDAs and game consoles

WebGL (Full Write Web Graphics Library) is a 3D drawing protocol. This drawing technology standard allows the combination of JavaScript and OpenGL ES. By adding a JavaScript binding to OpenGL ES, WebGL can provide hardware 3D accelerated rendering for HTML5 Canvas, so that Web developers can use the system graphics card to more smoothly display 3D scenes and models in the browser, as well as create complex navigation and data visualization.

The history of OpenGL

OpenGL is an open standard, although it was pioneered by SGI, but its standard is not controlled by SGI, but by the OpenGL Architecture Review Board (ARB). ARB was founded in 1992 by SGC, DEC, IBM, Intel, Microsoft and other famous companies, and later added nVidia, ATI and other giants in the field of graphics chips. The ARB meets every four years to maintain and improve the OpenGL specification, and plans to upgrade the OpenGL standard, so that OpenGL always keeps up with The Times.

In 2006, SGIG transferred control of the OpenGL standard from the ARB to a new working group: the Khronos Group (www.khronos.org). Khronos is a member-funded industry association focused on the creation and maintenance of open media standards.

Software installation

Before we can learn OpenGL, we need to configure the OpenGL software environment.

IDE

The OpenGL development environment is Visual Studio. You can download the latest version of OpenGL from the Official Website of Visual Studio.

GLFW

OpenGL is a graphics library, and to draw, you create a window. Unfortunately, OpenGL does not provide the ability to create Windows, so you must create your own. Creating Windows is different on every operating system (and quite codebase on Windows), so we’ll use a window library to simplify the process for convenience. Common OpenGL window libraries include GLUT, GLFW, and SDL. Here we choose GLFW which is used more frequently.

Visual Studio only supports OpenGL (gl.h) up to 1.1, and we used OpenGL 3.3. However, OpenGL is supported by graphics cards, and graphics cards already provide the OpenGL functions we need. So you need to get the function address dynamically when you run the program. Under Windows, using glGenBuffers as an example, it would look something like this:

#include <windows.h>
#include <GL/gl.h>
...
// define the functions' prototypes
typedef void * (*WGLGETPROCADDRESS)(const char *);
typedef void (*GLGENBUFFERS)(GLsizei, GLsizei *);
// load opengl32.dll and query wglGetProcAddress' address
HMODULE hDll = LoadLibrary("opengl32.dll");
WGLGETPROCADDRESS wglGetProcAddress = (WGLGETPROCADDRESS)GetProcAddress(hDll, "wglGetProcAddress");
// query OpenGL functions' addresses
GLGENBUFFERS glGenBuffers = (GLGENBUFFERS)wglGetProcAddress("glGenBuffers");

// now the function can be used as normal
GLuint vbo;
glGenBuffers(1, &vbo);
Copy the code

Of course, GLFW can be downloaded from its official website. You can then download its binaries directly, or compile them yourself using CMake. If you compile using CMake yourself, you can refer to the following article: GLFW Environment Configuration, creating Windows

If you download binaries already compiled, unzip them and open them, you’ll find an include folder and several lib-xxxx folders (XXXX is the compiler name). The include folder contains a GLFW folder that contains glfw3.h (and another glfw3native.h).

For detailed documentation, please refer to the official introduction or download the source code package directly from the download page of the official GLFW website.

OpenGL basics

Data type and function name

OpenGL’s data type definitions can be consistent with those of other languages, but it is recommended that the following data types, such as GLint, GLfloat, etc., be used under ANSI C.

The prefix The data type Corresponding C language type OpenGL type
b 8-bit integer signed char GLbyte
s 16-bit integer short GLshort
i 32-bit integer long GLint,GLsizei
f 32-bit floating-point float GLfloat,GLclampf
d 64-bit floating-point double GLdouble,GLclampd
ub 8-bit unsigned integer unsigned char GLubyte,GLboolean
us 16-bit unsigned integer unsigned short GLushort
ui 32-bit unsigned integer unsigned long GLuint,GLenum,GLbitfield

As can be seen from the above table, OpenGL library function naming way is very regular, understand this rule after reading and writing procedures are easier and convenient. First, each library function is prefixed with gl, glu, GLX, or AUX, indicating that the function belongs to the base library, utility library, X window extension library, or auxiliary library. Such as:

GlVertex2i (2, 4); GlVertex3f (2.0, 4.0, 5.0);Copy the code

As shown above, some function parameter types are suffixed with the numbers 2, 3, and 4. Where 2 represents two dimensions, 3 represents three dimensions, and 4 represents alpha.

In addition, some OpenGL functions have a letter V at the end, indicating that a function argument can be a pointer to a vector (or array) instead of a series of single argument values. The following two formats are equivalent to setting the current color to red.

GlColor3f (1.0, 0.0, 0.0); Float color_array [] = {1.0, 0.0, 0.0}; glColor3fv(color_array);Copy the code

In addition to the above basic naming, there is another notation with an asterisk, such as glColor(), which indicates that the current color can be set in various ways by the function. Similarly, glVertex*v() denotes a set of vertex coordinates defined by a single pointer to all types of vectors.

The sample

For example, there is the following example procedures, is also a beginner to learn the first example procedures. The source code is as follows:

// main.cpp // opengl_progress_struct #include <GLUT/ glut.h > #include <OpenGL/ opengl.h > GlClearColor (0.1, 0.1, 0.4, 0.0); glShadeModel(GL_SMOOTH); } // Draw callback void display() {// clear the previous frame glClear(GL_COLOR_BUFFER_BIT); // Triangle glBegin(GL_TRIANGLES); glColor3f(1, 0, 0); glVertex3f(-1, -1, -5); glColor3f(0, 1, 0); glVertex3f(1, -1, -5); glColor3f(0, 0, 1); glVertex3f(0, 1, -5); glEnd(); // execute drawing command glFlush(); 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 glMatrixMode(GL_PROJECTION); glLoadIdentity(); GluPerspective (60.0, (GLfloat) w/h (GLfloat), 0.1, 100000.0); glMatrixMode(GL_MODELVIEW); glLoadIdentity(); } int main(int argc, const char * argv[]) {argc, const_cast<char **>(argv)); glutInitDisplayMode(GLUT_SINGLE | GLUT_RGB); // Initialize window glutInitWindowSize(500, 500); glutInitWindowPosition(100, 100); glutCreateWindow(argv[0]); init(); glutReshapeFunc(reshape); glutDisplayFunc(display); // Start the main loop drawing glutMainLoop(); return 0; }Copy the code

The running effect is as follows:

Geometric drawing

In the spatial cartesian coordinate system, any point can be represented by a three-dimensional coordinate matrix [x, y, z]. If the point is represented by a four-dimensional matrix [Hx Hy Hz H], it is called the homogeneous coordinate representation method. In homogeneous coordinates, the last one-dimensional coordinate H is called the scaling factor. In OpenGL, all two-dimensional coordinate points are regarded as three-dimensional coordinate points, and all points are described by homogeneous coordinates, which are uniformly treated as three-dimensional homogeneous points. Each homogeneous point is represented by a vector (x, y, z, w) in which none of the four elements is zero. The homogeneous point has the following properties: 1) If the real number A is non-zero, (x, y, x, w) and (ax, ay, az, aw) represent the same point, similar to x/y = (ax)/(ay). 2) The homogeneous coordinates of three-dimensional space points (x,y, z) are (x,y, z, 1.0), and the homogeneous coordinates of two-dimensional plane points (x,y, 0.0, 1.0). 3) When W is not zero, the coordinate of homogeneous point (x, y, z, w) is the coordinate of three-dimensional space point (x/w, y/w, z/w); When w is zero, the homogeneous point (x, y, z, 0.0) indicates that the point is at infinity in some direction. Note: OpenGL specifies w greater than or equal to 0.0.

The geometry

In set graphics, several concepts are involved:

point

Points represented by floating-point values are called vertices. All vertices are treated as three-dimensional points when evaluated internally in OpenGL. Points defined with two-dimensional coordinates (x, y) have a default z value of 0 in OpenGL. All vertex coordinates are represented by homogeneous coordinates (x, y, z, w). If W is not 0.0, the vertices represented by these homogeneous coordinates are the three dimensional space points (x/w, y/w, z/w). Programmers can specify w values themselves, but rarely do. In general, w defaults to 1.0.

line

In OpenGL, lines represent Line segments, not mathematically lines that extend indefinitely along an axis in both directions. The line here consists of a series of connected vertices, closed and non-closed.

polygon

A polygon defined in OpenGL is a closed region consisting of a series of line segments connected in sequence. These line segments can’t cross, there can’t be holes in the region, and the polygon must be convex or it won’t be accepted by The OpenGL function.

Drawing primitives

Define the vertices

In OpenGL, all geometric objects are ultimately described by a set of vertices in some order. The function glVertex{234}{sifd}[v](TYPE Coords) can define vertices with two-dimensional, three-dimensional, or homogeneous coordinates. Such as:

GlVertex2s (2, 3); GlVertex3d (0.0, 1.0, 3.1414926535); GlVertex4f (2.4, 1.0, 2.2, 2.0); GLfloat pp [3] = {5.0, 2.0, 10.2}; glVertex3fv(pp);Copy the code

The first example represents a spatial vertex (2, 3, 0), the second example defines a vertex with a double precision floating point number, the third example defines a vertex with homogeneous coordinates whose real coordinates are (1.2, 0.5, -1.1), and the last example defines a vertex with a pointer (or array).

The geometric primitives

In practice, a geometric graph element is usually defined by a sequence of related vertices in a certain way, rather than by defining multiple vertices individually. In OpenGL, all defined vertices must be placed between glBegain() and glEnd() to properly represent a geometric element or object. Otherwise, glVertex*() does nothing. Such as:

GlBegin (GL_POLYGON); GlVertex2f (0.0, 0.0); GlVertex2f (0.0, 3.0); GlVertex2f (3.0, 3.0); GlVertex2f (4.0, 1.5); GlVertex2f (3.0, 0.0); glEnd();Copy the code

This program defines a polygon. If you change GL_POLYGON in glBegin() to GL_POINTS, the graph becomes a set of five vertices.

Primitive symbol

The point function glBegin(GLenum mode) marks the beginning of the list of vertices of a geometric element, whose mode parameter indicates the description type of the geometric element. All types and descriptions are shown in the following table:

type instructions
GL_POINTS Single vertex set
GL_LINES Multiple sets of two – vertex line segments
GL_POLYGON A single simply filled convex polygon
GL_TRAINGLES Multiple groups of independently filled triangles
GL_QUADS Multiple groups of independently filled quadrilateral
GL_LINE_STRIP Do not close the polyline
GL_LINE_LOOP Closed polyline
GL_TRAINGLE_STRIP Linear continuous filling triangle string
GL_TRAINGLE_FAN Fan-shaped continuous filling triangle string
GL_QUAD_STRIP Fill the quadrilateral string continuously

The most important information between glBegin() and glEnd() is the vertices defined by the function glVertex*(). If necessary, each vertex can be specified by color, normal, texture coordinates, or other functions, which are called, as shown in the following table.

function instructions
glVertex*() Set vertex coordinates
glColor*() Set current color
glIndex*() Sets the current color table
glNormal*() Set the normal coordinates
glCallList(),glCallLists() Perform display list
glTexCoord*() Set texture coordinates
glEdgeFlag*() Control boundary drawing
glMaterial*() Set the material
GlBegin (GL_POINTS); GlColor3f (1.0, 0.0, 0.0); /* color */ glVertex(...) ; GlColor3f (0.0, 1.0, 0.0); /* green color */ glColor3f(0.0,0.0,1.0); /* blue */ glVertex(...) ; glVertex(...) ; glEnd();Copy the code

The sample

To better understand OpenGL geometry drawing, let’s look at a comprehensive example.

#include <GLUT/ glut.h > #include <OpenGL/ opengl.h > void init() {glClearColor(0.1, 0.1, 0.4, 0.0); glShadeModel(GL_SMOOTH); } void DrawMyObjects(void){ /* draw some points */ glBegin(GL_POINTS); GlColor3f (1.0, 0.0, 0.0); GlVertex2f (10.0, 11.0); GlColor3f (1.0, 1.0, 0.0); GlVertex2f (9.0, 10.0); GlColor3f (0.0, 1.0, 1.0); GlVertex2f (8.0, 12.0); glEnd(); /* draw some line_segments */ glBegin(GL_LINES); GlColor3f (1.0, 1.0, 0.0); GlVertex2f (11.0, 8.0); GlVertex2f (7.0, 7.0); GlColor3f (1.0, 0.0, 1.0); GlVertex2f (11.0, 9.0); GlVertex2f (8.0, 6.0); glEnd(); /* draw one opened_line */ glBegin(GL_LINE_STRIP); GlColor3f (0.0, 1.0, 0.0); GlVertex2f (3.0, 9.0); GlVertex2f (2.0, 6.0); GlVertex2f (3.0, 8.0); GlVertex2f (2.5, 6.5); glEnd(); /* draw one closed_line */ glBegin(GL_LINE_LOOP); GlColor3f (0.0, 1.0, 1.0); GlVertex2f (7.0, 7.0); GlVertex2f (8.0, 8.0); GlVertex2f (9.0, 6.5); GlVertex2f (10.3, 7.5); GlVertex2f (11.5, 6.0); GlVertex2f (7.5, 6.0); glEnd(); /* draw one filled_polygon */ glBegin(GL_POLYGON); GlColor3f (0.5, 0.3, 0.7); GlVertex2f (7.0, 2.0); GlVertex2f (8.0, 3.0); GlVertex2f (10.3, 0.5); GlVertex2f (7.5, 2.0); GlVertex2f (6.0, 1.0); glEnd(); /* draw some filled_quandrangles */ glBegin(GL_QUADS); GlColor3f (0.7, 0.5, 0.2); GlVertex2f (0.0, 2.0); GlVertex2f (1.0, 3.0); GlVertex2f (3.3, 0.5); GlVertex2f (0.5, 1.0); GlColor3f (0.5, 0.7, 0.2); GlVertex2f (3.0, 2.0); GlVertex2f (2.0, 3.0); GlVertex2f (0.0, 0.5); GlVertex2f (2.5, 1.0); glEnd(); /* draw some filled_strip_quandrangles */ glBegin(GL_QUAD_STRIP); GlVertex2f (6.0, 2.0); GlVertex2f (5.5, 1.0); GlVertex2f (8.0, 1.0); GlColor3f (0.8, 0.0, 0.0); GlVertex2f (9.0, 2.0); GlVertex2f (11.0, 2.0); GlColor3f (0.0, 0.0, 0.8); GlVertex2f (11.0, 2.0); GlVertex2f (13.0, 1.0); GlColor3f (0.0, 0.8, 0.0); GlVertex2f (14.0, 1.0); glEnd(); /* draw some filled_triangles */ glBegin(GL_TRIANGLES); GlColor3f (0.2, 0.5, 0.7); GlVertex2f (10.0, 5.0); GlVertex2f (12.3, 7.5); GlVertex2f (8.5, 6.0); GlColor3f (0.2, 0.7, 0.5); GlVertex2f (8.0, 7.0); GlVertex2f (7.0, 4.5); GlVertex2f (5.5, 9.0); glEnd(); /* draw some filled_strip_triangles */ glBegin(GL_TRIANGLE_STRIP); GlVertex2f (1.0, 8.0); GlVertex2f (2.5, 5.0); GlColor3f (0.8, 0.8, 0.0); GlVertex2f (1.0, 7.0); GlColor3f (0.0, 0.8, 0.8); GlVertex2f (2.0, 4.0); GlColor3f (0.8, 0.0, 0.8); GlVertex2f (4.0, 6.0); glEnd(); /* draw some filled_fan_triangles */ glBegin(GL_TRIANGLE_FAN); GlVertex2f (8.0, 6.0); GlVertex2f (10.0, 3.0); GlColor3f (0.8, 0.2, 0.5); GlVertex2f (12.5, 4.5); GlColor3f (0.2, 0.5, 0.8); GlVertex2f (13.0, 7.5); GlColor3f (0.8, 0.5, 0.2); GlVertex2f (10.5, 9.0); glEnd(); } // Draw callback void display() {// clear the previous frame glClear(GL_COLOR_BUFFER_BIT); DrawMyObjects(); // execute drawing command glFlush(); 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 glMatrixMode(GL_PROJECTION); glLoadIdentity(); GluPerspective (60.0, (GLfloat) w/h (GLfloat), 0.1, 100000.0); glMatrixMode(GL_MODELVIEW); glLoadIdentity(); gluLookAt(0, 0, 25, 0, 0, -1, 0, 1, 0); } int main(int argc, const char * argv[]) {argc, const_cast<char **>(argv)); glutInitDisplayMode(GLUT_SINGLE | GLUT_RGB); // Initialize window glutInitWindowSize(500, 500); glutInitWindowPosition(100, 100); glutCreateWindow(argv[0]); init(); glutReshapeFunc(reshape); glutDisplayFunc(display); // Start the main loop drawing glutMainLoop(); return 0; }Copy the code

The operation effect is shown as follows:

Coordinate system and coordinate transformation

Right hand coordinate system

OpenGL uses a right-handed coordinate system. For the difference between left and right coordinate systems, please refer to the following figure.

Coordinate space

OpenGL space is divided into:

  • Local Space (or Object Space)
  • World Space
  • View Space (or Eye Space)
  • Clip Space
  • Screen Space

Local space

Local space refers to the coordinate space where the object is located, that is, where the object was originally located. Imagine you create a cube in a modeling program. The cube you create may have an origin at (0, 0, 0), even though it may end up in a completely different position in the program. It’s even possible that all models you create start with (0, 0, 0). So, all the vertices of your model are in local space, and they’re all local relative to your object.

The world space

If we import all of our objects into the program, they might all be jammed at the origin of the world (0, 0, 0), which is not what we want. We want to define a location for each object so that we can place them in the larger world. Coordinates in world space are just as the name suggests: coordinates of vertices relative to the world. If you want to scatter objects around the world (especially if it’s very real), this is the space you want objects to transform into. Object coordinates will be transformed from local to world space; The transformation is realized by Model Matrix. A model matrix is a transformation matrix that moves, scales, and rotates an object to the position or orientation it should be in. You can think of it as transforming a house. You shrink it (it’s too big in local space), move it to a small town in the suburbs, and rotate it a little to the left on the Y axis to match nearby houses. You can also think of the matrix used in the previous section to place boxes all over the scene as roughly a model matrix; We transform the local coordinates of the box to different locations in the scene/world.

The observation space

Observation Space is often referred to as OpenGL’s Camera (so it is sometimes called Camera Space or Eye Space). Observation space is the result of transforming the world space coordinates into the coordinates in front of the user’s field of vision. So the space of observation is the space observed from the point of view of the camera. This is usually done by a series of shifts and rotations, panning/rotating the scene so that a particular object is shifted in front of the camera. These combined transformations are usually stored in a View Matrix, which is used to transform the world coordinates into the View space.

Cut out space

At the end of a vertex shader run, OpenGL expects all coordinates to fall within a specific range, and any points outside this range should be Clipped. The clipped coordinates are ignored, so the remaining coordinates become visible fragments on the screen. That’s where the name Clip Space comes from. Because it’s not intuitive to specify all visible coordinates in the range of −1.0 −1.0 to 1.0 1.0, we specify our own Coordinate Set and transform it back to the standardized device Coordinate system, as OpenGL expects. To transform vertex coordinates from observation to clipping space, we need to define a Projection Matrix that specifies a range of coordinates, such as −1000 −1000 to 1000 1000 on each dimension. The projection matrix then transforms the coordinates within this specified range to the range of normalized device coordinates (−1.0,1.0) (−1.0,1.0). All coordinates outside the range are not mapped between −1.0 −1.0 and 1.0 1.0, so they are clipped out. In the range specified by the projection matrix above, the coordinates (1250,500,750) (1250,500,750) will not be visible because its x x coordinate is out of range, it is converted to a normalized device coordinate greater than 1.0 1.0, so it is clipped out. If part of a Primitive, such as a triangle, exceeds the Clipping Volume, OpenGL will reconstruct the triangle into one or more triangles to fit within the Clipping range. The Viewing Box created by the projection matrix is called a Frustum, and each coordinate that appears within the Frustum range will eventually appear on the user’s screen. The process of converting coordinates within a specific scope to a standardized device coordinate system (and it is easy to map to 2D viewing space coordinates) is called Projection because a Projection matrix can be used to Project 3D coordinates into a standardized device coordinate system that can be easily mapped to 2D.

Screen space

The final coordinates will be mapped to screen space (using the Settings in glViewport) and transformed into segments.

Space transformation

In order to transform coordinates from one coordinate system to another, several transformation matrices are needed, the most important of which are Model, View, and Projection matrices. The starting Coordinate of the vertex of the object is the Local Space, we’ll call it Local Coordinate, It’s going to be world Coordinate, View Coordinate, Clip Coordinate, and it’s going to end up as Screen Corrdinate.

The following diagram illustrates the specific process and results of the spatial transformation process.

The relevant API

The apis related to spatial change are:

Transformation of model matrix

void glTranslate{fd}(TYPE x,TYPE y,TYPE z)
void glRotate{fd}(TYPE angle,TYPE x,TYPE y,TYPE z)
void glScale{fd}(TYPE x,TYPE y,TYPE z)
Copy the code

View matrix transformation

void gluLookAt(GLdouble eyex,GLdouble eyey,GLdouble eyez,GLdouble centerx,GLdouble centery,GLdouble centerz,GLdouble upx,GLdouble upy,GLdouble upz);
Copy the code

Projection transformation

void glOrtho(GLdouble left,GLdouble right,GLdouble bottom,GLdouble top, GLdouble near,GLdouble far)
void gluOrtho2D(GLdouble left,GLdouble right,GLdouble bottom,GLdouble top)
void glFrustum(GLdouble left,GLdouble Right,GLdouble bottom,GLdouble top, GLdouble near,GLdouble far);
void gluPerspective(GLdouble fovy,GLdouble aspect,GLdouble zNear, GLdouble zFar);
Copy the code

Viewport transformation

glViewport(GLint x,GLint y,GLsizei width, GLsizei height);
Copy the code

General transform

void glLoadMatrix{fd}(const TYPE *m)
void glMultMatrix{fd}(const TYPE *m)
Copy the code

OpenGL texture

Texture Mapping is widely used in 3D graphics, especially to describe objects with realistic sense. For example, to draw a brick wall, you can use a real brick wall image or photo as a texture on a rectangle, so that a realistic brick wall is drawn. If texture mapping is not used, each brick on the wall must be drawn as a separate polygon. In addition, texture mapping ensures that as polygons are transformed, the texture pattern on the polygon also changes. For example, when a wall is viewed through a perspective projection, the size of the bricks that are farther from the point of view will shrink, while those closer to the point of view will grow. In addition, texture mapping is also often used in other fields. For example, in flight simulation, images of a large area of vegetation are often mapped to some large polygons to represent the ground, or images of marble, wood, cloth and other natural substances are used as texture mapping to represent corresponding objects on polygons.

Texture classification

According to the use scene and form of texture, texture is mainly divided into the following categories:

  • One-dimensional textures. For example, all changes in a textured strip drawn by a program may occur in the same direction. A one-dimensional texture is like a two-dimensional texture of height 1.
  • Two-dimensional texture, in fact, is the easiest to understand, is also the most common, with horizontal and vertical texture coordinates, usually a picture can be used as a two-dimensional texture.
  • The most common application of 3D texture is rendering in the medical and geoscience fields. In medical applications, 3d textures can be used to represent a series of computed tomography (CT) or magnetic resonance (MRI) images. For oil and gas researchers, 3d textures can be used to model rock substrata. 3 d texture can be regarded as a layer of 2 D sub-image rectangles.
  • Sphere textures, also known as ambient textures, aim to render perfectly reflective objects whose surface colors are reflected into the surroundings of the human eye.
  • Cube texture is a special texture technology, which uses six two-dimensional texture images to form a texture cube centered on the origin. Cube textures are great for environment, reflection, and lighting effects.
  • Multi-texture, multi-texture allows you to apply several textures, one at a time, to the same polygon in the texture manipulation pipeline.
  • .

Texture definition

One dimensional texture

void glTexImage1D(GLenum target,GLint level,GLint components,GLsizei width, GLint border,GLenum format,GLenum type,const GLvoid *pixels);Copy the code

Define a one-dimensional texture map. All parameters are the same as TexImage2D() except that the first parameter target should be set to GL_TEXTURE_1D, except that the texture image is a set of one-dimensional striated prime numbers whose width value must be a power of 2, or 2m+2 if there are boundaries.

2 d texture

Void glTexImage2D(GLenum target,GLint level,GLint Components, GLsizei width, GLsizei height,GLint border, GLenum format,GLenum type, const GLvoid *pixels);Copy the code

Define a 2d texture map. Where the parameter target is the constant GL_TEXTURE_2D. The parameter level represents the series of texture images with multi-resolution. If there is only one resolution, level is set to 0. Parameter components is an integer from 1 to 4, indicating which components of R, G, B, and A are selected for adjustment and mixing. 1 indicates that R is selected, 2 indicates that R and A are selected, 3 indicates that R, G, and B are selected, and 4 indicates that R, G, B, and A are selected. Width and height parameters give the length and width of the texture image. Border is the texture boundary width, which is usually 0. Width and height must be 2m+2b, where m is an integer, length and width can have different values, and b is the border value. The maximum size of the texture map depends on OpenGL, but it must be at least 64×64 (if the band boundary is 66×66). If width and height are set to 0, the texture map is effectively turned off. The format and type parameters describe the format and data types of the texture map. They have the same meaning here as in the glDrawPixels() function. In fact, the texture data has the same format as the data used in glDrawPixels(). The format parameter can be GL_COLOR_INDEX, GL_RGB, GL_RGBA, GL_RED, GL_GREEN, GL_BLUE, GL_ALPHA, GL_LUMINANCE, or GL_LUMINANCE_ALPHA (note: GL_STENCIL_INDEX and GL_DEPTH_COMPONENT are not allowed. Similarly, the parameter type is GL_BYPE, GL_UNSIGNED_BYTE, GL_SHORT, GL_UNSIGNED_SHORT, GL_INT, GL_UNSIGNED_INT, GL_FLOAT, or GL_BITMAP. The parameter Pixels contains texture image data, which describes the texture image itself and its boundaries.

Texture control function

Texture control functions in OpenGL are as follows:

void glTexParameter{if}[v](GLenum target,GLenum pname,TYPE param);
Copy the code

The first parameter target, which can be GL_TEXTURE_1D or GL_TEXTURE_2D, specifies the parameters for a one-dimensional or two-dimensional texture; The possible values for the last two parameters are shown in the table below.

parameter The value of the corresponding
GL_TEXTURE_WRAP_S GL_CLAMP ,GL_REPEAT
GL_TEXTURE_WRAP_T GL_CLAMP, GL_REPEAT
GL_TEXTURE_MAG_FILTER GL_NEAREST, GL_LINEAR
GL_TEXTURE_MIN_FILTER GL_NEAREST, GL_LINEAR, GL_NEAREST_MIPMAP_NEAREST, GL_NEAREST_MIPMAP_LINEAR, GL_LINEAR_MIPMAP_NEAREST, GL_LINEAR_MIPMAP_LINEAR

In general, texture images are square or rectangular. But when it maps to a polygon or surface and transforms to screen coordinates, the individual texture elements rarely correspond to pixels on the screen image. Depending on the transformation used and the texture mapping used, a single pixel on the screen can correspond to a small part of a pixel (zoom in) or a large number of pixels (zoom out). GlTexParameter *() shows how to zoom in and out:

GlTexParameter * (GL_TEXTURE_2D GL_TEXTURE_MAG_FILTER, GL_NEAREST); glTexParameter*(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_NEAREST);Copy the code

In fact, the first argument can be GL_TEXTURE_1D or GL_TEXTURE_2D, indicating whether the texture used is one-dimensional or two-dimensional; The second parameter specifies the filtering method, where the parameter value GL_TEXTURE_MAG_FILTER specifies the amplification filtering method, and the parameter value GL_TEXTURE_MIN_FILTER specifies the reduction filtering method. The third parameter describes the filtering mode. Table 12-1 lists the filtering mode. If GL_NEAREST is selected, the pixel whose coordinate is closest to the pixel center will be adopted, which may cause the image to go out of shape. If GL_LINEAR is selected, the weighted average of the four pixels closest to the center of the pixels is adopted. GL_NEAREST requires fewer calculations than GL_LINEAR and thus performs faster, but GL_LINEAR provides a smoother effect.

At the same time, texture coordinates can be outside the range of (0, 1), and the mapping can be repeated or reduced during texture mapping. In the case of repeated mapping, the texture can be repeated in the S and T directions. Such as:

GlTexParameterfv (GL_TEXTURE_2D GL_TEXTURE_WRAP_S, GL_REPEAT); glTexParameterfv(GL_TEXTURE_2D,GL_TEXTURE_WRAP_T,GL_REPEAT);Copy the code

Texture coordinates

When you draw a texture mapping scene, you define not only the geometric coordinates for each vertex, but also the texture coordinates. After various transformations, the geometric coordinates determine where the vertex is drawn on the screen, and the texture coordinates determine which pixel in the texture image assigns that vertex. And the texture coordinate interpolation between vertices is the same as the smooth coloring interpolation method described in the basic part. Texture images are square arrays, and texture coordinates are usually defined in one, two, three, or four dimensions, called S, T, R, and Q coordinates, to distinguish them from object coordinates (x, Y, z, W) and other coordinates. One-dimensional texture is usually represented by S coordinate, and two-dimensional texture is usually represented by (S, T) coordinate. At present, R coordinate is ignored. Q coordinate, like W, has a half value of 1, which is mainly used to establish homogeneous coordinate. OpenGL coordinates define functions as follows:

void gltexCoord{1234}{sifd}[v](TYPE coords);
Copy the code

Set the current texture coordinates, and subsequent calls to glVertex*() give the current texture coordinates to all vertices produced. For gltexCoord1*(), the s coordinate is set to the given value, t and r are set to 0, and q is set to 1; GltexCoord2 *() sets the s and t coordinates, r to 0 and q to 1. For gltexCoord3*(), q is set to 1 and other coordinates are set according to the given values; All coordinates are given with gltexCoord4*(). Specify the TYPE of coordinates using the appropriate suffix (s, I, f, or D) and the corresponding value of TYPE (GLshort, GLint, glfloat, or GLdouble). Note: Integer texture coordinates can be applied directly instead of being mapped between [-1, 1] as normal coordinates.

The sample

#include <GLUT/GLUT.h> #include <OpenGL/OpenGL.h> #include "BMPLoader.h" GLuint tex2D; GLfloat angle; Void init() {glEnable(GL_DEPTH_TEST); glDepthFunc(GL_LESS); GlClearColor (0.1, 0.1, 0.4, 0.0); glShadeModel(GL_SMOOTH); CBMPLoader bmpLoader; bmpLoader.LoadBmp("/123-bmp.bmp"); // Create texture glGenTextures(1, &tex2D); glBindTexture(GL_TEXTURE_2D, tex2D); GlTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP); GlTexImage2D (GL_TEXTURE_2D, 0, GL_RGB, bmploader. imageWidth, bmploader. imageHeight, 0, GL_RGB, GL_UNSIGNED_BYTE, bmpLoader.image); angle = 0; Void DrawBox(){glEnable(GL_TEXTURE_2D); /** Select texture */ glBindTexture(GL_TEXTURE_2D, tex2D); /** start drawing quadrilateral */ glBegin(GL_QUADS); /// glNormal3f(0.0f, 0.0f, 1.0f); /** specify normal to observer */ glTexCoord2f(0.0f, 0.0f); GlVertex3f (1.0-1.0 f to 1.0 f, f); GlTexCoord2f (0.0 1.0 f, f); GlVertex3f (1.0 1.0 f to 1.0 f, f); GlTexCoord2f (1.0 1.0 f, f); GlVertex3f (1.0 f, f, 1.0 1.0 f); GlTexCoord2f (1.0 0.0 f, f); GlVertex3f (1.0 1.0 f, f, 1.0 f); /// glNormal3f(0.0f, 0.0f, -1.0f); /* / glTexCoord2f(0.0f, 0.0f); GlVertex3f (1.0 f, 1.0 f, 1.0 f); GlTexCoord2f (0.0 1.0 f, f); GlVertex3f (1.0 1.0 f, f, 1.0 f); GlTexCoord2f (1.0 1.0 f, f); GlVertex3f (1.0 f, f, 1.0-1.0 f); GlTexCoord2f (1.0 0.0 f, f); GlVertex3f (1.0 f to 1.0 f to 1.0 f); // glNormal3f(0.0f, 1.0f, 0.0f); /** specify normals up */ glTexCoord2f(0.0f, 0.0f); GlVertex3f (1.0 1.0 f, f, 1.0 f); GlTexCoord2f (0.0 1.0 f, f); GlVertex3f (1.0 f, f, 1.0 1.0 f); GlTexCoord2f (1.0 1.0 f, f); GlVertex3f (1.0 f, f, 1.0-1.0 f); GlTexCoord2f (1.0 0.0 f, f); GlVertex3f (1.0 1.0 f, f, 1.0 f); GlNormal3f (0.0f, -1.0f, 0.0f); /** specify normal down */ glTexCoord2f(0.0f, 0.0f); GlVertex3f (1.0-1.0 f to 1.0 f, f); GlTexCoord2f (0.0 1.0 f, f); GlVertex3f (1.0 1.0 f to 1.0 f, f); GlTexCoord2f (1.0 1.0 f, f); GlVertex3f (1.0 f to 1.0 f to 1.0 f); GlTexCoord2f (1.0 0.0 f, f); GlVertex3f (1.0 f, 1.0 f, 1.0 f); // right side glNormal3f(1.0f, 0.0f, 0.0f); /** specify normals to the right */ glTexCoord2f(0.0f, 0.0f); GlVertex3f (1.0 f to 1.0 f to 1.0 f); GlTexCoord2f (0.0 1.0 f, f); GlVertex3f (1.0 f, f, 1.0-1.0 f); GlTexCoord2f (1.0 1.0 f, f); GlVertex3f (1.0 f, f, 1.0 1.0 f); GlTexCoord2f (1.0 0.0 f, f); GlVertex3f (1.0 1.0 f to 1.0 f, f); // left side glNormal3f(-1.0f, 0.0f, 0.0f); /** specify normals to the left */ glTexCoord2f(0.0f, 0.0f); GlVertex3f (1.0 f, 1.0 f, 1.0 f); GlTexCoord2f (0.0 1.0 f, f); GlVertex3f (1.0 1.0 f, f, 1.0 f); GlTexCoord2f (1.0 1.0 f, f); GlVertex3f (1.0 1.0 f, f, 1.0 f); GlTexCoord2f (1.0 0.0 f, f); GlVertex3f (1.0-1.0 f to 1.0 f, f); glEnd(); glDisable(GL_TEXTURE_2D); } / / drawing callback function void display () {/ / remove frame data before glClear (GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); glPushMatrix(); GlTranslatef (0.0 f, f, 0.0-5.0 f); glRotated(angle, 1, 1, 0); DrawBox(); glPopMatrix(); // execute drawing command glFlush(); angle ++; glutPostRedisplay(); 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 glMatrixMode(GL_PROJECTION); glLoadIdentity(); GluPerspective (60.0, (GLfloat) w/h (GLfloat), 0.1, 100000.0); glMatrixMode(GL_MODELVIEW); glLoadIdentity(); } int main(int argc, const char * argv[]) {argc, const_cast<char **>(argv)); glutInitDisplayMode(GLUT_SINGLE | GLUT_RGB|GLUT_DEPTH); // Initialize window glutInitWindowSize(500, 500); glutInitWindowPosition(100, 100); glutCreateWindow(argv[0]); init(); glutReshapeFunc(reshape); glutDisplayFunc(display); // Start the main loop drawing glutMainLoop(); return 0; }Copy the code

The running effect is as follows:

OpenGL Lighting and materials

When light hits a surface, three things happen. First, light can be reflected through an object’s surface into space, producing reflected light. Second, with a transparent body, light can pass through the object and come out the other end, producing transmitted light. Finally, some of the light is absorbed by the surface and converted into heat. In the above three parts of light, only transmitted light and reflected light can enter the human eye to produce visual effects. The simple lighting model introduced here only considers the influence of reflected light from the illuminated object surface, assumes that the surface of the object is smooth, opaque and composed of ideal materials, and assumes that the environment is illuminated by white light. Generally, reflected light can be divided into three components, namely, ambient reflection, diffuse reflection and specular reflection. Ambient reflection component Assumes that incident Light evenly incident to the surface of the scene from the surrounding environment and is reflected equally in all directions. Usually, the surface of the object is also irradiated by the reflected Light from the surrounding environment (such as the reflected Light from the ground, sky, wall, etc.), which is often collectively referred to as Ambient Light. Diffuse component refers to the Light uniformly reflected in all directions of space by a specific Light source on the scene surface. These Light is often referred to as Diffuse Light. Specular Light refers to the Light reflected in a certain direction. For example, when a point Light illuminates a metal sphere, it will form a particularly bright area on the sphere, presenting the so-called “Highlight”, which is the Specular Light generated by the Light source on the metal sphere. For smooth objects, the specular region of the reflected light is small and bright. On the contrary, the reflected light of the rough surface is divergent and its highlight area is large and not bright.

Light up

Emitted Light Emitted by OpenGL several sources of Light in the Simple lighting model are classified as: Emitted Light, Ambient Light, Diffuse Light, and Specular Light.

  • Radiant light is the simplest kind of light that emanates directly from an object and is not affected by any light source.
  • Ambient light is light from a source that has been repeatedly scattered by the environment and whose direction cannot be determined, that is, it appears to come from all directions. Generally speaking, there is more ambient light in a room, and much less outside, because most of the light goes in the same direction and there is little reflection of other objects outside. When ambient light hits the surface, it diverges equally in all directions (similar to shadowless lamp light).
  • Diffuse light comes from one direction and is brighter when it is perpendicular to an object than when it is tilted. Once it hits an object, it spreads out evenly in all directions. So it’s the same no matter where the viewpoint is. Any light coming from a particular location and a particular direction can have a scattering component.
  • Specular light comes from one direction and bounces off in another. A parallel laser beam on a high-quality mirror produces 100% specular reflection. Shiny metals and plastics have a high non-reflective composition, while things like chalk and carpets have almost no reflective composition. Thus, in a sense, the degree of reflection of an object is equivalent to the intensity (or brightness) of the light on it.

Create the light source

Light sources have many characteristics, such as color, position, orientation, etc. Select different characteristic values, the corresponding light source on the effect of the object is not the same, this will be introduced gradually in later chapters. GlLight *() : glLight*()

void glLight{if}[v](GLenum light , GLenum pname, TYPE param)
Copy the code

Create a light source with certain properties. The first parameter light specifies the light number to be created, such as GL_LIGHT0, GL_LIGHT1… And GL_LIGHT7. The second parameter, pname, specifies the light source properties, and the auxiliary information for this parameter is shown in Table 1-3. The last parameter sets the corresponding light source characteristic value.

Pname parameter name The default value instructions
GL_AMBIENT (0.0, 0.0, 0.0, 1.0) Ambient light in RGBA mode
GL_DIFFUSE (1.0, 1.0, 1.0, 1.0) Diffuse light in RGBA mode
GL_SPECULAR (1.0, 1.0, 1.0, 1.0) Mirror light in RGBA mode
GL_POSITION (0.0, 0.0, 1.0, 0.0) Homogeneous coordinates of light source position (X, Y,z, W)
GL_SPOT_DIRECTION (0.0, 0.0, 1.0) Focus direction Vector of point Light Source (X, Y, Z)
GL_SPOT_EXPONENT 0.0 Concentration index of point light source
GL_SPOT_CUTOFF 180.0 Cut-off Angle of point light source
GL_CONSTANT_ATTENUATION 1.0 Constant attenuation factor
GL_LINER_ATTENUATION 0.0 Linear decay factor
GL_QUADRATIC_ATTENUATION 0.0 Square attenuation factor
GLfloat light_position[] = {1.0, 1.0, 1.0, 0.0}; glLightfv(GL_LIGHT0, GL_POSITION, light_position);Copy the code

Where light_position is a pointer to a defined homogeneous array of light position coordinates. The other light source properties are the default values. Similarly, we can define several other characteristic values of the light source in a similar way. Such as:

GLfloat light_ambient [] = {0.0, 0.0, 0.0, 1.0}; GLfloat light_diffuse [] = {1.0, 1.0, 1.0, 1.0}; GLfloat light_specular[] = {1.0, 1.0, 1.0, 1.0}; GlLightfv (GL_LIGHT0, GL_AMBIENT, light_ambient); GlLightfv (GL_LIGHT0, GL_DIFFUSE, light_diffuse); glLightfv(GL_LIGHT0, GL_SPECULAR, light_specular);Copy the code

Start the light

In OpenGL, you must specify whether lighting is valid or not. If lighting is not effective, the current color is simply mapped to the current vertex without complicated calculation of normal, light source, material and so on, then the graphics displayed will not be realistic, as shown in the results of routine operation in the previous chapters. For lighting to work, you first have to start lighting, which uses the following function.

glEnable(GL_LIGHTING);
Copy the code

If lighting is disabled, call gDisable(GL_LIGHTING) to turn off the current lighting. Then, each light source defined must be valid if only one light source is used.

glEnable(GL_LIGHT0);
Copy the code

Other light sources are similar, but the light source number is different.

The material color

OpenGL uses the reflectance of the red, green and blue primary colors of light to approximately define the color of a material. Like light sources, material colors are divided into ambient, diffuse, and specular reflection components, which determine the extent to which the material reflects ambient, diffuse, and specular light. In the calculation of illumination, the reflectance of the material to ambient light is combined with the ambient light entering the light source, the reflectance to diffuse light is combined with the diffuse light entering the light source, and the reflectance to specular light is combined with the specular light entering the light source. The degree of reflection to ambient and diffuse light determines the color of the material, and they are very similar. The reflectivity of specular light is usually white or gray (i.e. the same reflectivity of red, green, and blue in specular light). The brightest specular highlights will change to a color with the intensity of the specular light source. For example, a bright red plastic ball, most of the ball will appear red, bright highlights will be white. The material definition is similar to the light source definition:

void glMaterial{if}[v](GLenum face,GLenum pname,TYPE param);
Copy the code

Defines the current material used in lighting calculations. Face, which can be GL_FRONT, GL_BACK, or GL_FRONT_AND_BACK, indicates which surface of the object the current material should be applied to. Pname specifies a specific material; Param is the concrete value of the material; if the function is vector, param is a pointer to a set of values; if the function is vector, param is the parameter value itself. The non-vector form is only used to set GL_SHINESS. Additionally, the GL_AMBIENT_AND_DIFFUSE parameter indicates that ambient and diffuse light colors can be set with the same RGB value.

Parameter names The default value instructions
GL_AMBIENT (0.2, 0.2, 0.2, 1.0) Ambient light color of the material
GL_DIFFUSE (0.8, 0.8, 0.8, 1.0) Diffuse light color of the material
GL_AMBIENT_AND_DIFFUSE Ambient light and diffuse light color of the material
GL_SPECULAR (0.0, 0.0, 0.0, 1.0) Mirror reflected light color of the material
GL_SHINESS 0.0 Specular index (brightness)
GL_EMISSION (0.0, 0.0, 0.0, 1.0) The radiant light color of the material
GL_COLOR_INDEXES (0, 1, 1) Ambient light, diffuse light and specular light color of the material

RGB value of material and light source

The color of the material is slightly different from the color of the light source. For a light source, the R, G, and B values are equal to the percentage of R, G, and B relative to its maximum intensity. If the R, G and B values of the light source color are 1.0, it is the strongest white light. If the value is 0.5, the color is still white, but the intensity is half of the original, so it is gray; If R = G = 1.0, B = 0.0, the light source is yellow. For materials, R, G, B values are the reflectivity of R, G, B components of the material to the light. For example, a material with R = 1.0, G = 0.5, and B = 0.0 reflects all red, half green, and no blue. That is, if OpenGL’s light source colors are (LR, LG, LB) and the material colors are (MR, MG, MB), then the final light color reaching the eye is (LRMR, LGMG, LB*MB), ignoring all other reflection effects. Similarly, if there are two lights with corresponding values (R1, G1, B1) and (R2, G2, B2), OpenGL adds the color components to get (R1+R2, G1+G2, B1+B2). If the sum of any of the components is greater than 1 (beyond the brightness that the device can display), it is reduced to 1.0.

The sample

The following example demonstrates the use of lighting and materials on OpenGL.

#include <GLUT/GLUT.h>
#include <OpenGL/OpenGL.h>
 
 
// 初始化参数
void init() {
    GLfloat ambient[] = { 0.0, 0.0, 0.0, 1.0 };
    GLfloat diffuse[] = { 1.0, 1.0, 1.0, 1.0 };
//    GLfloat specular[] = { 1.0, 1.0, 1.0, 1.0 };
    GLfloat position[] = { 0.0, 0, -1.0, 0.0 };
    glEnable(GL_DEPTH_TEST);
    glDepthFunc(GL_LESS);
    glLightfv(GL_LIGHT0, GL_AMBIENT, ambient);
    glLightfv(GL_LIGHT0, GL_DIFFUSE, diffuse);
//    glLightfv(GL_LIGHT0, GL_SPECULAR, specular);
    glLightfv(GL_LIGHT0, GL_POSITION, position);
    glEnable(GL_LIGHTING);
    glEnable(GL_LIGHT0);
    glClearColor(0.0, 0.1, 0.1, 0.0) ;
}
 
 
// 绘图回调函数
void display() {
    GLfloat no_mat[] = { 0.0, 0.0, 0.0, 1.0 };
    GLfloat mat_ambient[] = { 0.7, 0.7, 0.7, 1.0 };
    GLfloat mat_ambient_color[] = { 0.8, 0.8, 0.2, 1.0 };
    GLfloat mat_diffuse[] = { 0.1, 0.5, 0.8, 1.0 };
    GLfloat mat_specular[] = { 1.0, 1.0, 1.0, 1.0 };
    GLfloat no_shininess[] = { 0.0 };
    GLfloat low_shininess[] = { 5.0 };
    GLfloat high_shininess[] = { 100.0 };
    GLfloat mat_emission[] = {0.3, 0.2, 0.2, 0.0};
    glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
 
     
 
    /* 第一行第一列绘制的球仅有漫反射光而无环境光和镜面光。*/
    glPushMatrix();
    glTranslatef (-3.75, 3.0, 0.0);
    glMaterialfv(GL_FRONT, GL_AMBIENT, no_mat);
    glMaterialfv(GL_FRONT, GL_DIFFUSE, mat_diffuse);
    glMaterialfv(GL_FRONT, GL_SPECULAR, no_mat);
    glMaterialfv(GL_FRONT, GL_SHININESS, no_shininess);
    glMaterialfv(GL_FRONT, GL_EMISSION, no_mat);
    glutSolidSphere(1.0, 20, 20);
    glPopMatrix();
 
     
 
    /* 第一行第二列绘制的球有漫反射光和镜面光,并有低高光,而无环境光 。*/
    glPushMatrix();
    glTranslatef (-1.25, 3.0, 0.0);
    glMaterialfv(GL_FRONT, GL_AMBIENT, no_mat);
    glMaterialfv(GL_FRONT, GL_DIFFUSE, mat_diffuse);
    glMaterialfv(GL_FRONT, GL_SPECULAR, mat_specular);
    glMaterialfv(GL_FRONT, GL_SHININESS, low_shininess);
    glMaterialfv(GL_FRONT, GL_EMISSION, no_mat);
    glutSolidSphere(1.0, 20, 20);
 
    glPopMatrix();
 
     
 
    /* 第一行第三列绘制的球有漫反射光和镜面光,并有很亮的高光,而无环境光 。*/
    glPushMatrix();
    glTranslatef (1.25, 3.0, 0.0);
    glMaterialfv(GL_FRONT, GL_AMBIENT, no_mat);
    glMaterialfv(GL_FRONT, GL_DIFFUSE, mat_diffuse);
    glMaterialfv(GL_FRONT, GL_SPECULAR, mat_specular);
    glMaterialfv(GL_FRONT, GL_SHININESS, high_shininess);
    glMaterialfv(GL_FRONT, GL_EMISSION, no_mat);
    glutSolidSphere(1.0, 20, 20);
    glPopMatrix();
     
 
    /* 第一行第四列绘制的球有漫反射光和辐射光,而无环境和镜面反射光。*/
    glPushMatrix();
    glTranslatef (3.75, 3.0, 0.0);
    glMaterialfv(GL_FRONT, GL_AMBIENT, no_mat);
    glMaterialfv(GL_FRONT, GL_DIFFUSE, mat_diffuse);
    glMaterialfv(GL_FRONT, GL_SPECULAR, no_mat);
    glMaterialfv(GL_FRONT, GL_SHININESS, no_shininess);
    glMaterialfv(GL_FRONT, GL_EMISSION, mat_emission);
    glutSolidSphere(1.0, 20, 20);
    glPopMatrix();
     
 
    /* 第二行第一列绘制的球有漫反射光和环境光,而镜面反射光。*/
    glPushMatrix();
    glTranslatef (-3.75, 0.0, 0.0);
    glMaterialfv(GL_FRONT, GL_AMBIENT, mat_ambient);
    glMaterialfv(GL_FRONT, GL_DIFFUSE, mat_diffuse);
    glMaterialfv(GL_FRONT, GL_SPECULAR, no_mat);
    glMaterialfv(GL_FRONT, GL_SHININESS, no_shininess);
    glMaterialfv(GL_FRONT, GL_EMISSION, no_mat);
    glutSolidSphere(1.0, 20, 20);
    glPopMatrix();
     
 
    /* 第二行第二列绘制的球有漫反射光、环境光和镜面光,且有低高光。*/
    glPushMatrix();
    glTranslatef (-1.25, 0.0, 0.0);
    glMaterialfv(GL_FRONT, GL_AMBIENT, mat_ambient);
    glMaterialfv(GL_FRONT, GL_DIFFUSE, mat_diffuse);
    glMaterialfv(GL_FRONT, GL_SPECULAR, mat_specular);
    glMaterialfv(GL_FRONT, GL_SHININESS, low_shininess);
    glMaterialfv(GL_FRONT, GL_EMISSION, no_mat);
    glutSolidSphere(1.0, 20, 20);
    glPopMatrix();
 
 
    /* 第二行第三列绘制的球有漫反射光、环境光和镜面光,且有很亮的高光。*/
    glPushMatrix();
    glTranslatef (1.25, 0.0, 0.0);
    glMaterialfv(GL_FRONT, GL_AMBIENT, mat_ambient);
    glMaterialfv(GL_FRONT, GL_DIFFUSE, mat_diffuse);
    glMaterialfv(GL_FRONT, GL_SPECULAR, mat_specular);
    glMaterialfv(GL_FRONT, GL_SHININESS, high_shininess);
    glMaterialfv(GL_FRONT, GL_EMISSION, no_mat);
    glutSolidSphere(1.0, 20, 20);
    glPopMatrix();
  
 
    /* 第二行第四列绘制的球有漫反射光、环境光和辐射光,而无镜面光。*/
    glPushMatrix();
    glTranslatef (3.75, 0.0, 0.0);
    glMaterialfv(GL_FRONT, GL_AMBIENT, mat_ambient);
    glMaterialfv(GL_FRONT, GL_DIFFUSE, mat_diffuse);
    glMaterialfv(GL_FRONT, GL_SPECULAR, no_mat);
    glMaterialfv(GL_FRONT, GL_SHININESS, no_shininess);
    glMaterialfv(GL_FRONT, GL_EMISSION, mat_emission);
    glutSolidSphere(1.0, 20, 20);
    glPopMatrix();
 
 
    /* 第三行第一列绘制的球有漫反射光和有颜色的环境光,而无镜面光。*/
    glPushMatrix();
    glTranslatef (-3.75, -3.0, 0.0);
    glMaterialfv(GL_FRONT, GL_AMBIENT, mat_ambient_color);
    glMaterialfv(GL_FRONT, GL_DIFFUSE, mat_diffuse);
    glMaterialfv(GL_FRONT, GL_SPECULAR, no_mat);
    glMaterialfv(GL_FRONT, GL_SHININESS, no_shininess);
    glMaterialfv(GL_FRONT, GL_EMISSION, no_mat);
    glutSolidSphere(1.0, 20, 20);
    glPopMatrix();
     
 
    /* 第三行第二列绘制的球有漫反射光和有颜色的环境光以及镜面光,且有低高光。*/
    glPushMatrix();
    glTranslatef (-1.25, -3.0, 0.0);
    glMaterialfv(GL_FRONT, GL_AMBIENT, mat_ambient_color);
    glMaterialfv(GL_FRONT, GL_DIFFUSE, mat_diffuse);
    glMaterialfv(GL_FRONT, GL_SPECULAR, mat_specular);
    glMaterialfv(GL_FRONT, GL_SHININESS, low_shininess);
    glMaterialfv(GL_FRONT, GL_EMISSION, no_mat);
    glutSolidSphere(1.0, 20, 20);
    glPopMatrix();
     
 
    /* 第三行第三列绘制的球有漫反射光和有颜色的环境光以及镜面光,且有很亮的高光。*/
    glPushMatrix();
    glTranslatef (1.25, -3.0, 0.0);
    glMaterialfv(GL_FRONT, GL_AMBIENT, mat_ambient_color);
    glMaterialfv(GL_FRONT, GL_DIFFUSE, mat_diffuse);
    glMaterialfv(GL_FRONT, GL_SPECULAR, mat_specular);
    glMaterialfv(GL_FRONT, GL_SHININESS, high_shininess);
    glMaterialfv(GL_FRONT, GL_EMISSION, no_mat);
    glutSolidSphere(1.0, 20, 20);
    glPopMatrix();
 
 
    /* 第三行第四列绘制的球有漫反射光和有颜色的环境光以及辐射光,而无镜面光。*/
    glPushMatrix();
    glTranslatef (3.75, -3.0, 0.0);
    glMaterialfv(GL_FRONT, GL_AMBIENT, mat_ambient_color);
    glMaterialfv(GL_FRONT, GL_DIFFUSE, mat_diffuse);
    glMaterialfv(GL_FRONT, GL_SPECULAR, no_mat);
    glMaterialfv(GL_FRONT, GL_SHININESS, no_shininess);
    glMaterialfv(GL_FRONT, GL_EMISSION, mat_emission);
    glutSolidSphere(1.0, 20, 20);
    glPopMatrix();
    // 执行绘图命令
    glFlush();
}
 
 
// 窗口大小变化回调函数
void reshape(int w, int h) {
    glViewport(0, 0, w, h);
    glMatrixMode(GL_PROJECTION);
    glLoadIdentity();
    gluPerspective(60.0, (GLfloat)w/(GLfloat)h, 0.1, 100000.0);
    glMatrixMode(GL_MODELVIEW);
    glLoadIdentity();
    gluLookAt(0, 0, 10, 0, 0, -1, 0, 1, 0);
}
 
 
int main(int argc, const char * argv[]) {
    // 初始化显示模式
    glutInit(&argc, const_cast<char **>(argv));
    glutInitDisplayMode(GLUT_SINGLE | GLUT_RGBA);

 
    // 初始化窗口
    glutInitWindowSize(500, 500);
    glutInitWindowPosition(100, 100);
    glutCreateWindow(argv[0]);

 
    init();
    glutReshapeFunc(reshape);
    glutDisplayFunc(display);
 
    // 开始主循环绘制
    glutMainLoop();
    return 0;
}
Copy the code

The operation effect is shown as follows: