Demo making warehouse address: https://github.com/filelife/FLVRPlayer

In order to make you better read the Demo code, the following simple introduction to the relevant knowledge and implementation ideas.

Step 0.

0.1 General idea of rendering VR scene video:

Step 1.

1.1 Model Acquisition

First, you need to create a sphere. To create a model in OpenGL, vertices and texture coordinates are required. Obj models made by tools such as 3D MAX cannot be recognized in iOS, so they need to be converted to the vertices array used by OpenGL.

There is a great god wrote a model into a vertex array tool: https://github.com/HBehrens/obj2opengl

The product is a set of vertex data + vertex index of sphere model;

1.2 Model data description

To better understand vertex data and fixed-point indexing, let’s analyze the modeling data of a square model.

GLfloatSquareVertexData [] = {0.5, 0.5, 0.0 f, f 1.0, 0.0, f / / right - 0.5, 0.5, 0.0 f, f 0.0, 1.0, f / / upper left - 0.5, 0.5, 0.0 f, 0.0 f, 0.0f, // lower left 0.5, 0.5, -0.0f, 1.0f, 1.0f, // upper right};Copy the code
These points were intended to be a square, but since the XY axis of the mobile screen coordinate system is not in the same proportion, the resulting shape is as follows:

Another technical point that is easy to overlook for beginners is that == in OpenGL ES you can only draw triangles ==, not polygons, but in OpenGL you can draw polygons directly.
Indices [] ={0, 1, 2, 1, 3, 0};Copy the code

We can also draw a 3D cube through the following vertices + index.
// Vertex data, the first three are vertex coordinates, the middle three are vertex colors, and the last two are texture coordinates GLfloatF attrArr [] = {0.5, 0.5 f to 0.0 f to 0.0 f to 0.0 f to 0.5 f to 0.0 f to 1.0 f, / / upper left 0.5 f, f 0.5, 0.0 f, f 0.0, 0.5 f, f 0.0, 1.0, f 1.0 f, / / upper right - 0.5 f, 0.5 f, f, 0.0 0.5 f, f 0.0, 1.0 f, f 0.0, 0.0, f / / lower left 0.5 f to 0.5 f, f 0.0, 0.0, f 0.0 f, f 0.5, 1.0, f 0.0 f, / / right - 0.5 f, 0.5 f to 1.0 f, 0.0 f, f, 0.0 0.5 f, f 0.0, 1.0, f / / rear upper left 0.5 f, f 0.5, 1.0 f, f 0.0, 0.5 f, f 0.0, 1.0, f 1.0 f, / / rear upper right - 0.5 f, 0.5 f, f, 1.0 0.5 f, f 0.0, 1.0 f, f 0.0, 0.0, f / / rear lower left 0.5 f, f - 0.5, 1.0 f, f 0.0, 0.0 f, f 0.5, 1.0, f 0.0f,// back right lower};Copy the code
/ / vertices GLuint indices [] = {0, 3, 2, 0, 1, 3, 0, 4, 1, 5, 4, 1, 1, 5, 3, 7, 5, 3, 2, 6, 3, 6, 7, 3, 0, 6, 2, 0, 4, 6, 4, 5, 7, 4, 6, 7,};Copy the code

Step 2.

2.1 Rendering basics – Caching

In order to optimize efficiency and avoid data hunger, in OpenGL ES, programs can be made to copy data from the MEMORY of the CPU to the continuous RAM cache controlled by the GPU. After the GPU obtains a cache of data, The cache is exclusively used to read and write data from memory as efficiently as possible: geometry data, colors, lighting effects, and so on.

The cache life cycle:

2.1.1. Generate: Request OpenGLES to Generate a unique identifier for the GPU-controlled cache.

Function: glGenBuffers()

2.1.2. Bind tells OpenGLES to use a cache for subsequent operations.

Corresponding function: glBindBuffer()

2.1.3. Enable or Disable: Tells OpenGLES whether to use cached data in subsequent renderings.

The corresponding function: glEnableVertexAttribArray ()

2.1.4. Set Points: Tells OpenGLES the type of data store in the cache and the memory pointer offset of the data that needs to be accessed.

GeVertexAttribPointer ()

2.1.5. Draw: Tells OpenGLES to render an entire scene or a portion of a scene using data from the currently bound and enabled cache.

Corresponding function: glDrawArrays()

2.1.6. Delete: Tells OpenGLES to Delete the previously generated cache and release the associated resources.

Function: glDeleteBuffers()

2.2 Geometric data of a 3D scene

2.2.1. Coordinate system (cartesian coordinate system) : the OpeGLES coordinate system has no unit. The distance from point {1,0,0} to point {2,0,0} is 1 unit along the X-axis. 2.2.2. Vectors: Vectors are the key to understanding modern Gpus because graphics processors are massive vector processors. 2.2.3. Points, Lines, Triangles: OpenGLES only renders vertices, line segments, and triangles. Here is an example of a loop rendered with points, lines, and triangles.

2.3 Core Animation Layer

Because the iOS operating system does not let the app directly draw from the forward frame cache or the back frame cache, it also does not let the app directly control the switch between the front frame cache and the back frame cache. So iOS uses a Core Animation synthesizer to control the composition of all the drawing layers. (e.g. : StatusBar layer + developer-provided rendering layer = final screen pixels)

Core Animation synthesizer uses OpenGLES to control GPU, blending layer and switch frame cache as efficiently as possible. Therefore, the process of Core Animation synthesizer synthesizing image to form a composite result is related to OpenGLES.

2.4 GLKView/GLKViewController:

GLKView/GLKViewController are part of the GLKit framework. GLKView is a subclass of UIView. GLKView simplifies the work needed to create the frame cache, manage the frame cache, and draw the frame cache through CoreAnimation. The GLKViewController associated with GLKView is the view delegate and receives messages when the view needs to be redrawn.

Step 3.

Understand the view Angle inside the sphere

Transformation relation between spherical coordinate system (R,θ,φ) and rectangular coordinate system (x,y,z) :
X = rsin theta cos phi
Y = rsin theta sine phi
Z = rcos theta

Step 4.

Code analysis.

GLKBaseEffecty: Hides the differences between multi-lattice OpenGLES versions supported by iOS devices. Using BaseEffect in our application can reduce the amount of code, for example simplifying the Shading Lauguage we use in OpenGLES2.
EAGL:Embedded Apple GL
2. EAGLContext: The Context Context exists for multi-task hardware devices that share images without interference. EAGLContext provides function methods that work in parallel and controls THE GPU to perform rendering operations.
3. EAGLSharegroup: Manages OpenGL ES objects in Context. Context needs to be created and managed using EAGLSharegroup objects.

[Image upload failed…(image-9f3553-1529478576825)]

4. GlTexParameter: The function is used to specify how the texture is applied. [Filtering mode(Page 14)]
/* TextureParameterName */ / provides texture zooming and zooming filters#define GL_TEXTURE_MAG_FILTER 0x2800
#define GL_TEXTURE_MIN_FILTER 0x2801
#define GL_TEXTURE_WRAP_S 0x2802
#define GL_TEXTURE_WRAP_T 0x2803@ glTexParameteri (GLenum target, GLenum pname, GLint param); //glTexParameterf vs glTexParameteri: The difference isintegerandfloatType entry parameter. //In thecase where the pname (second) parameter is GL_TEXTURE_WRAP_S where you are passing an enum you should use glTexParameteri but for other possible values such as GL_TEXTURE_MIN_LOD and GL_TEXTURE_MAX_LOD it makes sense to pass a float parameter using glTexParameterf. 
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_REPEAT);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);
Copy the code
5. The texture space is mapped to the object space to generate Wrap. About the extension scheme in the S direction and T direction:
#define GL_TEXTURE_WRAP_S 0x2802
#define GL_TEXTURE_WRAP_T 0x2803
Copy the code
6. Texture miniaturization scheme when doing Mipmap changes:

Sampling scheme:

Point sampling method: the texture value of the nearest grain element. Linear filtering: weighted average of the nearest field (2x2) striene.Copy the code

Two points to add:

1. Shader syntax introduction
2. Homogeneous notation