One, foreword

This article is mainly to learn ThreeJs demo loader/obj2, mainly is to analyze how obJ is loaded, texture and material is how to load, 3D camera and camera controller is how to achieve, etc. So, let’s shake things up with a couple of GIFs.

Second, code analysis

1. The HTML part

<div id="glFullscreen"> <! Render 3D scene canvas --> <canvas ID ="example"></canvas> </div> <! -- dat GUI div placeholder --> <div id="dat"> </div> <! --> <div id="info">
    <a href="http://threejs.org" target="_blank" rel="noopener">three.js</a> - OBJLoader2 direct loader test
    <div id="feedback"></div>
</div>
Copy the code

The most important part of this section is the addition of this tag, which also shows that the main implementation of WebGL is to draw on this canvas. This is very similar to the native API on Android.

2. The script to import

<! Import threejs core library --> <script SRC =".. /build/three.js"></script> <! --> <script SRC = "camera controller" --> <script SRC ="js/controls/TrackballControls.js"></script> <! -- Material load --> <script SRC ="js/loaders/MTLLoader.js"></script> <! Dat GUI library import --> <script SRC ="js/libs/dat.gui.min.js"></script> <! -- Import tripartite library STATS --> <scripttype="text/javascript" src="js/libs/stats.min.js"></script> <! Create mesh,texture, etc. --> <script SRC ="js/loaders/LoaderSupport.js"></script> <! -- Load the main implementation of obj --> <script SRC ="js/loaders/OBJLoader2.js"></script>
Copy the code

3. Model loading

3.1 define OBJLoader2Example

In ThreeJS learning notes – Functions and Objects in JavaScript learn that JavaScript is through prototype to achieve object-oriented programming. OBJLoader2Example() constructor = OBJLoader2Example; This defines a “class”, OBJLoader2Example, that we can use to declare new objects.

var OBJLoader2Example = function( elementToBindTo ) {...... }; OBJLoader2Example.prototype = { constructor: OBJLoader2Example, initGL:function() {... }, initContent:function() {... }, _reportProgress:function() {... }, resizeDisplayGL:function() {... }, recalcAspectRatio:function() {... }, resetCamera:function() {... }, updateCamera:function() {... }, render:function() {... }}Copy the code

3.2 OBJLoader2Example constructor

var OBJLoader2Example = function(elementToBindTo) {// The renderer, which will bind the Canvas node this.renderer = null; // Canvas this.canvas = elementToBindTo; This.aspectratio = 1; this.recalcAspectRatio(); // 3D scene this.scene = null; // This. CameraDefaults = {// The position of the camera, which is where the camera should be placed posCamera: New THREE.Vector3(0.0, 175.0, 500.0), // Camera target posCameraTarget: New THREE.Vector3(0, 0, 0), // Near cross section near: 0.1, // far section far: 10000, // Angle between visual body fov: 45}; // 3D camera this.camera = null; / / 3 d camera target, is that the camera look at where this. CameraTarget = this. CameraDefaults. PosCameraTarget; // This. Controls = null; };Copy the code

The constructor is mainly the definition of attributes, and annotations are added in the code to briefly introduce the functions of each attribute. Generally speaking, it is 3D scene, 3D camera, camera controller and the most important renderer. Renderer is bound to canvas, and 3D scene and all objects will be rendered to canvas through this renderer.

3.3 initGL ()

initGL: functionRenderer = new three. WebGLRenderer({// bind canvas: this.canvas, // antialiasing:true,
						autoClear: true}); this.renderer.setClearColor( 0x050505 ); this.scene = new THREE.Scene(); // Initialize the perspective projection camera, which is a triangular landscape cone, Camera = new THREE.PerspectiveCamera(this.cameradeframe.fov, this.aspectratio, this.cameraDefaults.near, this.cameraDefaults.far ); this.resetCamera(); // Initialize Controller this.controls = new THREE.TrackballControls(this.camera, this.renderer.domElement); Var ambientLight = new THREE.AmbientLight(0x404040); var directionalLight1 = new THREE.DirectionalLight( 0xC0C090 ); var directionalLight2 = new THREE.DirectionalLight( 0xC0C090 ); directionalLight1.position.set( -100, -50, 100 ); directionalLight2.position.set( 100, 50, -100 ); this.scene.add( directionalLight1 ); this.scene.add( directionalLight2 ); this.scene.add( ambientLight ); Var Helper = new three. GridHelper(1200, 60, 0xFF4444, 0x404040); this.scene.add( helper ); },Copy the code

The attributes are initialized in the initGL() method, along with ambient and parallel light sources for debugging the grid help model. Many objects can be seen as models in a 3D scene, such as the light source here. Camera is also considered a model in some rendering frameworks, but it is only a parameter used to participate in 3D rendering. The main function of camera is to determine the projection matrix. Objects in the projection matrix are visible, but not visible.

4. initContent()

initContent: function () {
					var modelName = 'female02';
					this._reportProgress( { detail: { text: 'Loading: '+ modelName } } ); var scope = this; Var objLoader = new three.objloader2 (); Var callbackOnLoad = // Call back when the model is loadedfunction ( event ) {
						scope.scene.add( event.detail.loaderRootNode );
						console.log( 'Loading complete: ' + event.detail.modelName );
						scope._reportProgress( { detail: { text: ' '}}); }; Var onLoadMtl = // This is a callback after the material is loadedfunction ( materials ) {
						objLoader.setModelName( modelName );
						objLoader.setMaterials( materials );
						objLoader.setLogging( true.true); // start loading obj objloader.load ('models/obj/female02/female02.obj', callbackOnLoad, null, null, null, false); }; // Start loading material objloader.loadmtl ('models/obj/female02/female02.mtl', null, onLoadMtl );
				},
Copy the code

Content loading is the focus, mainly by loading the material and then the model with ObjLoader2. For obj and MTL files, open female02.obj and female02.mtl to see that it is just a text file. Use comments to get a feel for the file format.

Female02.obj Partial data

# Blender v2.54 (sub 0) OBJ File: ''
# www.blender.org
# obj corresponds to the material file
mtllib female02.mtl
# o Object nameO mesh1.002 _mesh1 - geometry# the verticesV 15.257854 104.640892 8.680023 V 14.044281 104.444138 11.718708 v 15.763498 98.955704 11.529579......# Texture coordinatesVt 0.389887 0.679023 vt 0.361250 0.679023 vt 0.361250 0.643346......Vertex normalsVn 0.945372 0.300211 0.126926 vn 0.794275 0.212683 0.569079 vn 0.792047 0.184729 0.581805......# groupG mesh1.002 _mesh1 - geometry__03_ - _Default1noCulli__03_ - _Default1noCulli# Material used for current primitives
usemtl _03_-_Default1noCulli__03_-_Default1noCulli
s off
# v1/vt1/vn1 v2/vt2/vn2 v3/vt3/vn3
f 1/1/1 2/2/2 3/3/3
f 1/1/1 4/4/4 2/2/2
f 4/4/4 1/1/1 5/5/5
......
Copy the code

Female02. MTL partial data

.# define a material named _03_ -_default1noculli__03_ -_default1noculli
newmtl _03_-_Default1noCulli__03_-_Default1noCulli
The reflectance index defines the reflectance. The higher the value is, the more intensive the highlights are. Generally, the value ranges from 0 to 1000.
Ns 154.901961
# Ambient color of material0.000000 0.000000 0.000000 Ka# 散射光(diffuse color)用Kd
Kd 0.640000 0.640000 0.640000
Specular color: Ks
Ks 0.165000 0.165000 0.165000
The refraction value can be between 0.001 and 10. If the value is 1.0, light does not bend as it passes through the object. Glass has an refractive index of 1.5.
Ni 1.000000
The parameter factor indicates the amount of the object blending into the background. The value ranges from 0.0 to 1.0, where 1.0 indicates complete opacity and 0.0 indicates complete transparency.D 1.000000# Specify the lighting model for the material. Illum can be followed by a number ranging from 0 to 10. Each parameter represents a different lighting model
illum 2
# Specify color texture file for diffuse reflection
map_Kd 03_-_Default1noCulling.JPG
......
Copy the code

The meaning of each field in obJ and MTL files is explained in the comments, and how each field parameter is used requires some understanding of how OpenGL renders the model. Moving on to material loading and obJ loading.

# 4.1 ObjectLoader2 loadMtl ()

loadMtl: function ( url, content, onLoad, onProgress, onError, crossOrigin, materialOptions ) {
		......
		this._loadMtl( resource, onLoad, onProgress, onError, crossOrigin, materialOptions );
},
Copy the code

Call internal _loadMtl(), _loadMtl() is a bit too much code to implement, but it doesn’t matter, I have simplified it.

_loadMtl: function( resource, onLoad, onProgress, onError, crossOrigin, materialOptions ) { ...... // 7. Once the materialCreator is created, it will be loaded here. This is finally notified to the caller via onLoad, and the caller continues to load the model. var processMaterials =function( materialCreator ) { ...... // 8. Create material materialCreator. Preload (); // 9. Call the caller onLoad(Materials, materialCreator); }... Var MTLLoader = new three.mtlLoader (this.manager); Var parseTextWithMtlLoader = var parseTextWithMtlLoader =function( content ) { ...... contentAsText = THREE.LoaderUtils.decodeText( content ); . // 5. Parse the contents of the file. After parsing, a materialCreator object is obtained. ProcessMaterials processMaterials(mtlLoader.parse(contentAsText)) is then called; }... Var FileLoader = new three.fileloader (this.manager); var FileLoader = new three.fileloader (this.manager); . // 3. Load file, ParseTextWithMtlLoader fileloader. load(resource. Url, parseTextWithMtlLoader, onProgress, onError); }Copy the code

The comments contain the entire logic of the material loading process, which consists of nine steps, but here we will focus on the following three steps:

(1) file load — FileLoader#load()

load: function ( url, onLoad, onProgress, onError ) {
    ......
    var request = new XMLHttpRequest();
    request.open( 'GET', url, true); . }Copy the code

FileLoader is the code in ThreeJs library. The code before and after the load() method is omitted here, except that it is obtained by a Get request.

(2) file parse — MTLLoader# parse ()

parse: function ( text, path ) {

		var lines = text.split( '\n' );
		var info = {};
		var delimiter_pattern = /\s+/;
		var materialsInfo = {};

		for ( var i = 0; i < lines.length; i ++ ) {

			var line = lines[ i ];
			line = line.trim();

			if ( line.length === 0 || line.charAt( 0 ) === The '#' ) {

				// Blank line or comment ignore
				continue;

			}

			var pos = line.indexOf( ' ' );

			var key = ( pos >= 0 ) ? line.substring( 0, pos ) : line;
			key = key.toLowerCase();

			var value = ( pos >= 0 ) ? line.substring( pos + 1 ) : ' ';
			value = value.trim();

			if ( key === 'newmtl' ) {

				// New material

				info = { name: value };
				materialsInfo[ value ] = info;

			} else {

				if ( key === 'ka' || key === 'kd' || key === 'ks' ) {

					var ss = value.split( delimiter_pattern, 3 );
					info[ key ] = [ parseFloat( ss[ 0 ] ), parseFloat( ss[ 1 ] ), parseFloat( ss[ 2 ] ) ];

				} else {

					info[ key ] = value;

				}

			}

		}

		var materialCreator = new THREE.MTLLoader.MaterialCreator( this.resourcePath || path, this.materialOptions );
		materialCreator.setCrossOrigin( this.crossOrigin );
		materialCreator.setManager( this.manager );
		materialCreator.setMaterials( materialsInfo );
		return materialCreator;

	}
Copy the code

The parse() method looks like a lot of code, but it’s really simple, parsing the MTL file line by line. The point here is that the MaterialCreator is created and stored in the materialsInfo. The materialsInfo object is a map object where the values stored most importantly include map_Kd, the texture to be loaded when creating the material.

Create material — MaterialCreator#preload()

preload: function () {
		for ( var mn inthis.materialsInfo ) { this.create( mn ); }},Copy the code

Preload () iterates through each material and calls Create () separately. Create () further calls the createMaterial_() method.

createMaterial_: function ( materialName ) {
		// Create material
		var scope = this;
		var mat = this.materialsInfo[ materialName ];
		var params = {
			name: materialName,
			side: this.side
		};
		function resolveURL( baseUrl, url ) {
			if( typeof url ! = ='string' || url === ' ' )
				return ' ';
			// Absolute URL
			if( /^https? :\/\//i.test( url ) )return url;
			return baseUrl + url;
		}
		function setMapForType( mapType, value ) {
			if ( params[ mapType ] ) return; // Keep the first encountered texture
			var texParams = scope.getTextureParams( value, params );
			var map = scope.loadTexture( resolveURL( scope.baseUrl, texParams.url ) );
			map.repeat.copy( texParams.scale );
			map.offset.copy( texParams.offset );
			map.wrapS = scope.wrap;
			map.wrapT = scope.wrap;
			params[ mapType ] = map;
		}
		for ( var prop in mat ) {
			var value = mat[ prop ];
			var n;
			if ( value === ' ' ) continue;
			switch ( prop.toLowerCase() ) {
				// Ns is material specular exponent
				case 'kd':
					// Diffuse color (color under white light) using RGB values
					params.color = new THREE.Color().fromArray( value );
					break;
				case 'ks':
					// Specular color (color when light is reflected from shiny surface) using RGB values
					params.specular = new THREE.Color().fromArray( value );
					break;
				case 'map_kd':
					// Diffuse texture map
					setMapForType( "map", value );
					break;
				case 'map_ks':
					// Specular map
					setMapForType( "specularMap", value );
					break;
				case 'norm':
					setMapForType( "normalMap", value );
					break;
				case 'map_bump':
				case 'bump':
					// Bump texture map
					setMapForType( "bumpMap", value );
					break;
				case 'map_d':
					// Alpha map
					setMapForType( "alphaMap", value );
					params.transparent = true;
					break;
				case 'ns':
					// The specular exponent (defines the focus of the specular highlight)
					// A high exponent results in a tight, concentrated highlight. Ns values normally range from 0 to 1000.
					params.shininess = parseFloat( value );
					break;
				case 'd':
					n = parseFloat( value );
					if ( n < 1 ) {
						params.opacity = n;
						params.transparent = true;
					}
					break;
				case 'tr':
					n = parseFloat( value );
					if ( this.options && this.options.invertTrProperty ) n = 1 - n;
					if ( n > 0 ) {
						params.opacity = 1 - n;
						params.transparent = true;
					}
					break;
				default:
					break;
			}
		}
		this.materials[ materialName ] = new THREE.MeshPhongMaterial( params );
		return this.materials[ materialName ];
	},
Copy the code

This is to tell us how to use each field in the MTL file. Here we focus on how the texture image is loaded, and other field parameters can be read in the MTL comments. Map-kd, map_ks, norm, map_bump, bump, and map_d are handled by calling setMapForType(), and they all load textures in different forms.

function setMapForType( mapType, value ) { ...... var map = scope.loadTexture( resolveURL( scope.baseUrl, texParams.url ) ); . }Copy the code

Resolve a texture based on the base URL of the texture’s address. Resolve a texture based on the base URL of the texture. Move on to loadTexture().

loadTexture: function( url, mapping, onLoad, onProgress, onError ) { ...... var loader = THREE.Loader.Handlers.get( url ); . loader = new THREE.TextureLoader( manager ); . texture = loader.load( url, onLoad, onProgress, onError );return texture;
}
Copy the code

It basically builds a TextureLoader and calls its load() to load it.

load: function( url, onLoad, onProgress, onError ) { ...... var loader = new ImageLoader( this.manager ); . loader.load( url,function ( image ) {}
}
Copy the code

It is further loaded by ImageLoader.

load: function ( url, onLoad, onProgress, onError ) {
  ......
  var image = document.createElementNS( 'http://www.w3.org/1999/xhtml'.'img'); . image.src = url;return image;
}
Copy the code

The original image is loaded by creating oneTag to load. To create aTag, which is not added to the DOM tree, is used to download images as long as SRC is assigned a value.

At this point, the material and texture loading analysis is finished. Next, we will analyze the loading of OBJ.

# 4.2 ObjLoader2 load ()

	load: function ( url, onLoad, onProgress, onError, onMeshAlter, useAsync ) {
		var resource = new THREE.LoaderSupport.ResourceDescriptor( url, 'OBJ' );
		this._loadObj( resource, onLoad, onProgress, onError, onMeshAlter, useAsync );
	},
Copy the code

Again a further call, _loadObj() is called.

_loadObj: function ( resource, onLoad, onProgress, onError, onMeshAlter, useAsync ) {
    ......
    var fileLoaderOnLoad = function( content ) { ...... . Parse obj loaderRootNode: scope.parse(content),...... }, // 1. Build FileLoader var FileLoader = new three.fileloader (this.manager); . // 2. Load the file, which has been analyzed when loading MTL. FileLoaderOnLoad fileloader. load(resource. Name, fileLoaderOnLoad, onProgress, onError); }Copy the code

The code for _loadObj() has also been simplified here, with the logic explained in comments. File loading has been analyzed previously, so let’s focus on parsing OBJ.

/**
* Parses OBJ data synchronously from arraybuffer or string.
*
* @param {arraybuffer|string} content OBJ data as Uint8Array or String
*/
parse: function( content ) { ...... // 1. Initialize meshBuilder this.meshBuilder.init(); Var Parser = new three.objloader2.parser (); . var onMeshLoaded =function(content) {/ / 4. Obtained from the meshBuilder mesh, and the mesh is added to a node var meshes = scope. MeshBuilder. ProcessPayload (content); var mesh;for ( var i inmeshes ) { mesh = meshes[ i ]; scope.loaderRootNode.add( mesh ); }}... ParseText (content) parser.parseText(content); . }Copy the code

The focus here is parseText().

parseText: function ( text ) {
    ......
    for ( var char, word = ' ', bufferPointer = 0, slashesCount = 0, i = 0; i < length; i++ ) { ...... this.processLine( buffer, bufferPointer, slashesCount ); . }... }Copy the code

Again, you can skip the omissions here and look at processLine(), which parses the obj file.

     processLine: function ( buffer, bufferPointer, slashesCount ) {
		if ( bufferPointer < 1 ) return;

		var reconstructString = function ( content, legacyMode, start, stop ) {
			var line = ' ';
			if ( stop > start ) {

				var i;
				if ( legacyMode ) {

					for ( i = start; i < stop; i++ ) line += content[ i ];

				} else {


					for ( i = start; i < stop; i++ ) line += String.fromCharCode( content[ i ] );

				}
				line = line.trim();

			}
			return line;
		};

		var bufferLength, length, i, lineDesignation;
		lineDesignation = buffer [ 0 ];
		switch ( lineDesignation ) {
			case 'v':
				this.vertices.push( parseFloat( buffer[ 1 ] ) );
				this.vertices.push( parseFloat( buffer[ 2 ] ) );
				this.vertices.push( parseFloat( buffer[ 3 ] ) );
				if ( bufferPointer > 4 ) {

					this.colors.push( parseFloat( buffer[ 4 ] ) );
					this.colors.push( parseFloat( buffer[ 5 ] ) );
					this.colors.push( parseFloat( buffer[ 6 ] ) );

				}
				break;

			case 'vt':
				this.uvs.push( parseFloat( buffer[ 1 ] ) );
				this.uvs.push( parseFloat( buffer[ 2 ] ) );
				break;

			case 'vn':
				this.normals.push( parseFloat( buffer[ 1 ] ) );
				this.normals.push( parseFloat( buffer[ 2 ] ) );
				this.normals.push( parseFloat( buffer[ 3 ] ) );
				break;

			case 'f':
				bufferLength = bufferPointer - 1;

				// "f vertex ..."
				if ( slashesCount === 0 ) {

					this.checkFaceType( 0 );
					for( i = 2, length = bufferLength; i < length; i ++ ) { this.buildFace( buffer[ 1 ] ); this.buildFace( buffer[ i ] ); this.buildFace( buffer[ i + 1 ] ); } / /"f vertex/uv ..."
				} else if  ( bufferLength === slashesCount * 2 ) {

					this.checkFaceType( 1 );
					for( i = 3, length = bufferLength - 2; i < length; i += 2 ) { this.buildFace( buffer[ 1 ], buffer[ 2 ] ); this.buildFace( buffer[ i ], buffer[ i + 1 ] ); this.buildFace( buffer[ i + 2 ], buffer[ i + 3 ] ); } / /"f vertex/uv/normal ..."
				} else if  ( bufferLength * 2 === slashesCount * 3 ) {

					this.checkFaceType( 2 );
					for( i = 4, length = bufferLength - 3; i < length; i += 3 ) { this.buildFace( buffer[ 1 ], buffer[ 2 ], buffer[ 3 ] ); this.buildFace( buffer[ i ], buffer[ i + 1 ], buffer[ i + 2 ] ); this.buildFace( buffer[ i + 3 ], buffer[ i + 4 ], buffer[ i + 5 ] ); } / /"f vertex//normal ..."
				} else {

					this.checkFaceType( 3 );
					for( i = 3, length = bufferLength - 2; i < length; i += 2 ) { this.buildFace( buffer[ 1 ], undefined, buffer[ 2 ] ); this.buildFace( buffer[ i ], undefined, buffer[ i + 1 ] ); this.buildFace( buffer[ i + 2 ], undefined, buffer[ i + 3 ] ); }}break;

			case 'l':
			case 'p':
				bufferLength = bufferPointer - 1;
				if ( bufferLength === slashesCount * 2 )  {

					this.checkFaceType( 4 );
					for ( i = 1, length = bufferLength + 1; i < length; i += 2 ) this.buildFace( buffer[ i ], buffer[ i + 1 ] );

				} else {

					this.checkFaceType( ( lineDesignation === 'l')? 5, 6);for ( i = 1, length = bufferLength + 1; i < length; i ++ ) this.buildFace( buffer[ i ] );

				}
				break;

			case 's':
				this.pushSmoothingGroup( buffer[ 1 ] );
				break;

			case 'g': / /'g' leads to creation of mesh if valid data (faces declaration was done before), otherwise only groupName gets set
				this.processCompletedMesh();
				this.rawMesh.groupName = reconstructString( this.contentRef, this.legacyMode, this.globalCounts.lineByte + 2, this.globalCounts.currentByte );
				break;

			case 'o': / /'o' is meta-information and usually does not result in creation of new meshes, but can be enforced with "useOAsMesh"
				if ( this.useOAsMesh ) this.processCompletedMesh();
				this.rawMesh.objectName = reconstructString( this.contentRef, this.legacyMode, this.globalCounts.lineByte + 2, this.globalCounts.currentByte );
				break;

			case 'mtllib':
				this.rawMesh.mtllibName = reconstructString( this.contentRef, this.legacyMode, this.globalCounts.lineByte + 7, this.globalCounts.currentByte );
				break;

			case 'usemtl':
				var mtlName = reconstructString( this.contentRef, this.legacyMode, this.globalCounts.lineByte + 7, this.globalCounts.currentByte );
				if( mtlName ! = =' '&& this.rawMesh.activeMtlName ! == mtlName ) { this.rawMesh.activeMtlName = mtlName; this.rawMesh.counts.mtlCount++; this.checkSubGroup(); }break;

			default:
				break; }},Copy the code

This code is longer, more than 150 lines, but it’s pretty simple, parsing according to obJ’s file format. If you’ve forgotten the obJ file format, I suggest you review it first. The parsing process is already very detailed, I won’t go into detail. Here the final analytical result vertices, texture coordinates and normal according to the index of face, the result is VVV | VTVT | VNVNVN such n vertex arrays and n group index array. Vertex arrays, index arrays, and materials/textures make up the 3D mesh used for rendering.

This is where the load of OBJ is analyzed. The loading of OBj is the main body, but also the simplest. The problem is in the material and texture loading, need to pay attention to a lot of problems.

5.render()

var render = function () {
    requestAnimationFrame( render );
    app.render();
};
Copy the code

This render is a function, not a method of OBJLoader2Example, which is inside. It first requests the animation refresh callback so that it can listen for the browser refresh. Set the callback function to itself when refreshing, causing the browser to call the render() function while refreshing. The render() method of OBJLoader2Example is then called to render the 3D scene. Here’s a quick look at MDN’s description of requestAnimationFrame.

Window. RequestAnimationFrame () method tells the browser you want to perform the animation and request the browser before the next redraw call the specified function to update the animation. This method takes as an argument a callback function that is called before the browser redraws. Call this method when you need to update the screen. Execute the callback before the next browser redraw. The number of callbacks is typically 60 per second, but most browsers generally match the refresh rate recommended by the W3C.

See the bold font? It’s the same refresh rate as the end, which is 60 FPS. Let’s take a quick look at OBJLoader2Example’s Render () method.

render: function () {
	if(! this.renderer.autoClear ) this.renderer.clear(); this.controls.update(); this.renderer.render( this.scene, this.camera ); }Copy the code

As you can see, the main thing we’re doing here is actually rendering with the WebGLRenderer, so let’s go into OpenGL a little bit further. OpenGL is a big topic, so I won’t analyze it here, nor will IT be appropriate.

Three, afterword.

This article mainly analyzes how ThreeJs loads an Obj model and renders it. The analysis process is long, but it is not complicated and does not involve any difficult concepts. Before analysis due to the level of JavaScript is really limited, so also specific to fill a knife “ThreeJS learning notes – JavaScript functions and objects”. After a deeper understanding of functions and objects, plus the basic OpenGL foundation, step by step analysis of the loading process is actually relatively easy.

Finally, thank you for reading and finishing this article. I hope my simple analysis and sharing are helpful to you, and please give me a thumbs up to encourage me to continue my analysis.