The reason

Last night, I couldn’t sleep. Suddenly I got inspiration from VR videos on foreign video websites. I spent 40 minutes writing down this small demo to share with you

rendering

The principle of

First we use a sphere

Generate a texture unit for each frame of the video and unfold it in the element shader according to the sphere’s UV

The camera position is at the origin position

Update the renderer with rAf

To prepare the material

Video Materials

First we download a panoramic video of scientific methods from some unknown website (domestic or foreign). Opened with a VLC player, it looks something like this:

Code implementation

First, we introduce three and its built-in track controller

import * as THREE from "three/build/three.module";
import { OrbitControls } from "three/examples/jsm/controls/OrbitControls";
Copy the code

Add the following DOM to your HTML

 <div class="player">
    <div>
      <button @click="$refs.video.play()">play</button>
    </div>
    <video
      preload
      ref="video"
      controls
      loop
      style="width: 100%; visibility: hidden; position: absolute"
      :src="src"
    ></video>

    <canvas
      style="width: 80%; height: 823px"
      width="1920"
      height="823"
      ref="canvas"
    ></canvas>
  </div>
Copy the code

Initialize the video texture

initVideoTexture() {
      this.videoTexture = new THREE.VideoTexture(this.$refs.video);
      this.videoTexture.needsUpdate = true;
      this.videoTexture.updateMatrix();
    }
Copy the code

Create an off-screen canvas and drawImage to pass the video into it. You can also instantiate it using the Texture. Remember to set the needUpdate attribute to true

Initialization Scenario

    initScene() {
      this.scene = new THREE.Scene();
    }
Copy the code

Initializing the camera

Notice that the perspective camera starts at (0,0,0)

    initCamera() {
      this.camera = new THREE.PerspectiveCamera(45.1024 / 768.1.1000);
      this.camera.position.z = 30;
      this.controls = new OrbitControls(this.camera, this.renderer.domElement);
      this.controls.maxDistance = 100;

      this.controls.update();
      // const helper = new THREE.CameraHelper(this.camera);
      // this.scene.add(helper);
      this.scene.add(this.camera);
    }
Copy the code

Initialization grid

Notice that to define the texture here we’re going to use the Facade object so let’s go ahead and pass in the texture object that we just initialized, and here we’re going to use tex_0 as the name of the variable that we’re going to use

    initMesh() {
      this.geometry = new THREE.SphereGeometry(100.32.16);
      this.material = new THREE.ShaderMaterial({
        wireframe: false.side: THREE.DoubleSide,
        map: this.videoTexture,
        uniforms: {
          tex_0: new THREE.Uniform(this.videoTexture),
        },
        vertexShader: require("@/components/v.glsl").default,
        fragmentShader: require("@/components/f.glsl").default,
      });
      this.mesh = new THREE.Mesh(this.geometry, this.material);
    }
Copy the code

Tick function

  update() {
      this.renderer.render(this.scene, this.camera);

      requestAnimationFrame(this.update);
    }
Copy the code

Mounted function

Since I’m using vue SFC, I’m writing in mounted hooks

 mounted() {
    this.initRenderer();
    this.initScene();
    this.initVideoTexture();
    this.initMesh();
    this.initCamera();
    this.addMeshToScene();
    this.update();
  }
Copy the code

Shader implementation

Vertex shader

Here we use the MVP matrix of three.js to world transform the vertex coordinates passed in, and then pass our UV variable v_UV defined by the keyword VARYING into the chip shader

precision highp float;
varying vec2 v_uv;
void main() {
    gl_Position = projectionMatrix *
        modelViewMatrix *
        vec4(position.xyz, 1.0);
    v_uv = uv;
}
Copy the code

Chip shader

Note that due to the default normal sphere is made up of outside introversion, so our uv texture coordinate mapping is outward, we set up double rendering, because our camera is package, sphere, so the ball will happen within the texture flip phenomenon, it is clearly wrong, so we need to turn the uv coordinates to sit down, is very simple, Use 1.0-v_uv.x

precision highp float;
varying vec2 v_uv;
uniform sampler2D tex_0;
void main() {
    vec4 texColor = texture2D(tex_0, vec2(1. - v_uv.x, v_uv.y));
    gl_FragColor = texColor;
}

Copy the code

So that’s our panoramic video. I have put the demo on Gitee, you guys have any ideas or share can also private message me ~~