preface

In 2019, VR, AR, XR, 5G, industrial Internet and other terms frequently appear in our vision, information sharing and the combination of virtual and real has become the general trend, 5G is an important direction of the upgrade of the new generation of information and communication technology, industrial Internet is the development trend of the transformation and upgrading of manufacturing industry. VR mentioned in this paper is another communication between machinery manufacturing industry and equipment. When technology nova meets manufacturing trend, it will undoubtedly become an important driving force for digital transformation of manufacturing industry and industrial control industry. “5G + VR + Industrial Internet” will become a constant topic in the New Year. How to combine the problems encountered in the current industry with virtual reality will allow us to communicate more closely and feel the changes brought to us by technology. At apple’s press conference this year, I believe we all know that Apple’s 5G mobile phone has not been released, indicating that the application and development of 5G is still in the stage of rapid development, but mobile phone APP with AR function has been released long ago, the speed of 5G combined with AR and VR immersive, let us feel not only technological innovation, What’s more, we can feel the actual application scenarios of technology in different fields. I believe that the New Year of 2020 will surely be another new beginning of the application of “5G + VR + Industrial Internet”, and the specific application cases developed by HT for Web combined with WebVR will be discussed in the following part of this paper.

Preview system

Preview address: based on HTML 5 WebGL and WebVR visualization of 3 d between virtual reality training system at http://www.hightopo.com/demo/vr-training/

VR disassemble and restore

VR operating

VR Scene Switching

PC disassemble and restore

PC test

System is introduced

The system is divided into three practical application levels:

  • 3D training: the user can disassemble the device by MB finger touch or PC mouse drag, and then restore the device to its original state by one-button restoration, or view the process of automatic disassembly and automatic restoration of the device after disassembly by disassembly or restoration button.
  • Examination system: This part is to test your familiarity with equipment disassembly. After the 3d training in the first step, you can test your understanding of the disassembly process in this system.
  • VR mode: This part is the specific application of 3D scene combined with WebVR. After entering VR, the device can be disassembled and restored by operating VR handle.

This article focuses on the VR mode in Part 3, showing us how to build VR scenes with HT. The following describes the main operations in VR. The six button operations described below will not appear when you do not enter VR. When you click into WebVR, the system will automatically display the six operation buttons in VR scene; otherwise, when you exit VR, the system will automatically hide the six operation buttons in 3D.

  • Device switching: As the name implies, you can align the gamepad ray with the list on the left of the scene and press the board to switch the scene device.
  • Operation switching: There are the following two operations on the device in VR. You can click the mode button in the lower right corner to switch.
  • Pan mode: In this mode, the user can align the device and press the panel to move the device from one position to another, and touch the touchpad to pull the device parts closer and farther.
  • Grab mode: in this mode, the user can align the device and press the board to grab the device. After grabbing the device, the user can rotate and enlarge or shrink the parts by touching the touch board.
  • One-button restoration: restores all parts of the equipment to their original positions.
  • Dismantlement animation: dismantlement of all parts of the equipment step by step through the predetermined position.
  • Restore animation: This operation can be understood as the reverse of the dismantling animation, that is, the dismantling process is restored in reverse order.
  • Wire frame switch: The HT supports the triangle view of a device node. You can see the outline of the wire frame of the device.

System development

3D scene HT supports the import of OBJ model, and the equipment parts in VR scene are all OBJ models. Since the equipment needs to be disassembled later, it is necessary to model each part of the equipment, rather than the whole equipment. If the equipment is modeled as a whole, it will be a Data node in the HT scenario, so parts cannot be disassembled. If disassembled, multiple OBJ can be loaded in HT, so there will be multiple Data nodes. After the Data node with multiple parts, parts of the equipment can be moved or other rotation operations. For the specific meaning of Data in HT, please refer to the HT for Web Data Model manual

The obJ model in the import scenario is as follows:

Can be seen from the above we import after the obj is dispersed between parts, so we need to adjust the initial position of the parts, so as to adjust out a made up many parts, complete equipment adjustment would not pass code to, of course, the corresponding 3 d editor can adjust, to drag drag drag to different parts together, The overall equipment after combination is as follows:

VR setup The VR scene is built on the basis of the first step. The buttons that are only displayed in the VR scene mentioned above are also built in the scene. In normal scenes, we can hide the corresponding nodes. Node.s (‘3d.visible’, false) is the code in the HT to hide the 3D node under the 3D, because the HT will send the corresponding state to the user when entering VR and leaving VR. Corresponding pseudocodes are as follows:

Var vr = graph3dView.vr; var vr = graph3dView.vr; vr.mp(function(e) {// property specifies the vr event type, detail Specifies the event status. var detail = e.newValue; // present indicates entering or leaving the VR sceneif (property === 'present'// the detail istrueEnter VR,falseLeave VRif// Perform the operation to display the nodes that need to be displayed in the VR scenario.else{// Perform the operation of hiding nodes to be hidden in VR scenarios}}});Copy the code

The above property will send the following types in HT, mainly including VR state and operation type of gamepad:

  • Enable: The ENABLE information of the VR is changed
  • Present: The present information of the VR changes, indicating that the VR world is entered
  • Gamepad. pose: Handle position or rotation changes, parameter ID, position, rotation
  • Gamepad. axes: Rotation point change in the middle of handle, ID, AXES; The form of axes is as follows: [0.2, 0.7], and the resolution represents the percentage of vertical and horizontal
  • Gamepad.button. thumbpad: When the thumbpad button is pressed, parameter ID, state, state includes down and Up
  • Gamepad.button. trigger: When the trigger button is pressed, the parameter id and state can be down or Up
  • Gamepad.button. grips: The grips button is pressed, and the parameters are ID and state. State can be Down or Up
  • Gamepad.button. menu: When the menu button is pressed, the parameter ID and state can be down or Up

One of the key configurations in VR is scale, because the units in VR are the same as the units of length in reality. We walk 1m forward while wearing the helmet, so we need a corresponding relation to how far we need to walk in HT 3D scene. MeasureOflength the measureOflength VR plug-in provides a measureOflength configuration item as follows:

Var vr_config = {measureOflength: 0.01,}Copy the code

0.01 means that the unit length in the HT scene is 0.01 meter in the real scene, so if you move 1 meter in the real scene while wearing the helmet, then the corresponding Angle in the HT will move 100 units forward. Therefore, if you need to build A VR scene, you should pay attention to the difference between the modeling ratio of the scene and the real world, and follow the uniform ratio to model, otherwise the problem of different sizes of devices will occur in THE VR scene, resulting in illusion. The comparison figure below shows the ratio of 0.01 on the left, and the dots of rays are very small. On the right is a 0.001 ratio that causes the ray dot to become larger.

HT has encapsulated the API of WebVR related interface provided by browser, including obtaining device navigator.getVrdisplays (), which is the first step to enter VR world. If the result returned by executing this code at this time is empty, it means that obtaining VR device failed. GetGamepads (). Users can type in the above two lines of code on the browser console to check whether the browser has obtained VR device information and VR controller information. If the return is empty, it means that the acquisition failed. Enable = true to enable VR for HT. The user does not need to execute this code. The VR plug-in provided by HT also provides the corresponding configuration item vrEnable: True indicates that VR is enabled, and the corresponding configuration is also hung in the vr_config object above, as follows:

Var vr_config = {measureOflength: 0.01, measurerenable:true// enable VR}Copy the code

The system in this display has the function of switching scenes directly in VR. As the value of the configuration item in VR_config of each scene may be different, for example, the measureOflength scale of the first scene is 0.01. If measureOflength in the second scenario is 0.02, the VR plugin provides a destruction function to destroy resources in the previous scenario. The destruction function includes clearing all nodes in the previous scenario. Therefore, when loading a new scenario, no scene node clearing is required. There is no need to execute datamodel.clear (), because the destruction function provided by VR has already been cleared. The gamepad and ray are both a Data node in the scene, so there is no need to remove the gamepad and ray nodes in the new scene, so the plugin manages the nodes for you. Deserialize (‘ Scene resource JSON address ‘) to serialize the new scene JSON file. Can be modified according to the new scene vr_config value right now, then call again graph3dView. InitVRForScene () to initialize the VR scene again. The relevant step pseudocode is as follows:

/ / window. GVR is in call graph3dView. InitVRForScene () after the initialization of a global VR plug-in variables For user access to the plugin object window. GVR. Destory (); // Execute the new scene serialization operation graph3dView.deserialize('Scene resource JSON address'.function(json, dm, g3D, datas) {// Modify the new scene scale to 0.02 window.vr_config.measureOflength = 0.02; Vr_config.vreye = ht.default.clone (g3d.geteye ()); / / initialize the VR scene graph3dView again. InitVRForScene ()});Copy the code

Of course, the VR plug-in provided by HT also has many configuration items for users to better adjust VR scenes, including terrain brushing, scene movement mode and scene operation mode. The main process of VR construction by HT is shown in the following flow chart:

Through the above flow chart, we can generally understand how to quickly build VR scenes with THE VR plug-in provided by HT.

At present, Google Chrome and Firefox are friendly to support VR. You can experience the VR scenes provided by the WebVR Demo on the official website of Firefox online.

Disassembly rules As can be seen from the previous part of the renderings of the article, the equipment in each scene is disassembled, and the number, location, direction of disassembly and length of offset of each device are inconsistent. Therefore, it is impossible to write down the length and direction of offset through the code. It is necessary to formulate a set of dismantling rules to help us make the dismantling animation of each scene more convenient. In this way, the designer only needs to configure the dismantling animation of different devices in different scenes according to the disassembly rules agreed with the program. The disassembly of the system is divided into two situations:

  • Single unit movement: The movement of a single device part along the direction of the connecting line between the parent node position and that node position
  • Combined movement: the combination of multiple equipment parts moves along a certain direction. After combined movement, the equipment parts can move in the position after combined movement and then move along a certain direction. They can be nested indefinitely, that is, after combination, they can also move in combination or single

The diagram of monomer movement is as follows:

The combined movement diagram is as follows:

In HT, data.setDisplayName(‘ name ‘) can be used to set the node name. It is agreed that the offset information of different devices can be obtained by the name of different devices. For example, data.setDisplayName(‘1-0.5-1000’) is the configuration rule agreed with the designer. 1 represents the first step of disassembly. Of course, there can be multiple ones in the scene. That is, the first step is to disassemble these parts at the same time. 0.5 represents 50% of the length of the connecting line between the parent node and its position offset towards the direction of the parent node. 1000 means the offset process lasts for 1000 milliseconds, after which you can specify the rotation and the rotation Angle. Once the designer knows these configuration rules, he can configure different parts through the visual editor, so that the program side only needs to write a set of common logic to disassemble and restore different equipment.

The system maintains a queue and a stack, the queue is used to record the disassembly order, the stack is used to record the restore order. The disassembly process is promoted to the queue in order by the serial number configured. The data structure of the queue is due to the characteristics of the first-in, first-out queue. The first part pressed into the queue is the first to execute, and the last part pressed into the queue is the last to execute the disassembly sequence. The parts disassembled from the queue are pushed onto the stack at the same time. The reason why the stack records the restore order is that the first part to be disassembled is the last part to be restored. Therefore, the different data structures mentioned above are used to better record the data. The following is the relevant JS pseudocode:

Const queue = []; // Record restore order const stack = []; Datamodel. each((node) = >{const displayName = node.getDisplayName();if (displayName) {
        const[index, distancePer, during] = displayName.split(The '-');
        if(index ! == void 0) {if (queue[index]) {
                if (queue[index] instanceof Array) {
                    queue[index].push(node);
                } else{ const tempNode = queue[index]; queue[index] = [tempNode, node]; }}else{ queue[index] = node; }}}});Copy the code

The related logic is as follows:

Through the convention above, designers can use visual editors to configure the movement rules of different parts, greatly improving the efficiency of animation production.

The code analysis This part is mainly about dismantling of reductive animation code is analyzed, the main vector and part of the concept of trigonometric function is used to calculate the position of the different parts in 3 d space, when the initial need to record each part in all previous combination move after the initial mobile position vector, and parts not move before the initial position vector, The purpose of obtaining these two position vectors is to move the parts after disassembly as mentioned above, and to restore the parts to the initial position of a whole device shape after disassembly. Both position vectors play an important role, and the relevant pseudocodes are as follows:

// Const Vector3 = ht.math.Vector3; // Const Vector3 = hT.math. // the first important position vector node.a('relativeP3Vec', new Vector3(node.p3())); // moveQueue is the moveQueue before Node and is the ancestor of Nodefor (let i = 0, l = moveQueue.length; i < l; i++) {
    const moveNode = moveQueue[i],
    parentMoveNode = moveNode.getParent();
    if (parentMoveNode) {
        const[, distancePer] = moveNode.getDisplayName().split(The '-');
        moveNode.a('defP3', moveNode.p3()) moveNode.p3(new Vector3().lerpVectors(new Vector3(moveNode.p3()), new Vector3(parentMoveNode.p3()), distancePer).toArray()); }} // Record the second important relative position vector node.a('relativeP3Vec', new Vector3(node.p3())); // Reverse the parent position of the combinationfor (let i = moveQueue.length - 1; i >= 0; i--) {
    const moveNode = moveQueue[i];
    moveNode.p3(moveNode.a('defP3'));
    moveNode.a('defP3', undefined);
}Copy the code

In the process of scene disassembly, it is necessary to set the device parts node not selectable, so it is necessary to record whether the parts are selectable before they are selectable, so as to restore the initial state of the node. Relevant pseudocodes are as follows:

dm3d.each((node) = >{
    node.a('defSelectable', node.s('3d.selectable'));
});Copy the code

The wireframe effect shown in this article is the wireframe mode supported by THE HT core package, which can be configured by the following code:

dm3d.each((data) = >{
    if (data.s('shape3d') && data.s('shape3d').startsWith('models/')) {
        data.s({
            'shape3d.transparent': true.'shape3d.opacity': 0, // To hide the original model'wf.geometry': true// Enable wireframes'wf.combineTriangle': 2, // Wireframes triangle merge type'wf.color': 'rgba (96172252,0.3)'// Wireframes}); }});Copy the code

The above wF.com bineTriangle mainly includes

  • False,0: do not merge triangles
  • True,1: Merges adjacent triangles into quadrangle faces, original effect
  • 2: fusion of all connected coplanar triangles
  • 3: Fuse all smooth triangular surfaces according to normal information

VR software and hardware installation

The VR hardware device used in this system is HTC VIVE. The following is the process and steps of installing HTC VIVE.

Step 1: Match HTC VIVE with PC host

Go to the HTC official website to find the connection guide and follow the steps to install, we just need to look at the directory in the screenshot below.

Step 2: Download the software

Download Stream VR from Steam’s official website.

Step 3: Open Stream VR to check device status

Open Stream VR, the following screen will appear, which is used to show the working status of HTC VIVE headset. Through the icon, we can check the working status of the headset, controller, positioner and other accessories.

Step 4: Select room setting mode

If your room is large, you can choose the first option. I choose the second option, standing mode. It is recommended to select a room size that can be set completely.

Step 5: Place the helmet and two handle controllers within the visual range of the two positioners to establish positioning

Step 6: Calibrate helmet center point

This step sets the default orientation of the helmet.

Step 7: Locate the ground

Place the two handle controllers within sight of the positioner, then click the button on the computer screen to “calibrate the ground” and wait for the system to calibrate

Step 8: Enter Steam VR’s own room for testing

Once set up, you can enter Steam VR’s own room.

To sum up, when people talk about new applications in 5G era, VR and AR are always a hot topic. Mobile networks in the 4G era are already capable of carrying high-definition video, so 5G era will surely be able to transmit immersive VR and AR images with greater data volume. Therefore, many people regard 5G as the pedal for the rise of VR and AR. It seems that it is not a far-fetched dream to visit the ends of the earth anytime and anywhere. The current 4G network application in VR/AR will bring about a delay of about 70ms, which will lead to the experience of vertigo, while the delay of 5G data transmission can reach the millisecond level, which can effectively solve the vertigo caused by data delay, contributing to the large-scale application of VR/AR. At present, with the gradual popularization of 5G network, VR/AR industry is gradually recovering and the market enthusiasm is gradually warming up. Virtual reality games, virtual reality live broadcasting and other specific applications of 5G in VR/AR. Today, with the progress of science and technology, safety is also an important topic, and the application of VR combined with simulation is also the trend. Simulation can make users feel real and personal, such as fire warning and pipeline warning. Users can experience fire fighting and other firefighter operations in VR world. Let the user immerse in the VR world to feel how to do the actual operation when the fire comes. Therefore, THE application of VR is far more than simulation, simulation and other experiences, but more is to provide real practical effects for people, rather than gimmicks.

Screenshots of application running on mobile phone: