instructions

ARKit series of articles directory

This is Raywenderlich’s translation of the free chapter on ARKit by Tutorials, chapter 7 of the original book. The book chapter 7 ~ 9 completed a space-time door app. The website address www.raywenderlich.com/195361/buil…


This article is our bookARKit by TutorialsThe book shows you how to build five immersive, good-looking AR apps using Apple’s augmented reality framework, ARKit. Let’s start

Through this series of tutorials, you will implement a portal application using ARKit and SceneKit. Spatial-temporal apps can be used for educational purposes, such as a virtual tour of the solar system, or for leisure activities, such as a virtual beach vacation.

Space-time door app

In this app, you place a virtual door to a futuristic room on a level in the real world. You can walk in and out of this room and explore what’s inside.

In this tutorial, you will build the foundation of a portal application. In this tutorial, you will learn how to:

  • Set up an ARSession
  • Use ARKit to detect and render the horizontal planes

Are you ready to build a gateway to another world?

start

In Xcode, open the Starter project, portal. xcodeProj. Create and run the project, and you’ll see a blank screen.

Ah yes, a blank canvas of opportunity! Open main. storyboard and expand Portal View Controller Scene

The PortalViewController is the interface presented to the user after the application is started. It includes an ARSCNView to show the camera preview. Two UILabels are also included to provide instructions and feedback to users.

Now, open PortalViewController. Swift. In this file, you’ll see the following variables, which represent the elements in the storyboard:

/ / 1
@IBOutlet var sceneView: ARSCNView?
/ / 2
@IBOutlet weak var messageLabel: UILabel?
/ / 3
@IBOutlet weak var sessionStateLabel: UILabel?
Copy the code

Let’s take a look at what it says:

  1. SceneView is used to display 3D SceneKit objects on the camera view.
  2. MessageLabel, it’s a UILabel, and it’s going to show the user a declarative message. Instructions, telling them how to interact with your app.
  3. The sessionStateLabel, another UILabel, notifies the user of session interruption, such as when the app goes into the background or the ambient light does not meet the conditions.

Note :ARKit handles all the sensor and camera data, but it doesn’t actually render any virtual content. To render content in your scene, use various renderers that work with ARKit, such as SceneKit or SpriteKit. ARSCNView is a framework provided by Apple that allows you to easily merge ARKit data with SceneKit. There are many benefits to using ARSCNView, which is why you should use it for the project in this tutorial.

In the Starter project, you’ll see a lot of tools in the Helpers group. You’ll use them later in the development of your app.

Establish ARKit

The first step is to capture video streams with a camera. To do this, you need to use the ARSCNView object.

Open the PortalViewController. Swift and add the following method:

func runSession(a) {
  / / 1
  let configuration = ARWorldTrackingConfiguration.init(a)/ / 2
  configuration.planeDetection = .horizontal
  / / 3
  configuration.isLightEstimationEnabled = true
  / / 4sceneView? .session.run(configuration)/ / 5
  #if DEBUGsceneView? .debugOptions = [ARSCNDebugOptions.showFeaturePoints]
  #endif
}
Copy the code

Code description:

  1. First instantiates a ARWorldTrackingConfiguration object. It defines the configuration information for the ARSession. Available for ARSession configuration type has two kinds: ARSessionConfiguration and ARWorldTrackingConfiguration. Using ARSessionConfiguration is not recommended because it uses the rotation information of the device and has no location. For using the A9 processor set ARWorldTrackingSessionConfiguration is the best choice, because it tracks the equipment of all sports.
  2. The configuration planeDetection is set to test level. The range of planes may change, and multiple planes may merge into one as the camera moves. It can be found on any level such as a floor, table or bed.
  3. It enables light estimation calculations that can be used by rendering frames to create more realistic virtual objects.
  4. Starts the AR process of the session with the specified configuration. This will start the ARKit session and video capture and display on the sceneView.
  5. Debug Settings, which add visible feature points; Overlay the camera view.

Now it’s time to create the default Settings for labels. Replace resetLabels() with the following:

func resetLabels(a){ messageLabel? .alpha =1.0messageLabel? .text ="Move the phone around and allow the app to find a plane." +
    "You will see a yellow horizontal plane."sessionStateLabel? .alpha =0.0sessionStateLabel? .text =""    
}
Copy the code

This sets the messageLabel and sessionStateLabel with transparency and text. Remember, the messageLabel is used to show instructions to the user, and the sessionStateLabel is used to show error messages, in case something goes wrong.

Now, add runSession() to the viewDidLoad() in the PortalViewController:

override func viewDidLoad(a) {
  super.viewDidLoad()    
  resetLabels()
  runSession()
}
Copy the code

This will run the ARKit session when the app starts and loads the view.

Next, build and run the app. Don’t forget — you need to grant camera access to the app.

ARSCNView does the heavy lifting of camera video capture and display. Since this is debug mode, you can see the rendered feature points, which form a point cloud that shows the intermediate results of the scene analysis.

Plane detection and rendering

Previously, in runSession(), you set planeDetection to.Horizontal, which means your app can horizontal. You can get the captured plane information in the proxy callback method of the ARSCNViewDelegate protocol.

Add a class extension to PortalViewController to implement the ARSCNViewDelegate protocol:

extension PortalViewController: ARSCNViewDelegate {}Copy the code

Add the following code at the end of runSession() :

sceneView? .delegate =self
Copy the code

This line of code sets the PortalViewController as the ARSCNViewDelegate proxy for the sceneView object.

ARPlaneAnchors are automatically added to the ARSession anchors group, and ARSCNView automatically transforms the ARPlaneAnchor object into a SCNNode node.

Now, to render the planes, all you need to do is implement the ARSCNViewDelegate proxy method:

/ / 1
func renderer(_ renderer: SCNSceneRenderer,
              didAdd node: SCNNode,
              for anchor: ARAnchor) {
  / / 2
  DispatchQueue.main.async {
    / / 3
    if let planeAnchor = anchor as? ARPlaneAnchor {
        / / 4
      #if DEBUG
        / / 5
        let debugPlaneNode = createPlaneNode(
          center: planeAnchor.center,
          extent: planeAnchor.extent)
        / / 6
        node.addChildNode(debugPlaneNode)
      #endif
      / / 7
      self.messageLabel? .text ="Tap on the detected horizontal plane to place the portal"}}}Copy the code

Code Meaning:

  1. When ARSession detects a new plane, the renderer(_:didAdd:for:) method is called, and ARSCNView automatically adds an ARPlaneAnchor for the plane.
  2. This callback is in the background thread. In this case, you need to dispatch to the main thread, because updating the UI needs to be done on the main thread.
  3. Check if ARAnchor is an ARPlaneAnchor.
  4. Check whether the system is in debug mode.
  5. If so, create a plane SCNNode node using the center and area coordinates of the planeAnchor detected by ARKit. CreatePlaneNode () is a helper class method implemented later.
  6. The Node object is an empty SCNNode that is automatically added to the scene by ARSCNView; Its coordinates are aligned to the position of the ARAnchor. Here, you add a debugPlaneNode as a child node so that it will be placed at the node’s location.
  7. Finally, whether we are in Debug mode or not, we update the user with a message indicating that the app is now ready to place the portal into the scene.

Now it’s time to create the helper class’s methods.

Create a new Swift file called scnnodehelpers.swift. Used to hold all the utility methods associated with rendering the SCNNode object.

Import SceneKit into a file:

import SceneKit
Copy the code

Now, add the following help method:

/ / 1
func createPlaneNode(center: vector_float3, extent: vector_float3) -> SCNNode {
  / / 2
  let plane = SCNPlane(width: CGFloat(extent.x),
                      height: CGFloat(extent.z))
  / / 3
  let planeMaterial = SCNMaterial()
  planeMaterial.diffuse.contents = UIColor.yellow.withAlphaComponent(0.4)
  / / 4
  plane.materials = [planeMaterial]
  / / 5
  let planeNode = SCNNode(geometry: plane)
  / / 6
  planeNode.position = SCNVector3Make(center.x, 0, center.z)
  / / 7
  planeNode.transform = SCNMatrix4MakeRotation(-Float.pi / 2.1.0.0)
  / / 8
  return planeNode
}
Copy the code

Code interpretation:

  1. The createPlaneNode method takes two arguments: the center and extent of the plane to be rendered, both of type Vector_FLOAT3. This type represents the coordinates of a point. This function returns an object of type SCNNode.
  2. Creates an SCNPlane with the specified width and height. Width is the X-coordinate of extent, and height is the Z-coordinate.
  3. Initialize the SCNMaterial object and assign the diffuse content. The diffuse layer color is set to a translucent yellow.
  4. The SCNMaterial object is then added to the plane’s Materials array. This defines the texture and color of the plane.
  5. Create an SCNNode node with plane geometry. SCNPlane, which inherits from the SCNGeometry class, provides only the visible objects rendered by the SceneKit. Specify the location and orientation of the geometry by attaching it to the SCNNode object. Multiple nodes can refer to the same geometry object and allow it to appear at different locations in a scene.
  6. Set the planeNode location. Note that the node is shifted to the coordinate points (center.x, 0, center.z) based on the ARPlaneAnchor instance object information reported by ARKit.
  7. The plane in SceneKit is vertical by default, so you need to rotate it 90 degrees to make it horizontal.
  8. This step returns the planeNode object created in the previous step.

Run the app, and if ARKit detects the right surface, you’ll see a yellow horizontal surface.

ARKit continuously updates the location and size of the plane based on newly discovered feature points. If you want to receive these updates, the renderer in PortalViewController. Add the following swift (_ : didUpdate: agent for:) methods:

/ / 1
func renderer(_ renderer: SCNSceneRenderer,
              didUpdate node: SCNNode,
              for anchor: ARAnchor) {
  / / 2
  DispatchQueue.main.async {
    / / 3
    if let planeAnchor = anchor as? ARPlaneAnchor,
      node.childNodes.count > 0 {
      / / 4
      updatePlaneNode(node.childNodes[0],
                      center: planeAnchor.center,
                      extent: planeAnchor.extent)
    }
  }
}
Copy the code

Explanation:

  1. Renderer (_:didUpdate:for:) will be called when the corresponding ARAnchor is updated.
  2. Update UI operations on the main thread.
  3. Check ARAnchor to make sure it is an ARPlaneAnchor type and that it has at least one child node corresponding to the plane SCNNode.
  4. The updatePlaneNode(_: Center :extent:) method will be implemented later. It updates the coordinates and dimensions of the plane based on the information in the ARPlaneAnchor.

Open the scnnodehelpers.swift file and add the following code:

func updatePlaneNode(_ node: SCNNode,
                     center: vector_float3,
                     extent: vector_float3) {
  / / 1
  let geometry = node.geometry as? SCNPlane
  / / 2geometry? .width =CGFloat(extent.x) geometry? .height =CGFloat(extent.z)
  / / 3
  node.position = SCNVector3Make(center.x, 0, center.z)
}
Copy the code

Explanation of code:

  1. Check whether the node has SCNPlane geometry.
  2. Updates the geometry of the node with the passed parameters. Update the width and height of the plane with extent or size in the ARPlaneAnchor.
  3. Update the plane node position to the new position.

Now that you have successfully updated the position of the plane, run the app. You will see that the size and position of the plane are adjusted as new feature points are detected.

There is still a problem to be solved. Once the app detects a plane, if you exit the app and come back, you’ll see that the previously detected plane is still in the camera view, in front of the other objects; It no longer matches the previous plane.

To fix this, you need to remove the plane node when the ARSession is interrupted. We’ll deal with this in the next chapter.

What’s the next step?

You may not realize it, but you’re well on your way to creating a portal app! Yes, there’s still a lot to do, but you’re well on your way to virtual space.

A brief summary of this chapter:

  • Explore the Starter project and review the basics of ARKit.
  • Configure an ARSession to display the camera’s output in the app.
  • Add plane detection and other functions so that the app can render horizontal planes using the ARSCNViewDelegate protocol.

In the next tutorial, you’ll learn how to handle session interruptions and use SceneKit to render 3D objects in views. Click here to continue part 2 of this tutorial series!

If you enjoyed this tutorial, check out our full book, ARKit by Tutorials.

Information Download Address