ARSCNView

Integrate SceneKit’s virtual 3D content into the view of your augmented reality experience.

Declaration

class ARSCNView : SCNView
Copy the code

An overview of the

The “ARSCNView” class provides the easiest way to create an AUGMENTED reality experience, combining virtual 3D content with a device camera view of the real world. When you run the ARSession object provided by the view:

– The view automatically renders the live video feed from the device’s camera as the scene background.

– View Scene The world coordinate system of the scene directly responds to the AR world coordinate system established by the session configuration.

– The view will automatically move its SceneKit camera to match the actual movement of the device.

Since ARKit automatically matches the SceneKit space with the real world, placing a virtual object in such a way that it appears to hold the real world position only requires setting the object’s SceneKit position appropriately. (See using SceneKit to provide 3D virtual content.)

You don’t necessarily need to use the ARAnchor class to track the location of the objects you add to the scene, but by implementing the ARSCNViewDelegate method, you can add SceneKit content to any anchor that ARKit automatically detects.

Basic knowledge of

Provides 3D virtual content with scene suites

Use SceneKit to add realistic 3D objects to your AR experience.

var session: ARSession

Manage view content motion tracking and camera image processing for AR sessions.

var scene: SCNScene

The SceneKit scene to display in the view.

Responding to AR Updates

var delegate: ARSCNViewDelegate?

Object that you provide to mediate the synchronization of the AR scene information of the view with SceneKit content.

protocol ARSCNViewDelegate

You can implement methods to mediate automatic synchronization of SceneKit content with AR sessions.

Map content to real-world locations

func anchor(for: SCNNode) -> ARAnchor?

Returns the AR anchor (if any) associated with the specified SceneKit node.

func node(for: ARAnchor) -> SCNNode?

Returns the SceneKit node (if any) associated with the specified AR anchor.

func unprojectPoint(CGPoint, onPlane: simd_float4x4) -> simd_float3?

Projection points from the 2D view back onto a plane in the 3D world space detected by ARKit.

Use SceneKit to provide 3D virtual content

Use SceneKit to add realistic 3D objects to your AR experience.

Overview

Since ARKit automatically matches the SceneKit space with the real world, placing a virtual object to look like it remains in the real world requires that you set the SceneKit position of the object appropriately. For example, in the default configuration, the following code places a 10 cm cube 20 cm in front of the camera’s initial position:

Let cubeNode = SCNNode(Geometry: SCNBox(width: 0.1, height: 0.1, length: 0.1, chamferRadius: 0.1) 0))cubeNode.position = SCNVector3(0, 0, 0.2) / / SceneKit/AR coordinates are in meterssceneView. Scene. The rootNode. AddChildNode (cubeNode)Copy the code

The above code places the object directly in the SceneKit scene of the view. The object is automatically displayed tracking the real world location because ARKit matches the SceneKit space with the real world space.

Alternatively, you can use the ARAnchor class to track real-world locations by creating your own anchors and adding them to a session, or by observing ARKit’s automatically created anchors. For example, when plane detection is enabled, ARKit adds and updates anchor points for each detected plane. To add visual content to these anchors, implement the ARSCNViewDelegate method,

ARWorldTrackingConfiguration

Tracks the configuration of the device’s position relative to objects in the environment.

Declaration

class ARWorldTrackingConfiguration : ARConfiguration
Copy the code

An overview of the

“ARWorldTrackingConfiguration” class with six degrees of freedom (dof) motion tracking equipment: three axis (roll, pitch and yaw) and three translation axis (x, y, and z as the unit of the mobile).

This kind of tracking can create an immersive AUGMENTED reality experience: virtual objects may appear to stay in the same place relative to the real world, even as the user tilts the device to see above or below the object, or moves the device to see the side and back of the object.

Figure 1. 6DOF tracking maintains the AR illusion no matter how the device rotates or moves

The world trace session also provides several ways for your application to identify real-world scene elements or interactions visible to the camera:

– Use planeDetection to find horizontal or vertical surfaces in the real world. Add the surface to the session as an ARPlaneAnchor object.

– use detectionImages to recognize and track 2D image movement. Add a 2D image to the scene as an ARImageAnchor object.

– Use detectionObjects to recognize 3D objects. Add a 3D object to the scene as an ARObjectAnchor object.

– Find the 3D position of the realistic function corresponding to the contact point on the device screen by ray projection.

ARSession

Manage the objects of major tasks associated with each augmented reality experience, such as motion tracking, camera pass-through, and image analysis.

Declaration

class ARSession : NSObject
Copy the code

An overview of the

The “ARSession” object coordinates the main processes ARKit executes on your behalf to create an AUGMENTED reality experience. These processes include reading data from the device’s motion-sensing hardware, controlling the device’s built-in camera, and performing image analysis of captured camera images. The session synthesizes all of these results to establish a correspondence between the real world space where the device resides and the virtual space where you model AR content.

Create a session

Each AR experience requires an ARSession. If you implement a custom renderer, instantiate the session yourself.

let session = ARSession()
session.delegate = self
Copy the code

If you use one of the standard renderers (such as ARView, ARSCNView, or ARSKView), the renderer will create a session object for you. When you want to interact with the application’s session, you can access it on the application’s renderer.

let session = myView.session
Copy the code

Running a session

Running sessions requires configuration. Subclasses of ARConfiguration determine how ARKit tracks the position and motion of the device relative to the real world, and thus determine the type of AR experience you create. ARWorldTrackingConfiguration, for example, allows you to through the rear camera equipment enhance the user to view of the world around them.

Configure and run the session

func run(ARConfiguration, options: ARSession.RunOptions)

Begins AR processing for the session with the specified configuration and options.

var identifier: UUID

Unique identifier of the running session.

struct ARSession.RunOptions

Changes the option to transform the current state of an AR session at configuration time.

var configuration: ARConfiguration?

Objects that define session motion and scene tracking behavior.

func pause()

Pause processing in the session.