Here are some tips from SALON:

PPT and video

Video address PPT address

Sharing guest -Roc Zhang

www.roczhang.com/
@RocZhang

Roc is introduced in four parts

WWDC 2018

First, Roc brought a first-hand look at WWDC 2018. AR is the future!

ARKit

  • ARKit is a mobile AR platform for developing augmented reality apps on iOS.
  • ARKit provides high-level apis with simple interfaces and a range of powerful features.
  • But more importantly, it will also be available on the tens of millions of existing iOS devices. To get the full functionality of ARKit, A9 and above chips are required. That’s most devices running iOS 11, including the iPhone 6S.

Function of 1.

Tracking

Tracking is a core feature of ARKit, which is tracking devices in real time.

  • World Tracking provides the relative location of the device in the physical environment.
  • Visual — Inertial Odometry(VIO), a Visual Inertial Odometry that uses camera images and motion data from the device, provides an accurate view of where the device is and where it is oriented.
  • Face tracking is unique to the iPhone X

Scene Understanding

The next layer of tracing is scene understanding, which determines the attributes or characteristics of the environment around the device.

  • Plane detection is used to determine surfaces or planes in the physical environment. Like a floor or a table.
  • Apple also provides hit tests for placing virtual objects. This feature allows you to get intersection points with the real world topology to place virtual objects in the physical world.
  • Lighting estimates are used to correctly light your virtual geometry to match the physical world

Rendering (Rendering)

Combined with the above capabilities, virtual content can be seamlessly integrated into the physical environment. So the last layer of ARKit is rendering.

  • Apple made it easy to integrate any renderer. They provide a continuous stream of camera images, tracking information, and scene understanding that can be imported into any renderer.
  • For those of you using SceneKit or SpriteKit, Apple offers custom AR Views that do most of the rendering for you. So it’s really easy to get started.
  • And for those doing custom renderers, Apple provides a Metal template via Xcode that you can incorporate into your custom renderer.
  • Unity and UNREAL will support all of ARKit’s features

2. ARKit course

  • In 2017, Apple introduced ARKit 1.0 at WWDC 2017.
  • ARKit 1.5 was released in early 2018 with the release of iOS11.3.
    • Support 2D image detection, integrate real world images into AR experience;
    • Support virtual objects placed on vertical surfaces;
    • Autofocus is supported for a clearer view
  • WWDC 2018 ARKit 2.0 came out of the window with some new features.
    • Loading and saving maps provides a new persistence and multiplayer experience
    • Ambient textures can render AR scenes more realistically
    • Image tracking, real-time tracking of 2D images
    • Object detection, detection of 3D objects in the scene
    • Face tracking lift

SwiftShot

Next Roc brought you a game for WWDC18.

What does the game show about ARKit2?

  • Multi-user experience. SwiftShot uses the MultipeerConnectivity framework to connect with other local players and send game data between devices.
  • When you start your session, the player who started the session creates an ARWorldMap, which contains ARKit’s understanding of the space around the game board.
  • Other participating players receive a copy of the map and see a photo of the host console. Move their devices, help ARKit process incoming maps, and build shared reference frames for multiplayer games.

How to Create an ARKit App –– Roc’s idea

Next, Roc introduces how to create an ARKit application through his own project ideas.

ARKit is a session-based API. So the first thing you need to do is create a simple ARSession. The ARSession object is used to control all the processing flows that are used to create the AUGMENTED reality app. But first you need to determine what kind of tracking the AUGMENTED reality app will do. So, create an ARSessionConfiguration as well. ARSessionConfiguration and its subclasses are used to determine what tracing sessions will run. As long as the corresponding properties are set, you can get different types of scene understanding and let ARSession do different processing.

To run a session, simply call the run method on ARSession, along with the configuration required. Run (_ configuration) then the processing will begin immediately. At the same time, the bottom layer will start capturing information. AVCaptureSession and CMMotionManager are automatically created behind the scenes. They are used to capture image data and motion data, which will be used for tracking.

After processing is complete, ARSession prints ARFrames.

An ARFrame is a snapshot of the current moment, including all the states of the session, and all the information needed to render the augmented reality scene. To access ARFrame, just get the currentFrame property of ARSession. Or you can set yourself as a delegate to receive the new ARFrame.

Code implementation

class ViewController: UIViewController {...@IBOutlet var sceneView: ARSCNView!
overide func viewDidLoad(a){...let scene = SCNSceneNamed: "XXX") sceneview.scene = scene}override func viewWillAppear(_ animated: Bool){...let configuration = ARWorldTrachingConfiguration(a)Configuration.planeDetection = .horizontal
sceneView.session.run(configuration) }
}
Copy the code

Scene Understanding

Plane detection

  • Plane detection provides a horizontal plane relative to gravity. This includes floors and parallel planes like tables. ARKit 1.5 also started to support vertical plane detection.
  • ARKit aggregates information from multiple frames in the background, so it learns more about the plane as the user moves the device around the scene.
  • Plane detection can also calibrate the edges of a plane by placing a rectangle around all the detected parts of the plane and aligning it with the main area. So it also tells you the major angles of the physical plane.
  • If the same physical plane detects multiple virtual planes, ARKit takes care of merging them. The combined plane expands into both areas, so the detected planes are removed from the session.

Code implementation

// Create a new World Tracking ConfigurationletConfiguration = ARWorldTrackingSessionConfiguration () / / enable plane detection configuration planeDetection =. The change in the operation of the horizontal / / The configuration of the session mySession. Run (configuration)Copy the code

Hit Test

  • A hit test is sending a ray of light from the device, intersecting it with the real world and finding where it intersects.
  • ARKit uses all available scene information, including all detected planes and 3D feature points, which are used by ARWorldTracking to determine location.
  • ARKit then emits light to intersect all available scene information and returns all intersecting points in an array, sorted in ascending distance. So the first element of the array is the closest intersection to the camera.
  • There are different ways to intersect. This can be defined by hit-test type. There are four ways.
    • Existing plane using extent: If you are running plane detection and ARKit has already detected a plane in the environment, you can use that plane. But you can choose to use the plane’s scope or ignore its scope. That is, if you want the user to move objects on a plane, for example, you have to think about the range, so if the light intersects in the range, it will produce a point of intersection. If the beam hits out of range, there will be no intersection.
    • Existing plane: But if you only detect a small part of the ground and want to move the furniture back and forth, you can choose to ignore the range and treat the current plane as an infinite plane. In this case, you always get intersection points.
    • Estimated plane: If no running plane is detected or a plane is not detected, you can estimate the plane based on the current 3D feature points. In this case, ARKit looks for points in the environment that are in a common plane and installs a plane for them. The point of intersection with this plane is also returned.
    • Feature point: If you want to put something on a very small surface, but the surface cannot generate a plane, or a very irregular environment, you can also choose to directly intersect with the Feature point. That is, the light will intersect the feature point and return the closest feature point as the result.

Code implementation

if let point = sceneView.hitTest(screenCenter, type: [.estimatedHorizontalPlane]).first {

// Creat ARAnchor using result 

}
Copy the code

Some gestures

We can add gestures to control zooming, rotation and movement of objects.

Multiuser operation Multiuser

  • Capture the AR World Map

  • The ARKit provides a worldMappingStatus value: notAvailable, extending, Limited, mapped, which indicates whether this is a good time to capture the World Map

  • Receives the current world map

  • Link via MultipeerConnectivity

  • Receive and relocate to the shared map

Beat PM

MultipeerConnectivity&Peertalk

Finally, Roc extended the introduction of MultipeerConnectivity and Peertalk (An iOS and Mac Cocoa library for communicating over USB). If you are interested in communicating, you can understand in detail.

conclusion

Roc shares ARKit features and how to use ARKit.

  1. The world tracking function provided by ARKit can provide the relative position of the device in the physical environment. Scene understanding can determine the attributes or features of the surrounding environment of the device, and better place objects in the real world. ARKit integrates seamlessly with SceneKit and SprintKit. Apple also provides Metal templates to help developers customize their rendering engines. ARKit also offers multiplayer features that allow multiple people to participate in the game.

  2. How to use ARKit, in a nutshell, ARSCNView combines the virtual world information in SCNScene and the real world information captured by ARsession to render the AR world. ARSessionConfiguration and its subclasses instruct ARSession how to track the world, and the results of tracing are returned as ARFrame. The ANAnchor information in ARFrame provides some points to place for SCNNode in SceneKit to bind the virtual node to the real anchor.

reference

Xiaozhuanlan.com/topic/90614… Developer.apple.com/videos/play… Juejin. Cn/post / 684490… Developer.apple.com/documentati… Toutiao. IO/posts/g54ix… Developer.apple.com/documentati…