instructions

ARKit series of articles directory

Ray Wenderlich’s ARKit by Tutorials is a summary of his reading notes

Yes, this article is about AR effects on the iPhone X’s front-facing TrueDepth camera! Main functions:

  • Face detection and tracking
  • Real-time facial expression tracking
  • Capture both color images and depth images
  • Light is estimated

Initial Configuration

Facial start and the other is the same as the AR, only need to replace configuration items with ARFaceTrackingConfiguration (), error handling is similar:

func resetTracking(a) {
  / / 1
  guard ARFaceTrackingConfiguration.isSupported else {
    updateMessage(text: "Face Tracking Not Supported.")
    return
  }

  / / 2
  updateMessage(text: "Looking for a face.")

  / / 3
  let configuration = ARFaceTrackingConfiguration()
  configuration.isLightEstimationEnabled = true /* default setting */
  configuration.providesAudioData = false /* default setting */

  / / 4
  session.run(configuration, options: 
              [.resetTracking, .removeExistingAnchors])
}
Copy the code
func session(_ session: ARSession, didFailWithError error: Error) {
  print("** didFailWithError")
  updateMessage(text: "Session failed.")}func sessionWasInterrupted(_ session: ARSession) {
  print("** sessionWasInterrupted")
  updateMessage(text: "Session interrupted.")}func sessionInterruptionEnded(_ session: ARSession) {
  print("** sessionInterruptionEnded")
  updateMessage(text: "Session interruption ended.")}Copy the code

Face tracking

When ARKit detects a face, it adds an ARFaceAnchor to the scene, which we can use for location and tracking.

If there are two faces,ARKit only tracks the largest, most recognizable face. It also adds a face coordinate system to the person’s face:

ARSCNFaceGeometry
mask

func createFaceGeometry(a) {
  updateMessage(text: "Creating face geometry.")

  let device = sceneView.device!

  let maskGeometry = ARSCNFaceGeometry(device: device)!
  mask = Mask(geometry: maskGeometry)
}
Copy the code

This method can be called directly from within viewDidLoad, so you don’t have to wait for the face to appear.

Finally, add the mask to the view to display it:

func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
  anchorNode = node
  // Display the mask
  setupFaceNodeContent()
}

func setupFaceNodeContent(a) {
  guard let node = anchorNode else { return }

  node.childNodes.forEach { $0.removeFromParentNode() }

  if let content = mask {
    node.addChildNode(content)
  }
}
Copy the code

The previously created geometry, ARSCNFaceGeometry, may be empty because the face has not yet been recognized. So we need to update it later in the update method.

func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {
  guard let faceAnchor = anchor as? ARFaceAnchor else { return }
  updateMessage(text: "Tracking your face.")
  // Update the geometry based on the anchor pointsmask? .update(withFaceAnchor: faceAnchor) }// Tag: ARFaceAnchor Update
func update(withFaceAnchor anchor: ARFaceAnchor) {
  let faceGeometry = geometry as! ARSCNFaceGeometry
  faceGeometry.update(from: anchor.geometry)
}
Copy the code

The default lighting Settings look like this:

/* default settings */
sceneView.automaticallyUpdatesLighting = true 
sceneView.autoenablesDefaultLighting = false 
sceneView.scene.lightingEnvironment.intensity = 1.0 
Copy the code

But we can also adjust to the circumstances to achieve different results:

func renderer(_ renderer: SCNSceneRenderer, updateAtTime time: TimeInterval) {
  / / 1
  guard letestimate = session.currentFrame? .lightEstimateelse {
    return
  }

  // 2 in ARKit 1000 means medium light intensity
  let intensity = estimate.ambientIntensity / 1000.0
  sceneView.scene.lightingEnvironment.intensity = intensity

  / / 3
  let intensityStr = String(format: "%.2f", intensity)
  let sceneLighting = String(format: "%.2f",
    sceneView.scene.lightingEnvironment.intensity)

  / / 4
  print("Intensity: \(intensityStr) - \(sceneLighting)")}Copy the code

Blend Shapes Expression animation

How do you animate the emojis in Animojis? It is used to blend shapes, its essence is a dictionary, the key is ARFaceAnchor BlendShapeLocation constants, and the value is a floating point number, range (naturally) ~ 1.0 0.0 (the largest mobile state).

For example, if we want to add a blinking effect, take out the eyeBlinkLeft:

This dictionary is updated as ARFaceAnchor is updated:

// - Tag: ARFaceAnchor Update
func update(withFaceAnchor anchor: ARFaceAnchor) {
  blendShapes = anchor.blendShapes
}

// - Tag: BlendShapeAnimation
var blendShapes: [ARFaceAnchor.BlendShapeLocation: Any] = [:] {
  didSet {
    guard
      // Brow
      let browInnerUp = blendShapes[.browInnerUp] as? Float
      // Right eye
let eyeLookInRight = blendShapes[.eyeLookInRight] as? Float.let eyeLookOutRight = blendShapes[.eyeLookOutRight] as? Float.let eyeLookUpRight = blendShapes[.eyeLookUpRight] as? Float.let eyeLookDownRight = blendShapes[.eyeLookDownRight] as? Float.let eyeBlinkRight = blendShapes[.eyeBlinkRight] as? Float
      else { return }

    // Processing animation
}
Copy the code

The effect after implementation

ReplayKit video

ReplayKit, introduced in iOS 9, is used to record audio, video, and microphones. It has two main classes:

  • RPScreenRecorder: singleton used to start/stop recording.
  • RPPreviewViewController: preview controller after recording.

ReplayKit 2

  • IOS screen recording and broadcasting
  • Radio paired
  • Fast camera switching
  • In-app screen capture

Start recording

The main code

let sharedRecorder = RPScreenRecorder.shared()
private var isRecording = false 
Copy the code
// Private functions
private func startRecording(a) {
  / / 1
  self.sharedRecorder.isMicrophoneEnabled = true

  / / 2
  sharedRecorder.startRecording( handler: { error in
    guard error == nil else {
      print("There was an error starting the recording: \(String(describing: error? .localizedDescription))")
      return
    }

    / / 3
    print("Started Recording Successfully")
    self.isRecording = true

    / / 4
    DispatchQueue.main.async {
      self.recordButton.setTitle("[ STOP RECORDING ]".for: .normal)
      self.recordButton.backgroundColor = UIColor.red
    }
  })
} 
Copy the code

To stop recording

The main code

func stopRecording(a) {
  / / 1
  self.sharedRecorder.isMicrophoneEnabled = false

  / / 2
  sharedRecorder.stopRecording( handler: { 
    previewViewController, error in
    guard error == nil else {
      print("There was an error stopping the recording: \(String(describing: error? .localizedDescription))")
      return
    }

    / / 3
    if let unwrappedPreview = previewViewController {
      unwrappedPreview.previewControllerDelegate = self
      self.present(unwrappedPreview, animated: true, completion: {})
    }
  })

  / / 4
  self.isRecording = false
  DispatchQueue.main.async {
    self.recordButton.setTitle("[ RECORD ]".for: .normal)
    self.recordButton.backgroundColor = UIColor(red: 0.0039, 
      green: 0.5882, blue: 1, alpha: 1.0) /* #0196ff */}}Copy the code

The effect is as follows:

In addition, you can implement a proxy method for sharedRecorder to listen for error callbacks and state changes:

// RPScreenRecorderDelegate methods
func screenRecorder(_screenRecorder: RPScreenRecorder, didStopRecordingWith previewViewController: RPPreviewViewController? , error: Error?) {
  guard error == nil else {
    print("There was an error recording: \(String(describing: error? .localizedDescription))")
    self.isRecording = false
    return}}func screenRecorderDidChangeAvailability(_ screenRecorder:  
  RPScreenRecorder) {
  recordButton.isEnabled = sharedRecorder.isAvailable
  if! recordButton.isEnabled {self.isRecording = false}}Copy the code

Use method is not too difficult, here no longer too much expansion, details can be bought to read legitimate books.

Part four Reading notes end!