0 x00 profile

What can ReplayKit do?

  • Support mobile phone screen recording, and streaming to the third party live service

  • Live audio and video directly from the device (screen + sound played by the system)

  • Supports the use of microphones and cameras for commentary

  • The content is safe and the whole process will only be pushed to third-party live streaming services

A little imagination?

  • Live games to Mobcrush or Youtube platforms, domestic inke, Huajiao, etc

  • The desktop display is synchronized when WebEx is used for network meetings

  • Use TeamView QS for customer service

  • Live stream a drawing app to Facebook

  • We can teach parents how to use mobile phones remotely and in real time by recording screen broadcast

ReplyKit VS ReplayKit 2

Let’s review the interaction flow of ReplayKit:First, the API will be available from iOS 10. Let’s say you have a game app that allows users to live stream to a third-party streaming platform, such as Mobcrush or Twitch, while playing the game.

First, your app needs to do a few things:

  • Realize UI interaction such as start/pause/stop live broadcast

  • Use ReplayKit to start/pause/stop live broadcasts

What the third-party live broadcasting platform needs to do:

  • Provide UI (Broadcast UI Extension) such as login and setting of live Broadcast title.

  • Implement push stream logic to transfer screen and audio etc. to the server (Broadcast Upload Extension)

The process for users to start live streaming is as follows:

Users enter your game app, click the recording button provided in the app, and then a prompt will pop up to select a live broadcast platform. After selection, the page provided by third-party APP UI Extension will be displayed for login and setting the title of live broadcast. After completion, ReplayKit began to record the screen, and gave the sample of audio and video to the Upload Extension of the third-party app for coding and streaming.

Since users must go to your app to Broadcast, it is also called in-App Broadcast

Obviously, a drawback of in-App Broadcast is that if the user leaves the App, such as retreating to the background or switching to another App, the live Broadcast stops.

ReplayKit 2 process

In contrast to ReplayKit, users of ReplayKit 2 initiate livestream from the control center (the same button that records the screen, which can be re-pressed to select a third-party livestream platform). The rest of the process is similar to ReplayKit for developers. So it’s also callediOS System Broadcast. ReplayKit 2 requires iOS 11+.

ReplayKit 2 has several advantages:

  • Users can directly start and stop live broadcast from the control center, which is a system-level operation unrelated to app. Users can seamlessly record screen broadcast across app without worrying about background withdrawal or changing app scenes.

  • Your app doesn’t need to provide app UI interaction anymore. In other words, unlike ReplayKit, you can do nothing but have your app support live streaming to third-party platforms.

What’s new with iOS 12 ReplayKit?

Ha ha.. A picture is worth a thousand words, I believe I need not say more. However, since you can initiate from your own app, will it be like ReplayKit 1 to provide a custom start/stop live button and control the live start and stop logic? The answer is no, RPSystemBroadcastPickerView provided by the system can be used directly.

0x01 System broardcast picker

Create a RPSystemBroadcastPickerView

It says that iOS 12 will now be able to initiate system-level live recording from your app. Then we need to provide a start button in the app. ReplayKit provides RPSystemBroadcastPickerView this class, we need to do is very simple, is to create a RPSystemBroadcastPickerView instance, added to the parent view. RPSystemBroadcastPickerView internal will be responsible for calling ReplayKit 2 API to start/stop live logic. so easy!

OR

import ReplayKit.broadcastclass ViewController: UIViewController { 
  var broadcastPicker: RPSystemBroadcastPickerView? 
  override func viewDidLoad() { 
    super.viewDidLoad() 
    broadcastPicker = RPSystemBroadcastPickerView(frame: kPickerFrame) 
    view.addSubview(broadcastPicker) 
  } 
}Copy the code

preferredExtension

Another RPSystemBroadcastPickerView one attribute: preferredExtension, you can specify your tendency to third party platform.

BroadcastPicker. PreferredExtension = "com. Your - app. Broadcast. The extension"Copy the code

IOS 12 ReplayKit Interaction Flowchart

Whether the live broadcast is initiated from the control center or the APP, it is system level. You can stop recording by using the Status bar. As you can also see, there is no API provided to start/stop the live stream via direct control.

0x2 Developing a Broadcast Extension

If you are a live streaming platform, you will need to provide an Upload Extension. Let’s take a look at the data flow diagram:Responsibilities of Upload Extension:

  • Accept audio and video samples (unencoded)

  • Encode the audio and video Sample and push the stream to CDN

  • Handles the screen rotation event

  • Flag app switch events during live broadcast and provide app switch information during app switch (Cooool!). How to create an Upload extension?After creating the Upload Extension target, Xcode generates a code template. Methods represent extension’s lifecycle functions. Method names are self-explanatory and need no further explanation.

// SampleHandler created by Xcode templates for Upload Extension class SampleHandler: RPBroadcastSampleHandler { 
  // User has requested to start the broadcast 
  override func broadcastStarted(withSetupInfo setupInfo: [String : NSObject]?) 
  // User has requested to finish the broadcast 
  override func broadcastFinished() 
  // Handle the sample buffer here 
  override func processSampleBuffer(_ sampleBuffer: CMSampleBuffer, with sampleBufferType: RPSampleBufferType) 
  // Use details of application to annotate the broadcast 
  override func broadcastAnnotated(withApplicationInfo info: [String : NSObject]) }Copy the code

Let’s focus on the processing, which corresponds to the processSampleBuffer method.

// Both audio and video samples are handled by processSampleBuffer routine override func processSampleBuffer(_ sampleBuffer: CMSampleBuffer, with sampleBufferType: RPSampleBufferType) { switch sampleBufferType { case RPSampleBufferType.video: / / app of dealing with the recording of video frames var imageBuffer: CVImageBuffer = CMSampleBufferGetImageBuffer (sampleBuffer)! Var PTS = CMSampleBufferGetPresentationTimeStamp (sampleBuffer) as CMTime / / use VideoToolbox for hardware encoding video sampling VTCompressionSessionEncodeFrame(session, imageBuffer, pts, kCMTimeInvalid, nil, nil, Nil) break case RPSampleBufferType. AudioApp: / / processing app audio, code omitted here case RPSampleBufferType. AudioMic: // Process the audio from the microphone.Copy the code

How to handle app switching?

During the live broadcast, users may switch apps, so how can they know what app they are using?

// Use application details to help users find your broadcast override func broadcastAnnotated(withApplicationInfo applicationInfo: [AnyHashable : Any]) { var bundleIdentifier = applicationInfo[RPApplicationInfoBundleIdentifierKey] if (bundleIdentifier ! = nil) { session.addMetadataWithApplicationInfo(bundleIdentifier) } }Copy the code

When the user switches apps, it tells us what the current app is in the broadcastAnnotated callback, Through the key to the current app RPApplicationInfoBundleIdentifierKey bundle ID. These can be used to remind users who are currently watching a live broadcast. ReplayKit is very thoughtful.

0x3 Content Protection There may be some content in the App that you do not want to record or broadcast. You can check whether the screen is being recorded. If the screen is being recorded, hide sensitive information or stop playing sensitive video or audio. So how do you check if you’re recording?

  • Check out Uiscreen. captured and uiscreen.screens. Count

  • Registered UIScreenCapturedDidChangeNotification notice

Sample Code:

import UIKit class func handleScreenCapturedChange() { let isScreenMirroring = UIScreen.screens.count > 1 if (UIScreen.isCaptured && ! isScreenMirroring) { // stop audio playback and remove sensitive content from the screen } }Copy the code

Screens. Count. Uiscreen.screens. A look at the documentation for Uiscreen. isCaptured shows that isCaptured returns YES when captured, reflected, or captured via AirPlay. Uiscreen. isCaptured! IsScreenMirroring ensures that the screen is being recorded and not mirrored.

Read more:

  1. WWDC 2016 – Session 601 – Go Live with ReplayKit

  2. WWDC 2017 – Session 606 – What’s New with Screen Recording and Live Broadcast

That’s all, thank you!