1. Introduction of iOS game live broadcast scheme

  • Prior to iOS 9, recording screens were available via the proprietary CoreSurface. Framework. Since the proprietary framework is used, it can only be installed on the user’s device in the form of an enterprise package. The advantage of this method is that it is very efficient, but there is no way to get the game sound.

  • After iOS 9, Apple dropped CoreSurface, making the above method ineffective. For a long time after iOS 9 was released, there was no way to record live screens. Later, they took a different approach by cracking the AirPlay screen projection protocol, using a virtual AirPlay Server on the phone to receive the screen image, and then decoding the live broadcast. At present, most live streaming platforms directly access third-party SDK, such as Lebo and Xindawn. The disadvantage of this scheme is that each time iOS system is upgraded, the corresponding Airplay Mirroring protocol will be updated, resulting in high cracking cost and high technical threshold.

  • In iOS 10, Apple provides ReplayKit, which can record live screen in games, but requires the support of game manufacturers and has low versatility. So basically the major manufacturers are still using the Airplay model.

  • IOS 11 apple enhancements provide a more versatile desktop-level recording solution for ReplayKit2, which is highlighted in this article.

2. ReplayKit2 overview

The video recording function is a new feature of iOS 10. Apple added the Live video streaming function on the basis of the ReplayKit of iOS 9 to save the video recording. For official introduction, see Go Live with ReplayKit. IOS 11 has been upgraded to Replaykit 2, which further improves Replaykit’s ease of use and versatility, and allows you to record the screen of the entire phone, rather than some apps that support Replaykit. Therefore, you are advised to use the ReplayKit2 screen recording mode in iOS11. System recording screen is extended, the extension program has a separate process, iOS system to ensure smooth system, to the extension program resources are relatively small, the extension program will be killed too much memory.

3. Realization of some key functions

# System callback processing
- (void)processSampleBuffer:(CMSampleBufferRef)sampleBuffer withType:(RPSampleBufferType)sampleBufferType {
    @synchronized(self) {
        KSYRKStreamerKit* kit = [KSYRKStreamerKit sharedInstance];
        switch (sampleBufferType) {
            case RPSampleBufferTypeVideo: {
                if(! CMSampleBufferIsValid(sampleBuffer) || ! sampleBuffer)return;
                if(tempVideoTimeStamp && (CFAbsoluteTimeGetCurrent() -tempVideoTimestamp < 0.025)) {#ifdef DEBUG
                    NSLog(@"Frame incoming too fast, frame loss processing");
#endif
                    return;
                }
                if(tempVideoPixelBuffer) { CFRelease(tempVideoPixelBuffer); tempVideoPixelBuffer = NULL; } CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); // Automatic rotation is supported above 11.1if(UIDevice. CurrentDevice. SystemVersion. FloatValue > 11.1) {CGImagePropertyOrientation oritation = ((__bridge NSNumber*)CMGetAttachment(sampleBuffer, (__bridge CFStringRef)RPVideoSampleOrientationKey , NULL)).unsignedIntValue; pixelBuffer = [kit resizeAndRotatePixelBuffer:pixelBuffer withOrientation:oritation]; }else{/ / 0 as unknown, landscape player the default processing pixelBuffer = [kit resizeAndRotatePixelBuffer: pixelBuffer withOrientation: 0]; } CMTime pts = CMSampleBufferGetPresentationTimeStamp(sampleBuffer); [kit.streamerBase processVideoPixelBuffer:pixelBuffer timeInfo:pts]; tempVideoTimeStamp = CFAbsoluteTimeGetCurrent(); tempVideoPts = pts; tempVideoPixelBuffer = pixelBuffer; CFRetain(tempVideoPixelBuffer); }break;
            case RPSampleBufferTypeAudioApp: {
                [kit mixAudio:sampleBuffer to:kit.appTrack];
            }
                break;
            case RPSampleBufferTypeAudioMic:
                [kit mixAudio:sampleBuffer to:kit.micTrack];
                break;
            default:
                break; }}}Copy the code

4. Some known problems and solutions

4.1 Screen frame direction

The video frames that the system calls back are full-size portrait images that need to be processed

/ zoom rotating * / * * * - (CVPixelBufferRef) resizeAndRotatePixelBuffer: (CVPixelBufferRef)sourcePixelBuffer withOrientation:(CGImagePropertyOrientation)orientation {
    @synchronized(self) {
        CIImage *outputImage;
        if (self.privacyMode) {
            if (_privacyImage && outputPixelBuffer) {
                return outputPixelBuffer;
            }
            outputImage = self.privacyImage;
        } else {
            if (_privacyImage) {
                _privacyImage = nil;
            }
            CIImage *sourceImage = [CIImage imageWithCVPixelBuffer:sourcePixelBuffer];
            if(lastSourceOritation ! = orientation) { CGFloat outputWidth = self.videoSize.width; CGFloat outputHeight = self.videoSize.height; CGFloat inputWidth =sourceImage.extent.size.width;
                CGFloat inputHeight = sourceImage.extent.size.height; // If it is landscape and input source is landscape (iPad Pro) or portrait and input source is portraitif((inputWidth > inputHeight && self.isLandscape) || (inputWidth <= inputHeight && ! self.isLandscape)) {if (orientation == kCGImagePropertyOrientationUp) {
                        lastRotateOritation = kCGImagePropertyOrientationUp;
                    } else if (orientation == kCGImagePropertyOrientationDown) {
                        lastRotateOritation = kCGImagePropertyOrientationDown;
                    }
                    lastRotateTransform = CGAffineTransformMakeScale(outputWidth/inputWidth, outputHeight/inputHeight);
                } else{// A rotation is requiredif (orientation == kCGImagePropertyOrientationLeft) {
                        lastRotateOritation = kCGImagePropertyOrientationRight;
                    } else if (orientation == kCGImagePropertyOrientationRight) {
                        lastRotateOritation = kCGImagePropertyOrientationLeft;
                    } else{ lastRotateOritation = kCGImagePropertyOrientationLeft; } lastRotateTransform = CGAffineTransformMakeScale(outputWidth/inputHeight, outputHeight/inputWidth); }}sourceImage = [sourceImage imageByApplyingOrientation:lastRotateOritation];
            outputImage = [sourceImage imageByApplyingTransform:lastRotateTransform];
            lastSourceOritation = orientation;
        }
        if(! NSDictionary* pixelBufferOptions = @{(NSString*) kCVPixelBufferWidthKey: @(self.videoSize.width), (NSString*) kCVPixelBufferHeightKey : @(self.videoSize.height), (NSString*) kCVPixelBufferOpenGLESCompatibilityKey : @YES, (NSString*) kCVPixelBufferIOSurfacePropertiesKey : @{}}; CVReturn ret = CVPixelBufferCreate(kCFAllocatorDefault, self.videoSize.width, self.videoSize.height, kCVPixelFormatType_32BGRA, (__bridge CFDictionaryRef)pixelBufferOptions, &outputPixelBuffer);if(ret! = noErr) { NSLog(@"Failed to create streamer buffer");
                outputPixelBuffer = nil;
                return outputPixelBuffer;
            }
        }
        MTIImage *mtiImage = [[MTIImage alloc] initWithCIImage:outputImage];
        if (cicontext) {
            NSError *error;
            [cicontext renderImage:mtiImage toCVPixelBuffer:outputPixelBuffer error:&error];
            NSAssert(error == nil, @"Render failed");
        }
        returnoutputPixelBuffer; }}Copy the code

4.2 Implementation of privacy mode

In the process of live broadcast, for example, to switch to QQ or input password and other operations, which are not convenient for the audience to see, the privacy mode should be used, and one or more pictures should be used instead of screen shots.

# UIImage The image is converted to CIImage, which can then be resized and reoriented and rendered directly to CVPixelBufferRef via CIContext
UIImage *privacyImage = [UIImage imageNamed:privacyImageName];
CIImage *sourceImage = [[CIImage alloc] initWithImage:privacyImage];
Copy the code

4.3 Video frames do not call back in some cases

Caches the last video frame and completes the frame according to the FPS of the push stream

4.4 Display of barrage and gift information

There are two ways to display barrage and gift information:

1. Socket connection is established in the main App, and after receiving the message, local notification is created to display gifts and bullet screen. Most live streaming applications use this method, and the transformation of the original bullet screen system is relatively small.

2. Similar to the practice of Penguin e-sports, the gift notification of bullet screen can be realized through Apns remote push notification.

4.5 Background Maintenance

The adoption of the first bullet screen involves the background viability of the main App:

Common background preservation modes: VOIP, background location, and playing blank sound.

Considering the problem of power consumption and on-line audit, we currently use background Task to play blank sound.

# create background task
self.taskIdentifier = [[UIApplication sharedApplication] beginBackgroundTaskWithExpirationHandler:^{
    [[UIApplication sharedApplication] endBackgroundTask:weakSelf.taskIdentifier];
    weakSelf.taskIdentifier = UIBackgroundTaskInvalid;
}];

# Play background musicSelf. TaskTimer = [NSTimer scheduledTimerWithTimeInterval: 20.0 f repeats: YES block: ^ (NSTimer * _Nonnull timer) {if ([[UIApplication sharedApplication] backgroundTimeRemaining] < 61.f) {
    //创建播放器
    AVAudioSession *session = [AVAudioSession sharedInstance];
    [session setActive:YES error:nil];
    [session setCategory:AVAudioSessionCategoryPlayback withOptions:AVAudioSessionCategoryOptionMixWithOthers error:nil]; AVAudioPlayer *audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:weakSelf.musicUrl error:nil]; [audioPlayer prepareToPlay]; [audioPlayer play]; [[UIApplication sharedApplication] beginBackgroundTaskWithExpirationHandler:nil]; }}];Copy the code

4.6 Some games have no sound

Tencent games, such as King of Glory, stimulate the battlefield. When the game is played first and then broadcast live, there will be a problem that the host cannot hear the sound of the game.

Reason is that we use AVAudioPlayer background keep alive, can we set the AVAudioSession Option for AVAudioSessionCategoryOptionMixWithOthers guarantee and other applications share the speaker, However, this property will cause non-Mix sounds that are already playing to be stopped.

The only solution is to tell the user to turn on the live broadcast first, then enter the game, so that the sound played later will not be wrong.

4.7 Screen Recording Disconnects after a Screen lock

There is no good solution to this problem except to create a notification to inform the user after disconnection.

Reference 5.

  1. Tencent Cloud Document – Game Recording (ReplayKit)
  2. Replaykit2 live trampling summary
  3. IOS11 ReplayKit2 problem summary