When building an iOS/tvOS application with video playback capabilities, the best player solution can be to build a custom player to match the exact application requirements. This article is a technical overview of the basics of building a custom player on top of AVFoundation.

The idea is that AVPlayer is used as a playback engine, taking advantage of its inefficiency and stability.

The key point

To create a great player experience, the following are important points:

  • Provide the best stream for the player
  • Observe the player
  • Management DRM
  • According to the control

Provide the best stream for the player

AVPlayer receives its stream AVPlayerItem. Project preparation should be completed as soon as possible to ensure a seamless user experience. Apple released a WWDC video about this.

Before providing AVPlayerItemto, there are a few things you can do to prepare AVPlayer:

  • Management DRM
  • Find the correct play location
  • Select audio and subtitle languages

Management DRM

In this case, we’re talking about Fairplay DRM, which is AVPlayer.

When an HLS stream is protected by Fairplay, the HLS playlist has a SESSION_KEY tag with a URI, which is collected by the project to allow the application and operating system to load the key to decode the stream. To obtain a license, we need to set an AVAssetResourceLoaderDelegate for AVPlayerItem ‘s AVURLAsset. The delegate creates a hook on the AVURLAsset resource load (license in our case) that lets the application get the license at playback time. During the hook, the application needs:

  • Obtaining application Certificate
  • useAVAssetResourceLoadingRequest‘sstreamingContentKeyRequestDataMethod to generate request data
  • Ask the application server for the content key for the requested data
  • requirementsAVAssetResourceLoadingRequestRespond with the content key
  • Close the hook
// mark - AVAssetResourceLoaderDelegate func resourceLoader(_ resourceLoader: AVAssetResourceLoader, shouldWaitForLoadingOfRequestedResource loadingRequest: AVAssetResourceLoadingRequest) -> Bool { guard loadingRequest.isContentKeyRelated else return { false } loadContentKey(for: loadingRequest) { result in switch result { case let .failure(error): loadingRequest.finishLoading(with: error) case let .success(data): loadingRequest.dataRequest? .respond(with: data) loadingRequest.finishLoading() } } return true } // mark - Private private func loadContentKey(for loadingRequest:  AVAssetResourceLoadingRequest, completion: (Result<Data, Error>) -> Void) { let uriData = loadingRequest.uriData getApplicationCertificate() { certificate in let challengeData =  try loadingRequest.streamingContentKeyRequestData(forApp: certificate, contentIdentifier: uriData, options: nil) self.fetchContentKey(with: challengeData, uriData: uriData, completion: completion) } }Copy the code

Licenses can also be prefetched before playing, rather than while playing; We won’t talk about prefetching, but the mechanics are pretty much the same. Prefetch keys are a great way to improve the user experience by avoiding content key extraction, tenths of a second of length, and tasks before playback begins.

Find the correct play location

To start playing at the location requested by the user, the idea is to use the AVPlayerItem seekTo method. These methods require time tolerance before and after, and tolerance parameters should be used to allow faster lookups by utilizing stream coding.

The AVPlayer uses its time reference CMTime type. If the stream is time-based, this should be the case for any non-real-time stream, and it is natural to perform a lookup from a in the TimeInterval representing the location in the stream, starting at zero. Then converting CMTime from TimeIntervalto is very simple.

With live streams, on the other hand, it’s natural to browse through playback with a date, because it represents an ongoing event. For this purpose, the HLS stream carries an ext-X-program-date-time tag that allows the player to convert the DATE into its own CMTime reference.

One final tip: If the state of the item is not, the seekToDate method of AVPlayerItem will not complete readyToPlay.

Select audio and subtitle languages

The player allows the user to select language and subtitles. Automatically applying user preferences at the start of a play could also be an interesting feature. To ensure best performance, this can even be done before playback. To do this, the idea is to asynchronously load the AVPlayerItem asset mediaSelectionOptions and select the appropriate option among the available options.

extension AVPlayerItem { func ad_select(languageTag: String, subtitlesTag: String, completion: () -> Void) { ad_loadAssetContent { [weak self] in self? .ad_selectOption(tagged: languageTag, for: .audible) self? .ad_selectOption(tagged: subtitlesTag, for: .audible) } } // MARK: - Private private func ad_loadAssetContent(with completion: @escaping () -> Void) { let selector = #selector(getter: AVAsset.availableMediaCharacteristicsWithMediaSelectionOptions) let selectorString = NSStringFromSelector(selector) asset.loadValuesAsynchronously(forKeys: [selectorString], completionHandler: completion) } private func ad_selectOption(tagged optionTag: String, for mediaCharacteristic: AVMediaCharacteristic) { guard let option = asset .mediaSelectionGroup(forMediaCharacteristic: mediaCharacteristic)? .options .filter({ $0.extendedLanguageTag == optionTag }) .first else { return } ad_select(mediaSelectionOption: option, for: mediaCharacteristic) } private func ad_select(mediaSelectionOption: AVMediaSelectionOption, for mediaCharacteristic: AVMediaCharacteristic) { guard let group = asset.mediaSelectionGroup(forMediaCharacteristic: mediaCharacteristic) else { return } select(mediaSelectionOption, in: group) } }Copy the code

To observe the

Once the stream is played on the screen, the application will certainly need feedback about the playback, showing the relevant controls first, and possibly monitoring player performance and user activity.

Regular observation

After playing it is important to first update the control transfer bar, which can be done with a regular time viewer for AVPlayer.

observer = avPlayer.addPeriodicTimeObserver(
    forInterval: CMTime(seconds: 1, preferredTimescale: CMTimeScale(NSEC_PER_SEC)),
    queue: DispatchQueue.main
) {  _ in
    // Compute the current player state and provide it to the controls
}
Copy the code

Events observed

Regular viewing is perfect when the player is playing the video nicely and nothing else is happening. But the video play was full of ambushes.

Most events come from AVPlayerItem and can be captured via KVO. loadedTimeRanges, isPlaybackBufferEmpty, IsPlaybackLikelyToKeepUp, isPlaybackBufferFull or seekableTimeRanges help understand player buffer works. Status is essential because it defines whether the player can get the content; This is usually where a playback startup error occurs.

avPlayer.replaceCurrentItem(with: playerItem) observer = playerItem.observe(\.status) { [weak self] (item, _) in switch item.status { case .readyToPlay: self? .avPlayer.play() case .failed: // handle item.error default: break }Copy the code

Some events AVPlayer is also important. Rate, connects to the Play/pause button in the control, or externalPlaybackActive activates Airplay.

Also NotificationCenter interesting events, including applicationWillResignActive/applicationDidBecomeActive, or AudioSessionRouteChange; Registering these notifications will cause the player to pause and resume when the application goes into the background, or update the UI when Airplay starts/stops.

According to the control

Once the stream is ready and the player is playing properly, the final step is to display the control at the top of the player view. To keep the control up to date when it plays, a good solution is to collect all play information into a property object, publish this object every time a play changes, and register the view controller responsible for displaying those changes. Each UI element is then easily linked to its associated information.

conclusion

These mandatory steps are a good foundation for building a custom player; They handle the most important parts of the play. However, it may take some extra work to provide a great user experience. Among the possible additional features, depending on the project, might be monitoring player performance, providing advanced controls (such as panning and closing gestures) or even support for Airplay or Google Chromecast.

Recommended at the end of the article: iOS popular anthology

The interview basis

IOS Interview Basics (part 1)

IOS Interview Basics (part 2)

IOS Interview Basics (part 3)

IOS Interview Basics (4)

IOS Interview Basics (5)

Knowledge,

IOS Interview Essentials GCD interview essentials

IOS Interview Essentials multithreaded interview essentials

IOS Interview Essentials block interview essentials

IOS Interview Essentials Runtime Interview essentials

RunLoop Interview Tips

Include management interview points in iOS interview points

IOS interview essentials MVC, MVVM interview essentials

IOS Interview tips for network performance optimization

IOS Interview Essentials for Web programming interview essentials

Interview Tips for KVC&KVO

IOS Interview Essentials data storage interview essentials

A mix of iOS interview tips

IOS Interview Essentials design mode interview essentials

IOS Interview Essentials UI Interview essentials