AnyLive profile

AnyLive is anyRTC open source push and pull stream project. Cross-platform architecture design, a set of code support Android, iOS, Windows, Mac, Ubuntu and other platforms. This paper mainly introduces the implementation of anyLive iOS platform.

Download the source code

  • Download the source code

The development environment

  • Development tools: Xcode13 real machine run
  • Development languages: Objective-C, Swift
  • Implementation: push and pull flow.

The platform is compatible with

system Compile environment The CPU architecture
Android 4.4 and above Android Studio, the NDK Armeabi v7a, arm64 v8a
IOS 9.0 or later Xcode13 arm64
Windows 7 and later VS2015,VS2017 X86 and x86-64

The project structure

AnyLive realizes the functions of push flow, pull flow, screen sharing, beauty and so on.

The sample code

Results show

Code implementation
Menus = [[MenuItem(imageName: "icon_push", title: "menu_push ", subTitle: [MenuItem(imageName: "icon_pull", title: "Live pull", subTitle: [MenuItem(imageName: "icon_video", title: "Small video playback ", subTitle:" small video playback ", MenuItem(imageName: "small video playback ", subTitle: [] let identifier = "ARLiveMainCell" [String] = { ["Live_JoinVC", "Player_JoinVC", "Video_JoinVC"] }() override func viewDidLoad() { super.viewDidLoad() // Uncomment the following line to preserve selection between presentations // self.clearsSelectionOnViewWillAppear = false // Uncomment the following line to display an Edit button in the navigation bar for this view controller. // self.navigationItem.rightBarButtonItem = self.editButtonItem let label = UILabel(frame: CGRectZero) label.textColor = UIColor(hexString: "#C4C4CE") label.font = UIFont(name: PingFang, size: 12) label.textAlignment = .center label.text = "Power by anyRTC" view.addSubview(label) liveEngine = ARLiveEngineKit(delegate: nil) } override func viewWillAppear(_ animated: Bool) { super.viewWillAppear(animated) navigationController? .setNavigationBarHidden(true, animated: true) } // MARK: - Table view data source override func numberOfSections(in tableView: UITableView) -> Int { return menus.count } override func tableView(_ tableView: UITableView, numberOfRowsInSection section: Int) -> Int { // #warning Incomplete implementation, return the number of rows return menus[section].count } override func tableView(_ tableView: UITableView, cellForRowAt indexPath: IndexPath) -> UITableViewCell { let cell: ARMainCell = tableView.dequeueReusableCell(withIdentifier: identifier, for: indexPath) as! ARMainCell // Configure the cell... let menuItem = menus[indexPath.section][indexPath.row] cell.mainImageView.image = UIImage(named: menuItem.imageName) cell.mainLabel.text = menuItem.title cell.subLabel.text = menuItem.subTitle cell.expectedImageView.isHidden = (indexPath.section ! = 2) return cell } override func tableView(_ tableView: UITableView, didSelectRowAt indexPath: IndexPath) { if indexPath.section ! = 2 { guard let vc = storyboard? .instantiateViewController(withIdentifier: identifierArr[indexPath.section]) else { return } navigationController? .pushViewController(vc, animated: true) } else { ARToast.showText(text: " Please look forward!" Duration: 1.0)},Copy the code
Effect display (Push flow)

Code implementation
Func initializePusher() {/// instantiate the push stream object livePusher = liveEngine! .creatvepusher () livepusher.setdelegate (self) // set let param = ARLiveVideoEncoderParam(resolution!) to be written until the computer is written. LivePusher. SetVideoQuality (param) livePusher. StartCamera (true) livePusher. StartMicrophone () / / / set the local camera preview View LivePusher. SetupCameraRender (renderView) livePusher. SetRenderFill (. The fill) / / / began pushing flow livePusher startPush (pushUrl)} / / MARK: - ARLivePushDelegate extension ArLiveViewController: ARLivePushDelegate { func onError(_ code: ARLiveCode, message msg: String? , extraInfo: [AnyHashable: Any]?) Logger.log(message: "onError (code.rawValue)", level:.error)} func onWarning(_ code: ARLiveCode, message msg: String? , extraInfo: [AnyHashable: Any]?) {/// Live stream push warning notification logger. log(message: "onWarning (code.rawvalue)", level: Warning)} func onCaptureFirstAudioFrame() {logger. log(message: "onCaptureFirstAudioFrame", level: .info)} func onCaptureFirstVideoFrame() {logger. log(message: "onCaptureFirstVideoFrame", level: .info)} func onMicrophoneVolumeUpdate(_ volume: Int) {/// Microphone collect volume value callback logger.log (message: "onMicrophoneVolumeUpdate volume = (volume)", level: .info) } func onPushStatusUpdate(_ status: ARLivePushStatus, message MSG: String?, extraInfo: [AnyHashable: Any]?) {/// Call logger. log(message: "onPushStatusUpdate status = (status.rawValue)", level: .info) stateLabel.text = "(status.description)" } func onStatisticsUpdate(_ statistics: ARLivePusherStatistics) {/// Live stream pusherstatistics callback // logger.log (message: "onStatisticsUpdate width = (statistics.width), height = (statistics.height), fps = (statistics.fps), videoBitrate = (statistics.videoBitrate), audioBitrate = (statistics.audioBitrate)", level: .info)} func onSnapshotComplete(_ image: UIImage) {/// screenshot callback logger. log(message: "onSnapshotComplete", level: .info) } }Copy the code
Effect display (pull flow)

Code implementation
Func initializePlayer() {/// create a liveEngine! .creatveplayer () liveplayer.setdelegate (self) // set the player's video renderView liveplayer.setrenderview (renderView). LivePlayer. SetRenderFill (renderMode) / / / set the player cache automatically adjust the minimum and maximum time (unit: second) livePlayer. SetCacheParams (1.0, maxTime: 100) /// Start playback livePlayer.startPlay(pullUrl)} // MARK: - ARLivePlayDelegate extension ArPlayerViewController: ARLivePlayDelegate { func onError(_ player: ARLivePlayer, code: ARLiveCode, message msg: String? , extraInfo: [AnyHashable: Any]?) Logger.log(message: "onError code = (code.rawValue)", level: .info) } func onWarning(_ player: ARLivePlayer, code: ARLiveCode, message msg: String?, extraInfo: [AnyHashable: Logger.log(message: "onWarning code = (code.rawValue)", level: .info) } func onVideoPlayStatusUpdate(_ player: ARLivePlayer, status: ARLivePlayStatus, reason: ARLiveStatusChangeReason extraInfo: [AnyHashable: Any]?) {/ / / live video player state change notification Logger. The log (message: "onVideoPlayStatusUpdate status = (status.rawValue), reason = (reason.rawValue)", level: .info) liveStatus = status stateLabel.text = "(status.description)" } func onAudioPlayStatusUpdate(_ player: ARLivePlayer, status: ARLivePlayStatus, reason: ARLiveStatusChangeReason, extraInfo: [AnyHashable: Any]?) {/// Broadcast logger. log(message: "onAudioPlayStatusUpdate status = (status.rawValue) reason = (reason.rawValue)", level: .info)} func onPlayoutVolumeUpdate(_ player: ARLivePlayer, volume: Int) {logger.log (message: "onPlayoutVolumeUpdate volume = (volume)", level: .info) } func onStatisticsUpdate(_ player: ARLivePlayer, statistics: ARLivePlayerStatistics?) {/// Live player statistics callback if statistics! = nil { Logger.log(message: "onStatisticsUpdate width = (statistics! .width), height =(statistics! .height), fps = (statistics! .fps), videoBitrate = (statistics! .videoBitrate), audioBitrate = (statistics! .audioBitrate)", level: .info) } } func onSnapshotComplete(_ player: ARLivePlayer, image: UIImage) {/ / / screenshot callback UIImageWriteToSavedPhotosAlbum (image, the self, #selector(saveImage(image:didFinishSavingWithError:contextInfo:)), nil) NSObject.cancelPreviousPerformRequests(withTarget: self, selector: #selector(removeSnapshot), object: nil) snapImageView.image = image let imageWidth = image.size.width/2 let imageHeight = image.size.height/2 snapImageView.frame = CGRect(x: ARScreenWidth - imageWidth - 24, y: 150, width: imageWidth, height: imageHeight) view.addSubview(snapImageView) perform(#selector(removeSnapshot), with: nil, afterDelay: 2) Logger.log(message: "onSnapshotComplete sucess, imageWidth = (image.size.width), imageHeight = (image.size.height)", level: .info) } func onRenderVideoFrame(_ player: ARLivePlayer, frame videoFrame: ARLiveVideoFrame?) {/// Custom video render callback logger. log(message: "onRenderVideoFrame", level: .info) } func onReceiveSeiMessage(_ player: ARLivePlayer, payloadType: Int32, data: Logger.log(message: "onReceiveSeiMessage payloadType = (payloadType)", level:.info)}}Copy the code
Effect display (screen sharing)

Code implementation
override func processSampleBuffer(_ sampleBuffer: CMSampleBuffer, with sampleBufferType: RPSampleBufferType) { DispatchQueue.main.async { switch sampleBufferType { case RPSampleBufferType.video: // Handle video sample buffer ARUploader.sendVideoBuffer(sampleBuffer) case RPSampleBufferType.audioApp: // Handle audio sample buffer for app audio ARUploader.sendAudioAppBuffer(sampleBuffer) case RPSampleBufferType.audioMic: // Handle audio sample buffer for mic audio ARUploader.sendAudioMicBuffer(sampleBuffer) break @unknown default: // Handle other sample buffer types fatalError("Unknown type of sample buffer") } } } private static let liverPusher: ARLivePusher = { let livePusher = liveEngine.createArLivePusher() let screenSize = UIScreen.main.currentMode? .size let screenWidth = screenSize? .width let screenHeight = screenSize? The height / / / set push streaming video encoding parameters let videoParam = ARLiveVideoEncoderParam () videoParam. VideoResolution =. Resolution640x480 videoParam.videoResolutionMode = .portrait videoParam.videoScaleMode = .fit livePusher.setVideoQuality(videoParam) LivePusher. StartMicrophone () / / / open the acquisition livePusher enableCustomAudioCapture (true) LivePusher. EnableCustomVideoCapture (true). / / / began pushing flow livePusher startPush (< # # T# String# >) return livePusher} () static func  sendAudioAppBuffer(_ sampleBuffer: CMSampleBuffer) { ARAudioTube.liverPusher(liverPusher, pushAudioCMSampleBuffer: sampleBuffer, resampleRate: audioSampleRate, type: .app) } static func sendAudioMicBuffer(_ sampleBuffer: CMSampleBuffer) { ARAudioTube.liverPusher(liverPusher, pushAudioCMSampleBuffer: sampleBuffer, resampleRate: audioSampleRate, type: .mic) }Copy the code

conclusion

Finally, due to the limited time, there are still some bugs and feature points to be improved in the project. For your information only, welcome to fork. You are welcome to point out issues if you have any shortcomings. Finally, post the Github open source download address.

Github open source download address.