Recording, we’re gonna need the microphone

Every app on an iOS device has an Audio Session.

App call audio related, will naturally use iOS hardware functions.

So the Audio Session, the Audio Session, is all about managing Audio operations.

IOS uses audio and has very fine management granularity

What do you think: should the music playing in the background be mixed with the audio of your app?

The Audio Session handles Audio by its Category Audio Session Category setting

The default category,

1. Playback is allowed, recording is not allowed.

2. After the mute button is turned on, your app will be mute and no sound will be heard when playing audio.

3. After the screen is locked, your app becomes mute and plays audio without sound.

4. If there are other apps playing audio in the background, when your app starts playing audio, the other apps will be mute.

More categories, as shown below:

The first thing we need to do is do some configuration for audio.

In general, we will use the AVFoundation framework to operate audio, so let’s import AVFoundation

Set the Audio Session, the classification of AVAudioSession. CategoryOptions. DefaultToSpeaker, allow our app, a call to the built-in microphone to the recording, can play Audio.

Here to do the recording function, the classification option has also changed.

The default option for categorizing is that the audio plays to the listener, which is the horn above, and the scene is usually you’re holding the phone to your ear and making a call.

Now take the audio path, point it to the speaker, the horn below the microphone.

// This is a global variable that records the microphone permissions var appHasMicAccess = true //... / / to get a first example of AVAudioSession let session. = AVAudioSession sharedInstance (do) {/ / here, Set the classification of the try session. SetCategory (AVAudioSession. Category. PlayAndRecord, the options: AVAudioSession. CategoryOptions. DefaultToSpeaker) try session. SetActive (true) / / did check app permissions, Use this equipment microphone session. RequestRecordPermission ({(isGranted: Bool) in if isGranted {// Your app wants to record audio, AppHasMicAccess = true} else{appHasMicAccess = false}})} Catch let error as NSError { Print (" AVAudioSession configuration error: (error. LocalizedDescription) ")} copy codeCopy the code

Enter the recording,

Var audioStatus: audioStatus = audiostatus. Stopped var audioRecorder: AVAudioRecorder! Func setupRecorder() {getURLforMemo () {getURLforMemo () {getURLforMemo (); For details see the GitHub link below let fileURL = getURLforMemo() // Set recording sampling description information /* Linear pulse coding modulation, uncompressed data format sampling frequency, 44.1 KHZ, CD-level effect mono, */ let recordSettings = [AVFormatIDKey: Int(kAudioFormatLinearPCM), AVSampleRateKey: 44100.0, AVNumberOfChannelsKey: 1, AVEncoderAudioQualityKey: AVAudioQuality. High. RawValue] as [String: Any] do {// Instantiate audioRecorder audioRecorder = try AVAudioRecorder(URL: fileURL, Settings: recordSettings) audioRecorder.delegate = self audioRecorder.prepareToRecord() } catch { print("Error creating audio Recorder.")}} // start recording func record() {startUpdateLoop() // track, record current app recording state audioStatus =.recording // this line, Is to start recording the audioRecorder. Record ()} / / stop the recording func stopRecording () {recordButton. SetBackgroundImage (UIImage (named: "Button-record "), for: uicontrol.state.normal) audioStatus =.Stopped audioRecorder. Stop () stopUpdateLoop()} Copy codeCopy the code

When recording ends, update the status through the AVAudioRecorderDelegate agent

func audioRecorderDidFinishRecording(_ recorder: AVAudioRecorder, successfully flag: Bool) {audioStatus =.stopped // Because this scene is recorded and must be manually clicked, // so there is no need to update the UI here} copy codeCopy the code

The recording is ready. Let’s play it

Play the tape

var audioPlayer: AVAudioPlayer! Func play() {getURLforMemo, getURLforMemo, getURLforMemo, Let fileURL = getURLforMemo() do {// Instantiate audioPlayer audioPlayer = try (contentsOf: If audioplayer. duration > 0.0 {setPlayButtonOn(flag: true) audioPlayer.play() audioStatus = .Playing startUpdateLoop() } } catch { print("Error loading audio Player") } } // Func stopPlayback() {setPlayButtonOn(flag: False) audioStatus =.stopped AudioPlayer.stop () stopUpdateLoop()} Copy codeCopy the code

After playing, update the UI via the AVAudioPlayerDelegate agent

func audioPlayerDidFinishPlaying(_ player: AVAudioPlayer, successfully flag: Bool) {// Because only here do we know when setPlayButtonOn(flag: false) audioStatus =.Stopped stopUpdateLoop()} copy codeCopy the code

UI to display recording/playback progress

To display the recording/playback progress, you use a timer,

Because the recording/playback, from moment to moment, changes.

Timer in three steps:

Turn on the timer,

Var soundTimer: CFTimeInterval = 0.0 var updateTimer: CADisplayLink! func startUpdateLoop(){ if updateTimer ! UpdateTimer = CADisplayLink(target: self, selector: #selector(ViewController.updateLoop)) updateTimer.preferredFramesPerSecond = 1 updateTimer.add(to: Runloop. current, forMode: runloop.mode.mon)} Copy codeCopy the code

Set a timer and get things done

@objc func updateLoop(){if audioStatus ==. Recording { Timed refresh if CFAbsoluteTimeGetCurrent() -SoundTimer > 0.5 {timelabel.text = formattedCurrentTime(UInt(audioRecorder.currentTime)) soundTimer = CFAbsoluteTimeGetCurrent() } } else if audioStatus == .playing{// Playing state, Timed refresh if CFAbsoluteTimeGetCurrent() -SoundTimer > 0.5 {timelabel.text = FormattedCurrentTime (UInt(AudioPlayer.currentTime)) soundTimer = CFAbsoluteTimeGetCurrent()}}} Copy the codeCopy the code

Destroy timer

Call this method when you need to stop, for example, after playing the proxy method, click the play button again…

Func stopUpdateLoop(){updatetimer.invalidate () updateTimer = nil Timelabel.text = formattedCurrentTime(UInt(0))Copy the code

Sampling volume size measurement

AVAudioPlayer has the audio metering function, and when you play the audio, the audio metering can detect, for example, the average energy level of the waveform

The AVAudioPlayer method, averagePower(forChannel:), returns the current decibel value ranging from -160 db to 0 db, where 0 is loud and -160 is quiet

The waveform looks like this

Make a mouth-opening animation, which is a simple visualization of volume. The bigger the volume, the more you open your mouth. See GitHub Repo at the end of this article

// Create a structure, MeterTable // audio metering returns floating point number range -160 ~ 0, first do decibel to amplitude, convert between 0 ~ 1 // open mouth mouth animation image has 5, divided into 5 levels, the above value range, Let MeterTable = MeterTable(tableSize: 100) //... Before / / play to activate the volume decibel value detection audioPlayer. IsMeteringEnabled = true / /... // The volume is mapped to the image number. // The status update method must use a timer. // This method is used in the timer method, Github repo func meterLevelsToFrame() -> Int{guard let player = audioPlayer else {return 1} Let avgPower = player.averagePower(forChannel: 0) let linearLevel = meterTable.valueForPower(power: AvgPower) // Continue processing the data, converting an energy level, GitHub repo let powerPercentage = Int(round(linearLevel * 100)) + 1 let frame = (powerPercentage/totalFrames) + 1 return min(frame, totalFrames)Copy the code

Audio playback control: including volume control, left and right channel switch, playback cycle, playback rate control and so on

Controls playback volume

The volume ranges from 0 to 1. 0 indicates mute, and 1 indicates maximum

func toSetVolumn(value: Float){guard let player = audioPlayer else {return} // Apple all package, set audioPlayer volume player.volume = value} copy codeCopy the code

Set the left and right channels

It ranges from -1 to 1,

-1 is all left, 1 is all right, 0 is equalized channel

func toSetPan(value: Float) {guard let player = audioPlayer else {return} // Apple all wrap, set audioPlayer pan player.pan = value} copy codeCopy the code

Set the play loop

The loop ranges from -1 to in.max,

NumberOfLoops 0 to int. Max will play that number more times

func toSetLoopPlayback(loop: Bool) {guard let player = audioPlayer else {return} Set audioPlayer numberOfLoops if loop == true{// numberOfLoops = -1; Player. numberOfLoops = 0} else{// numberOfLoops = 0}} Copy the code until audioPlayer stops player.numberOfLoops = 0} else{// numberOfLoops = 0Copy the code

Setting the Playback rate

The playback rate of the audioPlayer ranges from 0.5 to 2.0

0.5 is half-speed playback, 1.0 is normal playback, 2.0 is double speed playback

Audioplayer. enableRate = true //... func toSetRate(value: Float) {guard let player = audioPlayer else {return} // Apple all encapsulates, set audioPlayer rate player.rate = value} copy codeCopy the code

Github.com/BoxDengJZ/A…

Reprinted from: juejin.cn/post/684490…