MIDI, MIDI is not audio data

MIDI Digital Interface Musical Instrument Digital Interface

MIDI computer can understand music, computer and electronic instruments can process musical instrument format

MIDI is not an audio signal and does not contain a PCM buffer

Play MIDI Event data through sequencer and audio data/Musical Instruments

(Play the sound of the instrument through the sound library SoundFont)

Play briefly through AVAudioSequencer

Connect the input and output of AVAudioEngine,

Input AVAudioUnitSampler → mixer engine. MainMixerNode → output engine. OutputNode

Take AVAudioEngine and create AVAudioSequencer to play MIDI

Configure the input and output of AVAudioEngine

  • Type AVAudioUnitSampler → mixerengine.mainMixerNode
Var engine = AVAudioEngine() var sampler = AVAudioUnitSampler() engine. Attach (sampler Is the output, / / bus 1 is input let outputHWFormat = engine. OutputNode. OutputFormat (forBus: 0) engine. Connect (sampler, to: engine.mainMixerNode, format: outputHWFormat) guard let bankURL = Bundle.main.url(forResource: soundFontMuseCoreName, withExtension: "Sf2) else {fatalError (\" (self. SoundFontMuseCoreName). Sf2 file not found. ")} / / loaded resources do {a try self.sampler.loadSoundBankInstrument(at: bankURL, program: 0, bankMSB: UInt8(kAUSampler_DefaultMelodicBankMSB), bankLSB: UInt8(kAUSampler_DefaultBankLSB)) try engine.start() } catch { print(error) }Copy the code
  • mixerengine.mainMixerNodeAnd the outputengine.outputNode, do not need to handle, just use the AVAudioEngine default

Use AVAudioSequencer to play MIDI

AVAudioSequencer can use different audio tracks to correspond to different musical instrument sounds

Tracks [index] point to different audio generation nodes


        var sequencer = AVAudioSequencer(audioEngine: engine)
        guard let fileURL = Bundle.main.url(forResource: "sibeliusGMajor", withExtension: "mid") else {
            fatalError("\"sibeliusGMajor.mid\" file not found.")
        }
        
        do {
            try sequencer.load(from: fileURL, options: .smfChannelsToTracks)
            print("loaded \(fileURL)")
        } catch {
            fatalError("something screwed up while loading midi file \n \(error)")
        }
        // 这里处理的,比较简单
        for track in sequencer.tracks {
            track.destinationAudioUnit = self.sampler
        }
        
        sequencer.prepareToPlay()
        do {
            try sequencer.start()
        } catch {
            print("\(error)")
        }
Copy the code

AudioKit source code, using MusicSequence and MusicPlayer

MIDI gives monotonous audio a sense of rhythm

Call:

AVAudioUnitSampler let drums = MIDISampler(name: AVAudioUnitSampler let drums = MIDISampler) Sequencer = AppleSequencer(filename: Output = drums do {try engine.start()} catch {//... } do { let bassDrumURL = Bundle.main.resourceURL? .appendingPathComponent("Samples/bass_drum_C1.wav") let bassDrumFile = try AVAudioFile(forReading: bassDrumURL!) let clapURL = Bundle.main.resourceURL? .appendingPathComponent("Samples/clap_D#1.wav") let clapFile = try AVAudioFile(forReading: clapURL!) / /... LoadAudioFiles ([bassDrumFile, clapFile, //...])} catch {//... }Copy the code

Above, a simple audio file playback, is done

Add MIDI effects to audio files, adjust the sound effects on each track,

A track is not a channel. A channel can have more than one track

MIDI effects, playback start time, playback duration, playback volume and pitch note

  • Note’s pitch ranges from 0 to 127

  • Play volume, described as velocity, which is also between 0 and 127

Sometimes, different velocity produces different timbre on the instrument

sequencer.clearRange(start: Duration(beats: 0), duration: Duration(beats: 100)) / / the output of the sequencer, pointing to the audio input MIDI sequencer. Entrance setGlobalMIDIOutput (drums. MidiIn) / / looping, MIDI, play time, can according to the second, EnableLooping (Duration(beats: SetTempo (150) sequencer.tracks[0]. Add (noteNumber: 24, Velocity: 80, position: Duration(beats: 0), duration: Duration(beats: 1)) sequencer.tracks[0].add(noteNumber: 24, velocity: 80, position: Duration(beats: 2), duration: Duration(beats: 1)) sequencer.tracks[1].add(noteNumber: 26, velocity: 80, position: Duration(beats: 2), duration: Duration(beats: 1)) // ... // set sequencer.tracks[2], and sequencer.tracks[3] // play sequencer.play()Copy the code

AudioKit source code implementation:

AppleSequencer is an encapsulation of MusicSequence, MusicTrack (MusicTrackManager), and MusicPlayer.

MusicTrack is encapsulated by MusicTrackManager

Initialize the

Create a sequence, load a MIDI file, and provide playback resources

Create MusicPlayer to play MIDI files

class AppleSequencer: NSObject { /// Music sequence open var sequence: MusicSequence? /// Array of AudioKit Music Tracks open var tracks = [MusicTrackManager]() /// Music Player var musicPlayer: MusicPlayer? Public init() {// initialize the sequester NewMusicSequence(&sequence) // setup and attach to musicplayer // initialize the musicplayer NewMusicPlayer(&musicPlayer) if let existingMusicPlayer = musicPlayer { Associate the sequencer MusicPlayerSetSequence(existingMusicPlayer, sequence)}}}Copy the code

Initial call. How do I get there

// Instantiate public convenience init(filename: String) {self.init() loadMIDIFile(filename)} public func loadMIDIFile(_ filename: String) {// File name, subtype URL let bundle = bundle. main guard let file = bundle.path(forResource: filename, ofType: "mid") else { Log("No midi file found") return } let fileURL = URL(fileURLWithPath: file) loadMIDIFile(fromURL: Public func loadMIDIFile(fromURL fileURL: url) {//... If let existingSequence = sequence {if let existingSequence = sequence { OSStatus = MusicSequenceFileLoad(existingSequence, fileURL as CFURL, .midiType, MusicSequenceLoadFlags()) if status ! = OSStatus(noErr) {// Error log //... } } initTracks() }Copy the code

Initialize the audio track,

Complete the initialization of Open var Tracks = [MusicTrackManager]()

func initTracks() { var count: UInt32 = 0 if let existingSequence = sequence { MusicSequenceGetTrackCount(existingSequence, &count) } for i in 0 .. < count { var musicTrack: MusicTrack? If let existingSequence = sequence { MusicSequenceGetIndTrack(existingSequence, UInt32(I), &musicTrack) } if let existingMusicTrack = musicTrack { tracks.append(MusicTrackManager(musicTrack: existingMusicTrack, name: "InitializedTrack")) } } // ... // Loop control}Copy the code

Add playback resources to a track

Call part

sequencer.setGlobalMIDIOutput(drums.midiIn)
Copy the code

In the class AppleSequencer,

Unified Settings, the simplest,

You put audio in, and you stuff it into every track

public func setGlobalMIDIOutput(_ midiEndpoint: MIDIEndpointRef) {
        for track in tracks {
            track.setMIDIOutput(midiEndpoint)
        }
    }
Copy the code

In class MusicTrackManager,

Add playing resources to music tracks

   public func setMIDIOutput(_ endpoint: MIDIEndpointRef) {
        if let track = internalMusicTrack {
            MusicTrackSetDestMIDIEndpoint(track, endpoint)
        }
    }
Copy the code

Play the sequencer processed audio, very simple

/// Play the sequence
    public func play() {
        if let existingMusicPlayer = musicPlayer {
            MusicPlayerStart(existingMusicPlayer)
        }
    }
Copy the code

MIDISampler, audio input

MIDISampler inherits from AppleSampler,

AppleSampler encapsulates an AVAudioUnitSampler that does three main things,

  • Audio resource files are loaded

  • Basic play function, play/Stop

  • Basic playback effects control, volume, left and right channels

Initialize audio sampling
open class MIDISampler: AppleSampler, NamedNode {// Open var name = "MIDI Sampler" /// public init(name midiOutputName: String? = nil) { super.init() name = midiOutputName ?? name enableMIDI(name: name) // ... MidiIn (MIDIEndpointRef) public func enableMIDI(_ midiClient: MIDIClientRef = MIDI.sharedInstance.client, name: String = "MIDI Sampler") { CheckError(MIDIDestinationCreateWithBlock(midiClient, name as CFString, &midiIn) { packetList, For e in packetList.pointee {e.freach {(event) in if event.length == 3 {do {try self.handle(event: Event)} catch let exception {// error log}}}}})}}Copy the code

MIDI data is an event


private func handle(event: MIDIEvent) throws {
        try self.handleMIDI(data1: event.data[0],
                            data2: event.data[1],
                            data3: event.data[2])
    }
Copy the code

For the rest, see Github Repo