• demand

    The company mixed development, UNI end shooting small video is not very ideal, in order to achieve the effect of imitation WeChat, the native plug-in start

  • Train of thought

    Step 1:1 AVCaptureSession, 1 piece of AVCaptureVideoPreviewLayer [compatible consider replacing AVPreView]

    Step 2: Video & Audio is required for video recording, and the corresponding AVCaptureDeviceInput is required, and the corresponding AVCaptureVideoDataOutput and AVCaptureAudioDataOutput are also required

    Step 3: Set output in the proxy to distinguish video from audio, and write the corresponding CmsampleBufferRef into the video file

    Step 4: Add the AvassetWriter to the video file. For video & audio, you need two AvassetWriters

    Step 5: CMSampleBufferRef keeps coming and AssetWriter keeps writing until it stops

  • serving

    The first step of initialization will not be written, nothing can look at my previous blog

    Step 2: Two avcaptureDeviceInput two outputs, and set the proxy for the outputs

    self.videoInput = [[AVCaptureDeviceInput alloc] initWithDevice:device error:&error]; If (error) {NSLog(@" Error when getting the device and ingesting the VideoInput object, cause: %@", error); return; } / / equipment added to the session the if ([self. The session canAddInput: self. The videoInput]) {[self. The session addInput: self. The videoInput]; } [self.videoOutput setSampleBufferDelegate:self queue:self.videoQueue]; if ([self.session canAddOutput:self.videoOutput]) { [self.session addOutput:self.videoOutput]; } / / audio related AVCaptureDevice * adevice = [AVCaptureDevice defaultDeviceWithMediaType: AVMediaTypeAudio]; self.audioInput = [[AVCaptureDeviceInput alloc] initWithDevice:adevice error:&error]; if ([self.session canAddInput:self.audioInput]) { [self.session addInput:self.audioInput]; } [self.audioOutput setSampleBufferDelegate:self queue:self.videoQueue]; if ([self.session canAddOutput:self.audioOutput]) { [self.session addOutput:self.audioOutput]; } // videoOutput - (avCaptureVideoDataOutput *)videoOutput {if (! _videoOutput) { _videoOutput = [[AVCaptureVideoDataOutput alloc] init]; _videoOutput.alwaysDiscardsLateVideoFrames = YES; } return _videoOutput; } - (AVCAPTUREAudioDataOutput *)audioOutput {if (! _audioOutput) { _audioOutput = [[AVCaptureAudioDataOutput alloc] init]; } return _audioOutput; }

    Step 3: Start the Session and operate the cmSampleBufferRef in the proxy

    #pragma mark - AVCaptureVideoDataOutputSampleBufferDelegate & AVCaptureAudioDataOutputSampleBufferDelegate - (void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer FromConnection :(avCaptureConnection *)connection {@AutoReleasePool {// video if (connection == [self.videoOutput connectionWithMediaType:AVMediaTypeVideo]) { if (! self.manager.outputVideoFormatDescription) { @synchronized(self) { CMFormatDescriptionRef formatDescription = CMSampleBufferGetFormatDescription(sampleBuffer); self.manager.outputVideoFormatDescription = formatDescription; } } else { @synchronized(self) { if (self.manager.state == StateRecording) { [self.manager appendBuffer:sampleBuffer type:AVMediaTypeVideo]; }}}} / / audio if (connection = = [self. AudioOutput connectionWithMediaType: AVMediaTypeAudio]) {if (! self.manager.outputAudioFormatDescription) { @synchronized(self) { CMFormatDescriptionRef formatDescription = CMSampleBufferGetFormatDescription(sampleBuffer); self.manager.outputAudioFormatDescription = formatDescription; } } @synchronized(self) { if (self.manager.state == StateRecording) { [self.manager appendBuffer:sampleBuffer type:AVMediaTypeAudio]; } } } } }

    Step 4: AvassetWriter and the corresponding Input

    AvassetWriter = [avassetWriter assetWriterWithurl :_videoUrl fileType: avFileTypeG4 error:nil]; avassetWriter assetWriterWithurl :_videoUrl fileType: avFileTypeG4 error:nil]; _videoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:_videoSettings]; / / expectsMediaDataInRealTime must be set to yes, need from the capture session _videoInput real-time access to data. The expectsMediaDataInRealTime = yes; _audioInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio outputSettings:_audioSettings]; _audioInput.expectsMediaDataInRealTime = YES; if ([_writer canAddInput:_videoInput]) { [_writer addInput:_videoInput]; } if ([_writer canAddInput:_audioInput]) { [_writer addInput:_audioInput]; }

    Step 5: The cmSampleBufferRef from Step 3 is written to the video file via the avassetWriter

    - (void)appendBuffer:(CMSampleBufferRef)buffer type:(NSString *)mediaType { if (buffer == NULL) { NSLog(@"empty sampleBuffer"); return; } @synchronized (self) { if (self.state < StateRecording) { NSLog(@"not ready yet"); return; } } CFRetain(buffer); dispatch_async(self.queue, ^{ @autoreleasepool { @synchronized (self) { if (self.state > StateFinish) { CFRelease(buffer); return; } } if (! self.canWrite && mediaType == AVMediaTypeVideo) { [self.writer startWriting]; [self.writer startSessionAtSourceTime:CMSampleBufferGetPresentationTimeStamp(buffer)]; self.canWrite = YES; } if(! self.timer) { dispatch_async(dispatch_get_main_queue(), ^{ self.timer = [NSTimer scheduledTimerWithTimeInterval:TIMER_INTERVAL target:self selector:@selector(updateProgress) userInfo:nil repeats:YES]; [[NSRunLoop currentRunLoop] addTimer:self.timer forMode:NSDefaultRunLoopMode]; }); } / / write video data if (mediaType = = AVMediaTypeVideo) {if (self) videoInput) readyForMoreMediaData) {BOOL success = [self.videoInput appendSampleBuffer:buffer]; if (! success) { @synchronized (self) { [self stop:^{}]; [self destroy]; }}}} / / writing audio data if (mediaType = = AVMediaTypeAudio) {if (self) audioInput) readyForMoreMediaData) {BOOL success = [self.audioInput appendSampleBuffer:buffer]; if (! success) { @synchronized (self) { [self stop:^{}]; [self destroy]; } } } } CFRelease(buffer); }}); }
  • Write at the end:

    1. When setting the video properties of avassetWriterInput, it should be designed according to its own needs. The setting of bit rate and frame rate will affect the quality and size of the video after shooting, which depends on the requirements of each project
    2. If there is a problem with the video Angle, you can adjust it in three directions

      1. Set VideoOrientation on Layer’s Connect

      2. Set VideoOrientation to AVCaptureOutput’s Connect

      3. Set the avassetWriterInput transform for video, such as the Rotation to M_PI/2