AVAssetReader introduction

AVAssetReader allows you to obtain media samples from video files and to read the raw, undecoded media samples directly from memory to obtain the decoded media samples in a renderable form. The documentation states that the AVAssetrader pipeline is threaded internally. After initialization, the reader loads and processes a reasonable amount of sample data before use, with very low latency for retrieval operations such as copyNextSampleBuffer (AVAssetReaderOutput). Still, AVAssetReader is not suitable for real-time sources, and its performance is not guaranteed for real-time operations. Some sample data needs to be loaded and processed before the video is used, which may occupy a large amount of memory. Therefore, be careful not to use too many Readers at the same time. The higher the video pixel, the larger the memory will be occupied. AVAssetReader is initialized with AVAsset. As mentioned above, sample data will be loaded after initialization, so this step will already have an impression on memory. If memory is tight, do not initialize beforehand.

 NSError *createReaderError;
 _reader = [[AVAssetReader alloc]initWithAsset:_asset error:&createReaderError];
Copy the code

 

AVAssetReader to set the Output

Before you start reading, you need to add output to control which tracks in the asset to use for read initialization and configure how to read. AVAssetReaderOutput there will be other subclass implementation, such as AVAssetReaderVideoCompositionOutput, AVAssetReaderAudioMixOutput and AVAssetReaderSampleReferenceOutput. AVAssetReaderTrackOutput is used for this demonstration. A track is required to initialize, which is retrieved from the asset.

NSArray tracks = [_asset tracksWithMediaType:AVMediaTypeAudio];
if (tracks.count > 0) {
    AVAssetTrack audioTrack = [tracks objectAtIndex:0];
}
Copy the code

or

 NSArray tracks = [_asset tracksWithMediaType:AVMediaTypeVideo];
if (tracks.count > 0) {
    AVAssetTrack videoTrack = [tracks objectAtIndex:0];
}
Copy the code

You can also configure the format of the output; see the documentation for more configuration.

NSDictionary * const VideoAssetTrackReaderOutputOptions = @{(id) kCVPixelBufferOpenGLESCompatibilityKey : @(YES),
                                                            (id) kCVPixelBufferIOSurfacePropertiesKey : [NSDictionary dictionary],
                                                            (NSString*)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_32BGRA)};
_readerOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:_track
                                                           outputSettings:VideoAssetTrackReaderOutputOptions];
if ([_reader canAddOutput:_readerOutput]) {
    [_reader addOutput:_readerOutput];
}
Copy the code

 

The seek operation

AVAssetReader is not suitable for frequent random reads and may need to be implemented in a different way if frequent seek is required. You can set the read range before the start of the read. After the start of the read, you can only read backwards. There are two ways to adjust the read range:

  1. SupportsRandomAccess can be set in output and, when true, resets the read range, but requires the caller to call copyNextSampleBuffer until the method returns NULL.
  2. Or reinitialize an AVAssetReader to set the read time.

If you try the first option, seek is required. You can try to set a short interval each time to ensure that reading the whole interval does not take too much time, and the interval is best divided by key frames.

Read the data

_reader.timeRange = range;
[_reader startReading]; 
_sampleBuffer = [_readerOutput copyNextSampleBuffer];
Copy the code

CMSampleBuffer provides methods to get the decoded data, such as the image information can be used

CVImageBufferRef pixelBuffer = 
Copy the code
CMSampleBufferGetImageBuffer(_sampleBuffer);
Copy the code

Note that when the CMSampleBuffer is used up, release is called to release it

CFRelease(_sampleBuffer);
Copy the code

 

Code sample

NSDictionary * const AssetOptions = @{AVURLAssetPreferPreciseDurationAndTimingKey:@YES}; NSDictionary * const VideoAssetTrackReaderOutputOptions = @{(id) kCVPixelBufferOpenGLESCompatibilityKey : @(YES), (id) kCVPixelBufferIOSurfacePropertiesKey : [NSDictionary dictionary], (NSString*)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_32BGRA)}; _videoAsset = [[AVURLAsset alloc] initWithURL:[NSURL fileURLWithPath: filePath] options:AssetOptions]; _videoTrack = [[mPrivate->mVideoAsset tracksWithMediaType:AVMediaTypeVideo] firstObject]; if (_videoTrack) { NSError *createReaderError; _reader = [[AVAssetReader alloc] initWithAsset:_videoAsset error:&createReaderError]; if (! createReaderError) { _readerOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:mPrivate->_videoTrack outputSettings:VideoAssetTrackReaderOutputOptions]; _readerOutput.supportsRandomAccess = YES; if ([_reader canAddOutput:_readerOutput]) { [_reader addOutput:_readerOutput]; } [_reader startReading]; if (_reader.status == AVAssetReaderStatusReading || _reader.status == AVAssetReaderStatusCompleted) { CMSampleBufferRef samplebuffer = [_readerOutput copyNextSampleBuffer]; If (samplebuffer) {/ / draw the picture CVImageBufferRef samepleBuffer imageBuffer = CMSampleBufferGetImageBuffer (samplebuffer); CVPixelBufferLockBaseAddress(imageBuffer, 0); uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer); size_t width = CVPixelBufferGetWidth(imageBuffer); size_t height = CVPixelBufferGetHeight(imageBuffer); size_t bytesPerRow = CVPixelBufferGetBytesPerRowOfPlane(imageBuffer, 0); size_t bufferSize = CVPixelBufferGetDataSize(imageBuffer); CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB(); CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, baseAddress, bufferSize, NULL); CGImageRef cgImage = CGImageCreate(width, height, 8, 32, bytesPerRow, rgbColorSpace, kCGImageAlphaNoneSkipFirst | kCGBitmapByteOrderDefault, provider, NULL, true, kCGRenderingIntentDefault); CGImageRelease(cgImage); CVPixelBufferUnlockBaseAddress(imageBuffer, 0); CFRelease(samplebuffer); }}}}Copy the code