Need: Use AVCaptureSession in AVFoundation to set camera resolution, frame rate (including high frame rate), switch front and rear cameras, focus, screen rotation, adjust exposure…


Prerequisites for reading:

  • See another article: overview of iOS Video Stream Collection (AVCaptureSession)
  • Based on the AVFoundation framework

GitHub address (with code)IOS Video Collection combat (AVCaptureSession)

Letter Address:IOS Video Collection combat (AVCaptureSession)

Blog Address:IOS Video Collection combat (AVCaptureSession)

Address of nuggets:IOS Video Collection combat (AVCaptureSession)


1. Set the resolution and frame rate

1.1. Low Frame Rate mode (FPS <= 30)

When the frame rate is required to be less than or equal to 30 frames, the camera sets the resolution and frame rate separately, that is, the method of setting frame rate is the method of setting frame rate, and the method of setting resolution is the method of setting resolution, and the two are not bound.

  • Set resolution

    You can use this method to set the camera resolution, and the type you can set can be directly jumped into the API document to choose by yourself. Currently, the maximum supported is 3840*2160. If the camera frame rate is not required to be greater than 30 frames, this method can be applied to you.

- (void)setCameraResolutionByPresetWithHeight:(int)height session:(AVCaptureSession *)session {
    /*
     Note: the method only support your frame rate <= 30 because we must use `activeFormat` when frame rate > 30, the `activeFormat` and `sessionPreset` are exclusive
     */
    AVCaptureSessionPreset preset = [self getSessionPresetByResolutionHeight:height];
    if ([session.sessionPreset isEqualToString:preset]) {
        NSLog(@"Needn't to set camera resolution repeatly !");
        return;
    }
    
    if(! [session canSetSessionPreset:preset]) { NSLog(@"Can't set the sessionPreset !");
        return;
    }
    
    [session beginConfiguration];
    session.sessionPreset = preset;
    [session commitConfiguration];
}
Copy the code
  • Set the frame rate

    Using this method, you can set the frame rate of the camera. Only 30 frames are allowed.

- (void)setCameraForLFRWithFrameRate:(int)frameRate {
    // Only for frame rate <= 30
    AVCaptureDevice *captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    [captureDevice lockForConfiguration:NULL];
    [captureDevice setActiveVideoMinFrameDuration:CMTimeMake(1, frameRate)];
    [captureDevice setActiveVideoMaxFrameDuration:CMTimeMake(1, frameRate)];
    [captureDevice unlockForConfiguration];
}
Copy the code
1.2. High Frame Rate mode (FPS > 30)

If you need to support high frame rate Settings for a certain resolution, such as 50,60,120… Would have been impossible, setActiveVideoMinFrameDuration and original setActiveVideoMaxFrameDuration, Apple rules we need to use a new way to set the frame rate setActiveVideoMinFrameDurati On and setActiveVideoMaxFrameDuration, and the method must cooperate with a new set of resolution activeFormat methods used together.

The new resolution setting method activeFormat and sessionPreset are mutually exclusive. If one is used, the other is unavailable. It is recommended to use the high frame rate setting method and discard the low frame rate setting method to avoid compatibility problems.

Apple after the update method to separate resolution and frame rate of the original setting method close 2 for one, was originally set camera resolution and frame rate alone, now you need to set up together, each has its corresponding resolution support frame rate range, each frame rate also has the support of the resolution, we need to traversal queries, so the original unified set the resolution and frame rate If it is determined that the project does not support high frame rates (FPS >30), the previous method can be used. It is simple and effective.

Note: Before using activeFormat method, using sessionPreset method resolution Settings will automatically become AVCaptureSessionPresetInputPriority, so if the project before canSetSessionPreset more useful if statements It is recommended that sessionPreset be enabled entirely if the project must support high frame rates.

+ (BOOL)setCameraFrameRateAndResolutionWithFrameRate:(int)frameRate andResolutionHeight:(CGFloat)resolutionHeight bySession:(AVCaptureSession *)session position:(AVCaptureDevicePosition)position videoFormat:(OSType)videoFormat {
    AVCaptureDevice *captureDevice = [self getCaptureDevicePosition:position];
    
    BOOL isSuccess = NO;
    for(AVCaptureDeviceFormat *vFormat in [captureDevice formats]) {
        CMFormatDescriptionRef description = vFormat.formatDescription;
        float maxRate = ((AVFrameRateRange*) [vFormat.videoSupportedFrameRateRanges objectAtIndex:0]).maxFrameRate;
        if (maxRate >= frameRate && CMFormatDescriptionGetMediaSubType(description) == videoFormat) {
            if([captureDevice lockForConfiguration: NULL] = = YES) {/ / contrast and resolution of the camera support the resolution of the current Settings CMVideoDimensions dims = CMVideoFormatDescriptionGetDimensions(description);if (dims.height == resolutionHeight && dims.width == [self getResolutionWidthByHeight:resolutionHeight]) {
                    [session beginConfiguration];
                    if ([captureDevice lockForConfiguration:NULL]){
                        captureDevice.activeFormat = vFormat;
                        [captureDevice setActiveVideoMinFrameDuration:CMTimeMake(1, frameRate)];
                        [captureDevice setActiveVideoMaxFrameDuration:CMTimeMake(1, frameRate)];
                        [captureDevice unlockForConfiguration];
                    }
                    [session commitConfiguration];
                    
                    returnYES; }}else {
                NSLog(@"%s: lock failed!",__func__);
            }
        }
    }
    
    NSLog(@"Set camera frame is success : %d, frame rate is %lu, resolution height = %f",isSuccess,(unsigned long)frameRate,resolutionHeight);
    return NO;
}

+ (AVCaptureDevice *)getCaptureDevicePosition:(AVCaptureDevicePosition)position {
    NSArray *devices = nil;
    
    if(@ the available (iOS 10.0, *)) { AVCaptureDeviceDiscoverySession *deviceDiscoverySession = [AVCaptureDeviceDiscoverySession discoverySessionWithDeviceTypes:@[AVCaptureDeviceTypeBuiltInWideAngleCamera] mediaType:AVMediaTypeVideo position:position]; devices = deviceDiscoverySession.devices; }else {
#pragma clang diagnostic push
#pragma clang diagnostic ignored "-Wdeprecated-declarations"
        devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
#pragma clang diagnostic pop
    }
    
    for (AVCaptureDevice *device in devices) {
        if (position == device.position) {
            returndevice; }}return NULL;
}
Copy the code

2. Switch the front and rear cameras

Switching the front and rear cameras seems simple, but it will cause many problems in the actual application. Because the frame rate of resolution supported by the front and rear cameras on the same device is different, problems will occur if the front and rear cameras are not supported in tangential direction. Specific cases are as follows

For example, iPhoneX has the maximum support for the rear camera (4K,60fps) and the maximum support for the front camera (2K,30fps). When using the rear camera (4K,60fps) to switch to the front camera, if no processing is done, it cannot be switched, and the program is confused.

Pay attention to

In the following code, we this line of code session. SessionPreset = AVCaptureSessionPresetLow; Cut, because from the rear to the front we need to calculate the current input devices support maximum resolution and frame rate, and the input device, if don’t add up and we can’t calculate first so here first to set an acceptable resolution so that we can add the input device, after find out the current maximum support again after the resolution and frame rate set the resolution and frame Rate.

- (void)setCameraPosition:(AVCaptureDevicePosition)position session:(AVCaptureSession *)session input:(AVCaptureDeviceInput *)input  videoFormat:(OSType)videoFormat resolutionHeight:(CGFloat)resolutionHeight frameRate:(int)frameRate {if (input) {
        [session beginConfiguration];
        [session removeInput:input];
        
        AVCaptureDevice *device = [self.class getCaptureDevicePosition:position];
        
        NSError *error = nil;
        AVCaptureDeviceInput *newInput = [AVCaptureDeviceInput deviceInputWithDevice:device
                                                                               error:&error];
        
        if(error ! = noErr) { NSLog(@"%s: error:%@",__func__, error.localizedDescription);
            return; } // Most rear is 4 k, front support 2 k, at this time to switch to relegation, and if you don't add the Input to the first session, we will be able to calculate the current camera support the maximum resolution of the session. SessionPreset = AVCaptureSessionPresetLow;if ([session canAddInput:newInput])  {
            self.input = newInput;
            [session addInput:newInput];
        }else {
            NSLog(@"%s: add input failed.",__func__);
            return;
        }
        
        int maxResolutionHeight = [self getMaxSupportResolutionByPreset];
        if (resolutionHeight > maxResolutionHeight) {
            resolutionHeight = maxResolutionHeight;
            self.cameraModel.resolutionHeight = resolutionHeight;
            NSLog(@"%s: Current support max resolution height = %d", __func__, maxResolutionHeight);
        }
        
        int maxFrameRate = [self getMaxFrameRateByCurrentResolution];
        if (frameRate > maxFrameRate) {
            frameRate = maxFrameRate;
            self.cameraModel.frameRate = frameRate;
            NSLog(@"%s: Current support max frame rate = %d",__func__, maxFrameRate);
        }

        BOOL isSuccess = [self.class setCameraFrameRateAndResolutionWithFrameRate:frameRate
                                                              andResolutionHeight:resolutionHeight
                                                                        bySession:session
                                                                         position:position
                                                                      videoFormat:videoFormat];
        
        if(! isSuccess) { NSLog(@"%s: Set resolution and frame rate failed.",__func__); } [session commitConfiguration]; }}Copy the code

3. Switch screen video direction

Here we first need to distinguish between the concept of under the direction and the direction of the video screen, a direction is used to indicate equipment (UIDeviceOrientation), a (AVCaptureVideoOrientation) is used to represent video direction. AVCaptureSession we use, if we want to support screen rotation, we need to rotate our video screen at the same time as the screen rotation.

Screen in the direction of rotation can inform UIDeviceOrientationDidChangeNotification receive, here don’t do too much.

- (void)adjustVideoOrientationByScreenOrientation:(UIDeviceOrientation)orientation previewFrame:(CGRect)previewFrame previewLayer:(AVCaptureVideoPreviewLayer *)previewLayer videoOutput:(AVCaptureVideoDataOutput *)videoOutput {
    [previewLayer setFrame:previewFrame];
    
    switch (orientation) {
        case UIInterfaceOrientationPortrait:
            [self adjustAVOutputDataOrientation:AVCaptureVideoOrientationPortrait
                                    videoOutput:videoOutput];
            break;
        case UIInterfaceOrientationPortraitUpsideDown:
            [self adjustAVOutputDataOrientation:AVCaptureVideoOrientationPortraitUpsideDown
                                    videoOutput:videoOutput];
            break;
        case UIInterfaceOrientationLandscapeLeft:
            [[previewLayer connection] setVideoOrientation:AVCaptureVideoOrientationLandscapeLeft];
            [self adjustAVOutputDataOrientation:AVCaptureVideoOrientationLandscapeLeft
                                    videoOutput:videoOutput];
            break;
        case UIInterfaceOrientationLandscapeRight:
            [[previewLayer connection] setVideoOrientation:AVCaptureVideoOrientationLandscapeRight];
            [self adjustAVOutputDataOrientation:AVCaptureVideoOrientationLandscapeRight
                                    videoOutput:videoOutput];
            break;
            
        default:
            break;
            
    }
}

-(void)adjustAVOutputDataOrientation:(AVCaptureVideoOrientation)orientation videoOutput:(AVCaptureVideoDataOutput *)videoOutput {
    for(AVCaptureConnection *connection in videoOutput.connections) {
        for(AVCaptureInputPort *port in [connection inputPorts]) {
            if([[port mediaType] isEqual:AVMediaTypeVideo]) {
                if([connection isVideoOrientationSupported]) {
                    [connection setVideoOrientation:orientation];
                }
            }
        }
    }
}

Copy the code

4. Focus adjustment

As for focusing, we need to specify the manual setting of focusing, because the focusing method only accepts the coordinate system with the upper left corner as (0,0) and the lower right corner as (1,1), so we need to transform the coordinate system of UIView, but the conversion needs to be divided into various situations, as follows

  • Whether the video output is in mirror mode: for example, the front camera may enable mirror mode (x,y coordinates are reversed).
  • Screen orientation is Home to the right or the left: on the right is the origin of the upper-left corner, on the left is the origin of the lower-right corner.
  • Video rendering method: keep resolution ratio, or fill mode, because the phone model is different, so it may be filled with black edge, may be beyond the screen, need to recalculate the focus.

If we are the direct use of AVCaptureSession AVCaptureVideoPreviewLayer do rendering, we can use captureDevicePointOfInterestForPoint method automatic calculation, the result will consider above all. But if we are rendering the screen ourselves, we need to calculate the focus ourselves, so we need to consider all of the above. Both automatic and manual calculation methods are provided below.

- (void)autoFocusAtPoint:(CGPoint)point {
    AVCaptureDevice *device = self.input.device;
    if ([device isFocusPointOfInterestSupported] && [device isFocusModeSupported:AVCaptureFocusModeAutoFocus]) {
        NSError *error;
        if ([device lockForConfiguration:&error]) {
            [device setExposurePointOfInterest:point];
            [device setExposureMode:AVCaptureExposureModeContinuousAutoExposure];
            [device setFocusPointOfInterest:point];
            [device setFocusMode:AVCaptureFocusModeAutoFocus]; [device unlockForConfiguration]; }}}Copy the code
4.1. Calculate the focal point automatically
- (CGPoint)convertToPointOfInterestFromViewCoordinates:(CGPoint)viewCoordinates captureVideoPreviewLayer:(AVCaptureVideoPreviewLayer *)captureVideoPreviewLayer {
    CGPoint pointOfInterest = CGPointMake(.5f, .5f);
    CGSize frameSize = [captureVideoPreviewLayer frame].size;
    
    if ([captureVideoPreviewLayer.connection isVideoMirrored]) {
        viewCoordinates.x = frameSize.width - viewCoordinates.x;
    }

    // Convert UIKit coordinate to Focus Point(0.0~1.1)
    pointOfInterest = [captureVideoPreviewLayer captureDevicePointOfInterestForPoint:viewCoordinates];
    
    // NSLog(@"Focus - Auto test: %@",NSStringFromCGPoint(pointOfInterest));
    
    return pointOfInterest;
}

Copy the code
4.2. Calculate the focal point manually
  • If the screen size of the phone is exactly the same as the resolution ratio, directly convert the coordinate system to (0,0) to (1,1)
  • If the screen size ratio and resolution ratio is different, need further analysis to calculate the video rendering, if is to keep the resolution, certainly will leave a black border, when calculating the focus of our need to minus the black side length, if is the resolution ratio of filling the screen will sacrifice part of pixels, we also need to add when calculating the focus of sacrifice of the pixel.
- (CGPoint)manualConvertFocusPoint:(CGPoint)point frameSize:(CGSize)frameSize captureVideoPreviewLayer:(AVCaptureVideoPreviewLayer *)captureVideoPreviewLayer position:(AVCaptureDevicePosition)position videoDataOutput:(AVCaptureVideoDataOutput *)videoDataOutput input:(AVCaptureDeviceInput *)input {
    CGPoint pointOfInterest = CGPointMake(.5f, .5f);
    
    if ([[videoDataOutput connectionWithMediaType:AVMediaTypeVideo] isVideoMirrored]) {
        point.x = frameSize.width - point.x;
    }
    
    for (AVCaptureInputPort *port in [input ports]) {
        if ([port mediaType] == AVMediaTypeVideo) {
            CGRect cleanAperture = CMVideoFormatDescriptionGetCleanAperture([port formatDescription], YES);
            CGSize resolutionSize = cleanAperture.size;
            
            CGFloat resolutionRatio = resolutionSize.width / resolutionSize.height;
            CGFloat screenSizeRatio = frameSize.width / frameSize.height;
            CGFloat xc = .5f;
            CGFloat yc = .5f;
        
            if (resolutionRatio == screenSizeRatio) {
                xc = point.x / frameSize.width;
                yc = point.y / frameSize.height;
            }else if (resolutionRatio > screenSizeRatio) {
                if ([[captureVideoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResizeAspectFill]) {
                    CGFloat needScreenWidth = resolutionRatio * frameSize.height;
                    CGFloat cropWidth = (needScreenWidth - frameSize.width) / 2;
                    xc = (cropWidth + point.x) / needScreenWidth;
                    yc = point.y / frameSize.height;
                }else if ([[captureVideoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResizeAspect]){
                    CGFloat needScreenHeight = frameSize.width * (1/resolutionRatio);
                    CGFloat blackBarLength   = (frameSize.height - needScreenHeight) / 2;
                    xc = point.x / frameSize.width;
                    yc = (point.y - blackBarLength) / needScreenHeight;
                }else if([[captureVideoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResize]) { xc = point.x / frameSize.width; yc = point.y / frameSize.height; }}else {
                if ([[captureVideoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResizeAspectFill]) {
                    CGFloat needScreenHeight = (1/resolutionRatio) * frameSize.width;
                    CGFloat cropHeight = (needScreenHeight - frameSize.height) / 2;
                    xc = point.x / frameSize.width;
                    yc = (cropHeight + point.y) / needScreenHeight;
                }else if ([[captureVideoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResizeAspect]){
                    CGFloat needScreenWidth = frameSize.height * resolutionRatio;
                    CGFloat blackBarLength   = (frameSize.width - needScreenWidth) / 2;
                    xc = (point.x - blackBarLength) / needScreenWidth;
                    yc = point.y / frameSize.height;
                }else if([[captureVideoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResize]) { xc = point.x / frameSize.width; yc = point.y / frameSize.height; } } pointOfInterest = CGPointMake(xc, yc); }}if (position == AVCaptureDevicePositionBack) {
        if(captureVideoPreviewLayer.connection.videoOrientation == AVCaptureVideoOrientationLandscapeLeft) { pointOfInterest = CGPointMake(1-pointOfInterest.x, 1-pointOfInterest.y); }}else {
        pointOfInterest = CGPointMake(pointOfInterest.x, 1-pointOfInterest.y);
    }
    
    //NSLog(@"Focus - manu test: %@",NSStringFromCGPoint(pointOfInterest));
    return pointOfInterest;
}
Copy the code

5. Exposure adjustment

If we use UISlider as the adjustment control, the easiest way to do this is to set its range to the same as the exposure value, that is, (-8 to 8), without converting the value, we can pass it in directly, and if it is a gesture or other control, we can adjust it as needed. It’s a little bit simpler, I’m not going to describe it.

- (void)setExposureWithNewValue:(CGFloat)newExposureValue device:(AVCaptureDevice *)device {
    NSError *error;
    if ([device lockForConfiguration:&error]) {
        [device setExposureTargetBias:newExposureValue completionHandler:nil]; [device unlockForConfiguration]; }}Copy the code

6. Flashlight mode

  • Automatic AVCaptureTorchModeAuto:
  • Open AVCaptureTorchModeOn:
  • Closed AVCaptureTorchModeOff:
- (void)setTorchState:(BOOL)isOpen device:(AVCaptureDevice *)device {
    if ([device hasTorch]) {
        NSError *error;
        [device lockForConfiguration:&error];
        device.torchMode = isOpen ? AVCaptureTorchModeOn : AVCaptureTorchModeOff;
        [device unlockForConfiguration];
    }else {
        NSLog(@"The device not support torch!"); }}Copy the code

7. Video stability adjustment

Note: Rendering with this property may cause problems for some models and resolutions (iPhone XS, render by yourself)

-(void)adjustVideoStabilizationWithOutput:(AVCaptureVideoDataOutput *)output {
    NSArray *devices = nil;
    
    if(@ the available (iOS 10.0, *)) { AVCaptureDeviceDiscoverySession *deviceDiscoverySession = [AVCaptureDeviceDiscoverySession discoverySessionWithDeviceTypes:@[AVCaptureDeviceTypeBuiltInWideAngleCamera] mediaType:AVMediaTypeVideo position:self.cameraModel.position]; devices = deviceDiscoverySession.devices; }else {
#pragma clang diagnostic push
#pragma clang diagnostic ignored "-Wdeprecated-declarations"
        devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
#pragma clang diagnostic pop
    }
    
    for(AVCaptureDevice *device in devices){
        if([device hasMediaType:AVMediaTypeVideo]){
            if([device.activeFormat isVideoStabilizationModeSupported:AVCaptureVideoStabilizationModeAuto]) {
                for(AVCaptureConnection *connection in output.connections) {
                    for(AVCaptureInputPort *port in [connection inputPorts]) {
                        if([[port mediaType] isEqual:AVMediaTypeVideo]) {
                            if(connection.supportsVideoStabilization) {
                                connection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeStandard;
                                NSLog(@"activeVideoStabilizationMode = %ld",(long)connection.activeVideoStabilizationMode);
                            }else {
                                NSLog(@"connection don't support video stabilization");
                            }
                        }
                    }
                }
            }else{
                NSLog(@"device don't support video stablization"); }}}}Copy the code

8. White balance adjustment

  • Temperature: adjusted by Fahrenheit temperature (-150-~250)
  • Tint: Adjust by tone (-150-~150)

Note that when using setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains method must be more current AVCaptureWhiteBalanceGains value is in the range.

-(AVCaptureWhiteBalanceGains)clampGains:(AVCaptureWhiteBalanceGains)gains toMinVal:(CGFloat)minVal andMaxVal:(CGFloat)maxVal {
    AVCaptureWhiteBalanceGains tmpGains = gains;
    tmpGains.blueGain   = MAX(MIN(tmpGains.blueGain , maxVal), minVal);
    tmpGains.redGain    = MAX(MIN(tmpGains.redGain  , maxVal), minVal);
    tmpGains.greenGain  = MAX(MIN(tmpGains.greenGain, maxVal), minVal);
    
    return tmpGains;
}

-(void)setWhiteBlanceValueByTemperature:(CGFloat)temperature device:(AVCaptureDevice *)device {
    if ([device isWhiteBalanceModeSupported:AVCaptureWhiteBalanceModeLocked]) {
        [device lockForConfiguration:nil];
        AVCaptureWhiteBalanceGains currentGains = device.deviceWhiteBalanceGains;
        CGFloat currentTint = [device temperatureAndTintValuesForDeviceWhiteBalanceGains:currentGains].tint;
        AVCaptureWhiteBalanceTemperatureAndTintValues tempAndTintValues = {
            .temperature = temperature,
            .tint        = currentTint,
        };
        
        AVCaptureWhiteBalanceGains deviceGains = [device deviceWhiteBalanceGainsForTemperatureAndTintValues:tempAndTintValues];
        CGFloat maxWhiteBalanceGain = device.maxWhiteBalanceGain;
        deviceGains = [self clampGains:deviceGains toMinVal:1 andMaxVal:maxWhiteBalanceGain];
        
        [device setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains:deviceGains completionHandler:nil];
        [device unlockForConfiguration];
    }
}

-(void)setWhiteBlanceValueByTint:(CGFloat)tint device:(AVCaptureDevice *)device {
    if ([device isWhiteBalanceModeSupported:AVCaptureWhiteBalanceModeLocked]) {
        [device lockForConfiguration:nil];
        CGFloat maxWhiteBalaceGain = device.maxWhiteBalanceGain;
        AVCaptureWhiteBalanceGains currentGains = device.deviceWhiteBalanceGains;
        currentGains = [self clampGains:currentGains toMinVal:1 andMaxVal:maxWhiteBalaceGain];
        CGFloat currentTemperature = [device temperatureAndTintValuesForDeviceWhiteBalanceGains:currentGains].temperature;
        AVCaptureWhiteBalanceTemperatureAndTintValues tempAndTintValues = {
            .temperature = currentTemperature,
            .tint        = tint,
        };
        
        AVCaptureWhiteBalanceGains deviceGains = [device deviceWhiteBalanceGainsForTemperatureAndTintValues:tempAndTintValues];
        deviceGains = [self clampGains:deviceGains toMinVal:1 andMaxVal:maxWhiteBalaceGain];
        
        [device setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains:deviceGains completionHandler:nil]; [device unlockForConfiguration]; }}Copy the code

9. Screen filling method

  • AVLayerVideoGravityResizeAspect: keep the resolution ratio, if the video and screen resolution leaves black border.
  • AVLayerVideoGravityResizeAspectFill: keep the resolution ratio to fill the screen, namely must fill the screen, with smaller side to sacrifice some pixels, because beyond the screen.
  • AVLayerVideoGravityResize: to fill the screen in the form of stretching, without sacrificing pixels, but the picture can be stretched.
- (void)setVideoGravity:(AVLayerVideoGravity)videoGravity previewLayer:(AVCaptureVideoPreviewLayer *)previewLayer session:(AVCaptureSession *)session {
    [session beginConfiguration];
    [previewLayer setVideoGravity:videoGravity];
    [session commitConfiguration];
}
Copy the code