This section mainly involves:

  1. AVCapturePhotoOutput basis
  2. Grading capture
  3. Live Photo
  4. Thumbnail
  5. Scene monitoring

AVCapturePhotoOutput basis

Since the iOS 10, apple started using AVCapturePhotoOutput to replace AVCaptureStillImageOutput, compared to the previous AVCaptureStillImageOutput, I think there are several advantages:

  • AVCapturePhotoOutput directly takes over the packaging, management, and enablement of all photo-related features such as Live Photo, depth of field, and Portrait;
  • Using a Delegate to call back images makes the life cycle clearer and more controllable, while avoiding the risk of circular references.
  • The addition of AVCapturePhotoSetttings made each setting more independent.
  • The callback interface can return photo Setting to provide imaging information features, such as whether the flash is turned on.

The image above shows apple’s new camera architecture since iOS 10.

AVCapturePhotoOutput structure

In addition to capturing still images, AVCapturePhotoOutput can also capture RAW images, Live photos, multi-image capture, and wide color gamut capture. Its class structure is generally organized as follows:


From the above two figures, we can see that the organizational structure of AVCapturePhotoOutput is very clear, which can be roughly divided into:

  • Readable properties: These are used to provide feedback on what features the device supports in capturing, such as Live Photo;
  • Function attribute: it is mainly used to decide whether to enable a function after judging whether to support a function, which is turned off by default, such as Live Photo;
  • Function methods: These are some common initialization methods, the most important being capturePhoto(with:delegate:).

AVCapturePhotoCaptureDelegate capture cycle

We know AVCapturePhotoOutput took the form of the delegate callback images, the type of the delegate is AVCapturePhotoCaptureDelegate, this will help the developers to manage image generated by the life cycle, Let’s look at the life cycle of image callbacks

The image above may seem confusing at first glance, but let me explain it in a minute:

  1. The system first calls back to the photoOutput(_:willBeginCaptureFor:) method, which tells the caller that the photo is about to be taken, At the same time can get a AVCaptureResolvedPhotoSettings type instance objects resolvedSettings, resolvedSettings will accompany the whole life cycle, it is to tell the caller whether there really is the purpose of using the function of a, For example, if we set flashMode = auto when taking photos, it will tell us whether the flashMode is on or off when taking photos. And there’s other information that I’m not going to describe here;
  2. The photoOutput(_:willCapturePhotoFor:) method is then called, and the shutter is closed and the sound is emitted.
  3. Then after a short period of time of waiting, the callback photoOutput (_ : didFinishProcessingPhoto: error:) method, said after the image processing, can get image data;
  4. The last callback photoOutput (_ : didFinishCaptureFor: error:), on behalf of the whole process of taking pictures over, to capture the data stored in photo albums.

Here are a few details to note:

  • AVCapturePhotoCaptureDelegate provides the complete life cycle state for developers, developers can be feedback to the UI, the user experience better;
  • Callback methods are optional;
  • The exact method of the callback is related to the AVCapturePhotoSettings passed in with capturePhoto(with:delegate:), such as using Live Photo;
  • All the callback method brought AVCaptureResolvedPhotoSettings, recorded the attributes of the current photo and a unique id, on more perturbation can be used to distinguish different cameras to capture.

Grading capture

4 hierarchical Capture, also known as Bracketed Capture, is an API available in iOS 8, so this is happening even before AVCapturePhotoOutput is around. Hierarchical capture, which is actually a set of parameters when taking a picture (currently supporting exposure and ISO), the system will take multiple images, AVCapturePhotoCaptureDelegate photoOutput (_ : didFinishProcessingPhoto: error:) method will follow multiple callback, the caller can use an algorithm (HDR) fusion images, Get a satisfactory picture.

In addition, it seems that the HDR mode cannot be started while taking photos. Here is a pit

Classification to capture a base class is AVCaptureBracketedStillImageSettings, cannot be instantiated, but provides two subclasses:

  • AVCaptureAutoExposureBracketedStillImageSettings: used to set the different exposure
  • AVCaptureManualExposureBracketedStillImageSettings: used to set the different ISO value

Next, let’s look at the usage method, mainly taking multiple groups of different exposure values as an example:

let bracketedStillImageSettings = [AVCaptureAutoExposureBracketedStillImageSettings.autoExposureSettings(exposureTargetBias: -1), AVCaptureAutoExposureBracketedStillImageSettings.autoExposureSettings(exposureTargetBias: 0), AVCaptureAutoExposureBracketedStillImageSettings.autoExposureSettings(exposureTargetBias: 1)]
photoSettings = AVCapturePhotoBracketSettings.init(rawPixelFormatType: 0, processedFormat: nil, bracketedSettings: bracketedStillImageSettings)

self.photoOutput.capturePhoto(with: photoSettings, delegate: photoCaptureProcessor)
Copy the code

After calling the photo, AVCapturePhotoCaptureDelegate photoOutput (_ : didFinishProcessingPhoto: error:) will be back three times, we’ll look at figure, exposure from left to right, respectively – 1, 1:



Live Photo

Live Photo is not a new feature of iOS 10. The essence of Live Photo is JPEG+MOV.

  • The total length of the video is 3s, that is, the first 1.5s and the last 1.5s of the picture;
  • Resolutions are 1440×1080 and 1290×960;
  • In addition to video frames, there are audio tracks.

The figure above helps us understand Live Photo more easily.

Live Photo capture

So how do we capture Live Photo? There are certain prerequisites for obtaining Live Photo, which are as follows:

  • Need to use AVCapturePhotoOutput isLivePhotoCaptureSupported whether equipment support Live Photo;
  • Live Photo can only be in AVCaptureSessionPresetPhoto mode;
  • You need to set isLivePhotoCaptureEnable to True for AVCapturePhotoOutput;
  • Live Photo is automatically disabled if AVCaptureMovieFileOutput is present.

Use code to organize as follows:

self.photoOutput.isLivePhotoCaptureEnabled = self.photoOutput.isLivePhotoCaptureSupported 
if self.photoOutput.isLivePhotoCaptureSupported {
                let livePhotoMovieFileName = NSUUID().uuidString
                let livePhotoMovieFilePath = (NSTemporaryDirectory(a)as NSString).appendingPathComponent((livePhotoMovieFileName as NSString).appendingPathExtension("mov")!)
                photoSettings.livePhotoMovieFileURL = URL(fileURLWithPath: livePhotoMovieFilePath)
            }
Copy the code

Live Photo captures callbacks

The method which we said AVCapturePhotoCaptureDelegate callback will be affected by AVCapturePhotoSettings, when there is a Live Photo to participate in, the callback and what kind of change?

As you can see from the figure above, there are two more callbacks to Live Photo between the two callbacks to complete the Photo process. They are:

  • PhotoOutput (_ : didFinishRecordingLivePhotoMovieForEventualFileAt: resolvedSettings:) : it means to complete the whole period of video recording, but haven’t written to the sand box;
  • PhotoOutput (_ : didFinishProcessingLivePhotoToMovieFileAt: duration: photoDisplayTime: resolvedSettings: error:) : The video has been written to the sandbox and can be read.

Live Photo is very simple to use and save, thanks to Apple’s good API design practices, but there are two things to be aware of when storing Live Photo:

  • Usually, we use PhotoKit to store photos in system albums. We need to build a PHAssetCreationRequest object, and Live Photo must use the same PHAssetCreationRequest object with the Photo to associate it. Otherwise, the storage will fail.
  • The system stores Live photos in the sandbox, but is not responsible for deleting them.

Thumbnail

AVCapturePhotoSettings added in iOS 10 allows the system to generate thumbnails when taking photos, without requiring developers to zoom in and out large images in various ways. It has little impact on the original process in terms of performance. Let’s take a look at the specific usage code:

if !photoSettings.__availablePreviewPhotoPixelFormatTypes.isEmpty {
    photoSettings.previewPhotoFormat = [kCVPixelBufferPixelFormatTypeKey as String: photoSettings.__availablePreviewPhotoPixelFormatTypes.first!, kCVPixelBufferWidthKey as String: 160, kCVPixelBufferHeightKey as String: 160]}Copy the code

It is also very simple to use in code, you can also set the size, and when the photo process is complete, the preview image will be accessible in the callback’s AVCapturePhoto property previewPixelBuffer (this depends on the specific callback version, but is essentially returned with the original image).

Scene monitoring

Scene monitoring believe you must be very strange to this word, but as the name implies, is to analyze Camera captured video monitoring, now found that apple provides two monitoring, a flash light, one is the static image stabilization technology, which has been abandoned in the iOS 13 here also buried a pit, the follow-up have to find the specific reason, So let’s use flash as an example.

When we use the camera system, the flash to the auto (off will disable scene monitoring, will start on May play a role in flash, no actual meaning), if use fingers to hold off camera, found that the UI will have to open the flash tooltips immediately, this is the scene monitoring apple provides about flash, when system sensed image at this time will be too dark, It signals the developer to turn on the flash, and if it’s too bright it turns off the flash.

I’m just going to set flashMode to Auto. Yes, you can do this, but the only difference with scene monitoring is that the user doesn’t know if the flash will be triggered by taking a picture at the moment, or if the user just doesn’t want the flash to be in a dark environment, etc. But this is more high-level business logic, let’s see how to use this feature in detail:

@objc dynamic private let photoOutput = AVCapturePhotoOutput(a)if self.session.canAddOutput(self.photoOutput) {
    let photoSettings = AVCapturePhotoSettings()
    photoSettings.flashMode = .on
    self.photoOutput.photoSettingsForSceneMonitoring = photoSettings
    let flashKeyValueObservation = self.photoOutput.observe(\.isFlashScene, options: .new) { (_, change) in
        guard let open = change.newValue else { return }
            debugPrint("Recommended status of flash:\ [open ? "Open" : "Closed")")}}Copy the code

One note: as in the previous example, flashMode needs to be auto or on to return monitoring information, and off does not recommend a proper flash state.

conclusion

For the advanced knowledge of AVCapturePhotoOutput, there are a few more points, which are a bit larger, such as depth of field mode, portrait mode, RAW and wide gamut capture, etc. Depth of field and portrait will be added later. RAW recently has a big action on iOS 14, and the following is also worth tracking. Wide gambit is a hardware dependent feature, which can be supported on some ipads, but on iPhone, you need to pay attention to it. If the article has what description is not allowed and improper, welcome to correct, common maintenance.