Basic development of iOS camera

catalogue

1, Set up session

1. Initialize session hub

//Create a capture session. AVCaptureSession is the central hub for capturing scenes
self.captureSession = [[AVCaptureSession alloc] init];

/*
 AVCaptureSessionPresetHigh
 AVCaptureSessionPresetMedium
 AVCaptureSessionPresetLow
 AVCaptureSessionPreset640x480
 AVCaptureSessionPreset1280x720
 AVCaptureSessionPresetPhoto
 */
//Sets the resolution of the image
self.captureSession.sessionPreset = AVCaptureSessionPresetHigh;

2. Create video input

//Set input
//Get the default video capture device and return the iOS system to the rear camera
AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

//Encapsulate the capture device into AVCaptureDeviceInput
//Note: to add a capture device to a session, the device must be encapsulated as an AVCaptureDeviceInput object
AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:error];
    

3. Add to session

//Determine whether the videoInput is valid
if (videoInput)
{
    //canAddInput: test whether it can be added to the session
    if ([self.captureSession canAddInput:videoInput])
    {
        //Add videoInput to captureSession
        [self.captureSession addInput:videoInput];
        self.activeVideoInput = videoInput;
    }
}else
{
    return NO;
}

4. Create audio input

//Selecting the default audio capture device returns you to a built-in microphone
AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];

//Create a capture input for this device
AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:error];

5. Add audio input

//Judge whether audioInput is valid
if (audioInput) {
    
    //canAddInput: test whether it can be added to the session
    if ([self.captureSession canAddInput:audioInput])
    {
        //Add audioInput to captureSession
        [self.captureSession addInput:audioInput];
    }
}else
{
    return NO;
}

6. Static picture output settings

//The AVCaptureStillImageOutput instance captures still pictures from the camera and sets the output (photo / video file)
self.imageOutput = [[AVCaptureStillImageOutput alloc] init];

//Configuration Dictionary: you want to capture pictures in JPEG format
self.imageOutput.outputSettings = @{AVVideoCodecKey:AVVideoCodecJPEG};

//The output connection determines whether it is available. If it is available, it is added to the output connection
if ([self.captureSession canAddOutput:self.imageOutput])
{
    [self.captureSession addOutput:self.imageOutput];
    
}

7. Set video output

//Create an AVCaptureMovieFileOutput instance to record Quick Time movies to the file system
self.movieOutput = [[AVCaptureMovieFileOutput alloc]init];

//The output connection determines whether it is available. If it is available, it is added to the output connection
if ([self.captureSession canAddOutput:self.movieOutput])
{
    [self.captureSession addOutput:self.movieOutput];
}

// Initialize queue
self.videoQueue = dispatch_queue_create("videoQueue", NULL);

2, Open session

1. Open session

//Check whether it is running
if (![self.captureSession isRunning])
{
    //If using synchronous calls will consume a certain amount of time, it should be handled asynchronously
    dispatch_async(self.videoQueue, ^{
        [self.captureSession startRunning];
    });
}

2. Camera processing

2.1 obtain the camera at the specified position

AVCaptureDevicePosition

  • AVCaptureDevicePositionUnspecified = 0, / / not supported
  • Avcapturedevicepositionback = 1, / / post
  • Avcapturedevicepositionfront = 2, / / front
//Find the specified camera device
- (AVCaptureDevice *)cameraWithPosition:(AVCaptureDevicePosition)position {
    
    //Get available video devices
    NSArray *devicess = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
    
    //Traverse the available video devices and return the value of the position parameter
    for (AVCaptureDevice *device in devicess)
    {
        if (device.position == position) {
            return device;
        }
    }
    return nil;
}

2.2 get currently active cameras

//Currently active devices
- (AVCaptureDevice *)activeCamera {

    //Returns the device property of the camera corresponding to the current capture session
    // The activeVideoInput here is the currently used input marked when switching the front and rear cameras
    return self.activeVideoInput.device;
}

2.3 get the number of currently available cameras and unused cameras

//Returns the currently inactive camera
- (AVCaptureDevice *)inactiveCamera {

    //It is obtained by finding the reverse camera of the currently active camera. If the device has only one camera, it returns nil
       AVCaptureDevice *device = nil;
      if (self.cameraCount > 1)
      {
          if ([self activeCamera].position == AVCaptureDevicePositionBack) {
               device = [self cameraWithPosition:AVCaptureDevicePositionFront];
         }else
         {
             device = [self cameraWithPosition:AVCaptureDevicePositionBack];
         }
     }
    return device;
}

//Number of video capture devices available
- (NSUInteger)cameraCount {

     return [[AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo] count];
}

2.4 switching cameras

//Switch camera
- (BOOL)switchCameras {

    //Determine whether there are multiple cameras
    if (![self canSwitchCameras]){
        return NO;
    }
    
    //Gets the reverse device of the current device
    NSError *error;
    AVCaptureDevice *videoDevice = [self inactiveCamera];
    
    
    //Encapsulate the input device into AVCaptureDeviceInput
    AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
    
    //Judge whether the videoInput is nil
    if (videoInput)
    {
        //Mark the beginning of the original configuration change
        [self.captureSession beginConfiguration];
        
        //Remove the original capture input device from the capture session
        [self.captureSession removeInput:self.activeVideoInput];
        
        //Judge whether the new equipment can be added
        if ([self.captureSession canAddInput:videoInput])
        {
            //If you can join successfully, use videoInput as a new video capture device
            [self.captureSession addInput:videoInput];
            
            //Change the acquired device to videoInput
            self.activeVideoInput = videoInput;
        }else
        {
            //If the device is new, you cannot join. Then rejoin the original video capture device into the capture session
            [self.captureSession addInput:self.activeVideoInput];
        }
        
        //After configuration, AVCaptureSession commitConfiguration will integrate all changes in batches.
        [self.captureSession commitConfiguration];
    }else{
        //If there is an error creating AVCaptureDeviceInput, notify the delegate to handle the error
        [self.delegate deviceConfigurationFailedWithError:error];
        return NO;
    }
    
    return YES;
}

3. Focus processing

3.1 does it support focusing

- (BOOL)cameraSupportsTapToFocus {
        
    //Ask whether the active camera supports focus of interest
    return [[self activeCamera] isFocusPointOfInterestSupported];
}

3.2 setting focus

- (void)focusAtPoint:(CGPoint)point {
    
    AVCaptureDevice *device = [self activeCamera];
    
    //Does it support focus at interest points & auto focus mode
    if (device.isFocusPointOfInterestSupported && [device isFocusModeSupported:AVCaptureFocusModeAutoFocus]) {
        
        NSError *error;
        //The locking device is ready for configuration if a lock is obtained
        if ([device lockForConfiguration:&error]) {
            
            //Set the focusPointOfInterest property to CGPoint
            device.focusPointOfInterest = point;
            
            //focusMode is set to AVCaptureFocusModeAutoFocus
            device.focusMode = AVCaptureFocusModeAutoFocus;
            
            //Release the lock
            [device unlockForConfiguration];
        }else{
            //When an error occurs, it is returned to the error handling agent
            [self.delegate deviceConfigurationFailedWithError:error];
        }
        
    }
    
}

4. Exposure processing

4.1 is exposure supported

- (BOOL)cameraSupportsTapToExpose {
    
    //Ask if the device supports exposing a point of interest
    return [[self activeCamera] isExposurePointOfInterestSupported];
}

4.2 setting exposure

static const NSString *YCameraAdjustingExposureContext;

- (void)exposeAtPoint:(CGPoint)point {

    // Get current device
    AVCaptureDevice *device = [self activeCamera];
    
    // The exposure mode is set to auto exposure
    AVCaptureExposureMode exposureMode =AVCaptureExposureModeContinuousAutoExposure;
    
    //Judge whether AVCaptureExposureModeContinuousAutoExposure mode is supported
    if (device.isExposurePointOfInterestSupported && [device isExposureModeSupported:exposureMode]) {
        
        NSError *error;
        
        //Lock device preparation configuration
        if ([device lockForConfiguration:&error])
        {
            //Configure expectations
            device.exposurePointOfInterest = point;
            device.exposureMode = exposureMode;
            
            //Determine whether the device supports the mode of locked exposure.
            if ([device isExposureModeSupported:AVCaptureExposureModeLocked]) {
                
                //If yes, use kvo to determine the status of the adjustingExposure property of the device.
                [device addObserver:self forKeyPath:@"adjustingExposure" options:NSKeyValueObservingOptionNew context:&YCameraAdjustingExposureContext];
            }
            
            //Release the lock
            [device unlockForConfiguration];
            
        }else
        {
            [self.delegate deviceConfigurationFailedWithError:error];
        }
    }
}

// Observer method
- (void)observeValueForKeyPath:(NSString *)keyPath
                      ofObject:(id)object
                        change:(NSDictionary *)change
                       context:(void *)context {

    //Judge whether the context is ycameraadjustingexposuurecontext
    if (context == &YCameraAdjustingExposureContext) {
        
        //Get device
        AVCaptureDevice *device = (AVCaptureDevice *)object;
        
        //Judge whether the exposure level of the device is no longer adjusted, and confirm whether the exposure mode of the device can be set to avcaptureexposure modelocked
        if(!device.isAdjustingExposure && [device isExposureModeSupported:AVCaptureExposureModeLocked])
        {
            //If you remove self as adjustingExposure, you will not be notified of subsequent changes
            [object removeObserver:self forKeyPath:@"adjustingExposure" context:&THCameraAdjustingExposureContext];
            
            //Call back to the main queue asynchronously,
            dispatch_async(dispatch_get_main_queue(), ^{
                NSError *error;
                if ([device lockForConfiguration:&error]) {
                    
                    //Modify exposureMode
                    device.exposureMode = AVCaptureExposureModeLocked;
                    
                    //Release the lock
                    [device unlockForConfiguration];
                    
                }else
                {
                    [self.delegate deviceConfigurationFailedWithError:error];
                }
            });
        }
        
    }else
    {
        [super observeValueForKeyPath:keyPath ofObject:object change:change context:context];
    }
    
    
}

4.3 reset focus exposure

//Reset focus & exposure
- (void)resetFocusAndExposureModes {

    AVCaptureDevice *device = [self activeCamera];
    
    AVCaptureFocusMode focusMode = AVCaptureFocusModeContinuousAutoFocus;
    
    //Whether the acquisition of focus points of interest and continuous auto focus mode are supported
    BOOL canResetFocus = [device isFocusPointOfInterestSupported]&& [device isFocusModeSupported:focusMode];
    
    AVCaptureExposureMode exposureMode = AVCaptureExposureModeContinuousAutoExposure;
    
    //Confirm that the exposure can be reset
    BOOL canResetExposure = [device isFocusPointOfInterestSupported] && [device isExposureModeSupported:exposureMode];
    
    //Capture the center point of the upper left corner (0, 0) and the lower right corner (1, 1) of the device space (0.5, 0.5)
    CGPoint centPoint = CGPointMake(0.5f, 0.5f);
    
    NSError *error;
    
    //Lock the device and prepare for configuration
    if ([device lockForConfiguration:&error]) {
        
        //If the focus can be set, it can be modified
        if (canResetFocus) {
            device.focusMode = focusMode;
            device.focusPointOfInterest = centPoint;
        }
        
        //If the exposure can be set, it is set to the desired exposure mode
        if (canResetExposure) {
            device.exposureMode = exposureMode;
            device.exposurePointOfInterest = centPoint;
            
        }
        
        //Release lock
        [device unlockForConfiguration];
        
    }else
    {
        [self.delegate deviceConfigurationFailedWithError:error];
    }
}

5. Flash & flashlight

5.1 judge whether there is a flash

//Determine whether there is a flash
- (BOOL)cameraHasFlash {

    return [[self activeCamera]hasFlash];
}

5.2 flash mode

There are three kinds of flash lamps

  • Avcaptureflashmodeoff = 0, / / off
  • Avcaptureflashmodeon = 1, / / on
  • AVCaptureFlashModeAuto = 2, / / Auto
//flash mode 
- (AVCaptureFlashMode)flashMode {

    return [[self activeCamera] flashMode];
}

5.3 setting flash mode

//Set flash
- (void)setFlashMode:(AVCaptureFlashMode)flashMode {

    //Get session
    AVCaptureDevice *device = [self activeCamera];
    
    //Determine whether flash mode is supported
    if ([device isFlashModeSupported:flashMode]) {
    
        //Lock the device if supported
        NSError *error;
        if ([device lockForConfiguration:&error]) {

            //Modify flash mode
            device.flashMode = flashMode;
            //After modification, unlock and release the device
            [device unlockForConfiguration];
            
        }else
        {
            [self.delegate deviceConfigurationFailedWithError:error];
        }
    }
}

5.4 is flashlight supported

//Does it support a flashlight
- (BOOL)cameraHasTorch {

    return [[self activeCamera] hasTorch];
}

5.5 flashlight mode

The flashlight also has three modes

  • AVCaptureTorchModeOff = 0,
  • AVCaptureTorchModeOn = 1,
  • AVCaptureTorchModeAuto = 2,
//Flashlight mode
- (AVCaptureTorchMode)torchMode {

    return [[self activeCamera] torchMode];
}

5.6 setting flashlight mode

//Set whether to turn on the flashlight
- (void)setTorchMode:(AVCaptureTorchMode)torchMode {

    
    AVCaptureDevice *device = [self activeCamera];
    
    if ([device isTorchModeSupported:torchMode]) {
        
        NSError *error;
        if ([device lockForConfiguration:&error]) {
            
            device.torchMode = torchMode;
            [device unlockForConfiguration];
        }else
        {
            [self.delegate deviceConfigurationFailedWithError:error];
        }
    }
}

3, Shoot

1. Take still pictures

/*
    AVCaptureStillImageOutput Is a subclass of AVCaptureOutput. Used to capture pictures
 */
- (void)captureStillImage {
    
    //Get connection from output
    AVCaptureConnection *connection = [self.imageOutput connectionWithMediaType:AVMediaTypeVideo];
    
    //The program only supports vertical, but if the user takes photos horizontally, he needs to adjust the direction of the resulting photos
    //Determine whether the setting of video direction is supported
    if (connection.isVideoOrientationSupported) {
        
        //Get direction value
        connection.videoOrientation = [self currentVideoOrientation];
    }
    
       //Defining a handler block will return NSData data of one picture
    id handler = ^(CMSampleBufferRef sampleBuffer,NSError *error){
        
        if (sampleBuffer != NULL)
        {
            NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:sampleBuffer];
            UIImage *image = [[UIImage alloc]initWithData:imageData];
            
            **** Key: Here you can transfer the captured pictures
        }
        else
        {
            NSLog(@"NULL sampleBuffer:%@",[error localizedDescription]);
        }
    };
    
    //Capture still pictures
    [self.imageOutput captureStillImageAsynchronouslyFromConnection:connection completionHandler:handler];
}

//Get direction value
- (AVCaptureVideoOrientation)currentVideoOrientation {
    
    AVCaptureVideoOrientation orientation;
    
    //Get the orientation of UIDevice
    switch ([UIDevice currentDevice].orientation) {
        case UIDeviceOrientationPortrait:
            orientation = AVCaptureVideoOrientationPortrait;
            break;
        case UIDeviceOrientationLandscapeRight:
            orientation = AVCaptureVideoOrientationLandscapeLeft;
            break;
        case UIDeviceOrientationPortraitUpsideDown:
            orientation = AVCaptureVideoOrientationPortraitUpsideDown;
            break;
        default:
            orientation = AVCaptureVideoOrientationLandscapeRight;
            break;
    }
    
    return orientation;

    return 0;
}

2. Write media library

/*
    Assets Library frame 
    Used to allow developers to access iOS photo through code
    Note: you will access the photo album and need to modify plist permission. Otherwise, the project will crash
 */

- (void)writeImageToAssetsLibrary:(UIImage *)image {

    //Create ALAssetsLibrary instance
    ALAssetsLibrary *library = [[ALAssetsLibrary alloc]init];
    
    //Parameter 1: picture (the parameter is CGImageRef, so image.CGImage)
    //Parameter 2: change direction parameter to NSUInteger
    //Parameter 3: write success and failure processing
    [library writeImageToSavedPhotosAlbum:image.CGImage
                             orientation:(NSUInteger)image.imageOrientation
                         completionBlock:^(NSURL *assetURL, NSError *error) {
                             //After success, send a capture picture notification. Thumbnail used to draw the lower left corner of the program
                             if (!error)
                             {
                                 [self postThumbnailNotifification:image];
                             }else
                             {
                                 //Failed to print error message
                                 id message = [error localizedDescription];
                                 NSLog(@"%@",message);
                             }
                         }];
}

//Send thumbnail notification
- (void)postThumbnailNotifification:(UIImage *)image {
    
    //Back to the main line
    dispatch_async(dispatch_get_main_queue(), ^{
        //Send request
        NSNotificationCenter *nc = [NSNotificationCenter defaultCenter];
        [nc postNotificationName:THThumbnailCreatedNotification object:image];
    });
}

3. Capture video

3.1 judge whether recording is in progress

//Determine whether recording status
- (BOOL)isRecording {
    // movieoutput is the output set when the session is initialized
    return self.movieOutput.isRecording;
}

3.2 start recording

//start recording 
- (void)startRecording {

    if (![self isRecording]) {
        
        //Get the current video capture connection information, which is used to capture video data and configure some core properties
        AVCaptureConnection * videoConnection = [self.movieOutput connectionWithMediaType:AVMediaTypeVideo];
        
        //Determine whether setting the videoOrientation property is supported.
        if([videoConnection isVideoOrientationSupported])
        {
            //If yes, modify the direction of the current video
            videoConnection.videoOrientation = [self currentVideoOrientation];
            
        }
        
        //Judging whether video stabilization is supported can significantly improve the quality of video. Will only be involved in recording video files
        if([videoConnection isVideoStabilizationSupported])
        {
            videoConnection.enablesVideoStabilizationWhenAvailable = YES;
        }
        
        
        AVCaptureDevice *device = [self activeCamera];
        
        //The camera can operate in smooth focus mode. That is, slow down the focusing speed of the camera lens. When the user moves to shoot, the camera will try to focus quickly.
        if (device.isSmoothAutoFocusEnabled) {
            NSError *error;
            if ([device lockForConfiguration:&error]) {
                
                device.smoothAutoFocusEnabled = YES;
                [device unlockForConfiguration];
            }else
            {
                [self.delegate deviceConfigurationFailedWithError:error];
            }
        }
        
        //Find a unique file system URL to write to the captured video
        self.outputURL = [self uniqueURL];
        
        //Call method on capture output parameter 1: record save path parameter 2: agent
        [self.movieOutput startRecordingToOutputFileURL:self.outputURL recordingDelegate:self];
        
    }
}

- (CMTime)recordedDuration {
    
    return self.movieOutput.recordedDuration;
}

- (NSURL *)uniqueURL {

    NSFileManager *fileManager = [NSFileManager defaultManager];
    
    //temporaryDirectoryWithTemplateString can create a uniquely named directory for the purpose of writing files;
    NSString *dirPath = [fileManager temporaryDirectoryWithTemplateString:@"kamera.XXXXXX"];
    
    if (dirPath) {
        
        NSString *filePath = [dirPath stringByAppendingPathComponent:@"kamera_movie.mov"];
        return  [NSURL fileURLWithPath:filePath];
        
    }
    
    return nil;
}

3.3 end recording

//Stop recording
- (void)stopRecording {

    //Are you recording
    if ([self isRecording]) {
        [self.movieOutput stopRecording];
    }
}

3.4 callback after recording

- (void)captureOutput:(AVCaptureFileOutput *)captureOutput
didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL
      fromConnections:(NSArray *)connections
                error:(NSError *)error {

    //error
    if (error) {
        [self.delegate mediaCaptureFailedWithError:error];
    }else
    {
        //write in
        [self writeVideoToAssetsLibrary:[self.outputURL copy]];
        
    }
    
    self.outputURL = nil;
}

3.5 write video to media library

//Write captured video
- (void)writeVideoToAssetsLibrary:(NSURL *)videoURL {
    
    //The ALAssetsLibrary instance provides an interface for writing video
    ALAssetsLibrary *library = [[ALAssetsLibrary alloc]init];
    
    //Before writing to the write repository, check whether the video can be written (try to form the habit of judgment before writing)
    if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:videoURL]) {
        
        //Create block block
        ALAssetsLibraryWriteVideoCompletionBlock completionBlock;
        completionBlock = ^(NSURL *assetURL,NSError *error)
        {
            if (error) {
                
                [self.delegate assetLibraryWriteFailedWithError:error];
            }else
            {
                //Video thumbnail for interface display
                [self generateThumbnailForVideoAtURL:videoURL];
            }
            
        };
        
        //Execute the action of actually writing to the resource library
        [library writeVideoAtPathToSavedPhotosAlbum:videoURL completionBlock:completionBlock];
    }
}

3.6 making video thumbnails

//Get the thumbnail in the lower left corner of the video
- (void)generateThumbnailForVideoAtURL:(NSURL *)videoURL {

    //On the videoQueue,
    dispatch_async(self.videoQueue, ^{
        
        //Create a new avasset & avassetimagegenerator
        AVAsset *asset = [AVAsset assetWithURL:videoURL];
        
        AVAssetImageGenerator *imageGenerator = [AVAssetImageGenerator assetImageGeneratorWithAsset:asset];
        
        //Set the maximumSize width to 100 and the height to 0. Calculate the height of the picture according to the aspect ratio of the video
        imageGenerator.maximumSize = CGSizeMake(100.0f, 0.0f);
        
        //Capturing video thumbnails will take into account video changes (such as video direction changes). If it is not set, the direction of thumbnails may be wrong
        imageGenerator.appliesPreferredTrackTransform = YES;
        
        //Get the CGImageRef image. Note that you need to manage its creation and release
        CGImageRef imageRef = [imageGenerator copyCGImageAtTime:kCMTimeZero actualTime:NULL error:nil];
        
        //Convert picture to UIImage
        UIImage *image = [UIImage imageWithCGImage:imageRef];
        
        //Release CGImageRef imageRef to prevent memory leakage
        CGImageRelease(imageRef);
        
        //Back to the main thread
        dispatch_async(dispatch_get_main_queue(), ^{
            
            //Send latest notification, image delivery
            [self postThumbnailNotifification:image];
            
        });
        
    });
}

4, QuckTime related description

Normally, the header information of our video content is at the beginning of the file, so that the video player can quickly read the information contained in the header to determine the content, structure and other sample positions of the file

However, when we use AVCaptureMovieFileOutput to record video, the video information is written to the file in real time. The header information can be accurately known only after the video is finished.

like this:

Posted video information

  Video head + Metadata of video
  
Live recorded video

  Metadata of video + Video head

There is a problem that if the recording is interrupted and the video header information is not completely written to the file, the video information cannot be accurately read.

The core function of AVCaptureMovieFileOutput is to capture QuickTime movies in segments

Namely:

head + data + fragment + data + fragment + data + fragment

By default, the system writes a fragment every 10s, and the complete header information is updated after the fragment is written.

We can use the movieFragmentInterval attribute to modify the time interval of the clip.

Tags: rtc

Posted by Pyrite on Sat, 07 May 2022 01:41:48 +0300