45

iOS视频采集实战(AVCaptureSession)

 5 years ago
source link: https://xiaodongxie1024.github.io/2019/04/15/20190413_iOS_VideoCaptureExercise/?amp%3Butm_medium=referral
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

需求:使用AVFoundation中的AVCaptureSession实现设置相机的分辨率,帧率(包括高帧率), 切换前后置摄像头,对焦,屏幕旋转,调节曝光度…

阅读前提:

GitHub地址(附代码): iOS视频采集实战(AVCaptureSession)

简书地址 : iOS视频采集实战(AVCaptureSession)

博客地址 : iOS视频采集实战(AVCaptureSession)

掘金地址 : iOS视频采集实战(AVCaptureSession)

1. 设置分辨率与帧率

1.1. 低帧率模式(fps <= 30)

在要求帧率小于等于30帧的情况下,相机设置分辨率与帧率的方法是单独的,即设置帧率是帧率的方法,设置分辨率是分辨率的方法,两者没有绑定.

  • 设置分辨率

    使用此方法可以设置相机分辨率,可以设置的类型可以直接跳转进API文档处自行选择,目前支持最大的是3840*2160,如果不要求相机帧率大于30帧,此方法可以适用于你.

- (void)setCameraResolutionByPresetWithHeight:(int)height session:(AVCaptureSession *)session {
    /*
     Note: the method only support your frame rate <= 30 because we must use `activeFormat` when frame rate > 30, the `activeFormat` and `sessionPreset` are exclusive
     */
    AVCaptureSessionPreset preset = [self getSessionPresetByResolutionHeight:height];
    if ([session.sessionPreset isEqualToString:preset]) {
        NSLog(@"Needn't to set camera resolution repeatly !");
        return;
    }
    
    if (![session canSetSessionPreset:preset]) {
        NSLog(@"Can't set the sessionPreset !");
        return;
    }
    
    [session beginConfiguration];
    session.sessionPreset = preset;
    [session commitConfiguration];
}
  • 设置帧率

    使用此方法可以设置相机帧率,仅支持帧率小于等于30帧.

    - (void)setCameraForLFRWithFrameRate:(int)frameRate {
        // Only for frame rate <= 30
        AVCaptureDevice *captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
        [captureDevice lockForConfiguration:NULL];
        [captureDevice setActiveVideoMinFrameDuration:CMTimeMake(1, frameRate)];
        [captureDevice setActiveVideoMaxFrameDuration:CMTimeMake(1, frameRate)];
        [captureDevice unlockForConfiguration];
    }
    

1.2. 高帧率模式(fps > 30)

如果需要对某一分辨率支持高帧率的设置,如50帧,60帧,120帧…,原先 setActiveVideoMinFrameDurationsetActiveVideoMaxFrameDuration 是无法做到的,Apple规定我们需要使用新的方法设置帧率 setActiveVideoMinFrameDurationsetActiveVideoMaxFrameDuration ,并且该方法必须配合新的设置分辨率 activeFormat 的方法一起使用.

新的设置分辨率的方法 activeFormatsessionPreset 是互斥的,如果使用了一个, 另一个会失效,建议直接使用高帧率的设置方法,废弃低帧率下设置方法,避免产生兼容问题。

Apple在更新方法后将原先分离的分辨率与帧率的设置方法合二为一,原先是单独设置相机分辨率与帧率,而现在则需要一起设置,即每个分辨率有其对应支持的帧率范围,每个帧率也有其支持的分辨率,需要我们遍历来查询,所以原先统一的单独的设置分辨率与帧率的方法在高帧率模式下相当于弃用,可以根据项目需求选择,如果确定项目不会支持高帧率(fps>30),可以使用以前的方法,简单且有效.

注意: 使用 activeFormat 方法后,之前使用 sessionPreset 方法设置的分辨率将自动变为 AVCaptureSessionPresetInputPriority ,所以如果项目之前有用 canSetSessionPreset 比较的if语句也都将失效,建议如果项目必须支持高帧率则彻底启用 sessionPreset 方法.

+ (BOOL)setCameraFrameRateAndResolutionWithFrameRate:(int)frameRate andResolutionHeight:(CGFloat)resolutionHeight bySession:(AVCaptureSession *)session position:(AVCaptureDevicePosition)position videoFormat:(OSType)videoFormat {
    AVCaptureDevice *captureDevice = [self getCaptureDevicePosition:position];
    
    BOOL isSuccess = NO;
    for(AVCaptureDeviceFormat *vFormat in [captureDevice formats]) {
        CMFormatDescriptionRef description = vFormat.formatDescription;
        float maxRate = ((AVFrameRateRange*) [vFormat.videoSupportedFrameRateRanges objectAtIndex:0]).maxFrameRate;
        if (maxRate >= frameRate && CMFormatDescriptionGetMediaSubType(description) == videoFormat) {
            if ([captureDevice lockForConfiguration:NULL] == YES) {
                // 对比镜头支持的分辨率和当前设置的分辨率
                CMVideoDimensions dims = CMVideoFormatDescriptionGetDimensions(description);
                if (dims.height == resolutionHeight && dims.width == [self getResolutionWidthByHeight:resolutionHeight]) {
                    [session beginConfiguration];
                    if ([captureDevice lockForConfiguration:NULL]){
                        captureDevice.activeFormat = vFormat;
                        [captureDevice setActiveVideoMinFrameDuration:CMTimeMake(1, frameRate)];
                        [captureDevice setActiveVideoMaxFrameDuration:CMTimeMake(1, frameRate)];
                        [captureDevice unlockForConfiguration];
                    }
                    [session commitConfiguration];
                    
                    return YES;
                }
            }else {
                NSLog(@"%s: lock failed!",__func__);
            }
        }
    }
    
    NSLog(@"Set camera frame is success : %d, frame rate is %lu, resolution height = %f",isSuccess,(unsigned long)frameRate,resolutionHeight);
    return NO;
}

+ (AVCaptureDevice *)getCaptureDevicePosition:(AVCaptureDevicePosition)position {
    NSArray *devices = nil;
    
    if (@available(iOS 10.0, *)) {
        AVCaptureDeviceDiscoverySession *deviceDiscoverySession =  [AVCaptureDeviceDiscoverySession discoverySessionWithDeviceTypes:@[AVCaptureDeviceTypeBuiltInWideAngleCamera] mediaType:AVMediaTypeVideo position:position];
        devices = deviceDiscoverySession.devices;
    } else {
#pragma clang diagnostic push
#pragma clang diagnostic ignored "-Wdeprecated-declarations"
        devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
#pragma clang diagnostic pop
    }
    
    for (AVCaptureDevice *device in devices) {
        if (position == device.position) {
            return device;
        }
    }
    return NULL;
}

2. 前后置摄像头切换

切换前后置摄像头,看似简单,实际应用中会产生很多问题,因为同一部设备前后置摄像头支持的分辨率帧率的值是不同的,所以如果从支持切向不支持就会产生问题,具体案例如下

比如iPhoneX, 后置摄像头最大支持(4K,60fps),前置摄像头最大支持(2K,30fps),当使用(4K,60fps)后置摄像头切到前置摄像头如果不做处理则无法切换,程序错乱.

注意

下面代码中我们这行代码 session.sessionPreset = AVCaptureSessionPresetLow; ,因为从后置切到前置我们需要重新计算当前输入设备支持最大的分辨率与帧率,而输入设备如果不先添加上去我们无法计算,所以在这里先随便设置一个可接受的分辨率以使我们可以把输入设备添加,之后在求出当前设备最大支持的分辨率与帧率后再重新设置分辨率与帧率.

- (void)setCameraPosition:(AVCaptureDevicePosition)position session:(AVCaptureSession *)session input:(AVCaptureDeviceInput *)input videoFormat:(OSType)videoFormat resolutionHeight:(CGFloat)resolutionHeight frameRate:(int)frameRate {
    if (input) {
        [session beginConfiguration];
        [session removeInput:input];
        
        AVCaptureDevice *device = [self.class getCaptureDevicePosition:position];
        
        NSError *error = nil;
        AVCaptureDeviceInput *newInput = [AVCaptureDeviceInput deviceInputWithDevice:device
                                                                               error:&error];
        
        if (error != noErr) {
            NSLog(@"%s: error:%@",__func__, error.localizedDescription);
            return;
        }
        
        // 比如: 后置是4K, 前置最多支持2K,此时切换需要降级, 而如果不先把Input添加到session中,我们无法计算当前摄像头支持的最大分辨率
        session.sessionPreset = AVCaptureSessionPresetLow;
        if ([session canAddInput:newInput])  {
            self.input = newInput;
            [session addInput:newInput];
        }else {
            NSLog(@"%s: add input failed.",__func__);
            return;
        }
        
        int maxResolutionHeight = [self getMaxSupportResolutionByPreset];
        if (resolutionHeight > maxResolutionHeight) {
            resolutionHeight = maxResolutionHeight;
            self.cameraModel.resolutionHeight = resolutionHeight;
            NSLog(@"%s: Current support max resolution height = %d", __func__, maxResolutionHeight);
        }
        
        int maxFrameRate = [self getMaxFrameRateByCurrentResolution];
        if (frameRate > maxFrameRate) {
            frameRate = maxFrameRate;
            self.cameraModel.frameRate = frameRate;
            NSLog(@"%s: Current support max frame rate = %d",__func__, maxFrameRate);
        }

        BOOL isSuccess = [self.class setCameraFrameRateAndResolutionWithFrameRate:frameRate
                                                              andResolutionHeight:resolutionHeight
                                                                        bySession:session
                                                                         position:position
                                                                      videoFormat:videoFormat];
        
        if (!isSuccess) {
            NSLog(@"%s: Set resolution and frame rate failed.",__func__);
        }
        
        [session commitConfiguration];
    }
}

3.屏幕视频方向切换

我们在这里首先要区分下屏幕方向与视频方向的概念,一个是用来表示设备方向(UIDeviceOrientation),一个是用来表示视频方向(AVCaptureVideoOrientation). 我们使用的AVCaptureSession,如果要支持屏幕旋转,需要在屏幕旋转的同时将我们的视频画面也进行旋转.

屏幕方向的旋转可以通过通知 UIDeviceOrientationDidChangeNotification 接收,这里不做过多说明.

- (void)adjustVideoOrientationByScreenOrientation:(UIDeviceOrientation)orientation previewFrame:(CGRect)previewFrame previewLayer:(AVCaptureVideoPreviewLayer *)previewLayer videoOutput:(AVCaptureVideoDataOutput *)videoOutput {
    [previewLayer setFrame:previewFrame];
    
    switch (orientation) {
        case UIInterfaceOrientationPortrait:
            [self adjustAVOutputDataOrientation:AVCaptureVideoOrientationPortrait
                                    videoOutput:videoOutput];
            break;
        case UIInterfaceOrientationPortraitUpsideDown:
            [self adjustAVOutputDataOrientation:AVCaptureVideoOrientationPortraitUpsideDown
                                    videoOutput:videoOutput];
            break;
        case UIInterfaceOrientationLandscapeLeft:
            [[previewLayer connection] setVideoOrientation:AVCaptureVideoOrientationLandscapeLeft];
            [self adjustAVOutputDataOrientation:AVCaptureVideoOrientationLandscapeLeft
                                    videoOutput:videoOutput];
            break;
        case UIInterfaceOrientationLandscapeRight:
            [[previewLayer connection] setVideoOrientation:AVCaptureVideoOrientationLandscapeRight];
            [self adjustAVOutputDataOrientation:AVCaptureVideoOrientationLandscapeRight
                                    videoOutput:videoOutput];
            break;
            
        default:
            break;
            
    }
}

-(void)adjustAVOutputDataOrientation:(AVCaptureVideoOrientation)orientation videoOutput:(AVCaptureVideoDataOutput *)videoOutput {
    for(AVCaptureConnection *connection in videoOutput.connections) {
        for(AVCaptureInputPort *port in [connection inputPorts]) {
            if([[port mediaType] isEqual:AVMediaTypeVideo]) {
                if([connection isVideoOrientationSupported]) {
                    [connection setVideoOrientation:orientation];
                }
            }
        }
    }
}

4.对焦调节

关于对焦,我们需要特别说明手动设置对焦点进行对焦,因为对焦方法仅接受以左上角为(0,0),右下角为(1,1)的坐标系,所以我们需要对UIView的坐标系进行转换,但是转换需要分为多种情况,如下

  • 视频是否以镜像模式输出: 如前置摄像头可能会开启镜像模式(x,y坐标是反的)
  • 屏幕方向是以Home在右还是在左: 在右的话是以左上角为原点,在左的话则是以右下角为原点.
  • 视频渲染方式: 是保持分辨率比例,还是填充模式,因为手机型号不同,所以可能是填充黑边,可能超出屏幕,需要重新计算对焦点.

如果我们是直接使用AVCaptureSession的AVCaptureVideoPreviewLayer做渲染,我们可以使用 captureDevicePointOfInterestForPoint 方法自动计算,此结果会考虑上面所有情况.但如果我们是自己对屏幕做渲染,则需要自己计算对焦点,上面的情况都需要考虑. 下面提供自动与手动计算两种方法.

- (void)autoFocusAtPoint:(CGPoint)point {
    AVCaptureDevice *device = self.input.device;
    if ([device isFocusPointOfInterestSupported] && [device isFocusModeSupported:AVCaptureFocusModeAutoFocus]) {
        NSError *error;
        if ([device lockForConfiguration:&error]) {
            [device setExposurePointOfInterest:point];
            [device setExposureMode:AVCaptureExposureModeContinuousAutoExposure];
            [device setFocusPointOfInterest:point];
            [device setFocusMode:AVCaptureFocusModeAutoFocus];
            [device unlockForConfiguration];
        }
    }
}

4.1. 自动计算对焦点

- (CGPoint)convertToPointOfInterestFromViewCoordinates:(CGPoint)viewCoordinates captureVideoPreviewLayer:(AVCaptureVideoPreviewLayer *)captureVideoPreviewLayer {
    CGPoint pointOfInterest = CGPointMake(.5f, .5f);
    CGSize frameSize = [captureVideoPreviewLayer frame].size;
    
    if ([captureVideoPreviewLayer.connection isVideoMirrored]) {
        viewCoordinates.x = frameSize.width - viewCoordinates.x;
    }

    // Convert UIKit coordinate to Focus Point(0.0~1.1)
    pointOfInterest = [captureVideoPreviewLayer captureDevicePointOfInterestForPoint:viewCoordinates];
    
    // NSLog(@"Focus - Auto test: %@",NSStringFromCGPoint(pointOfInterest));
    
    return pointOfInterest;
}

4.2. 手动计算对焦点

  • 如果手机屏幕尺寸与分辨率比例完全吻合,则直接将坐标系转为(0,0)到(1,1)即可
  • 如果屏幕尺寸比例与分辨率比例不同,需要进一步分析视频渲染方式来计算,如果是保持分辨率,则肯定会留下黑边,我们在计算对焦点时需要减去黑边长度,如果是以分辨率比例填充屏幕则会牺牲一部分像素,我们在计算对焦点时同样需要加上牺牲的像素.
    - (CGPoint)manualConvertFocusPoint:(CGPoint)point frameSize:(CGSize)frameSize captureVideoPreviewLayer:(AVCaptureVideoPreviewLayer *)captureVideoPreviewLayer position:(AVCaptureDevicePosition)position videoDataOutput:(AVCaptureVideoDataOutput *)videoDataOutput input:(AVCaptureDeviceInput *)input {
        CGPoint pointOfInterest = CGPointMake(.5f, .5f);
        
        if ([[videoDataOutput connectionWithMediaType:AVMediaTypeVideo] isVideoMirrored]) {
            point.x = frameSize.width - point.x;
        }
        
        for (AVCaptureInputPort *port in [input ports]) {
            if ([port mediaType] == AVMediaTypeVideo) {
                CGRect cleanAperture = CMVideoFormatDescriptionGetCleanAperture([port formatDescription], YES);
                CGSize resolutionSize = cleanAperture.size;
                
                CGFloat resolutionRatio = resolutionSize.width / resolutionSize.height;
                CGFloat screenSizeRatio = frameSize.width / frameSize.height;
                CGFloat xc = .5f;
                CGFloat yc = .5f;
            
                if (resolutionRatio == screenSizeRatio) {
                    xc = point.x / frameSize.width;
                    yc = point.y / frameSize.height;
                }else if (resolutionRatio > screenSizeRatio) {
                    if ([[captureVideoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResizeAspectFill]) {
                        CGFloat needScreenWidth = resolutionRatio * frameSize.height;
                        CGFloat cropWidth = (needScreenWidth - frameSize.width) / 2;
                        xc = (cropWidth + point.x) / needScreenWidth;
                        yc = point.y / frameSize.height;
                    }else if ([[captureVideoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResizeAspect]){
                        CGFloat needScreenHeight = frameSize.width * (1/resolutionRatio);
                        CGFloat blackBarLength   = (frameSize.height - needScreenHeight) / 2;
                        xc = point.x / frameSize.width;
                        yc = (point.y - blackBarLength) / needScreenHeight;
                    }else if ([[captureVideoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResize]) {
                        xc = point.x / frameSize.width;
                        yc = point.y / frameSize.height;
                    }
                }else {
                    if ([[captureVideoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResizeAspectFill]) {
                        CGFloat needScreenHeight = (1/resolutionRatio) * frameSize.width;
                        CGFloat cropHeight = (needScreenHeight - frameSize.height) / 2;
                        xc = point.x / frameSize.width;
                        yc = (cropHeight + point.y) / needScreenHeight;
                    }else if ([[captureVideoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResizeAspect]){
                        CGFloat needScreenWidth = frameSize.height * resolutionRatio;
                        CGFloat blackBarLength   = (frameSize.width - needScreenWidth) / 2;
                        xc = (point.x - blackBarLength) / needScreenWidth;
                        yc = point.y / frameSize.height;
                    }else if ([[captureVideoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResize]) {
                        xc = point.x / frameSize.width;
                        yc = point.y / frameSize.height;
                    }
                }
                pointOfInterest = CGPointMake(xc, yc);
            }
        }
        
        if (position == AVCaptureDevicePositionBack) {
            if (captureVideoPreviewLayer.connection.videoOrientation == AVCaptureVideoOrientationLandscapeLeft) {
                pointOfInterest = CGPointMake(1-pointOfInterest.x, 1-pointOfInterest.y);
            }
        }else {
            pointOfInterest = CGPointMake(pointOfInterest.x, 1-pointOfInterest.y);
        }
        
        //NSLog(@"Focus - manu test: %@",NSStringFromCGPoint(pointOfInterest));
        return pointOfInterest;
    }
    

5.曝光调节

如果我们是以UISlider作为调节控件,最简单的做法可以将其范围设置的与曝光度值的范围相同,即(-8~8),这样无需转换值,直接传入即可,如果是手势或是其他控件可根据需求自行调整.较为简单,不再叙述.

- (void)setExposureWithNewValue:(CGFloat)newExposureValue device:(AVCaptureDevice *)device {
    NSError *error;
    if ([device lockForConfiguration:&error]) {
        [device setExposureTargetBias:newExposureValue completionHandler:nil];
        [device unlockForConfiguration];
    }
}

6.手电筒模式

  • AVCaptureTorchModeAuto: 自动
  • AVCaptureTorchModeOn: 打开
  • AVCaptureTorchModeOff: 关闭
- (void)setTorchState:(BOOL)isOpen device:(AVCaptureDevice *)device {
    if ([device hasTorch]) {
        NSError *error;
        [device lockForConfiguration:&error];
        device.torchMode = isOpen ? AVCaptureTorchModeOn : AVCaptureTorchModeOff;
        [device unlockForConfiguration];
    }else {
        NSLog(@"The device not support torch!");
    }
}

7.视频稳定性调节

注意: 部分机型,部分分辨率使用此属性渲染可能会出现问题 (iphone xs, 自己渲染)

-(void)adjustVideoStabilizationWithOutput:(AVCaptureVideoDataOutput *)output {
    NSArray *devices = nil;
    
    if (@available(iOS 10.0, *)) {
        AVCaptureDeviceDiscoverySession *deviceDiscoverySession =  [AVCaptureDeviceDiscoverySession discoverySessionWithDeviceTypes:@[AVCaptureDeviceTypeBuiltInWideAngleCamera] mediaType:AVMediaTypeVideo position:self.cameraModel.position];
        devices = deviceDiscoverySession.devices;
    } else {
#pragma clang diagnostic push
#pragma clang diagnostic ignored "-Wdeprecated-declarations"
        devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
#pragma clang diagnostic pop
    }
    
    for(AVCaptureDevice *device in devices){
        if([device hasMediaType:AVMediaTypeVideo]){
            if([device.activeFormat isVideoStabilizationModeSupported:AVCaptureVideoStabilizationModeAuto]) {
                for(AVCaptureConnection *connection in output.connections) {
                    for(AVCaptureInputPort *port in [connection inputPorts]) {
                        if([[port mediaType] isEqual:AVMediaTypeVideo]) {
                            if(connection.supportsVideoStabilization) {
                                connection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeStandard;
                                NSLog(@"activeVideoStabilizationMode = %ld",(long)connection.activeVideoStabilizationMode);
                            }else {
                                NSLog(@"connection don't support video stabilization");
                            }
                        }
                    }
                }
            }else{
                NSLog(@"device don't support video stablization");
            }
        }
    }
}

8.白平衡调节

  • temperature: 通过华氏温度调节 (-150-~250)
  • tint: 通过色调调节 (-150-~150)

注意在使用 setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains 方法时必须比较当前的 AVCaptureWhiteBalanceGains 值是否在有效范围.

-(AVCaptureWhiteBalanceGains)clampGains:(AVCaptureWhiteBalanceGains)gains toMinVal:(CGFloat)minVal andMaxVal:(CGFloat)maxVal {
    AVCaptureWhiteBalanceGains tmpGains = gains;
    tmpGains.blueGain   = MAX(MIN(tmpGains.blueGain , maxVal), minVal);
    tmpGains.redGain    = MAX(MIN(tmpGains.redGain  , maxVal), minVal);
    tmpGains.greenGain  = MAX(MIN(tmpGains.greenGain, maxVal), minVal);
    
    return tmpGains;
}

-(void)setWhiteBlanceValueByTemperature:(CGFloat)temperature device:(AVCaptureDevice *)device {
    if ([device isWhiteBalanceModeSupported:AVCaptureWhiteBalanceModeLocked]) {
        [device lockForConfiguration:nil];
        AVCaptureWhiteBalanceGains currentGains = device.deviceWhiteBalanceGains;
        CGFloat currentTint = [device temperatureAndTintValuesForDeviceWhiteBalanceGains:currentGains].tint;
        AVCaptureWhiteBalanceTemperatureAndTintValues tempAndTintValues = {
            .temperature = temperature,
            .tint        = currentTint,
        };
        
        AVCaptureWhiteBalanceGains deviceGains = [device deviceWhiteBalanceGainsForTemperatureAndTintValues:tempAndTintValues];
        CGFloat maxWhiteBalanceGain = device.maxWhiteBalanceGain;
        deviceGains = [self clampGains:deviceGains toMinVal:1 andMaxVal:maxWhiteBalanceGain];
        
        [device setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains:deviceGains completionHandler:nil];
        [device unlockForConfiguration];
    }
}

-(void)setWhiteBlanceValueByTint:(CGFloat)tint device:(AVCaptureDevice *)device {
    if ([device isWhiteBalanceModeSupported:AVCaptureWhiteBalanceModeLocked]) {
        [device lockForConfiguration:nil];
        CGFloat maxWhiteBalaceGain = device.maxWhiteBalanceGain;
        AVCaptureWhiteBalanceGains currentGains = device.deviceWhiteBalanceGains;
        currentGains = [self clampGains:currentGains toMinVal:1 andMaxVal:maxWhiteBalaceGain];
        CGFloat currentTemperature = [device temperatureAndTintValuesForDeviceWhiteBalanceGains:currentGains].temperature;
        AVCaptureWhiteBalanceTemperatureAndTintValues tempAndTintValues = {
            .temperature = currentTemperature,
            .tint        = tint,
        };
        
        AVCaptureWhiteBalanceGains deviceGains = [device deviceWhiteBalanceGainsForTemperatureAndTintValues:tempAndTintValues];
        deviceGains = [self clampGains:deviceGains toMinVal:1 andMaxVal:maxWhiteBalaceGain];
        
        [device setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains:deviceGains completionHandler:nil];
        [device unlockForConfiguration];
    }
}

About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK