如何利用 AVFoundation 设计一个通用稳定的音视频框架?

栏目: IOS · 发布时间: 6年前

内容简介:承接上篇的AVCaptureSession开启捕获任务,配置AVCaptureDeviceInput定制捕获任务的输入源(多种摄像头),通过AVFoundation内各种Data output输出数据(元数据、视频帧、音频帧),AVAssetWriter开启写任务,将音视频数据归档为媒体文件。视频流预览、录像归档、捕获相片、切换摄像头、人脸检测、帧率配置、相机详细配置

承接上篇的 《AV Foundation开发秘籍——实践掌握iOS & OS X应用的视听处理技术 阅读指南》 今天这篇给大家讲解下如何利用AVFoundation设计一套通用稳定的音视频框架。

核心

AVCaptureSession开启捕获任务,配置AVCaptureDeviceInput定制捕获任务的输入源(多种摄像头),通过AVFoundation内各种Data output输出数据(元数据、视频帧、音频帧),AVAssetWriter开启写任务,将音视频数据归档为媒体文件。

实现功能

视频流预览、录像归档、捕获相片、切换摄像头、人脸检测、帧率配置、相机详细配置

框架源码

github.com/caixindong/…

具体设计

核心模块 XDCaptureService & XDVideoWritter

XDCaptureService是对外API的总入口,也是框架的核心类,主要做音视频输入输出配置工作和调度工作。 XDVideoWritter是音视频写模块,主要提供写数据和归档数据的基础操作,不对外暴露。 对外API设计:

@class XDCaptureService;


@protocol XDCaptureServiceDelegate <NSObject>

@optional
//service生命周期
- (void)captureServiceDidStartService:(XDCaptureService *)service;

- (void)captureService:(XDCaptureService *)service serviceDidFailWithError:(NSError *)error;

- (void)captureServiceDidStopService:(XDCaptureService *)service;

- (void)captureService:(XDCaptureService *)service getPreviewLayer:(AVCaptureVideoPreviewLayer *)previewLayer;

- (void)captureService:(XDCaptureService *)service outputSampleBuffer:(CMSampleBufferRef)sampleBuffer;

//录像相关
- (void)captureServiceRecorderDidStart:(XDCaptureService *)service ;

- (void)captureService:(XDCaptureService *)service recorderDidFailWithError:(NSError *)error;

- (void)captureServiceRecorderDidStop:(XDCaptureService *)service;

//照片捕获
- (void)captureService:(XDCaptureService *)service capturePhoto:(UIImage *)photo;

//人脸检测
- (void)captureService:(XDCaptureService *)service outputFaceDetectData:(NSArray <AVMetadataFaceObject*>*) faces;

//景深数据
- (void)captureService:(XDCaptureService *)service captureTrueDepth:(AVDepthData *)depthData API_AVAILABLE(ios(11.0));

@end

@protocol XDCaptureServicePreViewSource <NSObject>

- (AVCaptureVideoPreviewLayer *)preViewLayerSource;

@end

@interface XDCaptureService : NSObject

//是否录制音频,默认是NO
@property (nonatomic, assign) BOOL shouldRecordAudio;

//iOS原生人脸检测,默认是NO
@property (nonatomic, assign) BOOL openNativeFaceDetect;

//摄像头的方向,默认是AVCaptureDevicePositionFront(前置)
@property (nonatomic, assign) AVCaptureDevicePosition devicePosition;

//判断是否支持景深模式,当前只支持7p、8p、X的后置摄像头及X的前后摄像头,系统要求是iOS 11以上
@property (nonatomic, assign, readonly) BOOL depthSupported;

//是否开启景深模式,默认是NO
@property (nonatomic, assign) BOOL openDepth;

//只有以下指定的sessionPreset才有depth数据:AVCaptureSessionPresetPhoto、AVCaptureSessionPreset1280x720、AVCaptureSessionPreset640x480
@property (nonatomic, assign) AVCaptureSessionPreset sessionPreset;

//帧率,默认是30
@property (nonatomic, assign) int frameRate;

//录像的临时存储地址,建议每次录完视频做下重定向
@property (nonatomic, strong, readonly) NSURL *recordURL;

//如果设置preViewSource则内部不生成AVCaptureVideoPreviewLayer
@property (nonatomic, assign) id<XDCaptureServicePreViewSource> preViewSource;

@property (nonatomic, assign) id<XDCaptureServiceDelegate> delegate;

@property (nonatomic, assign, readonly) BOOL isRunning;


//视频编码设置(影响录制的视频的编码和大小)
@property (nonatomic, strong) NSDictionary *videoSetting;

///相机专业设置,除非特定需求,一般不设置
//感光度(iOS8以上)
@property (nonatomic, assign, readonly) CGFloat deviceISO;
@property (nonatomic, assign, readonly) CGFloat deviceMinISO;
@property (nonatomic, assign, readonly) CGFloat deviceMaxISO;

//镜头光圈大小
@property (nonatomic, assign, readonly) CGFloat deviceAperture;

//曝光
@property (nonatomic, assign, readonly) BOOL supportsTapToExpose;
@property (nonatomic, assign) AVCaptureExposureMode exposureMode;
@property (nonatomic, assign) CGPoint exposurePoint;
@property (nonatomic, assign, readonly) CMTime deviceExposureDuration;

//聚焦
@property (nonatomic, assign, readonly) BOOL supportsTapToFocus;
@property (nonatomic, assign) AVCaptureFocusMode focusMode;
@property (nonatomic, assign) CGPoint focusPoint;

//白平衡
@property (nonatomic, assign) AVCaptureWhiteBalanceMode whiteBalanceMode;

//手电筒
@property (nonatomic, assign, readonly) BOOL hasTorch;
@property (nonatomic, assign) AVCaptureTorchMode torchMode;

//闪光灯
@property (nonatomic, assign, readonly) BOOL hasFlash;
@property (nonatomic, assign) AVCaptureFlashMode flashMode;

//相机权限判断
+ (BOOL)videoGranted;

//麦克风权限判断
+ (BOOL)audioGranted;

//切换摄像机
- (void)switchCamera;

//启动
- (void)startRunning;

//关闭
- (void)stopRunning;

//开始录像
- (void)startRecording;

//取消录像
- (void)cancleRecording;

//停止录像
- (void)stopRecording;

//拍照
- (void)capturePhoto;

@end
复制代码

CDG队列分流

因为在主线程启动音视频捕获及音视频读写会阻塞主线程,所以我们需要将这些任务派发到子线程中执行。我们选择GCD队列帮我们做这个视频。我们框架总共配置3个队列,分别是sessionQueue、writtingQueue、outputQueue,这些队列都是串行队列,因为音视频相关操作都是有顺序(时序)要求,保证当前队列只有一个操作的执行(配置、写数据、读数据)。sessionQueue主要负责音视频任务启动的调度,writtingQueue主要负责写数据的调度,保证数据帧能够准确归档到文件,outputQueue主要负责数据帧的输出。

@property (nonatomic, strong) dispatch_queue_t sessionQueue;
@property (nonatomic, strong) dispatch_queue_t writtingQueue;
@property (nonatomic, strong) dispatch_queue_t outputQueue;

 _sessionQueue = dispatch_queue_create("com.caixindong.captureservice.session", DISPATCH_QUEUE_SERIAL);
_writtingQueue = dispatch_queue_create("com.caixindong.captureservice.writting", DISPATCH_QUEUE_SERIAL);
_outputQueue = dispatch_queue_create("com.caixindong.captureservice.output", DISPATCH_QUEUE_SERIAL);
复制代码

音视频捕获

初始化捕获任务

sessionPreset指定了输出的视频帧的像素,例如640*480

@property (nonatomic, strong) AVCaptureSession *captureSession;
 _captureSession = [[AVCaptureSession alloc] init];
_captureSession.sessionPreset = _sessionPreset;
复制代码

配置捕获的输入

获取输入源设备,通过_cameraWithPosition可以获取摄像头的抽象表示,因为红外摄像头、双摄像头只能从较新API中获取,所以方法里已经做了兼容处理。并用输入源设备配置捕获输入AVCaptureDeviceInput

@property (nonatomic, strong) AVCaptureDeviceInput *videoInput;

- (BOOL)_setupVideoInputOutput:(NSError **) error {
    self.currentDevice = [self _cameraWithPosition:_devicePosition];
    
    self.videoInput = [[AVCaptureDeviceInput alloc] initWithDevice:_currentDevice error:error];
    if (_videoInput) {
        if ([_captureSession canAddInput:_videoInput]) {
            [_captureSession addInput:_videoInput];
        } else {
            *error = [NSError errorWithDomain:@"com.caixindong.captureservice.video" code:-2200 userInfo:@{NSLocalizedDescriptionKey:@"add video input fail"}];
            return NO;
        }
    } else {
        *error = [NSError errorWithDomain:@"com.caixindong.captureservice.video" code:-2201 userInfo:@{NSLocalizedDescriptionKey:@"video input is nil"}];
        return NO;
    }
    
    //稳定帧率
    CMTime frameDuration = CMTimeMake(1, _frameRate);
    if ([_currentDevice lockForConfiguration:error]) {
        _currentDevice.activeVideoMaxFrameDuration = frameDuration;
        _currentDevice.activeVideoMinFrameDuration = frameDuration;
        [_currentDevice unlockForConfiguration];
    } else {
        *error = [NSError errorWithDomain:@"com.caixindong.captureservice.video" code:-2203 userInfo:@{NSLocalizedDescriptionKey:@"device lock fail(input)"}];
        
        return NO;
    }

……Other code
}

- (AVCaptureDevice *)_cameraWithPosition:(AVCaptureDevicePosition)position {
    if (@available(iOS 10.0, *)) {
        //AVCaptureDeviceTypeBuiltInWideAngleCamera默认广角摄像头,AVCaptureDeviceTypeBuiltInTelephotoCamera长焦摄像头,AVCaptureDeviceTypeBuiltInDualCamera后置双摄像头,AVCaptureDeviceTypeBuiltInTrueDepthCamera红外前置摄像头
        NSMutableArray *mulArr = [NSMutableArray arrayWithObjects:AVCaptureDeviceTypeBuiltInWideAngleCamera,AVCaptureDeviceTypeBuiltInTelephotoCamera,nil];
        if (@available(iOS 10.2, *)) {
            [mulArr addObject:AVCaptureDeviceTypeBuiltInDualCamera];
        }
        if (@available(iOS 11.1, *)) {
            [mulArr addObject:AVCaptureDeviceTypeBuiltInTrueDepthCamera];
        }
        AVCaptureDeviceDiscoverySession *discoverySession = [AVCaptureDeviceDiscoverySession discoverySessionWithDeviceTypes:[mulArr copy] mediaType:AVMediaTypeVideo position:position];
        return discoverySession.devices.firstObject;
    } else {
        NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
        for (AVCaptureDevice *device in videoDevices) {
            if (device.position == position) {
                return device;
            }
        }
    }
    return nil;
}

复制代码

配置捕获输出

根据不同的功能需求,我们可以往捕获任务里添加不同的输出,例如捕获基础的视频帧数据,我们添加AVCaptureVideoDataOutput,捕获音频数据,我们添加AVCaptureAudioDataOutput,捕获人脸数据,我们添加AVCaptureMetadataOutput。因为音频的输出和视频输出的设置方式大同小异,所以这里只列出视频输出的关键代码,这里有几个关键的设计:

1、因为相机传感器问题,输出的视频流的方向会有90度偏转,所以我们需要通过获取与输出连接的videoConnection进行偏转配置; 2、视频帧(或者音频帧)都是以CMSampleBufferRef格式输出,视频帧可能经过多个业务处理,例如写文件或者抛到上层业务处理,所以处理数据前都对视频帧数据进行retatin操作,保证各个业务线处理的视频帧是独立的,具体可以看_processVideoData;

3、为了及时清理临时变量(对视频帧处理的各种操作可能需要较多内存空间),所以将外抛的帧处理用autorelease pool包裹起来,防止出现内存高峰;

@property (nonatomic, strong) AVCaptureVideoDataOutput *videoOutput;

- (BOOL)_setupVideoInputOutput:(NSError **) error {
……Other code

self.videoOutput = [[AVCaptureVideoDataOutput alloc] init];
    _videoOutput.videoSettings = @{(id)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_32BGRA)};
    //对迟到的帧做丢帧处理
    _videoOutput.alwaysDiscardsLateVideoFrames = YES;
    
    dispatch_queue_t videoQueue = dispatch_queue_create("com.caixindong.captureservice.video", DISPATCH_QUEUE_SERIAL);
    //设置数据输出的delegate
    [_videoOutput setSampleBufferDelegate:self queue:videoQueue];
    
    if ([_captureSession canAddOutput:_videoOutput]) {
        [_captureSession addOutput:_videoOutput];
    } else {
        *error = [NSError errorWithDomain:@"com.caixindong.captureservice.video" code:-2204 userInfo:@{NSLocalizedDescriptionKey:@"device lock fail(output)"}];
        return NO;
    }
    
    self.videoConnection = [self.videoOutput connectionWithMediaType:AVMediaTypeVideo];
    //录制视频会有90度偏转,是因为相机传感器问题,所以在这里设置输出的视频流的方向
    if (_videoConnection.isVideoOrientationSupported) {
        _videoConnection.videoOrientation = AVCaptureVideoOrientationPortrait;
    }
    return YES;
}

#pragma mark - AVCaptureVideoDataOutputSampleBufferDelegate && AVCaptureAudioDataOutputSampleBufferDelegate
- (void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
    //可以捕获到不同的线程
    if (connection == _videoConnection) {
        @synchronized(self) {
            [self _processVideoData:sampleBuffer];
        };
    } else if (connection == _audioConnection) {
        @synchronized(self) {
            [self _processAudioData:sampleBuffer];
        };
    }
}

#pragma mark - process Data
- (void)_processVideoData:(CMSampleBufferRef)sampleBuffer {
    //CFRetain的目的是为了每条业务线(写视频、抛帧)的sampleBuffer都是独立的
    if (_videoWriter && _videoWriter.isWriting) {
        CFRetain(sampleBuffer);
        dispatch_async(_writtingQueue, ^{
            [_videoWriter appendSampleBuffer:sampleBuffer];
            CFRelease(sampleBuffer);
        });
    }
    
    CFRetain(sampleBuffer);
    //及时清理临时变量,防止出现内存高峰
    dispatch_async(_outputQueue, ^{
        @autoreleasepool{
            if (self.delegate && [self.delegate respondsToSelector:@selector(captureService:outputSampleBuffer:)]) {
                [self.delegate captureService:self outputSampleBuffer:sampleBuffer];
            }
        }
        CFRelease(sampleBuffer);
    });
}
复制代码

配置图片数据输出

配置图片数据输出的目的是为了实现相片捕获功能,通过setOutputSettings,我们可以配置我们输出的图片格式。

@property (nonatomic, strong) AVCaptureStillImageOutput *imageOutput;

- (BOOL)_setupImageOutput:(NSError **)error {
    self.imageOutput = [[AVCaptureStillImageOutput alloc] init];
    NSDictionary *outputSetting = @{AVVideoCodecKey: AVVideoCodecJPEG};
    [_imageOutput setOutputSettings:outputSetting];
    if ([_captureSession canAddOutput:_imageOutput]) {
        [_captureSession addOutput:_imageOutput];
        return YES;
    } else {
        *error = [NSError errorWithDomain:@"com.caixindong.captureservice.image" code:-2205 userInfo:@{NSLocalizedDescriptionKey:@"device lock fail(output)"}];
        return NO;
    }
}

//拍照功能实现
- (void)capturePhoto {
    AVCaptureConnection *connection = [_imageOutput connectionWithMediaType:AVMediaTypeVideo];
    if (connection.isVideoOrientationSupported) {
        connection.videoOrientation = AVCaptureVideoOrientationPortrait;
    }
    
    __weak typeof(self) weakSelf = self;
    [_imageOutput captureStillImageAsynchronouslyFromConnection:connection completionHandler:^(CMSampleBufferRef  _Nullable imageDataSampleBuffer, NSError * _Nullable error) {
        __strong typeof(weakSelf) strongSelf = weakSelf;
        if (imageDataSampleBuffer != NULL) {
            NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
            UIImage *image = [UIImage imageWithData:imageData];
            if (strongSelf.delegate && [strongSelf.delegate respondsToSelector:@selector(captureService:capturePhoto:)]) {
                [strongSelf.delegate captureService:strongSelf capturePhoto:image];
            }
        }
    }];
}
复制代码

配置人脸数据输出

关键是配置人脸元数据输出AVCaptureMetadataOutput,并指定metadataObjectTypes为AVMetadataObjectTypeFace,捕获的人脸数据包含当前视频帧中所有的人脸,可以从数据中提取人脸的范围、位置、偏转角,但这个有个注意点,就是原始的人脸数据的坐标是相机坐标系,我们需要转化为屏幕坐标,这样才方便我们的业务处理,具体可以看人脸数据输出那一块。

@property (nonatomic, strong) AVCaptureMetadataOutput *metadataOutput;

-(void)captureOutput:(AVCaptureOutput *)output didOutputMetadataObjects:(NSArray<__kindof AVMetadataObject *> *)metadataObjects fromConnection:(AVCaptureConnection *)connection {
    NSMutableArray *transformedFaces = [NSMutableArray array];
    for (AVMetadataObject *face in metadataObjects) {
        @autoreleasepool{
            AVMetadataFaceObject *transformedFace = (AVMetadataFaceObject*)[self.previewLayer transformedMetadataObjectForMetadataObject:face];
            if (transformedFace) {
                [transformedFaces addObject:transformedFace];
            }
        };
    }
    @autoreleasepool{
        if (self.delegate && [self.delegate respondsToSelector:@selector(captureService:outputFaceDetectData:)]) {
            [self.delegate captureService:self outputFaceDetectData:[transformedFaces copy]];
        }
    };
}

复制代码

配置预览源

这里有两种方式,一种是外部已经通过实现预览数据源方法配置了数据源,另外一种是内部自己生成AVCaptureVideoPreviewLayer配置为预览源。

if (self.preViewSource && [self.preViewSource respondsToSelector:@selector(preViewLayerSource)]) {
        self.previewLayer = [self.preViewSource preViewLayerSource];
        [_previewLayer setSession:_captureSession];
        [_previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
    } else {
        self.previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:_captureSession];
        //充满整个屏幕
        [_previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
        
        if (self.delegate && [self.delegate respondsToSelector:@selector(captureService:getPreviewLayer:)]) {
            [self.delegate captureService:self getPreviewLayer:_previewLayer];
        }
    }
复制代码

处理音视频前后台状态变化

iOS的音视频前后台机制较复杂,有各种生命周期变化,为了保证我们框架在正确状态下做正确的事,我们将数据帧的读和写的状态处理进行解耦,各位维护自己的通知状态变化处理。外层业务无需监听AVFoundation的通知手动处理视频流的状态变化。 读模块音视频通知配置:

//CaptureService和VideoWritter各自维护自己的生命周期,捕获视频流的状态与写入视频流的状态解耦分离,音视频状态变迁由captureservice内部管理,外层业务无需手动处理视频流变化
    [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(_captureSessionNotification:) name:nil object:self.captureSession];
    
    //为了适配低于iOS 9的版本,在iOS 9以前,当session start 还没完成就退到后台,回到前台会捕获AVCaptureSessionRuntimeErrorNotification,这时需要手动重新启动session,iOS 9以后系统对此做了优化,系统退到后台后会将session start缓存起来,回到前台会自动调用缓存的session start,无需手动调用
    [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(_enterForegroundNotification:) name:UIApplicationWillEnterForegroundNotification object:nil];

#pragma mark - CaptureSession Notification
- (void)_captureSessionNotification:(NSNotification *)notification {
    NSLog(@"_captureSessionNotification:%@",notification.name);
    if ([notification.name isEqualToString:AVCaptureSessionDidStartRunningNotification]) {
        if (!_firstStartRunning) {
            NSLog(@"session start running");
            _firstStartRunning = YES;
            if (self.delegate && [self.delegate respondsToSelector:@selector(captureServiceDidStartService:)]) {
                [self.delegate captureServiceDidStartService:self];
            }
        } else {
            NSLog(@"session resunme running");
        }
    } else if ([notification.name isEqualToString:AVCaptureSessionDidStopRunningNotification]) {
        if (!_isRunning) {
            NSLog(@"session stop running");
            if (self.delegate && [self.delegate respondsToSelector:@selector(captureServiceDidStopService:)]) {
                [self.delegate captureServiceDidStopService:self];
            }
        } else {
            NSLog(@"interupte session stop running");
        }
    } else if ([notification.name isEqualToString:AVCaptureSessionWasInterruptedNotification]) {
        NSLog(@"session was interupted, userInfo: %@",notification.userInfo);
    } else if ([notification.name isEqualToString:AVCaptureSessionInterruptionEndedNotification]) {
        NSLog(@"session interupted end");
    } else if ([notification.name isEqualToString:AVCaptureSessionRuntimeErrorNotification]) {
        NSError *error = notification.userInfo[AVCaptureSessionErrorKey];
        if (error.code == AVErrorDeviceIsNotAvailableInBackground) {
            NSLog(@"session runtime error : AVErrorDeviceIsNotAvailableInBackground");
            _startSessionOnEnteringForeground = YES;
        } else {
            if (self.delegate && [self.delegate respondsToSelector:@selector(captureService:serviceDidFailWithError:)]) {
                [self.delegate captureService:self serviceDidFailWithError:error];
            }
        }
    } else {
        NSLog(@"handel other notification : %@",notification.name);
    }
}

#pragma mark - UIApplicationWillEnterForegroundNotification
- (void)_enterForegroundNotification:(NSNotification *)notification {
    if (_startSessionOnEnteringForeground == YES) {
        NSLog(@"为了适配低于iOS 9的版本,在iOS 9以前,当session start 还没完成就退到后台,回到前台会捕获AVCaptureSessionRuntimeErrorNotification,这时需要手动重新启动session,iOS 9以后系统对此做了优化,系统退到后台后会将session start缓存起来,回到前台会自动调用缓存的session start,无需手动调用");
        _startSessionOnEnteringForeground = NO;
        [self startRunning];
    }
}
复制代码

写模块音视频通知配置:

//写模块注册通知,只负责写相关的状态处理
    [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(_assetWritterInterruptedNotification:) name:AVCaptureSessionWasInterruptedNotification object:nil];

- (void)_assetWritterInterruptedNotification:(NSNotification *)notification {
    NSLog(@"assetWritterInterruptedNotification");
    [self cancleWriting];
}
复制代码

启动捕获 & 关闭捕获

异步启动,防止阻塞主线程。串行队列中执行启动和关闭,保证不会出现启动到一半就关闭这种异常case。

- (void)startRunning {
    dispatch_async(_sessionQueue, ^{
        NSError *error = nil;
        BOOL result =  [self _setupSession:&error];
        if (result) {
            _isRunning = YES;
            [_captureSession startRunning];
        }else{
            if (self.delegate && [self.delegate respondsToSelector:@selector(captureService:serviceDidFailWithError:)]) {
                [self.delegate captureService:self serviceDidFailWithError:error];
            }
        }
    });
}

- (void)stopRunning {
    dispatch_async(_sessionQueue, ^{
        _isRunning = NO;
        NSError *error = nil;
        [self _clearVideoFile:&error];
        if (error) {
            if (self.delegate && [self.delegate respondsToSelector:@selector(captureService:serviceDidFailWithError:)]) {
                [self.delegate captureService:self serviceDidFailWithError:error];
            }
        }
        [_captureSession stopRunning];
    });
}

复制代码

切换摄像头

切换不仅仅是切换device,同时还要将旧的捕获input移除,添加新的device input。切换摄像头时,videoConnection会变化,所以需要重新获取。

- (void)switchCamera {
    if (_openDepth) {
        return;
    }
    
    NSError *error;
    AVCaptureDevice *videoDevice = [self _inactiveCamera];
    AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
    
    if (videoInput) {
        [_captureSession beginConfiguration];
        
        [_captureSession removeInput:self.videoInput];
        
        if ([self.captureSession canAddInput:videoInput]) {
            [self.captureSession addInput:videoInput];
            self.videoInput = videoInput;
            //切换摄像头videoConnection会变化,所以需要重新获取
            self.videoConnection = [self.videoOutput connectionWithMediaType:AVMediaTypeVideo];
            if (_videoConnection.isVideoOrientationSupported) {
                _videoConnection.videoOrientation = AVCaptureVideoOrientationPortrait;
            }
        } else {
            [self.captureSession addInput:self.videoInput];
        }
        
        [self.captureSession commitConfiguration];
    }
    
    _devicePosition = _devicePosition == AVCaptureDevicePositionFront?AVCaptureDevicePositionBack:AVCaptureDevicePositionFront;
}

- (AVCaptureDevice *)_inactiveCamera {
    AVCaptureDevice *device = nil;
    if (_devicePosition == AVCaptureDevicePositionBack) {
        device = [self _cameraWithPosition:AVCaptureDevicePositionFront];
    } else {
        device = [self _cameraWithPosition:AVCaptureDevicePositionBack];
    }
    return device;
}
复制代码

录像功能

通过videoSetting配置录制完的视频的编码格式,框架默认的编码格式是H.264,H.264是一种高效的视频编码格式,之后再出篇文章讲下这种编码格式,现阶段你只需要知道这是一种常用的编码格式,想了解更多编码格式,可以看下AVVideoCodecType里面的内容。在XDCaptureService的startRecording方法中,我们初始化我们的写模块XDVideoWritter,XDVideoWritter根据videoSetting配置对应的编码格式。

- (void)startRecording {
    dispatch_async(_writtingQueue, ^{
        @synchronized(self) {
            NSString *videoFilePath = [_videoDir stringByAppendingPathComponent:[NSString stringWithFormat:@"Record-%llu.mp4",mach_absolute_time()]];
            
            _recordURL = [[NSURL alloc] initFileURLWithPath:videoFilePath];
            
            if (_recordURL) {
                _videoWriter = [[XDVideoWritter alloc] initWithURL:_recordURL VideoSettings:_videoSetting audioSetting:_audioSetting];
                _videoWriter.delegate = self;
                [_videoWriter startWriting];
                if (self.delegate && [self.delegate respondsToSelector:@selector(captureServiceRecorderDidStart:)]) {
                    [self.delegate captureServiceRecorderDidStart:self];
                }
            } else {
                NSLog(@"No record URL");
            }
        }
    });
}

//XDVideoWritter.m
- (void)startWriting {
    if (_assetWriter) {
        _assetWriter = nil;
    }
    NSError *error = nil;
    
    NSString *fileType = AVFileTypeMPEG4;
    _assetWriter = [[AVAssetWriter alloc] initWithURL:_outputURL fileType:fileType error:&error];
    
    if (!_assetWriter || error) {
        if (self.delegate && [self.delegate respondsToSelector:@selector(videoWritter:didFailWithError:)]){
            [self.delegate videoWritter:self didFailWithError:error];
        }
    }
    
    if (_videoSetting) {
        _videoInput = [[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeVideo outputSettings:_videoSetting];
        
        _videoInput.expectsMediaDataInRealTime = YES;
        
        if ([_assetWriter canAddInput:_videoInput]) {
            [_assetWriter addInput:_videoInput];
        } else {
            NSError *error = [NSError errorWithDomain:@"com.caixindong.captureservice.writter" code:-2210 userInfo:@{NSLocalizedDescriptionKey:@"VideoWritter unable to add video input"}];
            if (self.delegate && [self.delegate respondsToSelector:@selector(videoWritter:didFailWithError:)]) {
                [self.delegate videoWritter:self didFailWithError:error];
            }
            return;
        }
    } else {
        NSLog(@"warning: no video setting");
    }
    
    if (_audioSetting) {
        _audioInput = [[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeAudio outputSettings:_audioSetting];
        
        _audioInput.expectsMediaDataInRealTime = YES;
        
        if ([_assetWriter canAddInput:_audioInput]) {
            [_assetWriter addInput:_audioInput];
        } else {
            NSError *error = [NSError errorWithDomain:@"com.caixindong.captureservice.writter" code:-2211 userInfo:@{NSLocalizedDescriptionKey:@"VideoWritter unable to add audio input"}];
            if (self.delegate && [self.delegate respondsToSelector:@selector(videoWritter:didFailWithError:)]) {
                [self.delegate videoWritter:self didFailWithError:error];
            }
            return;
        }
    } else {
        NSLog(@"warning: no audio setting");
    }
    
    if ([_assetWriter startWriting]) {
        self.isWriting = YES;
    } else {
        NSError *error = [NSError errorWithDomain:@"com.xindong.captureservice.writter" code:-2212 userInfo:@{NSLocalizedDescriptionKey: [NSString stringWithFormat: @"VideoWritter startWriting fail error: %@",_assetWriter.error.localizedDescription]}];
        if (self.delegate && [self.delegate respondsToSelector:@selector(videoWritter:didFailWithError:)]) {
            [self.delegate videoWritter:self didFailWithError:error];
        }
    }
    
    [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(_assetWritterInterruptedNotification:) name:AVCaptureSessionWasInterruptedNotification object:nil];
}

复制代码

录像的原理就是在抛数据的同时调用XDVideoWritter的appendSampleBuffer方法将数据写入一个临时文件中,当调用stopRecording,也就是调用到XDVideoWritter的stopWriting方法停止写数据,将临时文件归档为MP4文件。

- (void)stopRecording {
    dispatch_async(_writtingQueue, ^{
        @synchronized(self) {
            if (_videoWriter) {
                [_videoWriter stopWriting];
            }
        }
    });
}

//XDVideoWritter.m
- (void)appendSampleBuffer:(CMSampleBufferRef)sampleBuffer {
    CMFormatDescriptionRef formatDesc = CMSampleBufferGetFormatDescription(sampleBuffer);
    
    CMMediaType mediaType = CMFormatDescriptionGetMediaType(formatDesc);
    
    if (mediaType == kCMMediaType_Video) {
        CMTime timestamp = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
        
        if (self.firstSample) {
            [_assetWriter startSessionAtSourceTime:timestamp];
            self.firstSample = NO;
        }
        
        if (_videoInput.readyForMoreMediaData) {
            if (![_videoInput appendSampleBuffer:sampleBuffer]) {
                NSError *error = [NSError errorWithDomain:@"com.caixindong.captureservice.writter" code:-2213 userInfo:@{NSLocalizedDescriptionKey:[NSString stringWithFormat: @"VideoWritter appending video sample buffer fail error:%@",_assetWriter.error.localizedDescription]}];
                if (self.delegate && [self.delegate respondsToSelector:@selector(videoWritter:didFailWithError:)]) {
                    [self.delegate videoWritter:self didFailWithError:error];
                }
            }
        }
    } else if (!self.firstSample && mediaType == kCMMediaType_Audio) {
        if (_audioInput.readyForMoreMediaData) {
            if (![_audioInput appendSampleBuffer:sampleBuffer]) {
                NSError *error = [NSError errorWithDomain:@"com.caixindong.captureservice.writter" code:-2214 userInfo:@{NSLocalizedDescriptionKey:[NSString stringWithFormat:@"VideoWritter appending audio sample buffer fail error: %@",_assetWriter.error]}];
                if (self.delegate && [self.delegate respondsToSelector:@selector(videoWritter:didFailWithError:)]) {
                    [self.delegate videoWritter:self didFailWithError:error];
                }
            }
        }
    }
}

- (void)stopWriting {
    if (_assetWriter.status == AVAssetWriterStatusWriting) {
        self.isWriting = NO;
        [_assetWriter finishWritingWithCompletionHandler:^{
            if (_assetWriter.status == AVAssetWriterStatusCompleted) {
                if (self.delegate && [self.delegate respondsToSelector:@selector(videoWritter:completeWriting:)]) {
                    [self.delegate videoWritter:self completeWriting:nil];
                }
            } else {
                if (self.delegate && [self.delegate respondsToSelector:@selector(videoWritter:completeWriting:)]) {
                    [self.delegate videoWritter:self completeWriting:_assetWriter.error];
                }
            }
        }];
    } else {
        NSLog(@"warning : stop writing with unsuitable state : %ld",_assetWriter.status);
    }
    [[NSNotificationCenter defaultCenter] removeObserver:self name:AVCaptureSessionWasInterruptedNotification object:nil];
}

复制代码

以上就是本文的全部内容,希望本文的内容对大家的学习或者工作能带来一定的帮助,也希望大家多多支持 码农网

查看所有标签

猜你喜欢:

本站部分资源来源于网络,本站转载出于传递更多信息之目的,版权归原作者或者来源机构所有,如转载稿涉及版权问题,请联系我们

Inside Larry's and Sergey's Brain

Inside Larry's and Sergey's Brain

Richard Brandt / Portfolio / 17 Sep 2009 / USD 24.95

You’ve used their products. You’ve heard about their skyrocketing wealth and “don’t be evil” business motto. But how much do you really know about Google’s founders, Larry Page and Sergey Brin? Inside......一起来看看 《Inside Larry's and Sergey's Brain》 这本书的介绍吧!

SHA 加密
SHA 加密

SHA 加密工具

UNIX 时间戳转换
UNIX 时间戳转换

UNIX 时间戳转换

HSV CMYK 转换工具
HSV CMYK 转换工具

HSV CMYK互换工具