内容简介:在直播应用中,视频的采集一般都是用AVFoundation框架,因为利用它我们能定制采集视频的参数;也能做切换手机摄像头、拍照、打开手电筒等一些列相机的操作;当然,更重要的一点是我们能获取到原始视频数据用来做编码等操作。这篇文章我们介绍的内容如下:代码:它表示硬件设备,我们可以从这个类中获取手机硬件的照相机,声音传感器等。当我们需要改变一些硬件设备的属性时(例如:闪光模式改变,相机聚焦改变等),必须要在改变设备属性之前调用
在直播应用中,视频的采集一般都是用AVFoundation框架,因为利用它我们能定制采集视频的参数;也能做切换手机摄像头、拍照、打开手电筒等一些列相机的操作;当然,更重要的一点是我们能获取到原始视频数据用来做编码等操作。这篇文章我们介绍的内容如下:
- 介绍和视频采集相关的关键类
- 介绍视频采集的步骤
- 介绍如何改变视频采集的参数,例如:分辨率,帧率,放大&缩小预览层,设置曝光等。
- 详细介绍相机操作,例如:拍照、切换前后镜头、打开&关闭手电筒等操作。
代码:
- github
- 欢迎fork&star
视频采集的关键类
AVCaptureDevice
它表示硬件设备,我们可以从这个类中获取手机硬件的照相机,声音传感器等。当我们需要改变一些硬件设备的属性时(例如:闪光模式改变,相机聚焦改变等),必须要在改变设备属性之前调用 lockForConfiguration
为设备加锁,改变完成后调用 unlockForConfiguration
方法解锁设备。
AVCaptureDeviceInput
输入设备管理对象,可以根据AVCaptureDevice创建创建对应的AVCaptureDeviceInput对象,该对象会被添加到AVCaptureSession中管理。它代表输入设备,它配置硬件设备的ports,通常的输入设备有(麦克风,相机等)。
AVCaptureOutput
代表输出数据,输出的可以是图片(AVCaptureStillImageOutput)或者视频(AVCaptureMovieFileOutput)
AVCaptureSession
媒体捕捉会话,负责把捕捉的音视频数据输出到输出设备中。一个AVCaptureSession可以有多个输入或输出。它是连接AVCaptureInput和AVCaptureOutput的桥梁,它协调input到output之间传输数据。它用startRunning和stopRunning两种方法来开启和结束会话。
每个session称之为一个会话,也就是在应用运行过程中如果需要改变会话的一些配置(eg:切换摄像头),此时需要先开启配置,配置完成之后再提交配置。
AVCaptureConnection
AVCaptureConnection represents a connection between an AVCaptureInputPort or ports, and an AVCaptureOutput or AVCaptureVideoPreviewLayer present in an AVCaptureSession.即它是一个连接,这个连接是inputPort和output之间或者是图像当前预览层和当前会话之间的。
AVCaptureVideoPreviewPlayer
它是图片预览层。我们的照片以及视频是如何显示在手机上的呢?那就是通过把这个对象添加到UIView 的layer上的。
视频采集的步骤
以下是视频采集的代码,帧率是 30FPS
,分辨率是 1920*1080
。
#import "MiVideoCollectVC.h" #import <AVFoundation/AVFoundation.h> @interface MiVideoCollectVC ()<AVCaptureVideoDataOutputSampleBufferDelegate> @property (nonatomic,strong) AVCaptureVideoDataOutput *video_output; @property (nonatomic,strong) AVCaptureSession *m_session; @property (weak, nonatomic) IBOutlet UIView *m_displayView; @end @implementation MiVideoCollectVC - (void)viewDidLoad { [super viewDidLoad]; // Do any additional setup after loading the view. [self startCaptureSession]; } - (void)viewWillAppear:(BOOL)animated { [super viewWillAppear:animated]; [self startPreview]; } - (IBAction)onpressedBtnDismiss:(id)sender { [self dismissViewControllerAnimated:YES completion:^{ [self stopPreview]; }]; } - (void)startCaptureSession { NSError *error = nil; AVCaptureSession *session = [[AVCaptureSession alloc] init]; if ([session canSetSessionPreset:AVCaptureSessionPreset1920x1080]) { session.sessionPreset = AVCaptureSessionPreset1920x1080; }else{ session.sessionPreset = AVCaptureSessionPresetHigh; } AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error]; if (error || !input) { NSLog(@"get input device error..."); return; } [session addInput:input]; _video_output = [[AVCaptureVideoDataOutput alloc] init]; [session addOutput:_video_output]; // Specify the pixel format _video_output.videoSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarFullRange] forKey:(id)kCVPixelBufferPixelFormatTypeKey]; _video_output.alwaysDiscardsLateVideoFrames = NO; dispatch_queue_t video_queue = dispatch_queue_create("MIVideoQueue", NULL); [_video_output setSampleBufferDelegate:self queue:video_queue]; CMTime frameDuration = CMTimeMake(1, 30); BOOL frameRateSupported = NO; for (AVFrameRateRange *range in [device.activeFormat videoSupportedFrameRateRanges]) { if (CMTIME_COMPARE_INLINE(frameDuration, >=, range.minFrameDuration) && CMTIME_COMPARE_INLINE(frameDuration, <=, range.maxFrameDuration)) { frameRateSupported = YES; } } if (frameRateSupported && [device lockForConfiguration:&error]) { [device setActiveVideoMaxFrameDuration:frameDuration]; [device setActiveVideoMinFrameDuration:frameDuration]; [device unlockForConfiguration]; } [self adjustVideoStabilization]; _m_session = session; CALayer *previewViewLayer = [self.m_displayView layer]; previewViewLayer.backgroundColor = [[UIColor blackColor] CGColor]; AVCaptureVideoPreviewLayer *newPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:_m_session]; [newPreviewLayer setFrame:[UIApplication sharedApplication].keyWindow.bounds]; [newPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill]; // [previewViewLayer insertSublayer:newPreviewLayer atIndex:2]; [previewViewLayer insertSublayer:newPreviewLayer atIndex:0]; } - (void)adjustVideoStabilization { NSArray *devices = [AVCaptureDevice devices]; for (AVCaptureDevice *device in devices) { if ([device hasMediaType:AVMediaTypeVideo]) { if ([device.activeFormat isVideoStabilizationModeSupported:AVCaptureVideoStabilizationModeAuto]) { for (AVCaptureConnection *connection in _video_output.connections) { for (AVCaptureInputPort *port in [connection inputPorts]) { if ([[port mediaType] isEqual:AVMediaTypeVideo]) { if (connection.supportsVideoStabilization) { connection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeStandard; NSLog(@"now videoStabilizationMode = %ld",(long)connection.activeVideoStabilizationMode); }else{ NSLog(@"connection does not support video stablization"); } } } } }else{ NSLog(@"device does not support video stablization"); } } } } - (void)startPreview { if (![_m_session isRunning]) { [_m_session startRunning]; } } - (void)stopPreview { if ([_m_session isRunning]) { [_m_session stopRunning]; } } #pragma mark -AVCaptureVideoDataOutputSampleBufferDelegate - (void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { NSLog(@"%s",__func__); } // 有丢帧时,此代理方法会触发 - (void)captureOutput:(AVCaptureOutput *)output didDropSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { NSLog(@"MediaIOS: 丢帧..."); } @end 复制代码
视频采集的具体步骤总结如下:
- 首先创建一个AVCaptureSession对象,并且为该对象输入设备和输出设备并把输入输出设备添加到AVCaptrueSession对象。
- 为AVCaptureSession设置视频分辨率
- 设置视频采集的帧率
- 创建视频预览层并插入到view的layer中
改变视频采集参数-分辨率和帧率
我们先不介绍如何改变视频的分辨率和帧率,我们首先来讲一下如何监控视频采集的这些参数,因为我们只有能监控到这些参数的变化才能知道我们对这些参数的设置是否成功。
监控视频分辨率:
我们可以通过AVCaptureSession对象的 sessionPreset
直接获取到,它是一个字符串,我们设置完成之后直接打印一下就可以了。
监控视频帧率:
视频的帧率表示的是每秒采集的视频帧数,我们可以通过启动一个timer(1s刷新一次),来实时打印当前采集的视频帧率是多少。下面是计算1s内采集视频帧数的代码:
// 计算每秒钟采集视频多少帧 static int captureVideoFPS; + (void)calculatorCaptureFPS { static int count = 0; static float lastTime = 0; CMClockRef hostClockRef = CMClockGetHostTimeClock(); CMTime hostTime = CMClockGetTime(hostClockRef); float nowTime = CMTimeGetSeconds(hostTime); if(nowTime - lastTime >= 1) { captureVideoFPS = count; lastTime = nowTime; count = 0; } else { count ++; } } // 获取视频帧率 + (int)getCaptureVideoFPS { return captureVideoFPS; } 复制代码
改变分辨率
/** * Reset resolution * * @param m_session AVCaptureSession instance * @param resolution */ + (void)resetSessionPreset:(AVCaptureSession *)m_session resolution:(int)resolution { [m_session beginConfiguration]; switch (resolution) { case 1080: m_session.sessionPreset = [m_session canSetSessionPreset:AVCaptureSessionPreset1920x1080] ? AVCaptureSessionPreset1920x1080 : AVCaptureSessionPresetHigh; break; case 720: m_session.sessionPreset = [m_session canSetSessionPreset:AVCaptureSessionPreset1280x720] ? AVCaptureSessionPreset1280x720 : AVCaptureSessionPresetMedium; break; case 480: m_session.sessionPreset = [m_session canSetSessionPreset:AVCaptureSessionPreset640x480] ? AVCaptureSessionPreset640x480 : AVCaptureSessionPresetMedium; break; case 360: m_session.sessionPreset = AVCaptureSessionPresetMedium; break; default: break; } [m_session commitConfiguration]; } 复制代码
改变视频帧率
+ (void)settingFrameRate:(int)frameRate { AVCaptureDevice *captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; [captureDevice lockForConfiguration:NULL]; @try { [captureDevice setActiveVideoMinFrameDuration:CMTimeMake(1, frameRate)]; [captureDevice setActiveVideoMaxFrameDuration:CMTimeMake(1, frameRate)]; } @catch (NSException *exception) { NSLog(@"MediaIOS, 设备不支持所设置的分辨率,错误信息:%@",exception.description); } @finally { } [captureDevice unlockForConfiguration]; } 复制代码
为视频预览层添加捏合手势
在用双手势时,可以放大缩小所预览的视频。
#define MiMaxZoomFactor 5.0f #define MiPrinchVelocityDividerFactor 20.0f + (void)zoomCapture:(UIPinchGestureRecognizer *)recognizer { AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; [videoDevice formats]; if ([recognizer state] == UIGestureRecognizerStateChanged) { NSError *error = nil; if ([videoDevice lockForConfiguration:&error]) { CGFloat desiredZoomFactor = videoDevice.videoZoomFactor + atan2f(recognizer.velocity, MiPrinchVelocityDividerFactor); videoDevice.videoZoomFactor = desiredZoomFactor <= MiMaxZoomFactor ? MAX(1.0, MIN(desiredZoomFactor, videoDevice.activeFormat.videoMaxZoomFactor)) : MiMaxZoomFactor ; [videoDevice unlockForConfiguration]; } else { NSLog(@"error: %@", error); } } } 复制代码
相机操作
在视频采集的时候,可能还伴随有切换前后镜头、打开&关闭闪光灯、拍照等操作。
切换相机前后镜头
此处切换镜头后,我把分辨率默认设置为了720p,因为对于有的设备可能前置摄像头不支持1080p,所以我在此设定一个固定的720p,如果在真实的项目中,这个值应该是你以前设定的那个值,如果前置摄像头不支持对应的又不支持的策略。
// 切换摄像头 - (void)switchCamera { [_m_session beginConfiguration]; if ([[_video_input device] position] == AVCaptureDevicePositionBack) { NSArray * devices = [AVCaptureDevice devices]; for(AVCaptureDevice * device in devices) { if([device hasMediaType:AVMediaTypeVideo]) { if([device position] == AVCaptureDevicePositionFront) { [self rePreviewWithCameraType:MiCameraType_Front device:device]; break; } } } }else{ NSArray * devices = [AVCaptureDevice devices]; for(AVCaptureDevice * device in devices) { if([device hasMediaType:AVMediaTypeVideo]) { if([device position] == AVCaptureDevicePositionBack) { [self rePreviewWithCameraType:MiCameraType_Back device:device]; break; } } } } [_m_session commitConfiguration]; } - (void)rePreviewWithCameraType:(MiCameraType)cameraType device:(AVCaptureDevice *)device { NSError *error = nil; AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error]; if (!input) return; [_m_session removeInput:_video_input]; _m_session.sessionPreset = AVCaptureSessionPresetLow; if ([_m_session canAddInput:input]) { [_m_session addInput:input]; }else { return; } _video_input = input; _m_cameraType = cameraType; NSString *preset = AVCaptureSessionPreset1280x720; if([device supportsAVCaptureSessionPreset:preset] && [_m_session canSetSessionPreset:preset]) { _m_session.sessionPreset = preset; }else { NSString *sesssionPreset = AVCaptureSessionPreset1280x720; if(![sesssionPreset isEqualToString:preset]) { _m_session.sessionPreset = sesssionPreset; } } } 复制代码
打开关闭闪光灯
// 打开关闭闪光灯 -(void)switchTorch { [_m_session beginConfiguration]; [[_video_input device] lockForConfiguration:NULL]; self.m_torchMode = [_video_input device].torchMode == AVCaptureTorchModeOn ? AVCaptureTorchModeOff : AVCaptureTorchModeOn; if ([[_video_input device] isTorchModeSupported:_m_torchMode ]) { [_video_input device].torchMode = self.m_torchMode; } [[_video_input device] unlockForConfiguration]; [_m_session commitConfiguration]; } 复制代码
拍照并保存到相册
具体的方案是:
CMSampleBufferRef
注意:以下代码只有指定像素格式为RGB的时候,才能保存成功一张彩色的照片到相册。
- (UIImage *)convertSameBufferToUIImage:(CMSampleBufferRef)sampleBuffer { // 为媒体数据设置一个CMSampleBuffer的Core Video图像缓存对象 CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); // 锁定pixel buffer的基地址 CVPixelBufferLockBaseAddress(imageBuffer, 0); // 得到pixel buffer的基地址 void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer); // 得到pixel buffer的行字节数 size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); // 得到pixel buffer的宽和高 size_t width = CVPixelBufferGetWidth(imageBuffer); size_t height = CVPixelBufferGetHeight(imageBuffer); // 创建一个依赖于设备的RGB颜色空间 CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); // 用抽样缓存的数据创建一个位图格式的图形上下文(graphics context)对象 CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8,bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); // 根据这个位图context中的像素数据创建一个Quartz image对象 CGImageRef quartzImage = CGBitmapContextCreateImage(context); // 解锁pixel buffer CVPixelBufferUnlockBaseAddress(imageBuffer,0); // 释放context和颜色空间 CGContextRelease(context); CGColorSpaceRelease(colorSpace); // 用Quartz image创建一个UIImage对象image UIImage *image = [UIImage imageWithCGImage:quartzImage]; // 释放Quartz image对象 CGImageRelease(quartzImage); return (image); } + (void)saveImageToSysphotos:(UIImage *)image { ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init]; [library writeImageToSavedPhotosAlbum:image.CGImage metadata:nil completionBlock:^(NSURL *assetURL, NSError *error) { if (error) { NSLog(@"MediaIos, save photo to photos error, error info: %@",error.description); }else{ NSLog(@"MediaIos, save photo success..."); } }]; } 复制代码
设置自动对焦
// 设置为自动对焦 - (void)mifocus:(UITapGestureRecognizer *)sender { CGPoint point = [sender locationInView:self.m_displayView]; [self miAutoFocusWithPoint:point]; NSLog(@"MediaIos, auto focus complete..."); } - (void)miAutoFocusWithPoint:(CGPoint)touchPoint{ AVCaptureDevice *captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; if ([captureDevice isFocusPointOfInterestSupported] && [captureDevice isFocusModeSupported:AVCaptureFocusModeAutoFocus]) { NSError *error; if ([captureDevice lockForConfiguration:&error]) { // 设置曝光点 [captureDevice setExposurePointOfInterest:touchPoint]; [captureDevice setExposureMode:AVCaptureExposureModeContinuousAutoExposure]; // 设置对焦点 [captureDevice setFocusPointOfInterest:touchPoint]; [captureDevice setFocusMode:AVCaptureFocusModeAutoFocus]; [captureDevice unlockForConfiguration]; } } } 复制代码
曝光调节
// 曝光调节 - (void)changeExposure:(id)sender { UISlider *slider = (UISlider *)sender; [self michangeExposure:slider.value]; } - (void)michangeExposure:(CGFloat)value{ AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; NSError *error; if ([device lockForConfiguration:&error]) { [device setExposureTargetBias:value completionHandler:nil]; [device unlockForConfiguration]; } } 复制代码
设置黑白平衡
- (AVCaptureWhiteBalanceGains)recalcGains:(AVCaptureWhiteBalanceGains)gains minValue:(CGFloat)minValue maxValue:(CGFloat)maxValue { AVCaptureWhiteBalanceGains tmpGains = gains; tmpGains.blueGain = MAX(MIN(tmpGains.blueGain , maxValue), minValue); tmpGains.redGain = MAX(MIN(tmpGains.redGain , maxValue), minValue); tmpGains.greenGain = MAX(MIN(tmpGains.greenGain, maxValue), minValue); return tmpGains; } -(void)setWhiteBlanceUseTemperature:(CGFloat)temperature{ AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; if ([device isWhiteBalanceModeSupported:AVCaptureWhiteBalanceModeLocked]) { [device lockForConfiguration:nil]; AVCaptureWhiteBalanceGains currentGains = device.deviceWhiteBalanceGains; CGFloat currentTint = [device temperatureAndTintValuesForDeviceWhiteBalanceGains:currentGains].tint; AVCaptureWhiteBalanceTemperatureAndTintValues tempAndTintValues = { .temperature = temperature, .tint = currentTint, }; AVCaptureWhiteBalanceGains gains = [device deviceWhiteBalanceGainsForTemperatureAndTintValues:tempAndTintValues]; CGFloat maxWhiteBalanceGain = device.maxWhiteBalanceGain; gains = [self recalcGains:gains minValue:1 maxValue:maxWhiteBalanceGain]; [device setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains:gains completionHandler:nil]; [device unlockForConfiguration]; } } // 黑白平衡调节 - (void)whiteBlanceChange:(id)sender { UISlider *slider = (UISlider *)sender; [self setWhiteBlanceUseTemperature:slider.value]; } 复制代码
以上就是本文的全部内容,希望对大家的学习有所帮助,也希望大家多多支持 码农网
猜你喜欢:本站部分资源来源于网络,本站转载出于传递更多信息之目的,版权归原作者或者来源机构所有,如转载稿涉及版权问题,请联系我们。
Java Concurrency in Practice
Brian Goetz、Tim Peierls、Joshua Bloch、Joseph Bowbeer、David Holmes、Doug Lea / Addison-Wesley Professional / 2006-5-19 / USD 59.99
This book covers: Basic concepts of concurrency and thread safety Techniques for building and composing thread-safe classes Using the concurrency building blocks in java.util.concurrent Pe......一起来看看 《Java Concurrency in Practice》 这本书的介绍吧!