我在iOS 7上使用OpenGL将前置摄像头视频捕捉渲染为UIView
在iPhone显示屏上(同样的iphone 5)。我正在使用AVCaptureSessionPreset640x480
并将其传递给AVCaptureSession
方法
[captureSession setSessionPreset:AVCaptureSessionPreset640x480];
然而,渲染视频的分辨率似乎低于上面设置的分辨率,
它似乎是AVCaptureSessionPreset352x288
。事实上,无论我经过什么
从这些没有区别,分辨率是相同的
NSString *const AVCaptureSessionPresetPhoto;
NSString *const AVCaptureSessionPresetHigh;
NSString *const AVCaptureSessionPresetMedium;
NSString *const AVCaptureSessionPresetLow;
NSString *const AVCaptureSessionPreset352x288;
NSString *const AVCaptureSessionPreset640x480;
NSString *const AVCaptureSessionPreset1280x720;
NSString *const AVCaptureSessionPreset1920x1080;
NSString *const AVCaptureSessionPresetiFrame960x540;
NSString *const AVCaptureSessionPresetiFrame1280x720;
NSString *const AVCaptureSessionPresetInputPriority;
如何查看相机实际拍摄的分辨率?
由于
答案 0 :(得分:2)
读取正在捕获的缓冲区的尺寸,如下所示(对于AVCaptureSessionPresetPhoto
,您需要捕获静止图像,当然,而不是阅读视频帧...):
- (void) captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
{
CVPixelBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(imageBuffer,0);
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
// "width" and "height" now hold your dimensions...
}