我们正尝试以每个网络摄像头通过可可应用程序最佳支持的分辨率从多个外部网络摄像头捕获图像。我们无法同时捕获所有Feed点,即同时捕获所有网络摄像头,因为我们无法同时打开所有网络摄像头并调用捕获方法。
我们已设法从每个网络摄像头以支持的分辨率拍摄图像,但只有2张图像之间的延迟时间为3秒,我们分别触发每个网络摄像头的捕获过程。
我们对此应用程序有2个非常具体的要求......
以支持的分辨率捕获图像的源代码,但延迟时间为3秒......
-(void)initailzeCamera{
if( [cameraArray count] == 0 ) {
exit( 1 );
}
if(cameraCount >= [cameraArray count]){
cameraCount = 0;
}
if(videoDevice){
//[mCaptureSession stopRunning];
}
videoDevice = [cameraArray objectAtIndex:cameraCount];
if(![videoDevice isOpen])
[videoDevice open:nil];
if( !videoDevice ) {
exit( 1 );
}
if(mCaptureDeviceInput){
[mCaptureDeviceInput release];
}
mCaptureDeviceInput = [[QTCaptureDeviceInput alloc] initWithDevice:videoDevice];
mCaptureSession = [[QTCaptureSession alloc] init];
[mCaptureSession addInput:mCaptureDeviceInput error:nil];
[mCaptureSession startRunning];
[mCaptureView setCaptureSession:mCaptureSession];
[mCaptureView setVideoPreviewConnection:[[mCaptureView availableVideoPreviewConnections] objectAtIndex:0]];
[mCaptureView setHidden:YES];
[mCaptureSession startRunning];
[self performSelector:@selector(stopCamera) withObject:nil afterDelay:2.0];
}
// This delegate method is called whenever the QTCaptureDecompressedVideoOutput receives a frame
- (CIImage *)view:(QTCaptureView *)view willDisplayImage:(CIImage *)image{
NSBitmapImageRep *bitmapRep = [[NSBitmapImageRep alloc] initWithCIImage:image];
jpegData =[[bitmapRep representationUsingType:NSJPEGFileType properties:nil]retain];
return image;
}
-(void)stopCamera{
@try {
[mCaptureSession stopRunning];
NSString *path= [NSString stringWithFormat:@"%@",[locationLabel stringValue]];
NSString *imagePath=[[NSString stringWithFormat:@"%@",path] retain];
NSString *imageName=[NSString stringWithFormat:@"%@.jpg",[[[NSDate date]description] substringToIndex:19]];
imageName = [imageName stringByReplacingOccurrencesOfString:@":" withString:@"."];
NSString *appFile = [imagePath stringByAppendingPathComponent:imageName];
if(jpegData != nil){
if([jpegData writeToFile:[appFile stringByExpandingTildeInPath] atomically:YES]){
}else {
}
}
else {
NSException* jpegDataNullException= [NSException exceptionWithName:@"JpegDataNullException"
reason:@"jpegData null found"
userInfo:nil];
@throw jpegDataNullException;
}
cameraCount++;
if(!flagForPause){
if(cameraCount < [cameraArray count]){
[self performSelector:@selector(initailzeCamera) withObject:nil afterDelay:1.0];
}else{
if(flagForAutoMode){
[self performSelector:@selector(initailzeCamera) withObject:nil afterDelay:[timeREcquire intValue]-5];
}
}
}
}
@catch (NSException * e) {
}
}
答案 0 :(得分:3)
使用QTKit
捕获多个摄像头效果很好。您只需打开两个会话并让它们并行运行(如果您不需要单独控制它们,您甚至可以打开一个会话并添加两个输入源)。启动设备捕获需要几秒钟,这就是为什么上述方法无法正常工作的一个原因 - 您必须保持两个会话都在运行。另一个问题是您正在使用预览进行捕获,这会导致质量下降。如果您想要全分辨率捕获,则应该使用captureOutput:didOutputVideoFrame:withSampleBuffer:fromConnection:
来提供全尺寸帧。
如何做到这一点的一个例子是http://svn.rforge.net/osx/trunk/tools/wcam.m
该代码用于慢速捕获(1 fps),因此为了您的使用,您可以删除或修改[dvo setMinimumVideoFrameInterval:1.0];
以满足您的需求(但请注意,存储图像必须足够快,不要阻止捕获)。
答案 1 :(得分:1)
尝试使用您提供的源代码,但我遇到的问题是,一个网络摄像头只拍摄一张照片并停止,第二个网络摄像头一直拍照。
devices = [[[NSMutableArray alloc] initWithArray:[QTCaptureDevice inputDevicesWithMediaType:QTMediaTypeVideo]] retain];
//[devices removeObject:[QTCaptureDevice defaultInputDeviceWithMediaType:QTMediaTypeVideo]];
int devId = 0;
for (QTCaptureDevice *device in devices) {
NSLog(@"device: %@", device);
Capture *cap = [[Capture alloc] initWithDevice: device];
[cap setFileName:[NSString stringWithFormat:@"image.%d.%%04d.jpeg", ++devId]];
[cap start];
}
if ([devices count] == 0) {
NSLog(@"no devices found, terminating");
exit (1);
}