我目前正试图弄清楚如何从网络摄像头改变帧以进行运动检测游戏。我是Objective-C的新手,我一直无法找到一种简单的方法来做到这一点。
我的问题是关于此方法的错误消息:
- (void)captureOutput:(QTCaptureOutput *)captureOutput
didOutputVideoFrame:(CVImageBufferRef)videoFrame
withSampleBuffer:(QTSampleBuffer *)sampleBuffer
fromConnection:(QTCaptureConnection *)connection
{
CIContext *myCIContext;
const NSOpenGLPixelFormatAttribute attr[] = {
NSOpenGLPFAAccelerated,
NSOpenGLPFANoRecovery,
NSOpenGLPFAColorSize, 32,
0
};
NSOpenGLPixelFormat *pf = [[NSOpenGLPixelFormat alloc] initWithAttributes:(void *)&attr];
myCIContext = [CIContext contextWithCGLContext: CGLGetCurrentContext()
pixelFormat: [pf CGLPixelFormatObj]
options: nil];
CVImageBufferRef releasedImageBuffer;
CVBufferRetain(videoFrame);
CIImage *picture = [CIImage imageWithCVImageBuffer:releasedImageBuffer];
NSRect frame = [self frame];
CGRect imageRect;
imageRect = [picture extent];
[colorCorrectionFilter setValue:picture forKey:@"inputImage"];
[effectFilter setValue:[colorCorrectionFilter valueForKey:@"outputImage"] forKey:@"inputImage"];
// render our resulting image into our context
[ciContext drawImage:[compositeFilter valueForKey:@"outputImage"]
atPoint:CGPointMake((int)((frame.size.width - imageRect.size.width) * 0.5), (int)((frame.size.height - imageRect.size.height) * 0.5)) // use integer coordinates to avoid interpolation
fromRect:imageRect];
@synchronized(self)
{
//basically, have frame to be released refer to the current frame
//then update the reference to the current frame with the next frame in the "video stream"
releasedImageBuffer = mCurrentImageBuffer;
mCurrentImageBuffer = videoFrame;
}
CVBufferRelease(releasedImageBuffer);
}
产生的错误消息显示:
warning: 'MyRecorderController' may not respond to '-frame'
error: invalid initializer
,突出显示的行是
NSRect frame = [self frame];
我的标题目前是这样的:
#import <QuickTime/ImageCompression.h>
#import <QuickTime/QuickTime.h>
#import <Cocoa/Cocoa.h>
#import <QTKit/QTKit.h>
#import <OpenGL/OpenGL.h>
#import <QuartzCore/QuartzCore.h>
#import <CoreVideo/CoreVideo.h>
@interface MyRecorderController : NSObject{
IBOutlet QTCaptureView *mCaptureView;
IBOutlet NSPopUpButton *videoDevicePopUp;
NSMutableDictionary *namesToDevicesDictionary;
NSString *defaultDeviceMenuTitle;
CVImageBufferRef mCurrentImageBuffer;
QTCaptureDecompressedVideoOutput *mCaptureDecompressedVideoOutput;
// filters for CI rendering
CIFilter *colorCorrectionFilter; // hue saturation brightness control through one CI filter
CIFilter *effectFilter; // zoom blur filter
CIFilter *compositeFilter; // composites the timecode over the video
CIContext *ciContext;
QTCaptureSession *mCaptureSession;
QTCaptureMovieFileOutput *mCaptureMovieFileOutput;
QTCaptureDeviceInput *mCaptureDeviceInput;
}
@end
我看过教程代码,但我不明白我做错了什么。据我所知(从各种示例代码判断)我不需要在此包含协议 - 这是其他网站建议的。 我已经试过了,虽然它确实编译了但最终输出:
2011-01-18 10:19:11.511 MyRecorder[9972:c903] -[MyRecorderController frame]: unrecognized selector sent to instance 0x1001525f0
2011-01-18 10:19:11.512 MyRecorder[9972:c903] *** Ignoring exception: -[MyRecorderController frame]: unrecognized selector sent to instance 0x1001525f0
我做错了是什么导致了这个?如果没有,是否有更好的方法可以从网络摄像头操作帧(并将其输出到屏幕)?
谢谢堆!
答案 0 :(得分:2)
您正在尝试在MyRecorderController上调用方法框架 - 它没有该方法。也许该类应该继承自UIView
,或者您需要实现该方法。
问问自己你的意思是什么框架,并写出适当的方法。