从非Cocoa app调用Foundation,我需要NSRunLoop吗?

时间:2016-09-17 00:48:01

标签: avfoundation nsrunloop

我正在编写一个程序插件,用于Cocoa环境中不存在的程序(想想C ++命令行程序)。如果感兴趣,这是v8节点附加系统。我喜欢这个插件来记录屏幕,从而利用AVCaptureSession等等。基本上,就像这样:

void start(/*entry*/)
{
    // No run loop is *necessarily* present.
    AVCaptureSession * session = ...
}

void stop (/*entry*/)
{
    // etc..
}

实际上,我可能会开始一个新的pthread来做这些东西,所以它们都没有阻塞。我的问题是,我需要设置多少周围的基础设施基础设施。我几乎肯定需要一个@autoreleasepool {},但我是否应该启动我自己的默认NSRunLoop才能在线程中运行,如果不是,我会得到AVCapture中的任何棘手等等可能会失败的印象:

BOOL isStillRecording = YES;
void start(/*entry*/)
{
   // setup avcapture and what have you.
   NSRunLoop *theRL = [NSRunLoop new];
   while (isStillRecording && [theRL runMode:NSDefaultRunLoopMode beforeDate:[NSDate distantFuture]]);
}

void stop(/**entry**/)
{
   // kill avcapture, maybe through async_dispatch to not stop on the start up.
   isStillRecording = NO;
}

1 个答案:

答案 0 :(得分:0)

<强>更新 实际上,我把它拿回来了。看起来你需要使用当前的runloop来获取一些委托回调,比如AVCaptureFileOutput。这样可以创建屏幕和麦克风的快速电影:

#import <AVFoundation/AVFoundation.h>

@interface Capturer() <AVCaptureFileOutputRecordingDelegate>

@property(nonatomic) AVCaptureSession *session;
@property(nonatomic) AVCaptureMovieFileOutput *movieOutput;
@property(nonatomic, copy) void(^finishBlock)(NSError*);

@end

@implementation Capturer
- (instancetype)init
{
    self = [super init];
    if (self) {
        [self setup];
    }
    return self;
}

- (void)setup {
    self.session = [[AVCaptureSession alloc] init];

    // capture the screen
    CGDirectDisplayID displayId = CGMainDisplayID();
    AVCaptureScreenInput *screenInput = [[AVCaptureScreenInput alloc] initWithDisplayID:displayId];

    [self.session addInput:screenInput];

    // capture microphone input
    AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
    AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:nil];
    [self.session addInput:audioInput];

    self.movieOutput = [[AVCaptureMovieFileOutput alloc] init];

    [self.session addOutput:self.movieOutput];
}

- (void)start {
    [self.session startRunning];
    NSURL *movieURL = [[NSURL fileURLWithPath:[[NSFileManager defaultManager] currentDirectoryPath]] URLByAppendingPathComponent:@"output.mov"];
    [[NSFileManager defaultManager] removeItemAtURL:movieURL error:nil];
    NSLog(@"recording to %@", movieURL.path);
    [self.movieOutput startRecordingToOutputFileURL:movieURL recordingDelegate:self];
}

- (void)stop:(void (^)(NSError *))finishBlock {
    self.finishBlock = finishBlock;
   [self.movieOutput stopRecording];
}

// MARK: AVCaptureFileOutputRecordingDelegate
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error {
    NSLog(@"Finished recording to %@ with error %@", outputFileURL.path, error);
    [self.session stopRunning];
    self.finishBlock(error);
}

@end


int main(int argc, const char * argv[]) {
    @autoreleasepool {
        Capturer *c = [[Capturer alloc] init];
        NSRunLoop   *runLoop = [NSRunLoop currentRunLoop];

        [c start];

        // record 10s' worth
        __block BOOL finished = NO;
        [c performSelector:@selector(stop:) withObject:^(NSError *error) {
            finished = YES;
        } afterDelay:10];

        // cribbed from https://gist.github.com/syzdek/3220789
        while(!finished && [runLoop runMode:NSDefaultRunLoopMode beforeDate:[NSDate dateWithTimeIntervalSinceNow:2]]) NSLog(@"waiting");

    }
    return 0;
}

<强>先前

我在命令行应用中使用了AVFoundation,但我不需要NSRunLoop。我需要

  1. 创建@autoreleasepool(正如您所说)
  2. 因为AVFoundation非常异步,所以使用信号量确保在处理完成之前我没有退出。