performSelector:withObject:afterDelay不调用选择器

时间:2013-12-06 08:12:56

标签: ios objective-c delayed-execution

目前我正在开发一款应用来捕获不同的exposurePointOfInterest图像。基本上,步骤是:

  1. 将重点放在A点
  2. 捕获
  3. 将重点放在B点
  4. 捕获
  5. 我必须在步骤1和步骤之间放置一个冗余for循环。 2和步骤3&如图4所示,允许镜头实际聚焦在预期点上一段时间,否则在步骤2和2中都捕获。 4会产生相同的图片。这非常有效。但是,我认为这不是解决这个问题的最佳方法。

    我尝试过使用此代码而不是for循环:

    [self performSelector:@selector(captureStillImage) withObject:@"Grand Central Dispatch" afterDelay:1.0]
    

    但是当我运行它时,它就像从未执行选择器captureStillImage一样运行。有什么我做错了吗?或者是否有更好的解决方案,任何人都可以建议我?

    我调用捕获多个图像的函数如下所示:

    -(void)captureMultipleImg
    {
    //CAPTURE FIRST IMAGE WITH EXPOSURE POINT(0,0)
    [self continuousExposeAtPoint:CGPointMake(0.0f, 0.0f)];
    
    NSLog(@"Looping..");
    for(int i=0; i<100000000;i++){
    }
    NSLog(@"Finish Looping");
    [self captureStillImage];
    
    
    //CAPTURE FIRST IMAGE WITH EXPOSURE POINT(0,0)
    [self continuousExposeAtPoint:CGPointMake(0.5f, 0.5f)];
    
    NSLog(@"Looping..");
    for(int i=0; i<100000000;i++){
    }
    NSLog(@"Finish Looping");
    
    [self captureStillImage];
    }
    

    captureStillImage 的代码如下所示:

    -(void)captureStillImage
    {
    AVCaptureConnection *connection = [stillImage connectionWithMediaType:AVMediaTypeVideo];
    
    typedef void(^MyBufBlock)(CMSampleBufferRef, NSError*);
    
    MyBufBlock h = ^(CMSampleBufferRef buf, NSError *err){
        NSData *data = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:buf];
        [self setToSaveImage:[UIImage imageWithData:data]];
    
        NSLog(@"Saving to Camera Roll..");
        //Saving photo to camera roll
        UIImageWriteToSavedPhotosAlbum(toSaveImage, self, @selector(image:didFinishSavingWithError:contextInfo:), nil);
        toSaveImage = NULL;
    };
    
    [stillImage captureStillImageAsynchronouslyFromConnection:connection completionHandler:h];
    }
    

    continuousExposeAtPoint:功能的代码:

    -(void)continuousExposeAtPoint:(CGPoint)point
    {
    if([device isExposurePointOfInterestSupported] && [device isExposureModeSupported:AVCaptureExposureModeContinuousAutoExposure]){
        if([device lockForConfiguration:NULL]){
            [device setExposurePointOfInterest:point];
            [device setExposureMode:AVCaptureExposureModeContinuousAutoExposure];
            [device unlockForConfiguration];
            NSLog(@"Exposure point of intereset has been set to (%f,%f)",point.x, point.y);
        }
    }
    }
    

    提前致谢!

8 个答案:

答案 0 :(得分:2)

您可以使用performSelector:withObject:afterDelay:

代替虚拟循环

答案 1 :(得分:1)

我在这里走出困境,因为我想建议一种完全避免“忙等待”或“跑圈等待”的不同方法。

如果我正确理解了相机,可能需要一定的时间,直到相机设定了曝光点。有一个属性adjustingFocus反映了相机的这种状态。此属性符合KVO标准,我们可以使用KVO来观察其值。

因此,想法是设置曝光点,然后观察属性adjustingFocus。当它的值更改为NO时,相机将完成设置曝光点。

现在,我们可以利用KVO在设置完成后立即调用完成处理程序。设置曝光点的方法变为异步,带有完成处理程序:

typedef void (^completion_t) ();
-(void)continuousExposeAtPoint:(CGPoint)point 
                    completion:(completion_t)completionHandler;

假设您已在上述方法中正确实施了KVO,您可以按如下方式使用它:

-(void)captureMultipleImg
{
    [self continuousExposeAtPoint:CGPointMake(0.0f, 0.0f) completion:^{
        [self captureStillImage];
        [self continuousExposeAtPoint:CGPointMake(0.5f, 0.5f) completion:^{
            [self captureStillImage];
        }];
    }];
}

编辑:

现在,方法captureMultipleImg也变为异步

注意:

  

调用异步方法的方法本身就是异步的。

因此,为了让 其基础异步任务完成时,调用站点知道,我们可能会提供一个完成处理程序:

typedef void (^completion_t)();
-(void)captureMultipleImagesWithCompletion:(completion_t)completionHandler
{
    [self continuousExposeAtPoint:CGPointMake(0.0f, 0.0f) completion:^{
        [self captureStillImage];
        [self continuousExposeAtPoint:CGPointMake(0.5f, 0.5f) completion:^{
            [self captureStillImage];
            if (completionHandler) {
                completionHandler();
            }
        }];
    }];
}

按钮动作可以按如下方式实现:

- (void)captureImages {
    [self showLabel];
    self.captureImagesButton.enabled = NO;
    [manager captureMultipleImagesWithCompletion:^{
        dispatch_async(dispatch_get_main_queue(), ^{
            [self hideLabel];
            self.captureImagesButton.enabled = NO;            
        });
    }];
}

编辑:

对于快速入门,您可以实现KVO和您的方法,如下所示。 警告:未经过测试!

-(void)continuousExposeAtPoint:(CGPoint)point 
                    completion:(completion_t)completionHandler
{
    AVCaptureDevice* device; // ...;

    if([device isExposurePointOfInterestSupported] && [device isExposureModeSupported:AVCaptureExposureModeContinuousAutoExposure]){
        if([device lockForConfiguration:NULL]){

            [device addObserver:self forKeyPath:@"adjustingExposure"
                        options:NSKeyValueObservingOptionNew | NSKeyValueObservingOptionOld
                        context:(__bridge_retained void*)([completionHandler copy])];
            [device setExposurePointOfInterest:point];
            [device setExposureMode:AVCaptureExposureModeContinuousAutoExposure];
        }
    }
}

- (void)observeValueForKeyPath:(NSString *)keyPath
                      ofObject:(id)object change:(NSDictionary *)change
                       context:(void *)context
{
    AVCaptureDevice* device; // = ...;

    if ([keyPath isEqual:@"adjustingExposure"]) {
        if ([[change objectForKey:NSKeyValueChangeNewKey] boolValue] == NO) {
            CGPoint point = device.exposurePointOfInterest;
            NSLog(@"Exposure point of intereset has been set to (%f,%f)",point.x, point.y);

            [device removeObserver:self forKeyPath:@"adjustingExposure"];
            [device unlockForConfiguration];
            completion_t block = CFBridgingRelease(context);
            if (block) {
                block();
            }
        }
    }
    // Be sure to call the superclass's implementation *if it implements it.
    // NSObject does not implement the method.
    [super observeValueForKeyPath:keyPath
                         ofObject:object
                           change:change
                          context:context];

}

这里需要注意的是,KVO很难设置。但是一旦你设法将它包装到一个带有完成处理程序的方法中,它看起来就更好了;)

答案 2 :(得分:0)

使用后发送:

double delayInSeconds = 2.0;
dispatch_time_t popTime = dispatch_time(DISPATCH_TIME_NOW, (int64_t)(delayInSeconds * NSEC_PER_SEC));
dispatch_after(popTime, dispatch_get_main_queue(), ^(void){
    // Code
});

答案 3 :(得分:0)

我个人倾向于在主线程上使用块中的延迟,如下所示:

double delayInSeconds = 0.5;
dispatch_time_t popTime = dispatch_time(DISPATCH_TIME_NOW, (int64_t)(delayInSeconds * NSEC_PER_SEC));
dispatch_after(popTime, dispatch_get_main_queue(), ^(void){
     //Do your thing here
});

答案 4 :(得分:0)

你应该知道performSelector在调用线程上工作,如果调用线程没有执行,则不调用选择器。

所以我认为performSelector:withObject:afterDelay:不起作用的原因是你的线程执行captureMultipleImg方法在延迟时间后不起作用。

如果您使用dispatch_async调用captureMultipleImg,原因相同。

假设您在dispatch_async

中调用该方法
- (void)testCode
{
    [self performSelector:@selector(mywork:) withObject:nil afterDelay:0.1] ;
    [self endWork] ;
}

执行endWork后,调用线程可能会被释放,因此永远不会调用- (void)mywork:(id)obj

答案 5 :(得分:0)

你尝试过使用计时器吗?

如果performSelector:withObject:afterDelay:不起作用,您可以尝试:

[NSTimer scheduledTimerWithTimeInterval:1.5 target:self selector:@selector(captureStillImage) userInfo:nil repeats:NO];

答案 6 :(得分:0)

以较少的时间(例如2秒)尝试此操作

[self performSelector:@selector(yourMethod :) withObject:yourObject afterDelay:0.2];

答案 7 :(得分:0)

也许当运行循环处于默认模式以外的其他模式时,您的代码是否正在运行?试试这个:

        [self performSelector:@selector(mywork:) withObject:nil
               afterDelay:delay
                  inModes:@[[[NSRunLoop currentRunLoop] currentMode]]];