如何在iPhone相机预览窗口中动态更改像素颜色?

时间:2010-11-22 02:31:15

标签: iphone uiimagepickercontroller

我正在使用UIImagePickerController在iPhone上拍照。我想动态调整照片,似乎我可以使用UIImagePickerController动态调整照片的形状,但我无法找到一种方法来动态改变颜色。例如,将所有颜色更改为黑/白。

感谢。

3 个答案:

答案 0 :(得分:4)

执行此操作的最佳方法是使用AVCaptureSession对象。我在我的免费应用程序“Live Effects Cam”中正在做你正在谈论的内容

在线有几个代码示例,a也可以帮助您实现这一点。以下是可能有用的示例代码块:

- (void) activateCameraFeed
    {
    videoSettings = nil;

#if USE_32BGRA
    pixelFormatCode = [[NSNumber alloc] initWithUnsignedInt:(unsigned int)kCVPixelFormatType_32BGRA];
    pixelFormatKey = [[NSString alloc] initWithString:(NSString *)kCVPixelBufferPixelFormatTypeKey];
    videoSettings = [[NSDictionary alloc] initWithObjectsAndKeys:pixelFormatCode, pixelFormatKey, nil]; 
#endif

    videoDataOutputQueue = dispatch_queue_create("com.jellyfilledstudios.ImageCaptureQueue", NULL);

    captureVideoOutput = [[AVCaptureVideoDataOutput alloc] init];
    [captureVideoOutput setAlwaysDiscardsLateVideoFrames:YES]; 
    [captureVideoOutput setSampleBufferDelegate:self queue:videoDataOutputQueue];
    [captureVideoOutput setVideoSettings:videoSettings];
    [captureVideoOutput setMinFrameDuration:kCMTimeZero];

    dispatch_release(videoDataOutputQueue); // AVCaptureVideoDataOutput uses dispatch_retain() & dispatch_release() so we can dispatch_release() our reference now

    if ( useFrontCamera )
        {
        currentCameraDeviceIndex = frontCameraDeviceIndex;
        cameraImageOrientation = UIImageOrientationLeftMirrored;
        }
    else
        {
        currentCameraDeviceIndex = backCameraDeviceIndex;
        cameraImageOrientation = UIImageOrientationRight;
        }

    selectedCamera = [[AVCaptureDevice devices] objectAtIndex:(NSUInteger)currentCameraDeviceIndex];

    captureVideoInput = [AVCaptureDeviceInput deviceInputWithDevice:selectedCamera error:nil];

    captureSession = [[AVCaptureSession alloc] init];

    [captureSession beginConfiguration];

    [self setCaptureConfiguration];

    [captureSession addInput:captureVideoInput];
    [captureSession addOutput:captureVideoOutput];
    [captureSession commitConfiguration];
    [captureSession startRunning];
    }


// AVCaptureVideoDataOutputSampleBufferDelegate
// AVCaptureAudioDataOutputSampleBufferDelegate
//
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection 
    {
    NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];

    if ( captureOutput==captureVideoOutput )
        {
        [self performImageCaptureFrom:sampleBuffer fromConnection:connection];
        }

    [pool drain];
    } 



- (void) performImageCaptureFrom:(CMSampleBufferRef)sampleBuffer
    {
    CVImageBufferRef imageBuffer;

    if ( CMSampleBufferGetNumSamples(sampleBuffer) != 1 )
        return;
    if ( !CMSampleBufferIsValid(sampleBuffer) )
        return;
    if ( !CMSampleBufferDataIsReady(sampleBuffer) )
        return;

    imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 

    if ( CVPixelBufferGetPixelFormatType(imageBuffer) != kCVPixelFormatType_32BGRA )
        return;

    CVPixelBufferLockBaseAddress(imageBuffer,0); 

    uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer); 
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
    size_t width = CVPixelBufferGetWidth(imageBuffer); 
    size_t height = CVPixelBufferGetHeight(imageBuffer);
    int bufferSize = bytesPerRow * height;

    uint8_t *tempAddress = malloc( bufferSize );
    memcpy( tempAddress, baseAddress, bytesPerRow * height );

    baseAddress = tempAddress;

    //
    // Apply affects to the pixels stored in (uint32_t *)baseAddress
    //
    //
    // example: grayScale( (uint32_t *)baseAddress, width, height );
    // example: sepia( (uint32_t *)baseAddress, width, height );
    //

    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 
    CGContextRef newContext = nil;

    if ( cameraDeviceSetting != CameraDeviceSetting640x480 )        // not an iPhone4 or iTouch 5th gen
        newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace,  kCGBitmapByteOrder32Little | kCGImageAlphaNoneSkipFirst);
    else
        newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);

    CGImageRef newImage = CGBitmapContextCreateImage( newContext );
    CGColorSpaceRelease( colorSpace );
    CGContextRelease( newContext );

    free( tempAddress );

    CVPixelBufferUnlockBaseAddress(imageBuffer,0);

    if ( newImage == nil )
        {
        return;
        }

    // To be able to display the CGImageRef newImage in your UI you will need to do it like this
    // because you are running on a different thread here…
    //
    [self performSelectorOnMainThread:@selector(newCameraImageNotification:) withObject:(id)newImage waitUntilDone:YES];
    }

答案 1 :(得分:1)

您可以在图像上叠加视图并更改混合模式以匹配黑/白效果。

查看Apple的QuartzDemo,特别是该演示中的Blending Modes示例

答案 2 :(得分:1)

另一种方法是使用AVFoundation转换每个帧。我没有这方面的经验,但WWDC2010及其示例项目中的“Session 409 - 使用带AVFoundation的摄像头”视频应该可以帮助您解决问题。

当然,如果您可以使用iOS4类,那就是这样。