ios9 - 裁剪CVImageBuffer的问题

时间:2015-09-24 17:05:33

标签: ios image crop ios9 ciimage

我面临的几个问题与使用iOS9 SDK进行裁剪有关。

我有以下代码来调整图像大小(通过在中间裁剪从4:3转换为16:9)。这曾经很好地工作到iOS8 SDK。在iOS 9中,底部区域为空白。

 (CMSampleBufferRef)resizeImage:(CMSampleBufferRef) sampleBuffer {
     {
         CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
         CVPixelBufferLockBaseAddress(imageBuffer,0);

         int target_width = CVPixelBufferGetWidth(imageBuffer);
         int target_height = CVPixelBufferGetHeight(imageBuffer);
         int height = CVPixelBufferGetHeight(imageBuffer);
         int width = CVPixelBufferGetWidth(imageBuffer);

         int x=0, y=0;

         // Convert 16:9 to 4:3
         if (((target_width*3)/target_height) == 4)
         {
             target_height = ((target_width*9)/16);
             target_height = ((target_height + 15) / 16) * 16;
             y = (height - target_height)/2;
         }
         else
         if ((target_width == 352) && (target_height == 288))
         {
             target_height = ((target_width*9)/16);
             target_height = ((target_height + 15) / 16) * 16;
             y = (height - target_height)/2;
         }
         else
         if (((target_height*3)/target_width) == 4)
         {
             target_width = ((target_height*9)/16);
             target_width = ((target_width + 15) / 16) * 16;
              x = ((width - target_width)/2);
         }
         else
         if ((target_width == 288) && (target_height == 352))
         {
             target_width = ((target_height*9)/16);
             target_width = ((target_width + 15) / 16) * 16;
              x = ((width - target_width)/2);
         }

         CGRect cropRect;

         NSLog(@"resizeImage x %d, y %d, target_width %d, target_height %d", x, y, target_width, target_height );
         cropRect = CGRectMake(x, y, target_width, target_height);
         CFDictionaryRef empty; // empty value for attr value.
         CFMutableDictionaryRef attrs;
         empty = CFDictionaryCreate(kCFAllocatorDefault, // our empty IOSurface properties dictionary
                                    NULL,
                                    NULL,
                                    0,
                                    &kCFTypeDictionaryKeyCallBacks,
                                    &kCFTypeDictionaryValueCallBacks);
         attrs = CFDictionaryCreateMutable(kCFAllocatorDefault,
                                           1,
                                           &kCFTypeDictionaryKeyCallBacks,
                                           &kCFTypeDictionaryValueCallBacks);

         CFDictionarySetValue(attrs,
                              kCVPixelBufferIOSurfacePropertiesKey,
                              empty);

        OSStatus status;
        CIImage *ciImage = [CIImage imageWithCVPixelBuffer:imageBuffer]; //options: [NSDictionary dictionaryWithObjectsAndKeys:[NSNull null], kCIImageColorSpace, nil]];
        CVPixelBufferRef pixelBuffer;
        status = CVPixelBufferCreate(kCFAllocatorSystemDefault, target_width, target_height, kCVPixelFormatType_420YpCbCr8BiPlanarFullRange, attrs, &pixelBuffer);
        if (status != 0)
        {
            NSLog(@"CVPixelBufferCreate error %d", (int)status);
        }

        [ciContext render:ciImage toCVPixelBuffer:pixelBuffer bounds:cropRect colorSpace:nil];
        CVPixelBufferUnlockBaseAddress( pixelBuffer, 0 );
        CVPixelBufferUnlockBaseAddress( imageBuffer,0);

        CMSampleTimingInfo sampleTime = {
            .duration = CMSampleBufferGetDuration(sampleBuffer),
            .presentationTimeStamp = CMSampleBufferGetPresentationTimeStamp(sampleBuffer),
            .decodeTimeStamp = CMSampleBufferGetDecodeTimeStamp(sampleBuffer)
        };

        CMVideoFormatDescriptionRef videoInfo = NULL;
        status = CMVideoFormatDescriptionCreateForImageBuffer(kCFAllocatorDefault, pixelBuffer, &videoInfo);
        if (status != 0)
        {
            NSLog(@"CMVideoFormatDescriptionCreateForImageBuffer error %d", (int)status);
        }
        CMSampleBufferRef oBuf;
        status = CMSampleBufferCreateForImageBuffer(kCFAllocatorDefault, pixelBuffer, true, NULL, NULL, videoInfo, &sampleTime, &oBuf);
        if (status != 0)
        {
            NSLog(@"CMSampleBufferCreateForImageBuffer error %d", (int)status);
        }
        CFRelease(pixelBuffer);
         ciImage = nil;
         pixelBuffer = nil;
        return oBuf;
    }
}

有关此问题的任何想法或建议?我尝试更改裁剪矩形但没有效果。

由于

1 个答案:

答案 0 :(得分:1)

您是否知道函数[CIContext toCVPixelBuffer: bounds: colorSpace:]的文档评论是关于iOS8和iOS9 +的? (但我找不到任何可以链接的在线资源。)

/* Render 'image' to the given CVPixelBufferRef.
 * The 'bounds' parameter has the following behavior:
 *    In OS X and iOS 9 and later:  The 'image' is rendered into 'buffer' so that
 *      point (0,0) of 'image' aligns to the lower left corner of 'buffer'.
 *      The 'bounds' acts like a clip rect to limit what region of 'buffer' is modified.
 *    In iOS 8 and earlier: The 'bounds' parameter acts to specify the region of 'image' to render.
 *      This region (regarless of its origin) is rendered at upper-left corner of 'buffer'.
 */

考虑到这一点,我解决了我的问题,看起来和你的一样。