UIImages导出为电影错误

时间:2014-05-23 22:24:01

标签: ios objective-c uiimage avfoundation avassetwriter

问题

使用AVAssetWriter向我添加5张左右的图片后,我的AVAssetWriterInputPixelBufferAdaptor失败了,我不明白为什么。

详细

这个受欢迎的问题有所帮助,但并不能满足我的需求:

How do I export UIImage array as a movie?

一切都按计划运行,我甚至延迟了assetWriterInput,直到它可以处理更多媒体。 但由于某种原因,它总是在5张左右的图像后失败。我使用的图像是从GIF中提取的帧

代码

这是我的迭代代码:

-(void)writeImageData
{

     __block int i = 0;
     videoQueue = dispatch_queue_create("com.videoQueue", DISPATCH_QUEUE_SERIAL);
    [self.writerInput requestMediaDataWhenReadyOnQueue:dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0) usingBlock:^{

     while (self.writerInput.readyForMoreMediaData) {
        if (i >= self.imageRefs.count){
            [self endSession];
            videoQueue = nil;
            [self saveToLibraryWithCompletion:^{
                NSLog(@"Saved");
            }];
            break;
        }

        if (self.writerInput.readyForMoreMediaData){
            CGImageRef imageRef = (__bridge CGImageRef)self.imageRefs[i];
            CVPixelBufferRef buffer = [self pixelBufferFromCGImageRef:imageRef];


            CGFloat timeScale = (CGFloat)self.imageRefs.count / self.originalDuration;
            BOOL accepted = [self.adaptor appendPixelBuffer:buffer withPresentationTime:CMTimeMake(i, timeScale)];
            CVBufferRelease(buffer);
            if (!accepted){
                NSLog(@"Buffer did not add %@, index %d, timescale %f", self.writer.error, i, timeScale);
            }else{
                NSLog(@"Buffer did nothing wrong");
            }
            i++;
        }
    }
}];

}

我的其他代码与上面链接中的代码相匹配。这只是略有不同:

-(CVPixelBufferRef)pixelBufferFromCGImageRef:(CGImageRef)image
{
   NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                         [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
                         [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
                         nil];
   CVPixelBufferRef pxbuffer = NULL;
   CGFloat width = 640;
   CGFloat height = 640;
   CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, width,
                                      height, kCVPixelFormatType_32ARGB, (__bridge CFDictionaryRef) options,
                                      &pxbuffer);

   NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);

   CVPixelBufferLockBaseAddress(pxbuffer, 0);
   void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
   NSParameterAssert(pxdata != NULL);

   CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
   CGContextRef context = CGBitmapContextCreate(pxdata, width,
                                             height, 8, 4*width, rgbColorSpace,
                                             kCGImageAlphaNoneSkipFirst);
   NSParameterAssert(context);

   CGContextDrawImage(context, CGRectMake(0, 0, width,
                                       height), image);
   CGColorSpaceRelease(rgbColorSpace);
   CGContextRelease(context);

   CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
   return pxbuffer;

}

1 个答案:

答案 0 :(得分:1)

有一点让我感到高兴的是你使用CMTimeMake(adjustedTime, 1)

您需要正确计算每个帧的时间。请注意,CMTime采用两个整数,并将它们作为浮点值传递截断它们。

第二个问题是您没有使用串行调度队列:)