使用iOS中的图像阵列创建视频

时间:2013-06-13 06:35:22

标签: ios avassetwriter cmtime

我正在尝试从一系列图片中创建视频。

我创建了一个视频,但我在演示时间方面存在问题,即CMTime

我正在使用以下代码制作视频

   int frameCount = 1;

    // Adding images here to buffer
    for( int i = 0; i<[ fileArray count]; i++ )
    {
        // Create Pool
        NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];

        NSString *imageTag = [fileArray objectAtIndex:i];
        // Create file path
        NSString *imgPath = [folderPath stringByAppendingPathComponent:imageTag];
        // Get image
        UIImage *img = [self getImageFromPath:imgPath];
        buffer = [self pixelBufferFromCGImage:[img CGImage] size:size];

        BOOL append_ok = NO;
        int j = 0;

        // Try 5 times if append failes
        while ( !append_ok && j < 30 )
        {
            if (adaptor.assetWriterInput.readyForMoreMediaData)
            {
                printf("appending %d attemp %d\n", frameCount, j);

                NSTimeInterval duration = 7.0;
                if ( [mArrAudioFileNames objectAtIndex:i] != [NSNull null] )
                {
                    // Get Audio file
                    NSString *docsDir = [[self dataFolderPathForAudio]
                                         stringByAppendingPathComponent:
                                         [mArrAudioFileNames objectAtIndex:i]];

                    NSURL *soundFileURL = [NSURL fileURLWithPath:docsDir];

                    // Create AudioPlayer
                    NSError *error;
                    AVAudioPlayer  *audioPlayer = [[AVAudioPlayer alloc]
                                                   initWithContentsOfURL:soundFileURL
                                                   error:&error];

                    // Get Audio duration
                    duration = [audioPlayer duration];
                    [audioPlayer release];
                }

                CMTime frameTime = CMTimeMake(frameCount,(int32_t)1);
                append_ok = [adaptor appendPixelBuffer:buffer
                                  withPresentationTime:frameTime];

                [NSThread sleepForTimeInterval:0.05];
            }
            else
            {
                printf("adaptor not ready %d, %d\n", frameCount, j);
                [NSThread sleepForTimeInterval:0.1];
            }
            j++;
        }
        if (!append_ok) {
            printf("error appending image %d times %d\n", frameCount, j);
            isError = YES;
        }
        frameCount++;
        CVBufferRelease(buffer);

        // drain the pool
        [pool drain];
    }

    [videoWriterInput markAsFinished];
    [videoWriter finishWriting];

    [videoWriterInput release];
    [videoWriter release];

如果我在阵列中有7个图像,则会创建7秒的视频。这意味着每张图片播放1秒钟。

我的问题是:如何使用相同的图像集制作视频(假设为7),其中视频中的每个图像显示时间都不同?

如果第一张图片从总视频时长播放9秒,则第二张图片播放5秒,第三张图片播放20秒。

提前致谢。

0 个答案:

没有答案