如何创建一个具有实际持续时间的虚拟AVPlayerItem?

时间:2014-01-20 22:09:09

标签: ios iphone avfoundation

我正在使用AVPlayerCAKeyFrameAnimations上播放AVSynchronizedLayer。为了让播放器保持播放,因为我在动画期间不播放AVAsset,我将forwardPlaybackEndTime的{​​{1}}设置为所需动画的持续时间。不幸。 AVPlayerItem期间seekToTime:似乎不可能,因为forwardPlaybackEndTime总是回到开头。可能是因为它试图寻找AVPlayer的持续时间。

我怎样才能创建一个具有实际持续时间的虚拟AVPlayerItem来欺骗AVPlayer播放一些空的AVplayerItem并让我AVPlayerItem

2 个答案:

答案 0 :(得分:0)

不幸的是,seekToTime只会进入AVPlayerItem's持续时间。因此,需要创建虚拟播放器项以生成可搜索的持续时间。为了快速完成,需要创建一个虚拟AVplayerItem。以下是生成此类项目的实现示例。它很长但是它是必需的。祝你好运!

@interface FakeAsset ()

+ (CVPixelBufferRef)blackImagePixelBuffer;

@end

@implementation FakeAsset

+ (void)assetWithDuration:(CMTime)duration
        completitionBlock:(void (^)(AVAsset *))callBack
{
    NSError * error      = nil;
    NSString * assetPath = nil;
    NSUInteger i         = 0;
    do
    {
        assetPath =
        [NSTemporaryDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:@"dummyAsset%i.m4v",i]];
        i++;
    }
    while ([[NSFileManager defaultManager] fileExistsAtPath:assetPath
                                                isDirectory:NO]);

    NSURL * fileURL = [NSURL fileURLWithPath:assetPath];

    NSParameterAssert(fileURL);

    AVAssetWriter * videoWriter =
    [[AVAssetWriter alloc] initWithURL:fileURL
                              fileType:AVFileTypeAppleM4V
                                 error:&error];
    NSParameterAssert(videoWriter);

    NSDictionary * compression  =
  @{
    AVVideoAverageBitRateKey      : @10,
    AVVideoProfileLevelKey        : AVVideoProfileLevelH264Main31,
    AVVideoMaxKeyFrameIntervalKey : @300
    };

    NSDictionary * outputSettings =
  @{
    AVVideoCodecKey                 : AVVideoCodecH264,
    AVVideoCompressionPropertiesKey : compression,
    AVVideoWidthKey                 : @120,
    AVVideoHeightKey                : @80
    };

    AVAssetWriterInput * videoWriterInput =
    [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo
                                       outputSettings:outputSettings];
    NSParameterAssert(videoWriterInput);

    NSDictionary * parameters =
    @{(NSString *)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32ARGB),
      (NSString *)kCVPixelBufferWidthKey           : @120,
      (NSString *)kCVPixelBufferHeightKey          : @80
      };

    AVAssetWriterInputPixelBufferAdaptor * adaptor =
    [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:videoWriterInput
                                                                     sourcePixelBufferAttributes:parameters];
    NSParameterAssert(adaptor);
    NSParameterAssert([videoWriter canAddInput:videoWriterInput]);

    videoWriterInput.expectsMediaDataInRealTime = NO;

    [videoWriter addInput:videoWriterInput];

    NSParameterAssert([videoWriter startWriting]);

    [videoWriter startSessionAtSourceTime:kCMTimeZero];

    dispatch_queue_t dispatchQueue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0);

    [videoWriterInput requestMediaDataWhenReadyOnQueue:dispatchQueue
                                            usingBlock:^
    {
        int frame = 0;
        while (videoWriterInput.isReadyForMoreMediaData)
        {
            if (frame < 2)
            {
                CMTime frameTime = frame ? duration : kCMTimeZero;
                CVPixelBufferRef buffer = [self blackImagePixelBuffer];

                [adaptor appendPixelBuffer:buffer
                      withPresentationTime:frameTime];

                CVBufferRelease(buffer);

                ++frame;
            }
            else
            {
                [videoWriterInput markAsFinished];
                [videoWriter endSessionAtSourceTime:duration];

                dispatch_async(dispatch_get_main_queue(), ^
                {
                    [videoWriter finishWritingWithCompletionHandler:^()
                     {
                         NSLog(@"did finish writing the video!");
                         AVURLAsset * asset =
                         [AVURLAsset assetWithURL:videoWriter.outputURL];
                         callBack(asset);
                     }];
                });
                break;
            }
        }
    }];
}

+ (CVPixelBufferRef)blackImagePixelBuffer
{
    NSDictionary * options =
    @{
      (id)kCVPixelBufferCGImageCompatibilityKey         : @YES,
      (id)kCVPixelBufferCGBitmapContextCompatibilityKey : @YES
      };

    CVPixelBufferRef pxbuffer = NULL;
    CVReturn status =
    CVPixelBufferCreate(kCFAllocatorDefault, 120, 80, kCVPixelFormatType_32ARGB, (__bridge CFDictionaryRef)options, &pxbuffer);

    NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);

    CVPixelBufferLockBaseAddress(pxbuffer, 0);

    void * pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
    NSParameterAssert(pxdata != NULL);

    CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
    //kCGImageAlphaPremultipliedFirst
    CGContextRef context = CGBitmapContextCreate(pxdata, 120, 80, 8, 4*120, rgbColorSpace, (CGBitmapInfo)kCGImageAlphaNoneSkipFirst);

    NSParameterAssert(context);
    CGContextSetFillColorWithColor(context, [UIColor blackColor].CGColor);
    CGContextFillRect(context,CGRectMake(0.f, 0.f, 120.f, 80.f));
    CGColorSpaceRelease(rgbColorSpace);
    CGContextRelease(context);

    CVPixelBufferUnlockBaseAddress(pxbuffer, 0);

    return pxbuffer;
}

答案 1 :(得分:0)

我认为,如果您只使用AVMutableComposition创建AVMutableCompositionTrack,并使用insertEmptyTimeRange:在其中放置所需持续时间的空范围,就会更容易。

然后使用此合成来创建AVPlayerItem - playerItemWithAsset:,因为它是AVAsset的子类。

这不需要任何生成和写入然后阅读,并且代码也少得多。