- copyPixelBufferForItemTime:itemTimeForDisplay:null值

时间:2015-10-09 11:02:14

标签: ios9 avasset avplayeritem cvpixelbuffer

我遇到的问题是,当我的应用尝试使用CVPixelBufferRef函数从AVPlayerItemVideoOutput获取- copyPixelBufferForItemTime:itemTimeForDisplay:时,我使用iOS9 sdk编译我的应用时,我从时间获得空值加载视频并创建所有实例的时间。

使用iOS 8我的应用程序运行正常,但iOS9给我的问题,即使我的应用程序商店中可用于下载的应用程序版本使用iOS 8 SDK编译也给我同样的问题安装在IOS9中。

当问题发生并且我得到一个空getCVPixelBufferRef时,如果我按下主页按钮并且应用程序在我再次打开应用程序时变为活动状态并变为活动状态AVPlayerItemVideoOutput实例正在给我null CVPixelBufferRef开始正常工作,问题就解决了。

这是一个youtube视频,我在其中复制了这个问题:

https://www.youtube.com/watch?v=997zG08_DMM&feature=youtu.be

以下是创建所有项目实例的示例代码:

NSURL *url ;
url = [[NSURL alloc] initFileURLWithPath:[_mainVideo objectForKey:@"file"]];

NSDictionary *pixBuffAttributes = @{(id)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange)};
_videoOutput = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:pixBuffAttributes];
_myVideoOutputQueue = dispatch_queue_create("myVideoOutputQueue", DISPATCH_QUEUE_SERIAL);
[_videoOutput setDelegate:self queue:_myVideoOutputQueue];

_player = [[AVPlayer alloc] init];


// Do not take mute button into account
NSError *error = nil;
BOOL success = [[AVAudioSession sharedInstance]
                setCategory:AVAudioSessionCategoryPlayback
                error:&error];
if (!success) {
   // NSLog(@"Could not use AVAudioSessionCategoryPlayback", nil);
}

asset = [AVURLAsset URLAssetWithURL:url options:nil];


if(![[NSFileManager defaultManager] fileExistsAtPath:[[asset URL] path]]) {
   // NSLog(@"file does not exist");
}

NSArray *requestedKeys = [NSArray arrayWithObjects:kTracksKey, kPlayableKey, nil];

[asset loadValuesAsynchronouslyForKeys:requestedKeys completionHandler:^{

    dispatch_async( dispatch_get_main_queue(),
                   ^{
                       /* Make sure that the value of each key has loaded successfully. */
                       for (NSString *thisKey in requestedKeys)
                       {
                           NSError *error = nil;
                           AVKeyValueStatus keyStatus = [asset statusOfValueForKey:thisKey error:&error];
                           if (keyStatus == AVKeyValueStatusFailed)
                           {
                               [self assetFailedToPrepareForPlayback:error];
                               return;
                           }
                       }

                       NSError* error = nil;
                       AVKeyValueStatus status = [asset statusOfValueForKey:kTracksKey error:&error];
                       if (status == AVKeyValueStatusLoaded)
                       {
                           //_playerItem = [AVPlayerItem playerItemWithAsset:asset];


                           [_playerItem addOutput:_videoOutput];
                           [_player replaceCurrentItemWithPlayerItem:_playerItem];
                           [_videoOutput requestNotificationOfMediaDataChangeWithAdvanceInterval:ONE_FRAME_DURATION];

                           /* When the player item has played to its end time we'll toggle
                            the movie controller Pause button to be the Play button */
                           [[NSNotificationCenter defaultCenter] addObserver:self
                                                                    selector:@selector(playerItemDidReachEnd:)
                                                                        name:AVPlayerItemDidPlayToEndTimeNotification
                                                                      object:_playerItem];

                           seekToZeroBeforePlay = NO;

                           [_playerItem addObserver:self
                                         forKeyPath:kStatusKey
                                            options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew
                                            context:AVPlayerDemoPlaybackViewControllerStatusObservationContext];

                           [_player addObserver:self
                                     forKeyPath:kCurrentItemKey
                                        options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew
                                        context:AVPlayerDemoPlaybackViewControllerCurrentItemObservationContext];

                           [_player addObserver:self
                                     forKeyPath:kRateKey
                                        options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew
                                        context:AVPlayerDemoPlaybackViewControllerRateObservationContext];


                           [self initScrubberTimer];

                           [self syncScrubber];


                       }
                       else
                       {
                         //  NSLog(@"%@ Failed to load the tracks.", self);
                       }
                   });
}];

下面是示例代码,它给出了空像素缓冲区

CVPixelBufferRef pixelBuffer =
[_videoOutput
 copyPixelBufferForItemTime:[_playerItem currentTime]
itemTimeForDisplay:nil];

NSLog(@"the pixel buffer is %@", pixelBuffer);
NSLog (@"the _videoOutput is %@", _videoOutput.description);
CMTime dataTime = [_playerItem currentTime];
//NSLog(@"the current time is %f", dataTime);
return pixelBuffer;

4 个答案:

答案 0 :(得分:1)

我遇到了同样的问题,并在此主题中找到了答案:https://forums.developer.apple.com/thread/27589#128476

在添加输出之前,您必须等待视频准备好播放,否则它将失败并返回nil。我的快速代码如下:

func retrievePixelBufferToDraw() -> CVPixelBuffer? {
  guard let videoItem = player.currentItem else { return nil }
  if videoOutput == nil || self.videoItem !== videoItem {
    videoItem.outputs.flatMap({ return $0 as? AVPlayerItemVideoOutput }).forEach {
      videoItem.remove($0)
    }
    if videoItem.status != AVPlayerItemStatus.readyToPlay {
      // see https://forums.developer.apple.com/thread/27589#128476
      return nil
    }

    let pixelBuffAttributes = [
      kCVPixelBufferPixelFormatTypeKey as String: kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange,
      ] as [String: Any]

    let videoOutput = AVPlayerItemVideoOutput.init(pixelBufferAttributes: pixelBuffAttributes)
    videoItem.add(videoOutput)
    self.videoOutput = videoOutput
    self.videoItem = videoItem
  }
  guard let videoOutput = videoOutput else { return nil }

  let time = videoItem.currentTime()
  if !videoOutput.hasNewPixelBuffer(forItemTime: time) { return nil }
  return videoOutput.copyPixelBuffer(forItemTime: time, itemTimeForDisplay: nil)
}

答案 1 :(得分:0)

我今天遇到了类似的问题,发现它只发生在: 当项目不是为arm64体系结构构建时,运行ios 9.0或更高版本的64位设备。

更改构建设置以构建arm64架构为我解决了这个问题。

答案 2 :(得分:0)

*既然你说它也适合你,我决定发布这个作为答案,而不是评论,以尽可能提供它。

<强>答案: 我仍然在寻找一种更优雅的方法。我发现AVPlayerItemVideoOutput alloc的分配是相对于传递给它的格式设置,但它所花费的时间并不是绝对的。在alloc和loading / playing之间强制一秒等待时间为我修复了它。此外,我只创建1个AVPlayerItemVideoOutput并重复使用它,所以我只需要1个延迟。

同时

使用下面的 hasNewPixelBufferForItemTime 是我制作的Unity插件中的一个小样本,它只是将像素缓冲区的内容上传到纹理。

//////////////////////
 if (g_TexturePointer)
{
    if([plug.playerOutput hasNewPixelBufferForItemTime:[plug.player currentTime]])
    {
        pbuffer = [plug.playerOutput copyPixelBufferForItemTime:plug.player.currentItem.currentTime itemTimeForDisplay:nil];
    } ... .. . (No need to show the rest.)

快乐的编码!

答案 3 :(得分:0)

虽然这不是问题的真正解决方案 - 根据你的评论似乎是AVFoundation中的一个错误 - 我发现更好的解决方法,而不是像Brian Hodge建议的那样等待1秒,是重新创建AVPlayer,如果它无法提供像素缓冲区。根据您的实际重启程序的实际情况,这可能会显着加快并减少对用户的刺激。此外,它仅在AVPlayer实际出现问题时引入(轻微)延迟,而不是每次启动时都会引发。(/ p>

但是,播放器播放结束后,AVPlayerItemVideoOutput将不再提供像素缓冲区。所以你可能应该通过记住你是否已经收到任何像素缓冲区来防范这种情况。否则,您的播放器将执行无意循环播放。

在班级界面中:

@property (nonatomic) BOOL videoOutputHadPixelBuffer;

然后在尝试复制像素缓冲区之前:

if (![self.videoOutput hasNewPixelBufferForItemTime:self.player.currentTime] && !self.videoOutputHadPixelBuffer)
{
    [self restartPlayer]; // call your custom restart routine where you create a new AVPlayer object
}

self.videoOutputHadPixelBuffer = YES; // guard against missing pixel buffers after playback finished