HTTP直播流AVAsset

时间:2014-03-06 06:51:20

标签: opencv

我正在使用avplayer在OSX上实现一个HTTP直播流媒体播放器。 我能够正确地搜索它并获得持续时间等。 现在我想使用OpenCV拍摄屏幕截图并从中处理帧。 我去使用AVASSetImageGenerator。但AVAsset没有与player.currentItem相关的音频和视频曲目。

曲目出现在player.currentItem.tracks中。 所以我无法起诉AVAssetGenerator。在这种情况下,任何人都可以帮助找到解决方案来提取屏幕截图和单个帧吗?

请在下面找到我如何启动HTTP直播流的代码

提前致谢。

NSURL* url = [NSURL URLWithString:@"http://devimages.apple.com/iphone/samples/bipbop/bipbopall.m3u8"];
playeritem = [AVPlayerItem playerItemWithURL:url];

[playeritem addObserver:self forKeyPath:@"status" options:0 context:AVSPPlayerStatusContext];
[self setPlayer:[AVPlayer playerWithPlayerItem:playeritem]];
[self addObserver:self forKeyPath:@"player.rate" options:NSKeyValueObservingOptionNew context:AVSPPlayerRateContext];
[self addObserver:self forKeyPath:@"player.currentItem.status" options:NSKeyValueObservingOptionNew context:AVSPPlayerItemStatusContext];
AVPlayerLayer *newPlayerLayer = [AVPlayerLayer playerLayerWithPlayer:[self player]];
[newPlayerLayer setFrame:[[[self playerView] layer] bounds]];
[newPlayerLayer setAutoresizingMask:kCALayerWidthSizable | kCALayerHeightSizable];
[newPlayerLayer setHidden:YES];
[[[self playerView] layer] addSublayer:newPlayerLayer];
[self setPlayerLayer:newPlayerLayer];
[self addObserver:self forKeyPath:@"playerLayer.readyForDisplay" options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew context:AVSPPlayerLayerReadyForDisplay];
[self.player play];    

以下是我如何检查资产是否存在视频跟踪

case AVPlayerItemStatusReadyToPlay:

                [self setTimeObserverToken:[[self player] addPeriodicTimeObserverForInterval:CMTimeMake(1, 10) queue:dispatch_get_main_queue() usingBlock:^(CMTime time) {
                    [[self timeSlider] setDoubleValue:CMTimeGetSeconds(time)];
                    NSLog(@"%f,%f,%f",[self currentTime],[self duration],[[self player] rate]);
                    AVPlayerItem *item = playeritem;
                    if(item.status == AVPlayerItemStatusReadyToPlay)
                    {
                    AVAsset *asset = (AVAsset *)item.asset;
                    long audiotracks = [[asset tracks] count];
                    long videotracks = [[asset availableMediaCharacteristicsWithMediaSelectionOptions]count];

                    NSLog(@"Track info Audio = %ld,Video=%ld",audiotracks,videotracks);
                    }
                }]];



                AVPlayerItem *item = self.player.currentItem;
                if(item.status != AVPlayerItemStatusReadyToPlay)
                    return;
                AVURLAsset *asset = (AVURLAsset *)item.asset;
                long audiotracks = [[asset tracksWithMediaType:AVMediaTypeAudio]count];
                long videotracks = [[asset tracksWithMediaType:AVMediaTypeVideo]count];

                NSLog(@"Track info Audio = %ld,Video=%ld",audiotracks,videotracks);

1 个答案:

答案 0 :(得分:1)

这是一个较旧的问题,但如果有人需要帮助,我有答案

AVURLAsset *asset = /* Your Asset here! */;
AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
generator.requestedTimeToleranceAfter =  kCMTimeZero;
generator.requestedTimeToleranceBefore =  kCMTimeZero;
for (Float64 i = 0; i < CMTimeGetSeconds(asset.duration) *  /* Put the FPS of the source video here */ ; i++){
    @autoreleasepool {
        CMTime time = CMTimeMake(i, /* Put the FPS of the source video here */);

        NSError *err;
        CMTime actualTime;
        CGImageRef image = [generator copyCGImageAtTime:time actualTime:&actualTime error:&err];

        // Do what you want with the image, for example save it as UIImage
        UIImage *generatedImage = [[UIImage alloc] initWithCGImage:image];

        CGImageRelease(image);
    }
}

您可以使用以下代码轻松获取视频的FPS:

float fps=0.00;
if (asset) {
    AVAssetTrack * videoATrack = [asset tracksWithMediaType:AVMediaTypeVideo][0];
    if(videoATrack)
    {
        fps = [videoATrack nominalFrameRate];
    }
}

希望能帮助那些询问如何从视频中获取所有帧或者只是某些特定帧(例如CMTime)帧的人。请记住,将所有帧保存到阵列中几乎不会影响内存!