从ios上的hls流解码音频样本?

时间:2014-11-24 02:04:09

标签: ios m3u8 hls

我正在尝试从iOS设备上的远程HLS(m3u8)流中解码音频样本,以便进一步处理数据,例如:记录到文件。 由于使用了参考流http://devimages.apple.com/iphone/samples/bipbop/bipbopall.m3u8

通过将AVURLAsset与AVPlayer结合使用,我可以在CALayer上将视频显示为预览。 我还可以使用AVPlayerItemVideoOutput获取原始视频数据(CVPixelBuffer)。音频也可以通过iOS设备的扬声器实现。

这是我目前用于AVURLAsset和AVPlayer的代码:

NSURL* url = [NSURL URLWithString:@"http://devimages.apple.com/iphone/samples/bipbop/bipbopall.m3u8"];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:url options:nil];
NSString *tracksKey = @"tracks";

[asset loadValuesAsynchronouslyForKeys:@[tracksKey] completionHandler: ^{

    dispatch_async(dispatch_get_main_queue(), ^{

        NSError* error = nil;
        AVKeyValueStatus status = [asset statusOfValueForKey:tracksKey error:&error];

        if (status == AVKeyValueStatusLoaded) {
            NSDictionary *settings = @
            {
                (id)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_32BGRA),
                @"IOSurfaceOpenGLESTextureCompatibility": @YES,
                @"IOSurfaceOpenGLESFBOCompatibility": @YES,
            };
            AVPlayerItemVideoOutput* output = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:settings];
            AVPlayerItem* playerItem = [AVPlayerItem playerItemWithAsset:asset];
            [playerItem addOutput:output];
            AVPlayer* player = [AVPlayer playerWithPlayerItem:playerItem];
            [player setVolume: 0.0];    // no preview audio
            self.playerItem = playerItem;
            self.player = player;
            self.playerItemVideoOutput = output;

            AVPlayerLayer* playerLayer = [AVPlayerLayer playerLayerWithPlayer: player];
            [self.preview.layer addSublayer: playerLayer];
            [playerLayer setFrame: self.preview.bounds];
            [playerLayer setVideoGravity: AVLayerVideoGravityResizeAspectFill];

            [self setPlayerLayer: playerLayer];

            [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(playerItemNewAccessLogEntry:) name:AVPlayerItemNewAccessLogEntryNotification object:self.playerItem];

            [_player addObserver:self forKeyPath:@"status" options:NSKeyValueObservingOptionNew context:&PlayerStatusContext];
            [_playerItem addObserver:self forKeyPath:@"status" options:NSKeyValueObservingOptionNew context:&PlayerItemStatusContext];
            [_playerItem addObserver:self forKeyPath:@"tracks" options:0 context:nil];
        }
    });
}];

-(void) observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context
{
    if (self.player.status == AVPlayerStatusReadyToPlay && context == &PlayerStatusContext) {
        [self.player play];
    }
}

要获取我使用的HLS流的原始视频数据:

CVPixelBufferRef buffer = [self.playerItemVideoOutput copyPixelBufferForItemTime:self.playerItem.currentTime itemTimeForDisplay:nil];

if (!buffer) {
    return;
}
CMSampleBufferRef newSampleBuffer = NULL;
CMSampleTimingInfo timingInfo = kCMTimingInfoInvalid;
timingInfo.duration = CMTimeMake(33, 1000);
int64_t ts = timestamp * 1000.0;
timingInfo.decodeTimeStamp = CMTimeMake(ts, 1000);
timingInfo.presentationTimeStamp = timingInfo.decodeTimeStamp;

CMVideoFormatDescriptionRef videoInfo = NULL;
CMVideoFormatDescriptionCreateForImageBuffer(
                                                 NULL, buffer, &videoInfo);
CMSampleBufferCreateForImageBuffer(kCFAllocatorDefault,
                                       buffer,
                                       true,
                                       NULL,
                                       NULL,
                                       videoInfo,
                                       &timingInfo,
                                       &newSampleBuffer);

// do something here with sample buffer...

CFRelease(buffer);
CFRelease(newSampleBuffer);

现在我也希望能够访问原始音频数据,但到目前为止还没有运气。 我尝试使用MTAudioProcessingTap,如下所述:
http://venodesigns.net/2014/01/08/recording-live-audio-streams-on-ios/
不幸的是我无法让它正常工作。我成功地访问了AVPlayerItem的底层assetTrack,但是回调mehtods"准备"和"过程"永远不会调用MTAudioProcessingTap。我不确定我是否走在正确的轨道上。

AVPlayer正在通过扬声器播放流的音频,因此内部音频似乎可用作原始音频数据。是否可以访问原始音频数据?如果AVPlayer不可能,还有其他方法吗?

如果可能,我不想使用ffmpeg,因为iOS设备的硬件解码器应该用于解码流。

0 个答案:

没有答案