我想要的是播放视频(来自本地文件和远程URL)及其音轨,并检索视频每帧的像素缓冲区,以将其绘制为OpenGL纹理。
以下是我在iOS 6中使用的代码(工作正常):
发布视频
- (void) readMovie:(NSURL *)url {
NSLog(@"Playing video %@", param.url);
AVURLAsset * asset = [AVURLAsset URLAssetWithURL:url options:nil];
[asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:@"tracks"] completionHandler:
^{
dispatch_async(dispatch_get_main_queue(),
^{
NSError* error = nil;
AVKeyValueStatus status = [asset statusOfValueForKey:@"tracks" error:&error];
if (status == AVKeyValueStatusLoaded) {
NSDictionary* settings = @{ (id)kCVPixelBufferPixelFormatTypeKey : [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] };
AVPlayerItemVideoOutput* output = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:settings];
AVPlayerItem* playerItem = [AVPlayerItem playerItemWithAsset:asset];
[playerItem addOutput:output];
AVPlayer* player = [AVPlayer playerWithPlayerItem:playerItem];
[self setPlayer:player];
[self setPlayerItem:playerItem];
[self setOutput:output];
[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(bufferingVideo:) name:AVPlayerItemPlaybackStalledNotification object:nil];
[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(videoEnded:) name:AVPlayerItemDidPlayToEndTimeNotification object:nil];
[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(videoFailed:) name:AVPlayerItemFailedToPlayToEndTimeNotification object:nil];
[[self player] addObserver:self forKeyPath:@"rate" options:0 context:nil];
[[self player] addObserver:self forKeyPath:@"status" options:0 context:NULL];
[player play];
} else {
NSLog(@"%@ Failed to load the tracks.", self);
}
});
}];
}
读取视频缓冲区(在称为每帧的更新功能中)
- (void) readNextMovieFrame {
CMTime outputItemTime = [[self playerItem] currentTime];
float interval = [self maxTimeLoaded];
CMTime t = [[self playerItem] currentTime];
CMTime d = [[self playerItem] duration];
NSLog(@"Video : %f/%f (loaded : %f) - speed : %f", (float)t.value / (float)t.timescale, (float)d.value / (float)d.timescale, interval, [self player].rate);
[videoBar updateProgress:(interval / CMTimeGetSeconds(d))];
[videoBar updateSlider:(CMTimeGetSeconds(t) / CMTimeGetSeconds(d))];
if ([[self output] hasNewPixelBufferForItemTime:outputItemTime]) {
CVPixelBufferRef buffer = [[self output] copyPixelBufferForItemTime:outputItemTime itemTimeForDisplay:nil];
// Lock the image buffer
CVPixelBufferLockBaseAddress(buffer, 0);
// Get information of the image
uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(buffer);
size_t width = CVPixelBufferGetWidth(buffer);
size_t height = CVPixelBufferGetHeight(buffer);
// Fill the texture
glBindTexture(GL_TEXTURE_2D, texture);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_BGRA_EXT, GL_UNSIGNED_BYTE, baseAddress);
// Unlock the image buffer
CVPixelBufferUnlockBaseAddress(buffer, 0);
//CFRelease(sampleBuffer);
CVBufferRelease(buffer);
}
}
所以这段代码在iOS 6上工作正常,我希望它能在iOS 5上运行,但AVPlayerItemVideoOutput
不是iOS 5的一部分,所以我仍然可以播放视频,但我不知道如何检索视频的每一帧的像素缓冲区。
您是否知道我可以使用什么代替AVPlayerItemVideoOutput
来检索视频的每一帧的像素缓冲区? (它必须适用于本地和远程视频,我也想播放音轨)。
非常感谢你的帮助!