我已经能够使用AVFoundation的AVAssetReader
类将视频帧上传到OpenGL ES纹理中。但它有一个caveat,因为当它与指向远程媒体的AVURLAsset
一起使用时会失败。这个失败没有很好的记录,我想知道是否有任何解决方法的缺点。
答案 0 :(得分:31)
我已经能够使用iOS 6发布的一些API来简化这个过程。它根本不使用AVAssetReader
,而是依赖于名为AVPlayerItemVideoOutput
的类。可以通过新的AVPlayerItem
方法将此类的实例添加到任何-addOutput:
实例。
与AVAssetReader
不同,此类适用于由远程AVPlayerItem
支持的AVURLAsset
,并且还具有允许支持更复杂的播放界面的优点通过-copyPixelBufferForItemTime:itemTimeForDisplay:
进行非线性播放(而非AVAssetReader
严格限制-copyNextSampleBuffer
方法。
// Initialize the AVFoundation state
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:someUrl options:nil];
[asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:@"tracks"] completionHandler:^{
NSError* error = nil;
AVKeyValueStatus status = [asset statusOfValueForKey:@"tracks" error:&error];
if (status == AVKeyValueStatusLoaded)
{
NSDictionary* settings = @{ (id)kCVPixelBufferPixelFormatTypeKey : [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] };
AVPlayerItemVideoOutput* output = [[[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:settings] autorelease];
AVPlayerItem* playerItem = [AVPlayerItem playerItemWithAsset:asset];
[playerItem addOutput:[self playerItemOutput]];
AVPlayer* player = [AVPlayer playerWithPlayerItem:playerItem];
// Assume some instance variable exist here. You'll need them to control the
// playback of the video (via the AVPlayer), and to copy sample buffers (via the AVPlayerItemVideoOutput).
[self setPlayer:player];
[self setPlayerItem:playerItem];
[self setOutput:output];
}
else
{
NSLog(@"%@ Failed to load the tracks.", self);
}
}];
// Now at any later point in time, you can get a pixel buffer
// that corresponds to the current AVPlayer state like this:
CVPixelBufferRef buffer = [[self output] copyPixelBufferForItemTime:[[self playerItem] currentTime] itemTimeForDisplay:nil];
获得缓冲区后,可以根据需要将其上传到OpenGL。我建议使用可怕的CVOpenGLESTextureCacheCreateTextureFromImage()
函数,因为您将在所有较新的设备上获得硬件加速,这比<{1}} 更快。有关示例,请参阅Apple的GLCameraRipple和RosyWriter演示。