构建针对10.7+的64位本机OSX(非iOS)ap。处理Cocoa Universe中的视频文件有点新鲜。
我希望能够打开视频文件并在openGL渲染中显示输出(IE,我希望能够有效地访问视频的帧缓冲区并将每个帧转换为opengl纹理。)< / p>
从概念上讲,这似乎很简单,但我很难通过各种(旧的和已弃用的)示例和选项,所有这些似乎最近都被弃用,转而支持AVFoundation。我可能错过了一些东西,但是在OpenGL中使用AVFoundation的例子看起来很薄。
为了进一步澄清,this sample application (QTCoreVideo101 from Apple)或多或少地完全符合我的要求,除了它是围绕已弃用的QTKit构建的,因此甚至不会以64位编译。
我现在正在阅读AVFoundation文档,但我仍然不确定尝试从AVFoundation获取glTexture或者我是否应该寻找其他地方是有意义的。
这是我最终选择的解决方案。 “thisLayer.layerSource.videoPlayerOutput”是AVPlayerItemVideoOutput对象。
if ([thisLayer.layerSource.videoPlayerOutput hasNewPixelBufferForItemTime:playerTime]){
frameBuffer= [thisLayer.layerSource.videoPlayerOutput copyPixelBufferForItemTime:playerTime itemTimeForDisplay:NULL];
CVReturn result= CVOpenGLTextureCacheCreateTextureFromImage(NULL,
textureCache,
frameBuffer,
NULL,
&textureRef);
if(result == kCVReturnSuccess){
// These appear to be GL_TEXTURE_RECTANGLE_ARB
thisLayer.layerSource.vid_glTextureTarget=CVOpenGLTextureGetTarget(textureRef);
thisLayer.layerSource.vid_glTexture=CVOpenGLTextureGetName(textureRef);
thisLayer.layerSource.vid_glTextureSize=NSMakeSize(CVPixelBufferGetWidth(frameBuffer), CVPixelBufferGetHeight(frameBuffer));
thisLayer.layerSource.vid_ciimage=[CIImage imageWithCVImageBuffer:frameBuffer];
CFRelease(textureRef);
CVOpenGLTextureCacheFlush(textureCache, 0);
}else{
NSLog(@"INTERNAL ERROR FAILED WITH CODE: %i",result);
}
CVBufferRelease(frameBuffer);
}
答案 0 :(得分:0)
AVAssetReaderTrackOutput
(已添加到您的AVAssetReader
)会输出CVPixelBufferRef
,您可以通过glTexImage
或{{1}指定上传到OpenGL的首选格式}。
答案 1 :(得分:0)
我一直在看这个,这是我目前的解决方案:
- (BOOL) renderWithCVPixelBufferForTime: (NSTimeInterval) time
{
CMTime vTime = [self.playeroutput itemTimeForHostTime:CACurrentMediaTime()];
if ([self.playeroutput hasNewPixelBufferForItemTime:vTime]) {
if (_cvPixelBufferRef) {
CVPixelBufferUnlockBaseAddress(_cvPixelBufferRef, kCVPixelBufferLock_ReadOnly);
CVPixelBufferRelease(_cvPixelBufferRef);
}
_cvPixelBufferRef = [self.playeroutput copyPixelBufferForItemTime:vTime itemTimeForDisplay:NULL];
CVPixelBufferLockBaseAddress(_cvPixelBufferRef, kCVPixelBufferLock_ReadOnly);
GLsizei texWidth = CVPixelBufferGetWidth(_cvPixelBufferRef);
GLsizei texHeight = CVPixelBufferGetHeight(_cvPixelBufferRef);
GLvoid *baseAddress = CVPixelBufferGetBaseAddress(_cvPixelBufferRef);
glBindTexture(GL_TEXTURE_RECTANGLE_ARB, self.textureName);
glTexParameteri(GL_TEXTURE_RECTANGLE_ARB, GL_TEXTURE_STORAGE_HINT_APPLE , GL_STORAGE_CACHED_APPLE);
glTexImage2D(GL_TEXTURE_RECTANGLE_ARB, 0, GL_RGB, texWidth, texHeight, 0, GL_YCBCR_422_APPLE, GL_UNSIGNED_SHORT_8_8_APPLE, baseAddress);
glBindTexture(GL_TEXTURE_RECTANGLE_ARB, 0);
}
return YES;
}
但是,我想知道是否有更有效的解决方案,我也对同一主题提出了几个方法的问题:
Best path from AVPlayerItemVideoOutput to openGL Texture
锁定基地址呼叫是一种生猪,我不确定它是否真的需要。