是否可以使用Apple iOS框架实时渲染电影到OpenGL纹理? 我在使用glTexSubImage2D的old NeHe tutorial中看过它,但我想知道如何使用Apple框架访问RGB数据?
初始化
NSString *mPath = [[NSBundle mainBundle] pathForResource:@"movie" ofType:@"m4v"];
NSURL *url = [NSURL fileURLWithPath:mPath];
NSDictionary *options = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:YES] forKey:AVURLAssetPreferPreciseDurationAndTimingKey];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:url options:options];
imgGen = [AVAssetImageGenerator assetImageGeneratorWithAsset:asset];
imgGen.requestedTimeToleranceBefore = kCMTimeZero;
imgGen.requestedTimeToleranceAfter = kCMTimeZero;
每一帧
double time = 0.xyz * asset.duration;
CMTime reqTime = CMTimeMakeWithSeconds (time, preferredTimeScale), actTime;
NSError *err = nil;
CGImageRef ref = [imgGen actualTime:&actTime error:&err];
//... GL calls to make an image from the CGImageRef
这种方法对于实时渲染非常慢,而且我只能生成~15帧。一种方法可能是异步生成帧,但肯定可以实时完成?最耗时的部分是copyCGImageAtTime调用。