我正在开发视频会议应用,以下代码成功地在屏幕上绘制了一个框架:
-(int)drawFrameOnMainThread{
if(mBitmapContext){
if(mDisplay){
CGImageRef imageRef = CGBitmapContextCreateImage(mBitmapContext);
#if TARGET_OS_IPHONE
UIImage *image = [UIImage imageWithCGImage:imageRef];
[self performSelectorOnMainThread:@selector(setImage:) withObject:image waitUntilDone:YES];
#elif TARGET_OS_MAC
[mDisplay setCurrentImage:imageRef];
#endif
CGImageRelease(imageRef);
}
}
return 0;
}
我想将CIFilter应用于正在绘制的帧,所以我修改了代码的iOS部分,如下所示:
UIImage *image = [UIImage imageWithCGImage:imageRef];
CIImage *beginImage = image.CIImage;
CIContext *context = [CIContext contextWithOptions:nil];
CIFilter *filter = [CIFilter filterWithName:@"CISepiaTone"
keysAndValues: kCIInputImageKey, beginImage,
@"inputIntensity", [NSNumber numberWithFloat:0.8], nil];
CIImage *outputImage = [filter outputImage];
CGImageRef cgimg =
[context createCGImage:outputImage fromRect:[outputImage extent]];
UIImage *newImg = [UIImage imageWithCGImage:cgimg];
[self performSelectorOnMainThread:@selector(setImage:) withObject:newImg waitUntilDone:YES];
结果是我的视频屏幕保持黑屏。谁能在这里看到错误?我已经在这几个小时了,不能弄明白。
答案 0 :(得分:4)
我已经解决了这个问题,问题在于在线初始化CIImage:
//Wrong
CIImage *beginImage = image.CIImage;
//Right
CIImage *beginImage = [CIImage imageWithCGImage:imageRef];
正如布拉德所说,表现是不可接受的。 iPad2上的视频落后于音频约5秒钟。所以我会为此考虑其他解决方案,但我仍然很高兴看到它更像是一个概念证明而不是其他任何东西:)