目标c:覆盖视频的最快方法

时间:2016-04-25 09:57:27

标签: objective-c video overlay avasset avmutablecomposition

录制视频后,我想用动态uiview覆盖它,该动态uiview与当前视频帧时间戳相关。事实上,我正在努力做与Vidometer应用程序相同的事情。 在Apple example之后,我能够将视频帧提取为缓冲区,并将其覆盖在我的UIView的UIImage上。

步骤如下:

  1. 提取视频帧:

    CMSampleBufferRef sampleBuffer = [assetReaderVideoOutput copyNextSampleBuffer];
    if (cancelled) {
        completedOrFailed = YES;
        [assetWriterVideoInput markAsFinished];
    }
    CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    
  2. 在帧时间戳后更新我的UIView的子视图:

    mergeTime = CMTimeGetSeconds(CMSampleBufferGetPresentationTimeStamp(sampleBuffer));
    [self updateWidget:mergeTime*1000];
    
  3. 获取我的UIView的UIImage:

    mSurcoucheImage = [self imageFromSurcouche];
    

    使用

    -(UIImage*)imageFromSurcouche{
    CGSize mSize;
    mSize = CGSizeMake(self.mSurcouche.bounds.size.width, self.mSurcouche.bounds.size.height);
    UIGraphicsBeginImageContextWithOptions(mSize, NO, 0.0);
    if (videoOrientation == UIInterfaceOrientationLandscapeRight) {
        CGContextTranslateCTM(UIGraphicsGetCurrentContext(), mSize.width, mSize.height);
        CGContextRotateCTM(UIGraphicsGetCurrentContext(), M_PI);
    }
    [self.mSurcouche.layer renderInContext:UIGraphicsGetCurrentContext()];
    UIImage* image = UIGraphicsGetImageFromCurrentImageContext();
    UIGraphicsEndImageContext();
    return image;
    }
    
  4. 应用过滤器以保持我的UIImage的alpha:

    CVPixelBufferLockBaseAddress(pixelBuffer, 0);
                            inputImage = [CIImage imageWithCVPixelBuffer:pixelBuffer options:options];
    filter = [CIFilter filterWithName:@"CISourceOverCompositing"];
    [filter setValue:maskImage forKey:kCIInputImageKey];
    [filter setValue:inputImage forKey:kCIInputBackgroundImageKey];
    outputImage = [filter outputImage];
    CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
    

    使用

    colorSpace = CGColorSpaceCreateDeviceRGB();
    options = [NSDictionary dictionaryWithObject:(__bridge id)colorSpace forKey:kCIImageColorSpace];
    
  5. 将我的UIImage渲染到视频帧缓冲区

    [ciContext render:outputImage toCVPixelBuffer:pixelBuffer bounds:[outputImage extent] colorSpace:CGColorSpaceCreateDeviceRGB()];
    

    使用

    eaglContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2];
    ciContext = [CIContext contextWithEAGLContext:eaglContext options:@{kCIContextWorkingColorSpace : [NSNull null]}];
    if (eaglContext != [EAGLContext currentContext])
       [EAGLContext setCurrentContext:eaglContext];
    
  6. 我试图达到与Vidometer应用程序相同的性能,其覆盖持续时间小于视频持续时间,但我对这种方法很有兴趣。

    问题1:此方法效果最好吗?

    问题2:A还看到另一个使用AVMutableComposition的method,但我认为我不能将我的UIView与视频帧时间戳同步。我可以吗?

0 个答案:

没有答案