GPUImage的GPUImageOpacityFilter表现不正常,不会更改Alpha通道

时间:2013-01-23 00:58:12

标签: iphone ios objective-c opengl-es gpuimage

我正在尝试使用相机Feed的输出执行Overlay Blend股票图像,其中股票图像的不透明度低于100%。我想我可以在过滤器堆栈中放置GPUImageOpacityFilter,一切都会好的:

  1. GPUImageVideoCamera - > MY_GPUImageOverlayBlendFilter
  2. GPUImagePicture - > GPUImageOpacityFilter(Opacity 0.1f) - > MY_GPUImageOverlayBlendFilter
  3. MY_GPUImageOverlayBlendFilter - > GPUImageView
  4. 但是导致的不是GPUImagePicture的0.1f alpha版本混合到GPUImageVideoCamera中,它导致有点软化GPUImagePicture的颜色/对比度并混合它。所以我做了一些搜索,并尝试使用imageFromCurrentlyProcessedOutput从GPUImageOpacity过滤器中取出UIImage并将其发送到BlendFilter:

    1. GPUImagePicture - > MY_GPUImageOpacityFilter(不透明度0.1f)
    2. [MY_GPUImageOpacityFilter imageFromCurrentlyProcessedOutput] - > MY_alphaedImage
    3. GPUImagePicture(MY_alphaedImage) - > MY_GPUImageOverlayBlendFilter
    4. GPUImageVideoCamera - > MY_GPUImageOverlayBlendFilter
    5. MY_GPUImageOverlayBlendFilter - > GPUImageView
    6. 这完全符合我的预期。那么,为什么我必须imageFromCurrentlyProcessedOutput,那不应该只是排队吗?以下是上述两种情况的代码snippits:

      第一个:

      //Create the GPUPicture
      UIImage *image = [UIImage imageNamed:@"someFile"];
      GPUImagePicture *textureImage = [[[GPUImagePicture alloc] initWithImage:image] autorelease];
      
      //Create the Opacity filter w/0.5 opacity
      GPUImageOpacityFilter *opacityFilter = [[[GPUImageOpacityFilter alloc] init] autorelease];
      opacityFilter.opacity = 0.5f
      [textureImage addTarget:opacityFilter];
      
      //Create the blendFilter
      GPUImageFilter *blendFilter = [[[GPUImageOverlayBlendFilter alloc] init] autorelease];
      
      //Point the cameraDevice's output at the blendFilter
      [self._videoCameraDevice addTarget:blendFilter];
      
      //Point the opacityFilter's output at the blendFilter
      [opacityFilter addTarget:blendFilter];
      
      [textureImage processImage];
      
      //Point the output of the blendFilter at our previewView
      GPUImageView *filterView = (GPUImageView *)self.previewImageView;
      [blendFilter addTarget:filterView];
      

      第二个:

      //Create the GPUPicture
      UIImage *image = [UIImage imageNamed:@"someFile"];
      GPUImagePicture *textureImage = [[[GPUImagePicture alloc] initWithImage:image] autorelease];
      
      //Create the Opacity filter w/0.5 opacity
      GPUImageOpacityFilter *opacityFilter = [[[GPUImageOpacityFilter alloc] init] autorelease];
      opacityFilter.opacity = 0.5f
      [textureImage addTarget:opacityFilter];
      
      //Process the image so we get a UIImage with 0.5 opacity of the original
      [textureImage processImage];
      UIImage *processedImage = [opacityFilter imageFromCurrentlyProcessedOutput];
      GPUImagePicture *processedTextureImage = [[[GPUImagePicture alloc] initWithImage:processedImage] autorelease];
      
      //Create the blendFilter
      GPUImageFilter *blendFilter = [[[GPUImageOverlayBlendFilter alloc] init] autorelease];
      
      //Point the cameraDevice's output at the blendFilter
      [self._videoCameraDevice addTarget:blendFilter];
      
      //Point the opacityFilter's output at the blendFilter
      [processedTextureImage addTarget:blendFilter];
      
      [processedTextureImage processImage];
      
      //Point the output of the blendFilter at our previewView
      GPUImageView *filterView = (GPUImageView *)self.previewImageView;
      [blendFilter addTarget:filterView];
      

0 个答案:

没有答案