视频作为openGLES的纹理

时间:2016-07-10 04:29:01

标签: ios swift video opengl-es render-to-texture

我想将本地视频作为纹理传递给openGL着色器。 我知道有很多帖子涉及相关主题,有些是陈旧或怪异的,还有一些我无法上班。

听起来可行的方法是:

  • 加载视频
  • CVPixelBuffer
  • 获取视频输出
  • 然后方法因yuv vs rgb,CVOpenGLESTextureCacheCreateTextureFromImage vs glTexImage2D等而异。如果没有特别的理由使用yuv,我宁愿坚持使用rgb。

我的代码能够呈现UIImages,但我无法将其改编为视频。

现在建议CVOpenGLESTextureCacheCreateTextureFromImage优先于glTexImage2D将视频帧传递给openGL程序。 Some将视频输出缓冲区转换为图像,然后将其传递给管道,但这听起来效率低下。

首先,我将如何获取传递给管理GL程序的视图的视频像素缓冲区(您可以跳过这个,因为我认为它可以正常工作):

import UIKit
import AVFoundation

class ViewController: UIViewController {
    // video things
    var videoOutput: AVPlayerItemVideoOutput!
    var player: AVPlayer!
    var playerItem: AVPlayerItem!
    var isVideoReady = false

    override func viewDidLoad() {
        super.viewDidLoad()
        self.setupVideo()
    }

    func setupVideo() -> Void {
        let url = Bundle.main.urlForResource("myVideoName", withExtension: "mp4")!

        let outputSettings: [String: AnyObject] = ["kCVPixelBufferPixelFormatTypeKey": Int(kCVPixelFormatType_32BGRA)]
        self.videoOutput = AVPlayerItemVideoOutput.init(pixelBufferAttributes: outputSettings)
        self.player = AVPlayer()
        let asset = AVURLAsset(url: url)


        asset.loadValuesAsynchronously(forKeys: ["playable"]) {
            var error: NSError? = nil
            let status = asset.statusOfValue(forKey: "playable", error: &error)
            switch status {
            case .loaded:
                self.playerItem = AVPlayerItem(asset: asset)
                self.playerItem.add(self.videoOutput)
                self.player.replaceCurrentItem(with: self.playerItem)
                self.isVideoReady = true
            case .failed:
                print("failed")
            case .cancelled:
                print("cancelled")
            default:
                print("default")
            }
        }
    }

    // this function is called just before that the openGL program renders
    // and can be used to update the texture. (all the GL program is already initialized at this point)
    func onGlRefresh(glView: OpenGLView) -> Void {
        if self.isVideoReady {
            let pixelBuffer = self.videoOutput.copyPixelBuffer(forItemTime: self.playerItem.currentTime(), itemTimeForDisplay: nil)
            glView.pixelBuffer = pixelBuffer
        }
    }
}

这似乎工作正常,即使我无法真正测试它:)

所以现在我已经CVPixelBuffer可用(加载视频后) 如何将其传递给GL计划?

此代码适用于CGImage?

    // textureSource is an CGImage?
    guard let textureSource = textureSource else { return }
    let width: Int = textureSource.width
    let height: Int = textureSource.height

    let spriteData = UnsafeMutablePointer<GLubyte>(calloc(Int(UInt(CGFloat(width) * CGFloat(height) * 4)), sizeof(GLubyte.self)))

    let colorSpace = textureSource.colorSpace!

    let spriteContext: CGContext = CGContext(data: spriteData, width: width, height: height, bitsPerComponent: 8, bytesPerRow: width*4, space: colorSpace, bitmapInfo: CGImageAlphaInfo.premultipliedLast.rawValue)!
    spriteContext.draw(in: CGRect(x: 0, y: 0, width: CGFloat(width), height: CGFloat(height)), image: textureSource)

    glBindTexture(GLenum(GL_TEXTURE_2D), _textureId!)
    glTexImage2D(GLenum(GL_TEXTURE_2D), 0, GL_RGBA, GLsizei(width), GLsizei(height), 0, GLenum(GL_RGBA), UInt32(GL_UNSIGNED_BYTE), spriteData)

    free(spriteData)

但我无法理解如何有效地适应CVPixelBuffer

如果需要,我很乐意分享更多代码,但我认为此帖已经足够长了:)

==========编辑==========

我看了一堆回购(所有来自Apple&#39; CameraRipple和Ray Wenderlich tutorial)和here的副本是到目前为止我所做的github回购(我会保留它以保留链接)它并不理想,但我不想在这里粘贴太多代码。我已经能够获得一些视频纹理,但是:

  • 颜色错误
  • 模拟器中的显示与设备上的显示不同。在模拟器中,只显示视频的左半部分(并覆盖整个屏幕),并且存在一些垂直像差。

模拟器问题看起来可能与XCode 8处于测试阶段有关,但我对此并不确定......

2 个答案:

答案 0 :(得分:2)

前段时间我面临同样的问题,开始的好点是Apple提供的样本(CameraRipple

你真正需要的是什么:

  1. 你应该得到一些CVPixelBufferRef(根据你的帖子 - 已经完成)。这应该反复收到openGL程序以显示实时视频
  2. 使用可以处理视频的着色器(在此我的意思是将yuv转换为正常颜色的着色器)
  3. 示例:

        varying lowp vec2 v_texCoord;
        precision mediump float;
    
        uniform sampler2D SamplerUV;
        uniform sampler2D SamplerY;
        uniform mat3 colorConversionMatrix;
    
        void main()
        {
            mediump vec3 yuv;
            lowp vec3 rgb;
    
            // Subtract constants to map the video range start at 0
            yuv.x = (texture2D(SamplerY, v_texCoord).r - (16.0/255.0));
            yuv.yz = (texture2D(SamplerUV, v_texCoord).ra - vec2(0.5, 0.5));
    
            rgb =   yuv*colorConversionMatrix;
    
            gl_FragColor = vec4(rgb,1);
    
        }
    
    1. 为了显示视频,Apple建议使用下一个colorConversation矩阵(我也使用它)

      static const GLfloat kColorConversion709[] = {
          1.1643,  0.0000,  1.2802,
          1.1643, -0.2148, -0.3806,
          1.1643,  2.1280,  0.0000
      };
      
    2. 因为如何在openGL上显示缓冲区作为纹理 - 你可以使用像

      这样的东西
         -(void)displayPixelBuffer:(CVPixelBufferRef)pixelBuffer
         {
          CVReturn err;
          if (pixelBuffer != NULL) {
          int frameWidth = (int)CVPixelBufferGetWidth(pixelBuffer);
          int frameHeight = (int)CVPixelBufferGetHeight(pixelBuffer);
      
          if (!_videoTextureCache) {
              NSLog(@"No video texture cache");
              return;
          }
          [self cleanUpTextures];
      
          //Create Y and UV textures from the pixel buffer. These textures will be drawn on the frame buffer
      
          //Y-plane.
          glActiveTexture(GL_TEXTURE0);
          err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, _videoTextureCache, pixelBuffer, NULL,  GL_TEXTURE_2D, GL_LUMINANCE, frameWidth, frameHeight, GL_LUMINANCE, GL_UNSIGNED_BYTE, 0, &_lumaTexture);
          if (err) {
              NSLog(@"Error at CVOpenGLESTextureCacheCreateTextureFromImage %d", err);
          }
      
          glBindTexture(CVOpenGLESTextureGetTarget(_lumaTexture), CVOpenGLESTextureGetName(_lumaTexture));
          glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
          glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
          glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
          glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
      
          // UV-plane.
          glActiveTexture(GL_TEXTURE1);
          err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, _videoTextureCache, pixelBuffer, NULL, GL_TEXTURE_2D, GL_LUMINANCE_ALPHA, frameWidth / 2, frameHeight / 2, GL_LUMINANCE_ALPHA, GL_UNSIGNED_BYTE, 1, &_chromaTexture);
          if (err) {
              NSLog(@"Error at CVOpenGLESTextureCacheCreateTextureFromImage %d", err);
          }
      
          glBindTexture(CVOpenGLESTextureGetTarget(_chromaTexture), CVOpenGLESTextureGetName(_chromaTexture));
          glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
          glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
          glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
          glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
      
          glEnableVertexAttribArray(_vertexBufferID);
          glBindFramebuffer(GL_FRAMEBUFFER, _vertexBufferID);
      
          CFRelease(pixelBuffer);
      
          glUniformMatrix3fv(uniforms[UNIFORM_COLOR_CONVERSION_MATRIX], 1, GL_FALSE, _preferredConversion);
      }
      }
      
    3. 不要忘记清理纹理

      -(void)cleanUpTextures
      {
          if (_lumaTexture) {
              CFRelease(_lumaTexture);
              _lumaTexture = NULL;
          }
          if (_chromaTexture) {
              CFRelease(_chromaTexture);
              _chromaTexture = NULL;
          }
          // Periodic texture cache flush every frame
          CVOpenGLESTextureCacheFlush(_videoTextureCache, 0);
      }
      
    4. PS。不是快速但实际上这应该是一个问题,将obj-c转换为swift我猜

答案 1 :(得分:0)

关于颜色,你错过了通过在refreshTextures()中每个部分的末尾调用glUniform1i()来指定制服的纹理;

func refreshTextures() -> Void {
    guard let pixelBuffer = pixelBuffer else { return }
    let textureWidth: GLsizei = GLsizei(CVPixelBufferGetWidth(pixelBuffer))
    let textureHeight: GLsizei = GLsizei(CVPixelBufferGetHeight(pixelBuffer))

    guard let videoTextureCache = videoTextureCache else { return }

    self.cleanUpTextures()

    // Y plane
    glActiveTexture(GLenum(GL_TEXTURE0))

    var err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, videoTextureCache, pixelBuffer, nil, GLenum(GL_TEXTURE_2D), GL_RED_EXT, textureWidth, textureHeight, GLenum(GL_RED_EXT), GLenum(GL_UNSIGNED_BYTE), 0, &lumaTexture)

    if err != kCVReturnSuccess {
        print("Error at CVOpenGLESTextureCacheCreateTextureFromImage %d", err)
        return
    }
    guard let lumaTexture = lumaTexture else { return }

    glBindTexture(CVOpenGLESTextureGetTarget(lumaTexture), CVOpenGLESTextureGetName(lumaTexture))
    glTexParameteri(GLenum(GL_TEXTURE_2D), GLenum(GL_TEXTURE_MIN_FILTER), GL_LINEAR)
    glTexParameteri(GLenum(GL_TEXTURE_2D), GLenum(GL_TEXTURE_MAG_FILTER), GL_LINEAR)
    glTexParameterf(GLenum(GL_TEXTURE_2D), GLenum(GL_TEXTURE_WRAP_S), GLfloat(GL_CLAMP_TO_EDGE))
    glTexParameterf(GLenum(GL_TEXTURE_2D), GLenum(GL_TEXTURE_WRAP_T), GLfloat(GL_CLAMP_TO_EDGE))

    glUniform1i(_locations.uniforms.textureSamplerY, 0)


    // UV plane
    glActiveTexture(GLenum(GL_TEXTURE1))

    err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, videoTextureCache, pixelBuffer, nil, GLenum(GL_TEXTURE_2D), GL_RG_EXT, textureWidth/2, textureHeight/2, GLenum(GL_RG_EXT), GLenum(GL_UNSIGNED_BYTE), 1, &chromaTexture)

    if err != kCVReturnSuccess {
        print("Error at CVOpenGLESTextureCacheCreateTextureFromImage %d", err)
        return
    }
    guard let chromaTexture = chromaTexture else { return }

    glBindTexture(CVOpenGLESTextureGetTarget(chromaTexture), CVOpenGLESTextureGetName(chromaTexture))
    glTexParameteri(GLenum(GL_TEXTURE_2D), GLenum(GL_TEXTURE_MIN_FILTER), GL_LINEAR)
    glTexParameteri(GLenum(GL_TEXTURE_2D), GLenum(GL_TEXTURE_MAG_FILTER), GL_LINEAR)
    glTexParameterf(GLenum(GL_TEXTURE_2D), GLenum(GL_TEXTURE_WRAP_S), GLfloat(GL_CLAMP_TO_EDGE))
    glTexParameterf(GLenum(GL_TEXTURE_2D), GLenum(GL_TEXTURE_WRAP_T), GLfloat(GL_CLAMP_TO_EDGE))

    glUniform1i(_locations.uniforms.textureSamplerUV, 1)
}

在这里,制服的类型也被修正为

private struct Uniforms {
    var textureSamplerY = GLint()
    var textureSamplerUV = GLint()
}

现在看来我们得到了正确的颜色。