如果我不需要,我宁愿不重新制作轮子,而且之前必须这样做。是否有使用OpenGL ES的Sobel滤波器的实现?
答案 0 :(得分:17)
如果Objective-C可以接受,您可以查看我的GPUImage框架及其GPUImageSobelEdgeDetectionFilter。这适用于使用OpenGL ES 2.0片段着色器的Sobel边缘检测。您可以在this answer中的“草图”示例中查看此输出。
如果您不想深入了解Objective-C代码,此处的关键工作由两组着色器执行。在第一遍中,我将图像缩小到其亮度,并将该值存储在红色,绿色和蓝色通道中。我使用以下顶点着色器执行此操作:
attribute vec4 position;
attribute vec4 inputTextureCoordinate;
varying vec2 textureCoordinate;
void main()
{
gl_Position = position;
textureCoordinate = inputTextureCoordinate.xy;
}
和片段着色器:
precision highp float;
varying vec2 textureCoordinate;
uniform sampler2D inputImageTexture;
const highp vec3 W = vec3(0.2125, 0.7154, 0.0721);
void main()
{
float luminance = dot(texture2D(inputImageTexture, textureCoordinate).rgb, W);
gl_FragColor = vec4(vec3(luminance), 1.0);
}
之后,我实际使用此顶点着色器执行Sobel边缘检测(在这种情况下,较亮像素为边缘):
attribute vec4 position;
attribute vec4 inputTextureCoordinate;
uniform highp float imageWidthFactor;
uniform highp float imageHeightFactor;
varying vec2 textureCoordinate;
varying vec2 leftTextureCoordinate;
varying vec2 rightTextureCoordinate;
varying vec2 topTextureCoordinate;
varying vec2 topLeftTextureCoordinate;
varying vec2 topRightTextureCoordinate;
varying vec2 bottomTextureCoordinate;
varying vec2 bottomLeftTextureCoordinate;
varying vec2 bottomRightTextureCoordinate;
void main()
{
gl_Position = position;
vec2 widthStep = vec2(imageWidthFactor, 0.0);
vec2 heightStep = vec2(0.0, imageHeightFactor);
vec2 widthHeightStep = vec2(imageWidthFactor, imageHeightFactor);
vec2 widthNegativeHeightStep = vec2(imageWidthFactor, -imageHeightFactor);
textureCoordinate = inputTextureCoordinate.xy;
leftTextureCoordinate = inputTextureCoordinate.xy - widthStep;
rightTextureCoordinate = inputTextureCoordinate.xy + widthStep;
topTextureCoordinate = inputTextureCoordinate.xy + heightStep;
topLeftTextureCoordinate = inputTextureCoordinate.xy - widthNegativeHeightStep;
topRightTextureCoordinate = inputTextureCoordinate.xy + widthHeightStep;
bottomTextureCoordinate = inputTextureCoordinate.xy - heightStep;
bottomLeftTextureCoordinate = inputTextureCoordinate.xy - widthHeightStep;
bottomRightTextureCoordinate = inputTextureCoordinate.xy + widthNegativeHeightStep;
}
和这个片段着色器:
precision highp float;
varying vec2 textureCoordinate;
varying vec2 leftTextureCoordinate;
varying vec2 rightTextureCoordinate;
varying vec2 topTextureCoordinate;
varying vec2 topLeftTextureCoordinate;
varying vec2 topRightTextureCoordinate;
varying vec2 bottomTextureCoordinate;
varying vec2 bottomLeftTextureCoordinate;
varying vec2 bottomRightTextureCoordinate;
uniform sampler2D inputImageTexture;
void main()
{
float i00 = texture2D(inputImageTexture, textureCoordinate).r;
float im1m1 = texture2D(inputImageTexture, bottomLeftTextureCoordinate).r;
float ip1p1 = texture2D(inputImageTexture, topRightTextureCoordinate).r;
float im1p1 = texture2D(inputImageTexture, topLeftTextureCoordinate).r;
float ip1m1 = texture2D(inputImageTexture, bottomRightTextureCoordinate).r;
float im10 = texture2D(inputImageTexture, leftTextureCoordinate).r;
float ip10 = texture2D(inputImageTexture, rightTextureCoordinate).r;
float i0m1 = texture2D(inputImageTexture, bottomTextureCoordinate).r;
float i0p1 = texture2D(inputImageTexture, topTextureCoordinate).r;
float h = -im1p1 - 2.0 * i0p1 - ip1p1 + im1m1 + 2.0 * i0m1 + ip1m1;
float v = -im1m1 - 2.0 * im10 - im1p1 + ip1m1 + 2.0 * ip10 + ip1p1;
float mag = length(vec2(h, v));
gl_FragColor = vec4(vec3(mag), 1.0);
}
imageWidthFactor
和imageHeightFactor
只是输入图像大小的倒数(以像素为单位)。
你可能会注意到这种两遍方法比上面链接的答案更复杂。这是因为在移动GPU(至少是iOS设备中的PowerVR类型)上运行时,最初的实现并不是最有效的。通过删除所有相关的纹理读取并预先计算亮度,以便我只需要从最终着色器中的红色通道进行采样,这种调整边缘检测方法在我的基准测试中比在一次通过中完成所有这些操作的天真快20倍。 / p>