你可以在下面找到关于LUMINANCE_ALPHA的原始问题,但我意识到我的问题是错的。 真正的问题应该是:
我们如何有效地检查使用webgl绘制的画布上完成的输出值?
是使用webgl画布作为图像在2D画布中绘制它并使用getImageData()
获取值是一个好主意吗?
const webglCanvas = ...;
const offCanvas = document.createElement('canvas');
offCanvas.style.background = 'black';
offCanvas.width = canvas.width;
offCanvas.height = canvas.height;
const context = offCanvas.getContext('2d');
context.drawImage(webglCanvas, 0, 0);
console.log( context.getImageData(0, 0, canvas.width, canvas.height).data );
我不明白gl.LUMINANCE_ALPHA是如何工作的,根据我的理解它应该得到字节2乘2并将第一个值分配给rgb而第二个值分配给alpha。 但是,当我使用webgl执行此操作时:
gl.texImage2D(gl.TEXTURE_2D, 0, gl.LUMINANCE_ALPHA, 1, 1, 0, gl.LUMINANCE_ALPHA, gl.UNSIGNED_BYTE, new Uint8Array([1, 30]));
当我期待(8, 8, 8, 30)
时,我的颜色为(1, 1, 1, 30)
。
我从those specs得到了这个定义:
每个元素都是亮度/ alpha倍。 GL将每个分量转换为浮点,钳位到范围[0,1],并通过将亮度值放在红色,绿色和蓝色通道中将它们组装成RGBA元素。
不确定这是如何适用于webgl的,因为没有double。也许我错过了converts each component to floating point
的含义或缺少一些打包/解包配置。
这是一个复制问题的片段:
const vertShaderStr = `
attribute vec2 a_position;
void main() {
gl_Position = vec4(a_position, 0, 1);
}
`;
const fragShaderStr = `
precision mediump float;
uniform sampler2D u_texture;
void main() {
gl_FragColor = texture2D(u_texture, vec2(0, 0));
}
`
var canvas = document.getElementById('canvas');
canvas.width = 1;
canvas.height = 1;
const gl = canvas.getContext('webgl');
const program = gl.createProgram();
const vertexShader = gl.createShader(gl.VERTEX_SHADER);
gl.shaderSource(vertexShader, vertShaderStr);
gl.compileShader(vertexShader);
if ( !gl.getShaderParameter(vertexShader, gl.COMPILE_STATUS) )
throw new Error('Vertex shader error', gl.getShaderInfoLog(vertexShader));
gl.attachShader(program, vertexShader);
const fragmentShader = gl.createShader(gl.FRAGMENT_SHADER);
gl.shaderSource(fragmentShader, fragShaderStr);
gl.compileShader(fragmentShader);
if ( !gl.getShaderParameter(fragmentShader, gl.COMPILE_STATUS) )
throw new Error('Fragment shader error', gl.getShaderInfoLog(fragmentShader));
gl.attachShader(program, fragmentShader);
gl.linkProgram(program);
if ( !gl.getProgramParameter(program, gl.LINK_STATUS) )
throw new Error(gl.getProgramInfoLog(program));
gl.useProgram(program);
const positionBuffer = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, positionBuffer);
gl.bufferData(gl.ARRAY_BUFFER, new Float32Array([
1, 1,
-1, 1,
1, -1,
-1, -1
]), gl.STATIC_DRAW);
const positionLocation = gl.getAttribLocation(program, 'a_position');
gl.enableVertexAttribArray(positionLocation);
gl.bindBuffer(gl.ARRAY_BUFFER, positionBuffer);
gl.vertexAttribPointer(positionLocation, 2, gl.FLOAT, false, 0, 0);
const texture = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, texture);
/*** Interresting part here ***/
gl.pixelStorei(gl.UNPACK_ALIGNMENT, 2);
gl.pixelStorei(gl.PACK_ALIGNMENT, 2);
gl.texImage2D(gl.TEXTURE_2D, 0, gl.LUMINANCE_ALPHA, 1, 1, 0, gl.LUMINANCE_ALPHA, gl.UNSIGNED_BYTE,
new Uint8Array([1, 30]));
gl.activeTexture(gl.TEXTURE0);
gl.drawArrays(gl.TRIANGLE_STRIP, 0, 4);
const offCanvas = document.createElement('canvas');
offCanvas.style.background = 'black';
offCanvas.width = canvas.width;
offCanvas.height = canvas.height;
const context = offCanvas.getContext('2d');
context.drawImage(canvas, 0, 0);
console.log( context.getImageData(0, 0, canvas.width, canvas.height).data );
<canvas id="canvas"></canvas>
刚发现alpha值(30)会影响产生的rgb。但我无法确切地知道究竟在做什么,如果它使用alpha来计算rgb,或者它是否正在从缓冲区中读取错误的字节。
答案 0 :(得分:1)
When drawing a webgl canvas to another 2d canvas conversion, filtering and blending operations are being applied which may lead to a skewed result. While you can disable blending by setting the globalCompositeOperation
on the 2d context to copy
you're still running through a conversion and filtering process which is not standardized and is not guaranteed to provide a precise result.
Using readPixels
returns correct results and is the only way to get guaranteed accurate readings from the current color framebuffer. If you need that data to be available to a 2D context you may use ImageData
in conjunction with putImageData
.
const vertShaderStr = `
attribute vec2 a_position;
void main() {
gl_Position = vec4(a_position, 0, 1);
}
`;
const fragShaderStr = `
precision mediump float;
uniform sampler2D u_texture;
void main() {
gl_FragColor = texture2D(u_texture, vec2(0, 0));
}
`
var canvas = document.getElementById('canvas');
canvas.width = 1;
canvas.height = 1;
const gl = canvas.getContext('webgl');
const program = gl.createProgram();
const vertexShader = gl.createShader(gl.VERTEX_SHADER);
gl.shaderSource(vertexShader, vertShaderStr);
gl.compileShader(vertexShader);
if ( !gl.getShaderParameter(vertexShader, gl.COMPILE_STATUS) )
throw new Error('Vertex shader error', gl.getShaderInfoLog(vertexShader));
gl.attachShader(program, vertexShader);
const fragmentShader = gl.createShader(gl.FRAGMENT_SHADER);
gl.shaderSource(fragmentShader, fragShaderStr);
gl.compileShader(fragmentShader);
if ( !gl.getShaderParameter(fragmentShader, gl.COMPILE_STATUS) )
throw new Error('Fragment shader error', gl.getShaderInfoLog(fragmentShader));
gl.attachShader(program, fragmentShader);
gl.linkProgram(program);
if ( !gl.getProgramParameter(program, gl.LINK_STATUS) )
throw new Error(gl.getProgramInfoLog(program));
gl.useProgram(program);
const positionBuffer = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, positionBuffer);
gl.bufferData(gl.ARRAY_BUFFER, new Float32Array([
1, 1,
-1, 1,
1, -1,
-1, -1
]), gl.STATIC_DRAW);
const positionLocation = gl.getAttribLocation(program, 'a_position');
gl.enableVertexAttribArray(positionLocation);
gl.bindBuffer(gl.ARRAY_BUFFER, positionBuffer);
gl.vertexAttribPointer(positionLocation, 2, gl.FLOAT, false, 0, 0);
const texture = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, texture);
/*** Interresting part here ***/
gl.pixelStorei(gl.UNPACK_ALIGNMENT, 2);
gl.pixelStorei(gl.PACK_ALIGNMENT, 2);
gl.texImage2D(gl.TEXTURE_2D, 0, gl.LUMINANCE_ALPHA, 1, 1, 0, gl.LUMINANCE_ALPHA, gl.UNSIGNED_BYTE,
new Uint8Array([1, 30]));
gl.activeTexture(gl.TEXTURE0);
gl.drawArrays(gl.TRIANGLE_STRIP, 0, 4);
var readback = new Uint8Array(4);
gl.readPixels(0,0,1,1,gl.RGBA,gl.UNSIGNED_BYTE,readback);
const offCanvas = document.createElement('canvas');
offCanvas.style.background = 'black';
offCanvas.width = canvas.width;
offCanvas.height = canvas.height;
const context = offCanvas.getContext('2d');
context.globalCompositeOperation = 'copy';
context.drawImage(canvas, 0, 0,1,1);
console.log("Canvas",context.getImageData(0, 0, canvas.width, canvas.height).data);
console.log("readPixels", readback );
<canvas id="canvas"></canvas>