移动纹理OpenGL ES 2.0

时间:2015-05-21 07:14:47

标签: android opengl-es textures

我正在尝试在OpenGL ES 2.0中实现8列和8行的精灵 我做了第一个imagen,但我无法弄清楚如何在OpenGL ES 2.0中翻译Texture矩阵,相当于我正在寻找的OpenGL 1.0中的代码

        gl.glMatrixMode(GL10.GL_TEXTURE);
        gl.glLoadIdentity();
        gl.glPushMatrix();
        gl.glTranslatef(0.0f, 0.2f, 0f);
        gl.glPopMatrix();

这是我正在使用的矩阵

/**
 * Store the model matrix. This matrix is used to move models from object space (where each model can be thought
 * of being located at the center of the universe) to world space.
 */
private float[] mModelMatrix = new float[16];

/**
 * Store the view matrix. This can be thought of as our camera. This matrix transforms world space to eye space;
 * it positions things relative to our eye.
 */
private float[] mViewMatrix = new float[16];

/** Store the projection matrix. This is used to project the scene onto a 2D viewport. */
private float[] mProjectionMatrix = new float[16];

/** Allocate storage for the final combined matrix. This will be passed into the shader program. */
private float[] mMVPMatrix = new float[16];

/** 
 * Stores a copy of the model matrix specifically for the light position.
 */
private float[] mLightModelMatrix = new float[16];  

我的顶点着色器

uniform mat4 u_MVPMatrix;       // A constant representing the combined     model/view/projection matrix.                  
uniform mat4 u_MVMatrix;        // A constant representing the combined model/view matrix.              

attribute vec4 a_Position;      // Per-vertex position information we will pass in.                             
attribute vec3 a_Normal;        // Per-vertex normal information we will pass in.      
attribute vec2 a_TexCoordinate; // Per-vertex texture coordinate information we will pass in.       

varying vec3 v_Position;        // This will be passed into the fragment shader.                            
varying vec3 v_Normal;          // This will be passed into the fragment shader.  
varying vec2 v_TexCoordinate;   // This will be passed into the fragment shader.            

// The entry point for our vertex shader.  
void main()                                                     
{                                                         
    // Transform the vertex into eye space.     
    v_Position = vec3(u_MVMatrix * a_Position);                 

    // Pass through the texture coordinate.
    v_TexCoordinate = a_TexCoordinate;                                      

    // Transform the normal's orientation into eye space.
    v_Normal = vec3(u_MVMatrix * vec4(a_Normal, 0.0));

    // gl_Position is a special variable used to store the final position.
    // Multiply the vertex by the matrix to get the final point in normalized screen coordinates.
    gl_Position = u_MVPMatrix * a_Position;                               
}

My Fragment着色器:

precision mediump float;        // Set the default precision to medium. We don't need as high of a 
                            // precision in the fragment shader.
    uniform vec3 u_LightPos;        // The position of the light in eye space.
    uniform sampler2D u_Texture;    // The input texture.

    varying vec3 v_Position;        // Interpolated position for this fragment.
    varying vec3 v_Normal;          // Interpolated normal for this fragment.
    varying vec2 v_TexCoordinate;   // Interpolated texture coordinate per fragment.

    // The entry point for our fragment shader.
    void main()                         
    {                              
        // Will be used for attenuation.
      float distance = length(u_LightPos - v_Position);                  

        // Get a lighting direction vector from the light to the vertex.
     vec3 lightVector = normalize(u_LightPos - v_Position);                 

        // Calculate the dot product of the light vector and vertex normal. If the normal and light vector are
        // pointing in the same direction then it will get max illumination.
     float diffuse = max(dot(v_Normal, lightVector), 0.0);                                                                                

        // Add attenuation. 
     diffuse = diffuse * (1.0 / (1.0 + (0.25 * distance)));

     // Add ambient lighting
     diffuse = diffuse + 0.7;  

        // Multiply the color by the diffuse illumination level and texture value to get final output color.
     gl_FragColor = (diffuse * texture2D(u_Texture, v_TexCoordinate));                                          
     }                                                                      

1 个答案:

答案 0 :(得分:1)

您需要自己对纹理坐标执行转换,您可以在以下四个位置之一执行此操作:

  • 将转换应用于原始模型数据。
  • 在CPU中应用转换(不推荐使用,除非您有充分的理由,因为这是顶点着色器的用途)。
  • 在顶点着色器中应用变换(推荐)。
  • 在片段着色器中应用转换。

如果要对纹理坐标应用平移,最灵活的方法是使用数学库创建平移矩阵,并将新矩阵作为一个统一体传递给顶点着色器(与传递mMVPMatrix和mLightModelMatrix)。 然后,您可以将平移矩阵乘以顶点着色器中的纹理坐标,并将结果输出为变化向量。

顶点着色器:

texture_coordinate_varying = texture_matrix_uniform * texture_coordinate_attribute;

Fragment Shader:

gl_FragColor = texture2D(texture_sampler, texture_coordinate_varying);

请注意:您的GLES 1.0代码实际上并不执行翻译,因为您用推送和弹出包围它。