缩放期间更改WebGl屏幕空间投影线着色器的行为

时间:2019-06-18 11:42:12

标签: three.js glsl shader vertex-shader

我正在基于Three.js构建2D Graph结构并堆栈,该问题与相机缩放期间的屏幕空间投影线行为有关。问题是,当我放大时线条会变得明显变小,而当缩小时线条会变得更大。

示例:

具有预定义厚度的法线:

Normal line with defined thickness

缩小后的行:

Line after zooming out

放大后的行:

Line after zooming in

我也在着色器中构建的所有其他元素(圆形,箭头的矩形)具有“正常”行为,并根据相机位置线性且沿相反方向更改其大小(放大时变大,缩小时变小)。我需要在着色器中使用线条达到完全相同的行为,但是由于该领域中的新事物,我不知道该怎么做。

我的线顶点着色器是WestLangley的LineMaterial着色器的稍微改编的版本,您可以在下面查看代码。我注意到的一项观察结果:

如果我删除dir = normalize(dir)线,则线的缩放行为将变得正常,但是线的粗细开始取决于节点之间的距离,这也是不合适的。

这里是VertexShader:

      `precision highp float;

        #include <common>
            #include <color_pars_vertex>
            #include <fog_pars_vertex>
            #include <logdepthbuf_pars_vertex>
            #include <clipping_planes_pars_vertex>

       uniform float linewidth;
           uniform vec2 resolution;
           attribute vec3 instanceStart;
           attribute vec3 instanceEnd;
           attribute vec3 instanceColorStart;
           attribute vec3 instanceColorEnd;
           attribute float alphaStart;
           attribute float alphaEnd;
           attribute float widthStart;
           attribute float widthEnd;
           varying vec2 vUv;
           varying float alphaTest;

       void trimSegment( const in vec4 start, inout vec4 end ) {

                 // trim end segment so it terminates between the camera plane and the near plane
                 // conservative estimate of the near plane
                 float a = projectionMatrix[ 2 ][ 2 ]; // 3nd entry in 3th column
                 float b = projectionMatrix[ 3 ][ 2 ]; // 3nd entry in 4th column
                 float nearEstimate = - 0.5 * b / a;
                 float alpha = ( nearEstimate - start.z ) / ( end.z - start.z );

                 end.xyz = mix( start.xyz, end.xyz, alpha );
           }

       void main() {                
            #ifdef USE_COLOR
                        vColor.xyz = ( position.y < 0.5 ) ? instanceColorStart : instanceColorEnd;
                        alphaTest = ( position.y < 0.5 ) ? alphaStart : alphaEnd;
                  #endif

                  float aspect = resolution.x / resolution.y;
                  vUv = uv;

                  // camera space
                  vec4 start = modelViewMatrix * vec4( instanceStart, 1.0 );
                  vec4 end = modelViewMatrix * vec4( instanceEnd, 1.0 );

                  // special case for perspective projection, and segments that terminate either in, or behind, the camera plane
                  // clearly the gpu firmware has a way of addressing this issue when projecting into ndc space
                  // but we need to perform ndc-space calculations in the shader, so we must address this issue directly
                  // perhaps there is a more elegant solution -- WestLangley

                  bool perspective = ( projectionMatrix[ 2 ][ 3 ] == - 1.0 ); // 4th entry in the 3rd column

                  if (perspective) {
                        if (start.z < 0.0 && end.z >= 0.0) {
                              trimSegment( start, end );
                        } else if (end.z < 0.0 && start.z >= 0.0) {
                              trimSegment( end, start );
                        }
                  }

                  // clip space
                  vec4 clipStart = projectionMatrix * start;
                  vec4 clipEnd = projectionMatrix * end;

                  // ndc space
                  vec2 ndcStart = clipStart.xy / clipStart.w;
                  vec2 ndcEnd = clipEnd.xy / clipEnd.w;

                  // direction
                  vec2 dir = ndcEnd - ndcStart;

                  // account for clip-space aspect ratio
                  dir.x *= aspect;
                  dir = normalize( dir );

                  // perpendicular to dir
                  vec2 offset = vec2( dir.y, - dir.x );

                  // undo aspect ratio adjustment
                  dir.x /= aspect;
                  offset.x /= aspect;

                  // sign flip
                  if ( position.x < 0.0 ) offset *= - 1.0;

                  // endcaps, to round line corners
                  if ( position.y < 0.0 ) {
                       // offset += - dir;
                  } else if ( position.y > 1.0 ) {
                       // offset += dir;
                  }

                  // adjust for linewidth
                  offset *= (linewidth * widthStart);

                  // adjust for clip-space to screen-space conversion // maybe resolution should be based on viewport ...
                  offset /= resolution.y;

                  // select end
                  vec4 clip = ( position.y < 0.5 ) ? clipStart : clipEnd;

                  // back to clip space
                  offset *= clip.w;

                  clip.xy += offset;

                  gl_Position = clip;

                  vec4 mvPosition = ( position.y < 0.5 ) ? start : end; // this is an approximation

                  #include <logdepthbuf_vertex>
                  #include <clipping_planes_vertex>
                  #include <fog_vertex>
       }`

FragmentShader:

`precision highp float;

       #include <common>
           #include <color_pars_fragment>
           #include <fog_pars_fragment>
           #include <logdepthbuf_pars_fragment>
           #include <clipping_planes_pars_fragment>

       uniform vec3 diffuse;
           uniform float opacity;
           varying vec2 vUv;
           varying float alphaTest;

       void main() {                   
         if ( abs( vUv.y ) > 1.0 ) {
                     float a = vUv.x;
                     float b = ( vUv.y > 0.0 ) ? vUv.y - 1.0 : vUv.y + 1.0;
                     float len2 = a * a + b * b;

                     if ( len2 > 1.0 ) discard;
               }

               vec4 diffuseColor = vec4( diffuse, alphaTest );

               #include <logdepthbuf_fragment>
               #include <color_fragment>

               gl_FragColor = vec4( diffuseColor.rgb, diffuseColor.a );

               #include <premultiplied_alpha_fragment>
               #include <tonemapping_fragment>
               #include <encodings_fragment>
               #include <fog_fragment>

       }`

将非常感谢您提供有关此操作的任何帮助。谢谢!

1 个答案:

答案 0 :(得分:1)

未经验证的想法,基于您对normalize()调用效果的评论。交换这些行的顺序:

dir = normalize( dir );
vec2 offset = vec2( dir.y, - dir.x );

成为

vec2 offset = vec2( dir.y, - dir.x );
dir = normalize( dir );

,这样offset仍然取决于dir的原始长度(我希望它可以使可见线宽正确地起作用),并且您仍然可以进行归一化(我希望它可以使可见线宽)行长正确地表现。)