3

I am attempting to wrap a texture around a sphere without any warping using Unity's Shader Graph. I have been following this StackExchange answer, and have semi-successfully converted their shader code into this graph:

enter image description here

Unfortunately, however, I end up with a seam from pole to pole of the sphere. It is very noticeable in-game, but a little hard to see in a screenshot:

enter image description here

After some research, it is my understanding that it is rather difficult to wrap a sphere without generating a seam. But, if I use the texture on one of Unity's default shaders, no seam is generated. So I thought I would ask anyway. Of course I can not use a default shader because then I would have a warped texture again.

  • This is due to mipmapping with misleading texture gradients. You can fix this by computing your own screenspace derivatives for the x and y texture coordinates rather than using the default/auto-generated ones. – DMGregory Jun 26 '23 at 17:21
  • @DMGregory Thanks! After googling mipmapping I found the Sample Texture 2D LOD node and set LOD to 0 which I think basically overwrites mipmapping and it seems to work. Does that sound right? I want to make sure I'm not using any bad practices. – Harper Rhett Jun 26 '23 at 19:18
  • Disabling mipmapping will tend to show sparkly aliasing and can impact performance. I recommend keeping it on. Is there a Sample Texture Grad(ient) node that will accept custom derivatives? – DMGregory Jun 26 '23 at 21:04
  • @DMGregory there does not seem to be. And, I am not sure where else to send the outputs either. Would I have to modify the texture before sampling? – Harper Rhett Jun 27 '23 at 12:43

1 Answers1

2

First, the reason this happens:

When rendering a triangle, your GPU evaluates 2x2 blocks of pixels at a time. That means when you sample a texture in a fragment shader, the GPU has 4 sets of texture coordinates to work with. It looks at the differences between these 4 texture coordinates (screenspace partial derivatives) to estimate how much the texture is being stretched in this neighbourhood - either due to the camera being close/far, looking at the surface at an angle, or due to intrinsic stretching in the model's UV unwrap. With this information, it can automatically select the right mipmap to read from, or apply the right amount of anisotropic filtering, so that the texture comes out looking the best it can: not sparkly and aliased from undersampling, and not unnecessarily blurry.

The trouble is that when we calculate texture coordinates for an equirectangular projection / spherical coordinates, one of these 2x2 screenspace pixel blocks can straddle the wrap-around point from just-shy-of-one to just-past-zero in the longitude direction. As far as the GPU is concerned, that means the entire texture has been compressed down, so that 100% of its width fits across a 1-pixel stride, and it auto-selects the smallest mip to sample from. You end up with the average colour of the whole texture, instead of the appropriately-filtered colour for the part of the sphere you were actually trying to render.

Here is a graph that fixes this seam in Unity URP:

Shader graph with LoD selection

The idea here is that we calculate two texture coordinates: one with the wrap-around point in the normal place, and one where we've moved the wrap-around point to the other side of the sphere. We calculate what level of detail should be used for each attempt, and when they disagree (when one says the block straddles the wrap-around and the other doesn't), we pick the smallest one. Then finally we sample the texture with that "least-of-two" LoD estimates.

One downside with this approach is that we lose anisotropic filtering, so you may find the glancing edges of the sphere appear more blurry than you'd like.

In shader code, we can use the tex2Dgrad function or equivalents to do a similar trick while supporting anisotropy, but as you've found, this function isn't currently exposed to the Unity shader graph - we'd have to use a Custom Function node to inject raw shader code to access it.

A simpler solution though, and one that may scale better for your needs, is to store your spherical textures not as 2D equirectangular rectangles, but as cubemaps. This gives a more even texel density over the surface of the sphere, so you can get away with smaller textures for the same worst-case detail, or better detail with the same texture memory. Filtering for these is already supported out-of-the-box, so your shader stays simpler too.

DMGregory
  • 134,153
  • 22
  • 242
  • 357
  • Thanks a million for your continued help. I have run into another problem however. I added bump mapping to this code and it works but the result is... sparkly. I assume this is because of the lack of anisotropy. Would you be able to go into detail about how to calculate the DDX and DDY values to input into the tex2Dgrad function? Thanks again for all the help. – Harper Rhett Jun 28 '23 at 19:25
  • That seems like a new problem to ask about separately. I wouldn't expect a lack of anisotropy to manifest as sparkling — usually it would just look blurrier than you want at shallow angles. Do you observe the sparkling everywhere, only at the seam, or only at the outside edges of the sphere? Have you enabled mipmapping on the normal map? Try posting a question showing a minimal complete verifiable example. – DMGregory Jun 28 '23 at 23:41
  • I just created it here. – Harper Rhett Jun 29 '23 at 13:46