Custom Atlas Shader

Unity Shader Graph subgraph for highly customized texture atlassing

Optimized vector mathematics, discrete UV manipulation, and clean texture re-atlassing for mobile stereo rendering. Re-usable shader node that is easy to implement and provides a nearly endless ability to combine textures into atlases.

Below is an extensive breakdown of the functionality, including some of the original code that is free to use. I built this shader from the ground up, including the development of all of the equations that make it up.
Shader subgraph. I also wrote out the mathematical functions and documented the shader.
The Custom Atlas Shader was built to solve the problem of having up to 9 characters in a scene at a time on a mobile platform in VR. Even with single-pass stereo rendering, that many skinned meshes, all animating, was stressing the hardware's rendering capabilities.
This shader feature, built initially using HLSL and then ported to use shader graph, allows for an incredible amount of customization in atlassing textures when draw calls are an active concern. Its primary function is to re-atlas textures that are already atlassed and rearrange them according to the needs of the user. It is infinitely scalable, uses no conditional statements or ternary operations, and is contained in a single pass. It also has functionality built-in to support multiple UV sets and texture layering within the same grid space.
The actual implementation of the atlas is very simple. We snap values on the inputs so the user never has to worry about in-betweens disrupting their workflow or creating undesirable UV misalignment.
Characters with multiple textures can be reduced to a single draw call on a single texture in memory, and the texture can be changed at will during runtime with no added tech cost. The performance implications of this, especially on the mobile platform that this features was designed for, are significant.
Here, we are able to keep the same mesh in the scene, but provide a sense of randomness and variance between different scenarios that the user might encounter by changing the skin tone of the baby being treated, along with all the skin conditions that are linked to that skin tone. Different skin tones display skin ailments differently, so it was important to provide functionality that supports that learning point.
Perhaps the best use of this shader feature is on a mesh that has been automatically mesh combined, and has a need for different texture sets on the same UV set. For example, a character's skin and hair texture might be constrained to the upper left quadrant of a UV coordinate, and the lower left quadrant contains that person's clothing. This is true on every character, regardless of how we choose to combine the meshes for the skin and clothes, because the pipeline mandates it. This allows us to only use 2 textures across a nearly infinite combination of skin and outfit combinations, and change them at will.
Below is the original HLSL function, commented for clarity. Please feel free to utilize this feature in your own projects. I've condensed it into a single function here, but in the actual code the section above "//BASE TEXTURE IMPLEMENTATION" is its own shader function.
//define the scale of the atlas based on how many divisions the user declares
float scaleM = 1 / (_MainAtlasDiv);
//create a 2-dimensional template based on the scale defined
float2 uvScaleM = float2(scaleM * 2, scaleM * 2);
//the '0' or left-most point of the x coordinate for the Us to start laying out on our texture
float xPosS = ((_MainAtlasIndex) % (_MainAtlasDiv));
//the '0' or lower-most point of the y coordinate for the Vs to start laying out on our texture 
float yPosS = floor((_MainAtlasIndex) / (_MainAtlasDiv));
//defining the quadrant on the x coordinate. or nonant, as it were. using a modulus snaps to usable values
float xPosT = ((_AtlasQuadrant) % 2);
//doing the same for y. Flooring snaps to usable values.
float yPosT = floor((_AtlasQuadrant) / 2);
//creating the UV coordinate system based on the starting positions defined above
float2 offM = float2(xPosS, yPosS);
//adding the offset from the quadrant (nonant) definitions above
float2 uVDelta = offM - float2(xPosT,yPosT);
//scaling UVs to fit our new coordinate definition
float2 sourceUV = uVDelta * scaleM;
//applying our model's uv0 set to our definition. Use this when you need to define a UV set for unpacking a texture that is atlassed with the same number of divisions as the base texture.
float2 targetUV = (IN.uv_MainTex * (2 / _MainAtlasDiv) + sourceUV);
float2 matchUV = (IN.uv_MainTex + offM) * uvScaleM;
//BASE TEXTURE IMPLEMENTATION
fixed4 detail = tex2D(_DetailTex, IN.uv_MainTex);
float3 subsurf = (_SubsurfColor.rgb * detail.g * _SubsurfColor.a) * detail.b;
//as you can see here, I've used targetUV. Make sure any other implementation of targetUV is generic and not use-case specific.
fixed4 c = tex2D(_MainTex, targetUV) * lerp(fixed4(1,1,1,1), _Color, detail.b);

This is a sped up tutorial for implementing the atlas subgraph. As you can see, it takes fewer than 7 minutes to have several atlassable textures on a material and ready for customization.

Here, you can find a small sample of the documentation for this shader feature.

You may also like

Back to Top