UV Set Based Deformation

𝙐𝙨𝙞𝙣𝙜 𝙐𝙄 𝙩𝙤 𝙙𝙧𝙞𝙫𝙚 𝙨𝙝𝙖𝙙𝙚𝙧
This was my first approach to the project, but I reached an impasse when I was going to apply an animation and read that skeletal meshes can only have 4 UV sets in UE5.

𝙐𝙨𝙞𝙣𝙜 𝙃𝙞𝙩 𝙙𝙚𝙩𝙚𝙘𝙩𝙞𝙤𝙣
My first intention with this scene was to use render targets and project color onto the UV space of the mesh to get localized deformations. But I couldn't quite get the projection to work as intended.

𝘽𝙖𝙨𝙚 𝙈𝙚𝙨𝙝
I started with making a base sculpt in ZBrush which I then retopologized in maya.

𝘽𝙖𝙨𝙚 𝙈𝙚𝙨𝙝
I started with making a base sculpt in ZBrush which I then retopologized in maya.

𝙑𝙖𝙧𝙞𝙖𝙣𝙩𝙨
I then took the base mesh back to ZBrush and created the deformation variants so that I could bake them in houdini.

𝙑𝙖𝙧𝙞𝙖𝙣𝙩𝙨
I then took the base mesh back to ZBrush and created the deformation variants so that I could bake them in houdini.

𝙏𝙚𝙭𝙩𝙪𝙧𝙞𝙣𝙜
I made handpainted texture variations to easily get variety in each albedo.

𝙏𝙚𝙭𝙩𝙪𝙧𝙞𝙣𝙜
I made handpainted texture variations to easily get variety in each albedo.

𝙃𝙤𝙪𝙙𝙞𝙣𝙞 𝙎𝙚𝙩𝙪𝙥
I came to the conclusion to use 1.5 UV sets per displacement.  As the displacement were inaccurate with just 1 UV set. This also allowed me to swizzle the vectors later on.

𝙃𝙤𝙪𝙙𝙞𝙣𝙞 𝙎𝙚𝙩𝙪𝙥
I came to the conclusion to use 1.5 UV sets per displacement. As the displacement were inaccurate with just 1 UV set. This also allowed me to swizzle the vectors later on.

𝘿𝙞𝙨𝙥𝙡𝙖𝙘𝙚𝙢𝙚𝙣𝙩 𝘽𝙡𝙚𝙣𝙙𝙞𝙣𝙜

This is how the shader blends between the different UV sets to create the displacement. 
The swizzling of the coordinates can be seen here.

𝘿𝙞𝙨𝙥𝙡𝙖𝙘𝙚𝙢𝙚𝙣𝙩 𝘽𝙡𝙚𝙣𝙙𝙞𝙣𝙜

This is how the shader blends between the different UV sets to create the displacement.
The swizzling of the coordinates can be seen here.

𝘼 𝙗𝙚𝙩𝙩𝙚𝙧 𝙫𝙞𝙚𝙬
Here, some uv coordinates needed to be inverted for correct displacement to get the right directions.

𝘼 𝙗𝙚𝙩𝙩𝙚𝙧 𝙫𝙞𝙚𝙬
Here, some uv coordinates needed to be inverted for correct displacement to get the right directions.

𝘽𝙡𝙚𝙣𝙙𝙞𝙣𝙜 𝙩𝙚𝙭𝙩𝙪𝙧𝙚𝙨
This is how I blend between the different texture maps. This segment is between the strength textures. Which in turn blends with the agility textures.

𝘽𝙡𝙚𝙣𝙙𝙞𝙣𝙜 𝙩𝙚𝙭𝙩𝙪𝙧𝙚𝙨
This is how I blend between the different texture maps. This segment is between the strength textures. Which in turn blends with the agility textures.

𝘽𝙡𝙚𝙣𝙙𝙞𝙣𝙜 𝙩𝙚𝙭𝙩𝙪𝙧𝙚𝙨
To get the blending to go between a midpoint of 0.5 I had to multiply in the base texture using a similar approach to the displacement logic as the shader needed a solid midpoint when both attributes are 50%.

𝘽𝙡𝙚𝙣𝙙𝙞𝙣𝙜 𝙩𝙚𝙭𝙩𝙪𝙧𝙚𝙨
To get the blending to go between a midpoint of 0.5 I had to multiply in the base texture using a similar approach to the displacement logic as the shader needed a solid midpoint when both attributes are 50%.

UV Set Based Deformation

I wanted to explore a more performant alternative to traditional blendshapes by storing deformations in UV sets using Houdini. My goal was to understand both the benefits and limitations of this approach compared to conventional methods.

The concept revolves around offloading deformation calculations to the GPU rather than relying on CPU-intensive blendshapes. Unlike texture-based deformation storage, this method requires fewer samples, freeing up resources for other aspects of the project.

As I developed the implementation, I discovered that splitting coordinates into three separate channels (using 1.5 UV sets per deformation instead of a single UV set) improved accuracy. This approach also simplified the shader implementation, particularly when dealing with the different coordinate systems used by Unreal, Houdini, and Maya.

During development, I encountered several engine limitations that forced me to pivot. Unreal Engine 5 can only handle 8 UV sets total, and more critically, only 4 UV sets in a shader for rigged meshes. Since my shader was already using 7 UV sets for attributes, I had to abandon my rigging and animation work, a lesson in checking engine constraints before going too deep into implementation!

While this vertex displacement method proved less than ideal for character work, I believe it could be used for other applications like dynamic vegetation growth and other dynamic environment effects.

- 𝙏𝙖𝙠𝙚𝙖𝙬𝙖𝙮𝙨 -
- GPU-based deformations can offer better real-time performance than traditional blendshapes.
- Always research engine limitations before committing to a technical approach.
- Splitting displacement coordinates across channels improves accuracy and simplifies implementation
- The technique trades some precision for performance gains.

- 𝙄𝙛 𝙄 𝙝𝙖𝙙 𝙢𝙤𝙧𝙚 𝙩𝙞𝙢𝙚 -
-The first would be to get the localized displacement I tried to implement as I was developing the implementations to work. So that each limb could grow or shrink individually.

-I'd create a version using the 4 available UV sets in UE5 with a rigged character to test the approach in a more gameplay-relevant context.