Over the last short while, I’ve been looking at UV texture mapping via grasshopper. Now, grasshopper itself does not provide any UV tools, so in order to do any mapping, the geometry had to be baked, UV’d, and then exported for every single frame.
In order to automate this process, I relied heavily on being able to construct untrimmed NURBS surfaces, as one of the properties of a NURBS surface is that it fills the 0-1 UV space by default. From this step, it was very easy to transfer UV data onto a mesh via the ApplyCustomMapping command.
For the first animation, I tried blending a plane into a sphere (based on a technique I outlined in a previous tutorial), and then ran a wave cycle simulation over the top using kangaroo which gave the wobbly appearance. The texture was also animated using data from the meshes such as displacement from the pre-wave simulation to give lighter and darker spots across the object.
For the second animation, the same wave-cycle simulation setup was used, but applied slightly differently. The raw surface blend was exported with no displacement, and the displacement was calculated per frame and rendered out as an animated normal map. This can be seen in the second pass, the first pass is just a checkerboard pattern to visualise UV stretching. Not baking the displacement into the geometry has the benefit of being able to apply any sort of animated displacement maps further down the line and probably worked better than the first animation.
Furthermore, in the second animation I tried a different method for blending between forms. Instead of targeting each surface point to an end point on the target surface, I aligned several objects along their U direction in order to generate a series of tracks through their collection of UV points, and then set up an interpolation track to create the animated meshes, as visible in the last video.