In this AR effect sticker, 32 daisy flowers bloom on user's face. Left eye's blinking makes the existing daisy flowers disappear one by one and makes a new set of flowers bloom on other locations of the user's face. Flowers have color variation, scale variation and animation timing variation.
These effects were made in Unity 2020.3.0f1 and iOs ARKit was used for face tracking. Pursuing more interactive implementation both for algorithm development and art-direction, I used the AR Foundation Remote 2.0 asset.
In the pre-process, data were generated, targeting optimized workflow and better performance at run time. They were generated and encoded to textures in Houdini.
I used Unity's GPU instancing framework to render 32 flowers. A compute shader calculated the positions and rotations of the instanced flowers at run time. A custom vertex shader was used to animate the blooming / disappearing flowers with varied timings, and a custom surface shader was used to apply color variations in flowers and petals.
A global seed was updated in each trigger. It controls all variations and randomness of the effects.
The flowers were procedurally animated following the concept of CFPA.
Not like flat face painting, using real flowers as make-up elements, creates richer aesthetics.
Real life daisy flowers have a lot of color variations.
Since daisy flowers' geometrical topologies were identical, it was a good use case of GPU instancing. I used Unity's Graphics.DrawMeshInstancedIndirect function for running GPU instancing. 4 custom structured buffers were used to store the flower data and to bring updated face mesh data and to render with the GPU instancing. A compute shader was used to calculate the instance position and normal. A custom shader was used to apply per-flower procedural animation and look development in vertices and in pixels.
By determining timeline per object, we can apply different timing for driving pre-determined procedural animation. Here is more detailed explanation.
The procedural animation was triggered with user's wink gesture. The distance between of two vertices on upper and lower eyelids of left eye were used to design the interaction. I set up the logic to make the procedural animation activated when the two vertices are close enough for more than 0.5 seconds.
I modeled a low resolution daisy flower and generated textures.
I designed 2,048 unique arrangement variations.
In order to art-direct overall placement not putting them around eyes and mouth , I drew 16 guide curves. They are drawn on the mesh that was generated with ARKit tracking tool and exported from Unity.
I made a rig that procedurally and randomly distributes 32 spheres - that represent each daisy flowers - around the guide curves. This system relaxed the spheres avoiding the inter collision and stuck on the face surface. 128 procedural arrangements were made per each guide curves. All in all, 2,048 (16 x 128) arrangement variations of the daisy flowers, were produced.
Flowers should not interpenetrate with each other. First, I ran 15 iterations of relaxation to make flowers being apart a little with each other to reduce the chance of inter-collision. But, it's not full separation, instead I made them have some room for overlapping. Seeing the references, we can see stacking multiple flowers make more natural look.
Along with the art-directed arrangement variations, this solution created evenly and randomly arranged. I intentionally ran the relaxation
The custom vertex shader would need the neighbors' attributes like position and radius, to deform the flowers reducing interpenetration. I ran a neighbor search in Houdini and encoded the data in the texture.
In order to make the daisy flower instances track the right position of the updated face mesh, and stick on it, the TriangleId and the PrimUV (barycentric coordinates) were used. 2,048 arrangement variations of those, were encoded in a single 512 x 512 texture.
The normalized Petal Id was stored in the R channel of the vertex color of the instance mesh.
Triangle Id and nearest neighbors' Id are integer value and need high precision. I deconstructed the integer digits and encoded them in separate channels. The numbers would be assembled with the reversedeconstruction rule, in the custom shader.
I made a compute shader that drove the position and rotation update at run time. This compute shader decoded the textures to extract data, calculate position and rotation of instances, and updated a custom buffer to deliver the calculated result to the custom vertex and fragment shader.
Decoded barycentric coordinate was used to determine the position and normal on a triangle that has the decoded triangle id.
I implemented a collision estimation logic. This ran a collision based deformation on the flowers in the the custom vertex shader, at runtime. Each petals checked the radius of the themselves and their neighbors and the distance among them. They are bent for avoiding inter-penetration, by calculating the rotation angle (𝜃). This produced a very natural collision-based deformation
Daisy flowers' disappearing and blooming animation was run in a custom vertex shader. The normalized age that was determined with the interaction and delivered to the shader, drove 4 broken down individual procedural animation. The combination of these 4 animations is composing the final blooming / disappearing animation.
Look variation was applied in the custom fragment shader. A color palette has been generated in the pre-process . Color options were stored in a 1024x1 texture and delivered to the shader. Each flower sampled two colors from it and made color gradient on petals mixing them with the uv.y as the bias.
The ARKit face mask was used for masking the flowers. I wrote a mask shader to matte out the occluded flowers by the face.