1,000 GPU instanced cartoon fishes. Running in Unity and captured with Oculus Quest 2.
This cartoon fish model and texture were purchased from cgtrader. The fishes' posture was modified to make the fishes have neutralized poses. And I modified the texture to avoid making artifacts around uv edges, when we use mipmap.
The fish kinematic equation generates natural fish swimming animation.
The derivative of this equation can be used for getting the proper deformation of vertices and for getting the updated normal of the deformed surface.
A :
Position update with the fish kinematic equation. The mesh was skewed without the normal correction.
B :
Position update with normal correction. The derivative of the kinematic equation was used to analytically calculate the normal of the deformed mesh. However, the fins were stretched with this rule.
C :
Separate transformation was applied to fins. They were transformed with corrected normal of their pivots.
D:
Additional procedural animation on the fins was added.
The moving speed of agent particle, drove the speed of fish kinematic animation. During the simulation, a nonlinear time attribute whose value had been accumulated with speed, were used for driving the fish kinematics.
Boids system is a flocking algorithm that drives the flocking agents with weighted separation, cohesion and alignment. I implemented this system with compute shaders to make fish swarming.
Improvement
I implemented a rotation delay function that helps avoiding aggressive sudden turn between agents.
Sudden turn from the default Boids system Turn fix with rotation delay
Another improvement was fixing gimbal lock. Rotation quaternion was generated with an up and an aim vectors. In order to make the fishes swim aligning the up dir to y axis. When the aim direction also faced upward or downward direction, it caused the gimbal lock issue.
I manipulated the up vector not to face upward when fishes are moving upward or downward.
I used Unity's Graphics.DrawMeshInstancedIndirect function for running GPU instancing for populating fish agents.
Compute shaders were used for running the dynamics of fish agents, including the Boids behavious. This dynamic system updated the transformation of each agent every timestep, independent from the render pipeline. They were also used for decoding and storing vtx data and per fish data in buffers. Only 3 more extra batches were required for rendering thousands of fishes.
Textures are used to encode and decode data. The position, normal, uv and segmentation data are stored in textures. Since open exr format is platform dependent even thought it guarantees high precision, png format textures were used for the data transfer. In order not to lose precision of the data with png textures, the data values were deconstructed into two parts with decimals. The high precision textures were used for storing position, normal and uv of the vertices. The data were reassembled in a compute shader and stored in structured buffers to be accessed from the vertex shader.
Low precision data
High precision data
Vertex position (4096 x 57 x 2)
Vertex normal (4096 x 57 x 2)
Vertex uv (4096 x 57 x 2)
Vertex existence and isFin (4096 x 57)
Geometry Data (2778 triangles)
I made a hue shift mask to apply color variation per fish agent. Combining the mask with fish agent id generated random value per agent. With this solution, every fish agents could have identical colors. The per agent hue offset values were generated and stored in a structured compute buffer when the app started. They were applied in fragment shader to render different color variation per individual instanced fishes.
I applied unique attribute values per fish species as follows.
Max speed
Speed multiplier
Fish kinematic coefficeints (c1, c2, k, w)
Per agent variation for each attribute was applied and stored in the same compute buffer at the same pre-process as hue-shift offset was stored in. It materialized another level of variation.
I showed the fishes to my 5 year old daughter Chloe. She asked me, "Daddy, can you make the fishes pop when I poke them?" I thought that it's a brilliant idea.
I used the Oculus integration asset to detect the position of the right hand's index finger tip.
While running the boid behavior, the compute shader detected the distance between the index finger tip and each fishes, for checking if they were poked. A structured compute buffer was used for generating the emitters of the popping pieces from the poked fishes. Only 4 emitters were used. They were recycled to create instanced popping pieces for newly poked fishes. Popping pieces data are generated and handled with an additional structured buffer. During the IsPoked detection, the FishIds were transferred from the fish buffer to the emitter buffer so that the similar textures and the the same color hue-shift could be applied for the pieces in the graphics shader. With this data transfer, the color and pattern could be synchronized between the poked fishes and the popping pieces. Pixels of the popping piece instances sampled the color from the textures that are projected on spherical coordinate.
Along with the popping pieces, I made the neighbor fishes swim away when fishes were poked. This helped to visually enhance the impact of the popping. Fishes checked adjacent fishes' IsPoked attribute to make themselves swim away if the neighbors were poked on that frame.
After fishes are poked, I made them appear at the random position inside the bounding sphere.
Chloe's fish-pop idea sketch
This is the pipeline of the fish-pop system. One specific challenge was synchronizing the trigger of the pop sounds' playing, when fishes popped. Transferring data from GPU back to CPU causes significant performance drop issues. I found a solution. I used the AsyncGPUReadback function to deliver the activation timing to the pop sound system without any performance drop. A payback was 2-3 frames delay for playing sounds, but it was not noticeable.
Pop sounds
A custom shader was used to visualize the impact of the fish popping on user's hand. The shader read the emitter buffer to determine the normalized age and the animation of the ripple, by detecting the pop activation. The normalized distance from the finger tip, the mask, and the alpha values were encoded in the vertex color of the mesh. The shader decoded it and drove the ripple to travel from the index finger tip.
Pre-cached animated loop able ocean spectrum with the real time Hex-tiling.
I built an app, did side load with the SideQuest, and ran it in the Oculus Quest Pro. As a stand-alone app, it performs well when we have around 120 fishes in the scene.