Koi Pond: Ray Tracing over a Plane
Introduction
My original idea for a final project was to create a koi pond simulation using hierarchical modeling of fish to create an animation of fish swimming under the surface of the water. A ray tracer would be used to accurately depict the refraction of water from an outside observer. Using camera controls under OpenGL, the observer can move across the water and dive underneath to see how the light changes.
Problems
As it turns out, this is a much more difficult problem than I had originally anticipated. Ray tracing is a painfully slow process, and the solution for a true real time ray tracing are still being studied. Instead, approximations are used such as vertex shaders. It uses the known angle of the camera eye from the surface of the water to the object under the water to fake the water refraction. However, I did not do enough research into vertex shaders to implement it in my project.
Attempted Solution
Instead, I attempted to do an optimization by dividing the surface area of the water into multiple zones. The zones adjacent to the object will be checked each animation cycle to see if the fish has passed into it, and the remaining zones are not processed. The area where the object is underwater will be rotated and transformed to mimic the appearance of refraction.
Outcome
However, in the end, things did not come together by the end of the quarter. I could not get the fish model to look right in my program, and the water layer refraction created holes on the bottom of the pond bed. Instead of a hierarchical model, I used a vertex transform on all of the vectors of the model along a sin wave based on the z-axis distance from each other. The final product looked like a fluid swimming motion rather than joint rotations. The water partial opacity worked when lighting was turned off, but it became fully opaque when lighting mode was switched on.
School of Fish Simulation: Collision Avoidance
Introduction
Instead of a ray tracer, I decided to try my hand at a simulation of fish school behavior. Within the constraints of a "fish tank", individual fish are spawned at a random vertex with a random vertex direction. At each animation step, it will sample the local surroundings within a short field of view for other fish and attempt to match the overall average direction. It will then locate the center of the largest cluster of fish in the field of view using the estimated center of mass of all the vertices and gravitate towards the center. Finally, if it has calculated that it will collide with the hitbox of another fish or with the fish tank constraints, it will move in the opposite direction to avoid the collision. To precipitate change in movement, a random vertex signifying an "area of interest" will be created every 20 animation steps. This will uniformly affect all of the direction vectors slightly of the fish to head in the direction of the area of interest. These four criteria will create the new direction vector for each individual fish, and hopefully emulate fish school behavior.
Outcome
Again, I did not have enough time to fully complete my implementation. I had random fish locations spawn (signified by the red squares placeholders) and cluster as they swam together away from the center of the screen. The fish tank boundaries, area of interest, and collision avoidance has not been implemented.