The raytracer parameterized spheres, planes, boxes, and triangles. The raytracer translates three dimensional intersection coordinates into two dimensional texture coordinates to sample a geometric primitive's color.
Mipmapping is the iterative filtering of a texture to progressive levels of detail. This is used to supplement pixel filtering when ray density is low. The effect is a low level of noise. The image shows different levels of the mipmap as different colors on the plane.
Ray Differentials are used to select the appropriate level of the mipmap to sample. Ray differentials are based on the Taylor series expansion of the rendering equations. The differentials are used to approximate the size of a ray's footprint on an intersected object or, alternatively, the approximate distance to a neighboring ray.
Finally, the raytracer has bound volume hierarchy (BVH) implemented to speed up intersection tests. It also uses OpenMP to do CPU threading for the eye ray casts. Lastly, it has pixel filtering to account for gamma correction.
I tried to make a scene that looked like it was underwater using a thousand spheres and a bunch of planes. One of the planes is texture mapped with a watery texture. The spheres are both refractive and reflective.