Final Ray Traced Renders and Pokemon Snap VR Project

by Reed Garmsen for CPE 473

Overview of the Final Renders

For my final three renders from the ray tracer I'd been building all quarter, I wanted to emphasize both its somewhat unique aspects, as well as its ability to render reflections and refractions, both of which I thought worked well. I also wanted to create my own scenes that were artistically pleasing and interesting. As a final note, all of the below renders were also created using the anti-aliasing feature of my raytracer, allowing for much smoother edges on the geometry of the scene.

Other techniques developed in my final ray tracer but not demonstrated below include the ability to display multiple lights, shape transforms, and a bounding volume hierarchy (BVH) data structure to improve the performance of the ray tracer. Also, while it is subtle, Schlick's approximation is present in the refractions shown below. All three images were rendered at a 640 x 480 resolution.

Render 1 (Source .pov file)

The Jaws of Mordor
"The Jaws of Mordor"

How many pyramids are in the scene? Two physical ones, the rest are reflections (the two closest to the foreground)! I liked how the use of reflections caused this one to turn out; I think it conveys a Mordor-esque feeling as well as looking really cool. From a technical/complexity perspective, this showcases my renderer's ability to successfully reflect multiple surfaces and objects multiple times.

Render 2 (Source .pov file)

Camera Refraction
"Camera Refraction"

Originally my goal was to build some sort of physical camera in the scene (hence the sphere in the box). However, I ended up sticking with a more simple approach as I thought it looked cooler. Here, refraction also comes into play along with reflection. The left sphere has a cool morphing effect on the edge of the box as well as its shadow that shows off my renderer's ability to reflect and refract sphere shapes. I also really liked how the refractive sphere looked embedded into the box and how my renderer was able to handle that.

Render 3 (Source .pov file)

Staring Past It All
"Staring Past It All"

While I had issues getting my global illumination model fully working with some shapes, I was able to implement a simple depth of field effect as part of my ray tracer for Part 5. This showcases that effect as well as how reflection and refraction mix with it. I think the combination of the reflection of the middle sphere with the other blurred object creates a cool effect. I also wanted to create a "lighter" picture as my other renders had all turned out pretty dark! Finally, of the three renders, this one took the longest at 2.89 minutes. This was achieved using 256 sample rays for the anti-aliasing (which doesn't include any additional refractive or reflective rays), and showcases the BVH performance improvement; without the BVH, the render would have most likely taken much longer.

What I Learned From Making a Ray Tracer

Rendering is hard. To be more specific, while remaining brief (going over everything I learned would take a while), I learned a whole lot about the math that makes rendering go, from the goal of the rendering equation to the specifics of how to implement shape intersections. I also learned a lot about software development and design in C++ (and in general too). This has been by far my longest single project at school and I really started to feel the effect of "technical debt" during the end of it. Also just getting some real work done in C++ from scratch was awesome.


Overview of My Final Project: Pokemon Snap VR

Screenshot of Pokemon Snap VR

As soon as I saw the design of the physical Google Cardboard unit and how much it resembled an actual camera with its button I knew what I wanted to make for my final project in VR. If you are not familiar, Pokemon Snap was a game released for the Nintendo 64 in 1999, where the player rode around in a cart taking pictures of Pokemon in various environments. While not a particularly amazing game, it holds a fair amount of nostalgia for me and it seemed ripe for reimagining (albeit much more simply) in VR!

The game works by moving the player forward (I tested this on my roommates to ensure no nausea was induced as I wanted to avoid a "moving train" effect on players) and allowing them to look around 360 degrees using their head. The player must then snap photos of all four Pokemon in the scene by looking at them with the in-game reticle and clicking the Cardboard button. Once the player has taken pictures of all the Pokemon they can restart the score counter and play again.

Some interesting technology I used to build the game included particle effects, transforms, materials, and, of course, VR. Most of the more complicated models and effects were taken from the free section of the Unity Asset store, with the exception of the Pokemon specific stuff and the pools. The game was heavily modified from the "Ninja Attack" tutorial (see references below under "Unity and Google Cardboard Tutorial/Resources")

What I Learned From Making a Game in VR

Rendering is hard. Wait, that was the first section! To be fair, that notion still applies here, just in real-time instead of having as much time as necessary to render one frame. Most of the challenge here was in setting up the tools and getting familiar with how Unity worked. After that, the next biggest challenge was understanding how making a game in VR changes the rules of good design and how to apply these new rules. But to be totally honest 95% of the time was spent trying to get Unity to do what I wanted it to do (not a complaint, ended up being a lot of fun)!


References

Pokemon is Copyright Gamefreak, Nintendo and The Pokémon Company 2001-2013