# Global Illumination + Color Bleeding

## Rohin Chander

For my project, I set out to create a globally illuminated world. Originally, this was to be created using photon mapping but due to the limitations of the computer I was working on, I was having a lot of memory trouble getting it to work. I settled on using backwards ray tracing in order to accomplish this goal. I started with our Lab 4 code for a simple ray tracer and began modifying that program to start.

### How It Works

First, I create a box around the objects that I am trying to draw and shoot rays from the light to every integer point within this cube. If the ray hits an object, we then calculate the blinn-phong shading for this point to calculate the direct illumination and we also begin the backwards ray tracing to figure out the color bleeding from indirect illumination (more on this in the next section. We then combine the direct and indirect illuminations to get the color of the point that the ray hit. Once the color for a point has been calculated, it is then converted to pixel coordinates and sent to the image being generated.

### Color Bleeding From Indirect Illumination

I based my indirect illumination off of the pseudocode found here: https://www.scratchapixel.com/lessons/3d-basic-rendering/global-illumination-path-tracing.

Essentially, how this piece of the code works is that we recieve the point that was hit by the intial light ray along with the normal of the object at that point starting at a depth of 0. After that, I then created a half-sphere around the point that was passed in and begin looping through and shooting random rays (using Monte Carlo) through this different points in this half-sphere. The function would then try to calculate the color at the point that the sphere had just hit which and then returns the color at that point (This is calculated by a recursive call to the indirect illumination function again). After the color at that new point that was hit is returned, it then returns the color to the intial point which had sent then combines it with the result of all the other rays that had returned the colors based on how close or far the point that it interseced was. The simple of idea of this can be represented in the following diagram:

Though this diagram does not show the recursive bounces that occur after a ray hits an object from the initial object, it still gives the basic idea of how colors come back to a point on an object. This combined color that is returned represents the indirect illumination which is returned and then combined with the direct illumination.

### Results

#### Full Room Image:

Vanilla Ray Tracer:

Global illumination + Color Bleeding:

#### Close ups:

Vanilla Ray Tracer:

Global illumination + Color Bleeding:

### Photon Mapping

I would also like to mention that I did some work with photon mapping as well. Though due to the hardware limitations on my computer, I had trouble getting any results. I used https://graphics.stanford.edu/courses/cs348b-00/course8.pdf as my main source of information while working on this.

I was able to get a few photons emitting and hitting objects but as soon as I tried to bounce photons, I started running into memory issues. This can be seen in the code that I submitted as I had just simply commented out the parts of the photon mapping that I had been working on. If I had a bit more time and hardware to support this, it is definetly something that I would like to work on further in the future.

### What I Learned

• Monte Carlo Ray Tracing
• Backward Ray Tracing
• Indirect color illumination calculations
• How to represent color bleeding using only ray tracing
• Photon Mapping

Overall, this project came with a fair bit of headaches. Due to the fact that it took around 30-40 minutes to generate an image, probably the most frustraiting part would be that it would hard to know when you have done something wrong until after you wait such a long time. Through it all though, I definetly felt like I accomplished something pretty cool and it was such a great feeling when I started seeing the results of that accomplishment in the images once they were generating properly.