Simplex/Perlin Noise Shader

Sean Quinlan


banner
Simplex Noise

Using the algorithm described by Ken Perlin and Stefan Gustavson's Simplex Noise Demystified, I implemented my own noise shader with a Blum Blum Shub psuedo random number generator.

Alpha Blending

Using uniforms in the shader, I was able to allow the program to configure the shader in ways it needed. One important feature was transparency through alpha blending and using the diffuse lighting as a way of fading the texture.

Parametric Sphere Drawing

Because of alpha blending, I had to use an alternative way of drawing a sphere so that the back was drawn separately from the front and the texture coordinates could be smoothed so that the sphere had no seams.



Introduction


The initial proposal of my project was to implement a skybox with a noise shader for clouds. Using GLSL, I did this quickly thanks to the included noisen() methods of GLSL. These methods where slow running though and reduced my framerate to ~3 fps. I was forced to write my own implementation. I ended up making a simulation of a planet textured only with the output of a noise shader.

Simplex Noise


Instead of writing the standard Perlin noise algorithm as outlined by Ken Perlin, I decided to use his more efficient version: Simplex noise. The main difference between the two algorithms is that Perlin decided to use shapes with n+1 vertices to enclose nD space. For example, a triangle for 2D space.

The algorithm was straightforward after reading Simplex Noise Demystified, but I was unable to use the permutation matrix strategy for pseudo randomness that Stefan Gustavson used. Instead I implemented Blum Blum Shub to add the proper amount of randomness to the noise function.

Possible improvements in the noise function could be eliminating multiples of the primes used. They create an obvious pattern in the noise function that looks unnatural at large scales.

Alpha Blending


I decided that I wanted to model a planet in space. I wanted there to be a cloud layer that one could 'fly' through with the camera. This cloud layer would require alpha blending to be able to see the sky from the planet or the planet from space (through the cloud layer). Additionally, I wanted the cloud layer to fade at night to clear sky using a diffuse shading algorithm applied to the alpha channel of the color in the fragment shader.

Initially, I couldn't explain the banding effect that alpha blending created until I read in the OpenGL documentation that the alpha blending is done in the Z-buffer algorithm. This would mean I would have to draw the sphere (1) orientated towards the camera and (2) in halves so that spheres could be drawn inside of spheres.

Parametric Sphere Drawing


Using a parametric sphere method to draw inspired from Paul Bourke's Spherical Texture Mapping technical descriptions made both drawing the sphere and texturing it far easier. I altered Paul's original algorithm significantly.

My first alteration was to change the way the texture was mapped to the sphere. I didn't like how the texture was compacted near the poles of the drawn planet. To smooth this out I added a factor of sin(theta) to the x of texture coordinates. This has the affect of smoothing the poles, but there was still a seam in the texture at the positive x-axis.

To solve the seaming, (my second alteration) I had the x coordinate go from 0 to 1 and back to 0 again to remove the seam. This unfortunately creates a mirror image on either side of the sphere. Because one can only see one side of the sphere at a time though, this effect is far less noticeable then one would imagine.

My third alteration, and most significant alteration, was to rotate the sphere towards the camera so that the back of the sphere would be drawn before the front for proper alpha blending. Finally, I attempted to remap the texture coordinates back to an unrotated sphere so that the sphere wouldn't appear to rotate with the camera. This was done by using the rotation matrix to figure out where a normal would be in the rotated space, calculate the theta and phi (spherical coordinates), and then calculate the proper texture coordinates.

Unfortunately, I was unable to successfully perform this task. I did reorient the poles successfully, but they move slightly when the camera moves. Also the cloud texture rotates around the poles as the camera does.


Lastly...


My demo creates an environment with natural depth and does this without the use of static textures. I think this is impressive and shows the usefulness of noise in graphics to add pseudo random realness to a world.

In the future, I would like to add a few more things to this project:

By level of detail, I mean that I could use the parametric equation for the sphere (since it is now orientated towards the camera) to add more points closer to the sphere and less further away. Thus adding more detail closer to the viewer and less further away. Additionally, the noise function could be used in the vertex shader to add a terrain to the planet in this more highly detailed surface.

More efficiency would mean that the shader could be used on more polygons. At the moment, it seems that a full solar system would greatly slow down the performance. More functionality would mean adding more parameters in the form of uniforms to the shader. Allowing for more colors, less blending, more blending, large scale, a texture offset, and so many more interesting options for interesting applications.

In conclusion, I enjoyed the challenge of making an interesting space environment and hope it showcases my abilities in graphics programming.

Screen Shots

test

The basic output of the noise shader


test

The view from inside the cloud sphere.


test

The view from outside the cloud sphere.



Sean Quinlan - CSC 471 - Fall 2010