Week 9: Particle Systems

Reading

Monday

This week we will look at particle systems, a method for creating visual effects such as smoke, fire, fountains, weather, crowds, or anything else that can be modeled as a group of entities interacting under internal and external forces. We’ll start of small and try to explore some new WebGL techniques along the way.

Our initial example will be a fountain. A typical particle system has one or more emitters. An emitter is a source of particles. Over time, new particles are created at the emitter with various properties. Common particle properties include the initial velocity of the particle, the lifetime of the particle, and the particle color. Once emitted, we trace the particle through the scene for its lifetime at which point it can be recycled and perhaps re-emitted.

In our initial demo, we will have a single fixed emitter near the center of the scene. We will create a number of particles with an initial velocity vector that is generally upwards and sampled within a cone of possible velocity vectors. Additionally we will stagger the start times of the particles to create a stream of particles, or a small burst, instead of one instantaneous burst of particles.

We pack the particle info into a vertex buffer object, but note that this buffer has no geometry/position information. Nevertheless, we will pass the properties of each particle on to the vertex shader through this buffer and have the vertex shader create the position needed for the rest of the pipeline.

Initially we will use the gl.POINTS rendering mode. Within the vertex shader we can use gl_PointSize inside the vertex shader to control the size of the points.

We can apply basic kinematics to compute the position of the particle given its velocity and time since it was emitted.

\[p = v_0 t + \frac{1}{2} a t^2\]

We can limit a particle to a given lifetime and recycle particles by using GLSL’s mod operator, and resetting each particle back to the emitter after its lifetime.

float t = mod(u_time - a_start_time, u_lifetime);

Wednesday

A couple of brief observations/comments on the raymarcher project.

  • Yes, technically you can use bracket notation to index the elements of vec3 in GLSL, but almost nobody does this in practice. Please try to use .x, .y, .z, or rgb/stp alternatives. They tend to convey more meaning, in addition to requiring fewer keystrokes and no shift. GLSL is generally trying to make it easy on graphics people trying to do graphics things.

  • There is a lot of confusion or misuse of the Material type from sceneSDF or rayMarch. The vec2 returned by sceneSDF attaches a material or object ID to the basic geomety SDF functions which only need to return floats. Inside sceneSDF you are absolutely free to make up the material IDs based on your preferences. In the rayMarch function however, you should return the same ID returned by sceneSDF when you are close to a surface, or -1. when you do not hit anything. Once you return the vec2 from rayMarch, you can look up the real Material properties by ID using getMaterial. The mapping of IDs to colors is again up to you.

Particle Systems and alpha blending

We’ll try to keep it a bit light today. I don’t think graphics is the top agenda item on most people’s mind in the US today. We’ll revisit our particle system from Monday and explore ways to add textures to the particles even though we are only drawing points. Then we’ll look at ways to use the forth component of the fragment color, the alpha channel to do some blending and transparency effects.

Blending can get tricky when combing opaque and transparent objects. The order of object rendering can have a significant impact in the final rendering appearance. See the Blending tutorial on Learn OpenGL for a more detailed discussion.