In this section, we will extend the knowledge of shaders and will see how to pass parameters from your C++ code, how to use Perlin noise, and how to process several images. The examples will be about the fragment shaders, but all the principles extend to the vertex and geometry shaders.
In order to make the shader interactive, we need a way to pass in it some parameters, such as time, mouse position, and some arrays. To add a parameter, you need to add its declaration in the shader's code using the uniform
keyword. For example, to declare the time
parameter, use the following line:
uniform float time;
To specify the parameter's value in openFrameworks, you need to add the following line after the shader.enable()
calling:
shader.setUniform1f( "time", time );
The 1f
suffix in the setUniform1f()
function name means that you pass one float value to the shader. The first parameter "time"
indicates the parameter name as it's declared in the shader. The second parameter time
is a float variable holding the time value:
float time = ofGetElapsedTimef();
Let's illustrate this in a simple example.
This example uses a fragment shader for distorting the geometry of an image. It transforms the image by shifting its horizontal lines by a sine wave, which also changes with time.
This example is similar to the example, which is described in the A simple geometrical distortion example section in Chapter 4, Images and Textures. However, it is based on the shaders technology. So it works much faster and the resultant image has no aliasing effect.
This is example 08-Shaders/02-ShaderHorizDistortion
. The project is based on the example given in the A simple fragment shader example section.
The fragment shader's code is as follows:
#version 120 #extension GL_ARB_texture_rectangle : enable #extension GL_EXT_gpu_shader4 : enable uniform sampler2DRect texture0; uniform float time; //Parameter which we will pass from OF void main(){ //Getting the coordinates of the current pixel in texture vec2 pos = gl_TexCoord[0].st; //Changing pos by sinewave float amp = sin( pos.y * 0.03 ); pos.x += sin( time * 2.0 ) * amp * 50.0; //Shifting x-coordinate //Getting pixel color from texture tex0 in position pos vec4 color = texture2DRect( texture0, pos ); //Output of shader gl_FragColor = color; }
This shader sets the color obtained by shifting the original position pos
along the x axis to its output value gl_FragColor
. The value of shifting depends on time
as sin( time * 2.0 )
and on the y coordinate as sin( pos.y * 0.03 )
.
In the openFrameworks' project, add the following lines to the draw()
function's body, just after the shader.begin()
line:
float time = ofGetElapsedTimef(); shader.setUniform1f( "time", time );
This code will set the time
variable equal to the number of seconds from the application's start, and set it to the shader's time
parameter.
Running the project, you will see a waving sunflower image as shown in the following screenshot:
Play with different distortion functions. For example, find the following line in shaderFrag.c
:
float amp = sin( pos.y * 0.03 );
Replace the preceding line by the following line:
float amp = sin( pos.x * 0.03 );
Sometimes it is necessary to pass to the shader not just a single float value but an array of floats. To do this, just declare the array in the shader's code as follows:
#define N (256) uniform float myArray[N];
Now bind the array from the openFrameworks project' code:
shader.setUniform1fv( "myArray", myArray, 256 );
In the preceding line of code, myArray
is a float array with 256
elements.
It is a good idea to use Perlin noise in shaders (see Appendix B, Perlin Noise, for details on Perlin noise). Though the GLSL language specification has built-in functions noise1
, noise2()
, noise3()
, and noise4()
for Perlin noise computing, most of the video cards return a zero value when calling these. So we need to implement it by ourselves. Fortunately, there are several ready-to-use Perlin and simplex noise implementations for GLSL, which are open for use.
We will use Perlin and simplex noise developed by Ashima Arts and Stefan Gustavson in the webgl-noise
library located at https://github.com/ashima/webgl-noise. This library is distributed along with the MIT license. Download and unpack the library, and then copy and paste the necessary functions right into your shader's code. Don't forget to include information about the license as requested in the library's description. Let's illustrate the usage of Perlin noise in an example.
Let's implement a fragment shader, which will shift each pixel using Perlin noise. The resultant effect will be liquid-like waving of the input image.
The project is based on the 08-Shaders/02-ShaderHorizDistortion
example, which was explained in the A simple geometrical distortion example section.
Change the fragment shader's text by the following code:
#version 120 #extension GL_ARB_texture_rectangle : enable #extension GL_EXT_gpu_shader4 : enable uniform sampler2DRect texture0; uniform float time; //Classic Perlin noise function declaration float cnoise( vec3 P ); void main(){ vec2 pos = gl_TexCoord[0].xy; //Shift pos using Perlin noise vec2 shift; shift.x = cnoise( vec3( pos*0.02, time * 0.5 + 17.0 ) )*30.0; shift.y = cnoise( vec3( pos*0.02, time * 0.5 + 12.0 ) )*30.0; pos += shift; vec4 color = texture2DRect( texture0, pos ); //Output of the shader gl_FragColor = color; } //Insert src/classicnoise3D.glsl file contents here //---------
Also, you need to add the code definition of the cnoise()
function by pasting the contents of the src/classicnoise3D.glsl
file located in the webgl-noise
library, at the end of the code.
The declared cnoise()
function computes the Perlin noise as a function of three parameters. We use it for computing the shift
vector, which pseudo-randomly depends on the current position pos
and time
. (See Appendix B, Perlin Noise, for details). Then, we shift pos
and get the resulting color from this shifted position.
Run the example, and you will see the liquid-like waving of the sunflower image as shown in the following screenshot:
For some effects such as masking, the fragment shader should read colors from more than one image. To use several images, in the shader's code you should declare additional uniform sampler2DRect
parameters:
uniform sampler2DRect texture1; //Second image uniform sampler2DRect texture2; //Third image //and so on
In openFrameworks' project code, you should link your images' textures to this shader's parameters, right after the shader.enable()
calling:
shader.setUniformTexture( "texture1",image2.getTextureReference(), 1 ); shader.setUniformTexture( "texture2",image3.getTextureReference(), 2 ); //and so on
Here, the first parameter means the shaders' uniform
parameter name, the second is texture, and the third is OpenGL texture identifier, which should be more than 0
, because the identifier 0
is used for default binding to texture0
in the shader (see the Structure of a shader's code section for details on texture0
).
Let's demonstrate the processing of several images by creating a fragment shader that masks the drawing image with some predefined mask.
The project is based on the 08-Shaders/02-ShaderHorizDistortion
example, which was explained in the A simple geometrical distortion example section.
Create the fragment shader with the following code:
#version 120 #extension GL_ARB_texture_rectangle : enable #extension GL_EXT_gpu_shader4 : enable uniform sampler2DRect texture0; uniform sampler2DRect texture1; //Second texture void main(){ vec2 pos = gl_TexCoord[0].xy; vec4 color0 = texture2DRect( texture0, pos ); vec4 color1 = texture2DRect( texture1, pos ); //Compute resulted color vec4 color; color.rgb = color0.rgb; color.a = color1.r; //Output of the shader gl_FragColor = color; }
This shader assumes that both images have the same size, and uses the red component of the texture1
pixel for setting the alpha value of the output color. To use the shader, make a grayscale mask, enable shader
, bind the mask to texture1
, and then draw your fbo
image. The pixels, corresponding to the black pixels in the mask, will have zero alpha in the output picture, and so will be invisible.
See the full example code in 08-Shaders/04-ShaderMasking
, where we use this shader for masking the sunflower image with the rotating triangle. The following screenshot shows the original image, the mask, and the result of applying the shader:
This is the end of the Creating video effects with fragment shaders section. Let's consider the concluding example, which combines music and images for obtaining audio-reactive visualization using a shader.
This example plays music and computes the spectrum array spectrum
of the current sound (see the Getting spectral data from sound section in Chapter 6, Working with Sounds). This array is converted into an image spectrumImage
, which is passed to the shader as texture2
. Finally, the shader uses texture2
for affecting the process of masking two input images texture0
and texture1
.
As a result, we obtain an animated picture which gleams and pulsates accordingly with the music beats:
You might ask why we pass the sound spectrum into the shader as a texture but not as a float array. The reason is simple; using the float array will result in steps in an output image, so some interpolation is needed for getting a smooth result. Fortunately, GLSL performs smooth interpolation of textures, so we just represent spectrum array as texture and delegate interpolating to GLSL.
Until now, we have considered the basic capabilities of processing 2D images with fragment shaders. Now let's look at the example of using a vertex shader for deforming 3D objects.