Table of Contents

LAB7: Programmable Shaders

This lab is based on a variety of sources, including the Ogre Shaders Wiki.

Discussion

Discussion thread for this lab is here: Lab 7 Discussion Thread

Goal

The goal of this lab is to understand how you can upload and execute simple shader programs on the GPU. Those programs will be associated with Ogre materials that you can assign to objects in a scene.

Preparation

From previous labs you should already have a Models folder inside the folder LabFiles. To continue to organize your resources properly, you should now create another folder to hold your custom materials. Call this folder Materials and put it next to your models folder. In order for your Ogre applications to find the contents of these two folders make sure the following lines are in your resources_d.cfg:

FileSystem=../LabFiles/Models
FileSystem=../LabFiles/Materials

From now on we assume that all your custom materials and shader programs are stored in the Materials folder.

Lab Project

Follow these steps to complete the lab project:

  1. Create a New Project Create a new empty project called “Lab7” in the same way you have created new projects for other lab projects. You can use your own application, or you can use the following code:
  2. Fixed Diffuse Color Fragment Shader The first shader program we create is a fragment shader that simply returns a fixed color for each fragment that gets processed. In Ogre you can write shader programs in any of the major high-level shading languages, but we will be using Cg. Create a new shader program file called diffuseshader.cg in your Materials folder and place the following code inside it:
    float4 main_orange_fp(in float3 TexelPos : TEXCOORD0) : COLOR {
        float4 oColor;
     
        oColor.r = 1.0;
        oColor.g = 0.8;
        oColor.b = 0.0;
        oColor.a = 0.0;
     
        return oColor;
    }

    Now that the shader program is ready, you have to instantiate it inside a material before you can apply it to objects in Ogre. Create a new material file called myshaders.material in the same folder and place the following material script inside it:

    fragment_program shader/orangeFP cg {
        source diffuseshader.cg
        entry_point main_orange_fp
        profiles ps_1_1 arbfp1 
        
    }
    
    material shader/orange {
        technique {
            pass {           
                fragment_program_ref shader/orangeFP {
                }       
                texture_unit {       
                }             
            }
        }
    }

    Essentially the new material shader/orange calls the shader/orangeFP program definition, which in turn calls the main_orange_fp method inside diffuseshader.cg. Since this is a fragment program, it gets called for every fragment processed with this material. Finally, you simply assign this material to the entities in your Ogre application:

    _ogre->setMaterialName("shader/orange");
    _ground->setMaterialName("shader/orange"); 

    Verify that both your ground and your ogre model show up orange on the screen.

  3. Parametric Diffuse Color Fragment ShaderYou can use the same shader program in many materials and simply let each material pass a parameter into the program to tell it how to paint a given surface. This makes more sense than writing a new shader program every time you want to paint an object in a different color for example. You can pass parameters into Cg shader programs through the so called uniform parameter. In the diffuseshader.cg file, add the following new shader program:
    float4 main_color_fp(in float3 TexelPos : TEXCOORD0, uniform float4 color) : COLOR {
        float4 oColor = color;
        return oColor;
    }

    This fragment shader expects a new parameter called color, and assigns that value to the color returned for this fragment. In the same material file as before, you now add the following shader program definition and material script:

    fragment_program shader/diffuseFP cg {
        source diffuseshader.cg
        entry_point main_color_fp
        profiles ps_1_1 arbfp1 
     
        default_params {
            param_named color float4 0.7 0.2 0.2 1.0
        }
    } 
     
    material shader/white {
        technique {
            pass {             
                fragment_program_ref shader/diffuseFP {
                    param_named color float4 0.8 0.8 0.8 1.0
                }    
                texture_unit {       
                }                      
            }
        }
    }

    Here you have created a new material shader/white that passes the color value <0.8,0.8,0.8,1.0> into the shader program called by the shader/diffuseFP program definition. Notice that if you skip specifying the color value in the material script, the program definition will pass its own default value into the shader program (in this case <0.7,0.2,0.2,1.0>). Now use this material for your ogre model:

    _ogre->setMaterialName("shader/white");

    You can of course create several new materials now, each with a different color!

  4. Custom Diffuse Color Fragment Shader You may also want to control some parameter in a shader program from within the application code. To do this, you indicate that a shader program parameter should be a custom parameter. You do not have to change your diffuse fragment shader program to do this (it is already expecting an external color parameter), but you need to create a new shader program definition and a new material in the materials file to do this:
    fragment_program shader/customFP cg {
        source diffuseshader.cg
        entry_point main_color_fp
        profiles ps_1_1 arbfp1 
        
        default_params {
            param_named_auto color custom 1
        }
    }
    
    material shader/custom {
        technique {
            pass {             
                fragment_program_ref shader/customFP {
                    param_named_auto color custom 1
                }          
                texture_unit {       
                }                     
            }
        }
    }

    You are with this essentially telling Ogre that if this material is associated with an object, it should look for a custom parameter in an object's custom parameter slot number 1 and pass that on to the fragment program. So, when you assign this material to entities in an Ogre application, you have to remember to add this custom parameter as well. This is how you do that (setCustomParameter takes the parameter slot number as the first argument):

    _ogre->getSubEntity(0)->setCustomParameter(1, Ogre::Vector4(0.0, 0.0, 1.0, 1.0));
    _ogre->setMaterialName("shader/custom");

    You should now have a blue ogre model.

  5. Texture Vertex and Fragment ShadersPainting an object in a single color is not particularly interesting. More commonly we read diffuse color information from a texture as we process each fragment. We can do this if we supply each fragment with texture coordinates that are interpolated from texture coordinates stored at the nearest vertices. To do texturing, we should create two shader programs: (1) We should make sure that a vertex program provides texture coordinates and (2) we should use the interpolated texture coordinates in a fragment program that returns the right color value from a texture. You should now create a new shader program file called textureshader.cg and place the following code inside:
    void main_vp(
            // Per-vertex information
            float4 vtx_position         : POSITION,     // Vertex position in model space
            float2 vtx_texcoord0        : TEXCOORD0,    // Texture UV set 0
            // Provided parameters
            uniform float4x4  mat_modelproj,
            // Shader outputs
            out float4 l_position       : POSITION,     // Transformed vertex position
            out float2 l_texcoord0      : TEXCOORD0)    // UV0
     
    {
        // Calculate output position (a vertex shader is expected to at least do this!)
        l_position = mul(mat_modelproj, vtx_position);
        // Simply copy the input vertex UV to the output
        l_texcoord0 = vtx_texcoord0;
    }
     
    void main_fp(
            // Interpolated fragment values
            float2 l_texcoord0        : TEXCOORD0,    // UV interpolated for current pixel
            // Provided parameters and data
            uniform sampler2D texture,        // Texture we're going to use
            // Shader output
            out float4 o_color    : COLOR)    // Output color we want to write
    {
        // Just sample texture using supplied UV
        o_color = tex2D(texture, l_texcoord0);
    }

    Now that the texture shader programs are ready, we need to instantiate them in an ogre material. First we provide the shader program definitions in our materials file:

    vertex_program shader/textureVP cg {
        source textureshader.cg         
        entry_point main_vp    
        profiles vs_1_1 arbvp1    
    
        default_params {
            param_named_auto mat_modelproj worldviewproj_matrix   
        }
    }
    
    fragment_program shader/textureFP cg {
        source textureshader.cg
        entry_point main_fp
        profiles ps_1_1 arbfp1  
    }

    Notice that we are passing one parameter, mat_modelproj, into the vertex shader. But instead of specifying a value, we simply indicate the name worldviewproj_matrix. This name refers to a list of values that an ogre application can supply automatically to a shader program (that's why we use param_named_auto). We obviously won't know the model-to-view projection matrix when we create the material, so we want it supplied at run-time instead. Finally, create the material that uses these two new shader program definitions and additionally loads a texture into the available texture unit (copy the actual texture from \OgreSDK_vc10_v1-8-1\media\materials\textures\Water02.jpg into your Materials folder so that your material can find it for sure):

    material shader/texture {
        technique {
            pass {      
                vertex_program_ref shader/textureVP {
                }
                fragment_program_ref shader/textureFP {
                }
                texture_unit {
                    texture Water02.jpg 2d            
                }                      
            }
        }
    }

    Apply this material to your _ground entity and make sure you see the texture on the ground!

  6. Animated Vertex ShaderTo try to have a vertex shader to something a little more interesting, how about actually moving each vertex a little bit based on the time that passes? That way you can very cheaply animate vertices according to any formula you like! To do this, you need to pass a time parameter into the vertex shader. Luckily, time is one of the parameters that ogre applications can provide automatically to a shader program. Add the following alternate texture vertex shader to textureshader.cg:
    void main_time_vp(
            // Per-vertex information
            float4 vtx_position         : POSITION,     // Vertex position in model space
            float2 vtx_texcoord0        : TEXCOORD0,    // Texture UV set 0
            // Provided parameters
            uniform float4x4  mat_modelproj,
            uniform float t,                            // Expecting time here
            // Shader outputs
            out float4 l_position       : POSITION,     // Transformed vertex position
            out float2 l_texcoord0      : TEXCOORD0)    // UV0
     
    {
        // Displace the vertical coordinate based on x-location and time
        float4 temp = vtx_position;
        temp.y = temp.y+cos(temp.x+t);
     
        // Calculate output position
        l_position = mul(mat_modelproj, temp);
        // Simply copy the input vertex UV to the output
        l_texcoord0 = vtx_texcoord0;
    }

    Now all you have to do is to supply an automatic time value in the shader program definition in the materials file:

    vertex_program shader/timetextureVP cg {
        source textureshader.cg         
        entry_point main_time_vp    
        profiles vs_1_1 arbvp1    
    
        default_params {
            param_named_auto mat_modelproj worldviewproj_matrix   
            param_named_auto t time 
        }
    }

    Now create a new material that uses this vertex program definition instead of the regular texture vertex program definition and apply that material to the a new plane.

    Ogre::Plane water(Ogre::Vector3::UNIT_Y, 0.0f);
    Ogre::MeshManager::getSingleton().createPlane("AnimatedWater", Ogre::ResourceGroupManager::DEFAULT_RESOURCE_GROUP_NAME, water, 30, 30, 40, 40, true, 1, 10, 10, Ogre::Vector3::UNIT_Z);
    Ogre::Entity* waterEnt = m_sceneManager->createEntity( "waterEntity","AnimatedWater");
    waterEnt->setMaterialName("shader/texture");
    m_sceneManager->getRootSceneNode()->createChildSceneNode(Ogre::Vector3(0, 0, -40))->attachObject(waterEnt);
  7. Per Pixel Phong ShaderFinally, let's try calculating the color value of a fragment based on an actual lighting model such as the Phong lighting model. Since we will be calculating the lighting value inside each fragment, we call this per-pixel lighting. This basically means that instead of using interpolated color values from the nearby vertices, we use interpolated vector values (model space vertex position, normal, view direction and light direction) to calculate the color value inside the fragment program. Create a new shader program file called lightingshader.cg and place the following code inside:
    // Cg
    void main_vp(
      float4 vtx_position       : POSITION,
      float3 vtx_normal         : NORMAL,
      float2 vtx_texcoord0      : TEXCOORD0,
     
      uniform float4x4 mat_modelproj,
      uniform float4   mspos_light,
      uniform float4   mspos_camera,
     
      out float4 l_position  : POSITION,
      out float2 l_texcoord0 : TEXCOORD0,
      out float3 l_N    	 : TEXCOORD1,
      out float3 l_L     	 : TEXCOORD2,
      out float3 l_V    	 : TEXCOORD3,
      out float3 l_P         : TEXCOORD4
    )
    {
      l_position = mul(mat_modelproj, vtx_position);
      l_texcoord0 = vtx_texcoord0;
     
      // The principal vectors for our Phong lighting model calculation:
      // L = Light Vector, N = Vertex Normal, V = View Vector R = Light Reflection Vector 
      l_N = vtx_normal;  // The Normal of the vertex itself was passed in automatically
      // We passed in the light and camera NodePaths and get their model space coordinates
      // here through the "mspos_<name>" variable. Everything here should be done in model space.
      // The w component of the mspos_light indicated whether the light is directional or a point.
      // So you must check whether you have to calculate the direction or use it directly.
      if( mspos_light.w == 1.0) {
      	l_L = normalize(mspos_light.xyz - vtx_position.xyz);
      } else {
      	l_L = mspos_light.xyz;
      }
      l_V = normalize(mspos_camera.xyz - vtx_position.xyz);
      l_P = vtx_position.xyz;
      // We can't calculate the R vector here because it won't interpolate correctly for each fragment
      // (it relies on a dot product which complicates things for it), so we'll calculate it inside the 
      // fragment shader. The other vectors will all get interpolated and passed to the fragments.
     
    }
     
    void main_fp(
      float2 l_texcoord0 	: TEXCOORD0, 
      float3 l_N		: TEXCOORD1, 
      float3 l_L    	: TEXCOORD2, 
      float3 l_V    	: TEXCOORD3,
      float3 l_P            : TEXCOORD4,
     
      uniform float4 k_ambientc,
      uniform float4 k_diffusec,
      uniform float4 k_specularc,
     
      out float4 o_color : COLOR)
    {
      // Inside the fragment shader, we get all the interpolated vectors
      // The Diffuse Attenuation follows under what angle the light shines on the fragment
      float diffuse_attn = saturate(dot(l_L,l_N));
     
      // The Specular Attenuation follows how close to the line of light reflection you are looking
      float3 R = normalize(2*l_N*dot(l_N,l_L)-l_L);
      float specular_attn = pow(saturate(dot(R,l_V)),6.0);
     
      // Here we return the color based on the full phong light model
      o_color = 0.2*k_ambientc + diffuse_attn*k_diffusec+specular_attn*k_specularc;
     
    }

    As you can see, we are expecting the application to pass in the location of both the camera and the light into the vertex shader (in model space coordinates!). Luckily, these are available in the list of automated values provided by Ogre. The shader program definition for the vertex shader is then:

    vertex_program shader/lightingVP cg {
        source lightingshader.cg
        entry_point main_vp
        profiles vs_1_1 arbvp1
        
        default_params {
            param_named_auto mat_modelproj worldviewproj_matrix
            param_named_auto mspos_light light_position_object_space 0
            param_named_auto mspos_camera camera_position_object_space
        }
    }

    Notice that the light position that is provided is indexed as light number 0. This refers to the closest light source to the object (which in this case is the point light if you used the application code provided in the first step of this lab). We are also expecting the colors in our lighting model to be passed into the fragment shader, which could be provided by each material using this shader. This is the fragment shader program definition:

    fragment_program shader/lightingFP cg {
        source lightingshader.cg
        entry_point main_fp
        profiles ps_2_0 arbfp1
      
        default_params {
            param_named k_ambientc float4 0.5 0.5 0.5 1.0
            param_named k_diffusec float4 0.8 0.1 0.1 1.0
            param_named k_specularc float4 0.6 0.6 0.6 1.0
        }
    }

    Now create a new material that uses these two shader program definitions and assign the material to the ogre model. You should see a properly shaded (albeit single colored) model!

Bonus

  1. Create a render texture For many post processing and special effects shaders, we will need to capture the current scene from some angle and draw that into a texture. This is done by telling a certain camera to draw the scene from its point of view into a texture often referred to as a RenderTexture.
    1. Create a variable in MyApplication.h: to hold the renderTexture
      Ogre::RenderTexture*	_renderTexture;
    2. Create a member function that assigns a camera to the render texture and initializes it. This function must accept a Ogre::Camera* to indicate what camera to draw and Ogre::RenderWindow* to get the window size. Notice the name “RttTex”, this is the name we will use to address the texture in the material script. Call this function in the startup function after you create the camera.
      void MyApplication::createRenderTarget(Ogre::Camera* camera, Ogre::RenderWindow* window) {
       
      	Ogre::TexturePtr rtt_texture = Ogre::TextureManager::getSingleton().createManual("RttTex", Ogre::ResourceGroupManager::DEFAULT_RESOURCE_GROUP_NAME, Ogre::TEX_TYPE_2D, window->getWidth(), window->getHeight(), 0, Ogre::PF_R8G8B8A8, Ogre::TU_RENDERTARGET);
      	_renderTexture = rtt_texture->getBuffer()->getRenderTarget();
       
      	_renderTexture->addViewport(camera);
      	_renderTexture->getViewport(0)->setClearEveryFrame(true);
      	_renderTexture->getViewport(0)->setBackgroundColour(Ogre::ColourValue::Black);
       
      	_renderTexture->setAutoUpdated(true);
      } 
    3. Now we must create a plane So that we will be able to see the rendered texture. Add the member variable Ogre::Entity* _board and the following member function
       void MyApplication::createPlane() {
      	Ogre::ManualObject* manual = _sceneManager->createManualObject("Board");
      	manual->begin("custom/BoardRtt", Ogre::RenderOperation::OT_TRIANGLE_LIST);
       
      	manual->position(-6, -5, 0);
      	manual->textureCoord(0, 1);
       
      	manual->position(6, -5, 0);
      	manual->textureCoord(1, 1);
       
      	manual->position(6, 5, 0.0);
      	manual->textureCoord(1, 0);
       
      	manual->position(-6, 5, 0);
      	manual->textureCoord(0, 0);
       
      	manual->index(0);
      	manual->index(1);
      	manual->index(3);
      	manual->index(1);
      	manual->index(2);
      	manual->index(3);
       
      	manual->end();
      	Ogre::SceneNode* boardNode = _sceneManager->getRootSceneNode()->createChildSceneNode("Board");
      	boardNode->setPosition(3, 3, -10);
      	boardNode->attachObject(manual);
      }
    4. Now we must create the custom/BoardRtt material script :) We will make that a very simple material that uses fixed function vertex and fragment programs that accepts one texture. Or you can use the texture material previously made but only changing the input texture to texture RttTex
      material custom/BoardRtt
      {
      	technique
      	{
      		pass
      		{
                              // Leaving the vertex and fragment ref's blank, 
                              // will make Ogre use fixed function vertex and fragment programs.
                              
      			texture_unit {
                                      // Notice: This is the name we gave in the createRenderTarget method.
      				texture RttTex
      			}
      
      		}
      	}
      } 

      Now build and observe.

  2. Lets make something practical with the renderTarget.
    1. Lets add a top down view to the scene, that will be displayed on our Board. To do that we will create another camera that we will position above and make it look down:
      Ogre::Camera* camera2 = _sceneManager->createCamera("Camera2");
      camera2->setPosition(Ogre::Vector3(0, 100, 0));
      camera2->lookAt(Ogre::Vector3(0, 0, 5));
      camera2->setNearClipDistance(5); 
    2. Now make sure to send the new camera into the function that creates the rendertarget.
      createRenderTarget(camera2, window);

      now Build and behold :)

  3. Finally lets create some water effects This is not an exhaustive method, but it points you in the general direction of distorting what is under water :)
    1. Lets start by creating a plane for our water surface, make it a bit smaller than the floor, and a few units higher.
       // Add the water surface
      Ogre::Plane waterSurface(Ogre::Vector3::UNIT_Y, -5);
      Ogre::MeshManager::getSingleton().createPlane("WaterSurface", Ogre::ResourceGroupManager::DEFAULT_RESOURCE_GROUP_NAME, waterSurface,
      		200, 200, 1, 1, true, 1, 5, 5, Ogre::Vector3::UNIT_Z);
       
      _water = _sceneManager->createEntity("WaterPlane", "WaterSurface");
      _sceneManager->getRootSceneNode()->createChildSceneNode("_water")->attachObject(_water);
      _water->getParentSceneNode()->translate(Ogre::Vector3::UNIT_Y*3);

      Dont forget to add the member

       Ogre::Entity* _water;
    2. Water surface If you think about a water surface, you can imagine it has no actual colour, it has generally 3 elements when looking at it with our simple goggles; reflection, some colouring of different light scattering about the water volume, and a distorted image of the bottom of the water since its mostly transparent.
    3. For this exercise we will skip the reflection :) lets focus on the bottom. The bottom is really just what is behind the water surface in the general view direction, but slightly distorted. So what we want to do is draw what the camera sees but exclude the water surface, and save it in a RenderTarget. To do that we send our normal camera to the createRenderTarget function, but we will need to add the following line to the function:
      _renderTexture->getViewport(0)->setVisibilityMask(0xFFFFFFF0); 

      The default visibility mask for objects is 0xFFFFFFFF… depending on the version of Ogre, you dont have to worry about that now. Now the render texture will exclude everything that has the mask 0xF or lower. So that is where we will want to place our water surface.

    4. Exclude the water from the RenderTarget Now we have to specify the visibility mask of our water surface:
      _water->setVisibilityFlags(0xF); 
    5. Now we will have to create our own distortion shader “custom/MyWater” :
    6. Create Two files, MyWater.material and distortShader.cg
    7. In the file distortShader.cg add the following code:
      uniform sampler2D RttTex : TEXUNIT0;
      uniform sampler2D offsetMap : TEXUNIT1;
      uniform float4x4  mat_modelproj;
      uniform float myTime;
       
      // Vertex program input
      struct VP_input {
          float4 pos  : POSITION;
          float4 uv   : TEXCOORD0;
      };
       
      // Vertex program output / fragment program input.
      struct VP_output {
          float4 pos      : POSITION;
          float4 uv       : TEXCOORD0;
          float4 uvPos    : TEXCOORD1;
      };
       
       
      VP_output main_distort_vp( VP_input p_in ) {
          VP_output output;
       
          // Transform the current vertex into projection space.
          output.pos = mul(mat_modelproj, p_in.pos  );
          output.uv = p_in.uv;
       
          // Since the fragment shader will not be able to read the position Semantic, 
          // we must pass down the position to the fragment shader via texture coordinate.
          // Note: In DirectX 10+ fragment shader have access to the current pixels position :)
          output.uvPos = output.pos;
       
          return output;
      }
       
      float4 main_distort_fp( VP_output p_in ) : COLOR {
          // Create a variable to hold the screen space uv coordinates
          float2 screenUV;
       
          // Since  the vertex shader doesn't the divide by w automatically, 
          // we have to do so explicitly here our selves to get the viewport coordinates.
          screenUV = float2(p_in.uvPos.x, -p_in.uvPos.y) / p_in.uvPos.w;
          // Viewport coordinates range from -1 to 1, so we must map them to the range 0..1 to be valid UV coordinates.
          screenUV = ( screenUV + 1.0f ) * 0.5f;
       
          // Read the current offset from the offset map.
          float2 offset = tex2D( offsetMap, p_in.uv.xy + myTime*.01).xy ;
       
          // Texture can not store negative values, so we must take our offset vectors, and map them to the correct range from (0..1) to (-1..1)
          offset = (offset - 0.5f) * 2.0 ;
       
          // You can change the last two values to configure the intensity and lightness of the shader.
          float4 oColor = tex2D( RttTex, screenUV.xy + offset * 0.07 ) * 0.5 ;
       
          return oColor;
      } 
    8. And into the Material script create a material that uses these two programs:
      vertex_program shader/waterVP cg {
      	source distortShader.cg
      	entry_point main_distort_vp
      	profiles vs_1_1 arbvp1
      	default_params {
      		param_named_auto mat_modelproj worldviewproj_matrix
      	}
      }
      
      fragment_program shader/waterFP cg {
      	source distortShader.cg
      	entry_point main_distort_fp
      	profiles ps_2_0 arbfp1
      	default_params {
      		param_named_auto myTime time 1
      	}
      }
      
      material custom/MyWater {
      	technique {
      		pass {
      			vertex_program_ref shader/waterVP {
      			}
      
      			fragment_program_ref shader/waterFP {
      			}
      
      			texture_unit 0 {
      				tex_address_mode mirror
      				texture RttTex
      			}
      			texture_unit 1 {
      				tex_address_mode mirror
      				texture offsetMap.png
      			}
      		}
      	}
      } 
    9. Then finally we will need to put that material on the water surface and download the offsetTexture
      _water->setMaterialName("custom/MyWater");
       

When You Are Finished

Upload your commented source files into Lab7 in MySchool along with your Material and shader programs. (zip them up if more than one). The lab projects will not be graded, but their completion counts towards your participation grade.