User Tools

Site Tools


public:t-gede-13-1:lab6

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
public:t-gede-13-1:lab6 [2013/03/05 11:59] hannespublic:t-gede-13-1:lab6 [2024/04/29 13:33] (current) – external edit 127.0.0.1
Line 208: Line 208:
         float2 vtx_texcoord0        : TEXCOORD0,    // Texture UV set 0         float2 vtx_texcoord0        : TEXCOORD0,    // Texture UV set 0
         // Provided parameters         // Provided parameters
-        uniform float4x4  mat_modelproj)+        uniform float4x4  mat_modelproj,
         // Shader outputs         // Shader outputs
         out float4 l_position       : POSITION,     // Transformed vertex position         out float4 l_position       : POSITION,     // Transformed vertex position
-        out float2 l_texcoord0      : TEXCOORD0   // UV0+        out float2 l_texcoord0      : TEXCOORD0   // UV0
  
 { {
-    // Calculate output position+    // Calculate output position (a vertex shader is expected to at least do this!)
     l_position = mul(mat_modelproj, vtx_position);     l_position = mul(mat_modelproj, vtx_position);
     // Simply copy the input vertex UV to the output     // Simply copy the input vertex UV to the output
Line 224: Line 224:
         float2 l_texcoord0        : TEXCOORD0,    // UV interpolated for current pixel         float2 l_texcoord0        : TEXCOORD0,    // UV interpolated for current pixel
         // Provided parameters and data         // Provided parameters and data
-        uniform sampler2D texture       // Texture we're going to use+        uniform sampler2D texture       // Texture we're going to use
         // Shader output         // Shader output
-        out float4 o_color    : COLOR    // Output color we want to write+        out float4 o_color    : COLOR   // Output color we want to write
 { {
     // Just sample texture using supplied UV     // Just sample texture using supplied UV
     o_color = tex2D(texture, l_texcoord0);     o_color = tex2D(texture, l_texcoord0);
 } }
-</code> +</code>Now that the texture shader programs are ready, we need to instantiate them in an ogre material. First we provide the shader program definitions in our **materials** file:<code> 
-  - **Animated Vertex Shader** +vertex_program shader/textureVP cg { 
-  - **Per Pixel Phong Shader**+    source textureshader.cg          
 +    entry_point main_vp     
 +    profiles vs_1_1 arbvp1     
 + 
 +    default_params { 
 +        param_named_auto mat_modelproj worldviewproj_matrix    
 +    } 
 +
 + 
 +fragment_program shader/textureFP cg { 
 +    source textureshader.cg 
 +    entry_point main_fp 
 +    profiles ps_1_1 arbfp1   
 +
 +</code>Notice that we are passing one parameter, ''mat_modelproj'', into the vertex shader. But instead of specifying a value, we simply indicate the name **worldviewproj_matrix**. This name refers to a [[http://stderr.org/doc/ogre-doc/manual/manual_20.html#SEC89|list of values]] that an ogre application can supply automatically to a shader program (that's why we use **param_named_auto**). We obviously won't know the model-to-view projection matrix when we create the material, so we want it supplied at run-time instead. Finally, create the material that uses these two new shader program definitions and additionally loads a texture into the available texture unit (copy the actual texture from ''\OgreSDK_vc10_v1-8-1\media\materials\textures\Water02.jpg'' into your **Materials** folder so that your material can find it for sure):<code> 
 +material shader/texture { 
 +    technique { 
 +        pass {       
 +            vertex_program_ref shader/textureVP { 
 +            } 
 +            fragment_program_ref shader/textureFP { 
 +            } 
 +            texture_unit { 
 +                texture Water02.jpg 2d             
 +            }                       
 +        } 
 +    } 
 +}</code>Apply this material to your ''_ground'' entity and make sure you see the texture on the ground! 
 +  - **Animated Vertex Shader**To try to have a vertex shader to something a little more interesting, how about actually moving each vertex a little bit based on the time that passes? That way you can very cheaply animate vertices according to any formula you like! To do this, you need to pass a time parameter into the vertex shader. Luckily, time is one of the parameters that ogre applications can provide automatically to a shader program. Add the following alternate texture vertex shader to ''textureshader.cg'':<code c> 
 +void main_time_vp( 
 +        // Per-vertex information 
 +        float4 vtx_position         : POSITION,     // Vertex position in model space 
 +        float2 vtx_texcoord0        : TEXCOORD0,    // Texture UV set 0 
 +        // Provided parameters 
 +        uniform float4x4  mat_modelproj, 
 +        uniform float t,                            // Expecting time here 
 +        // Shader outputs 
 +        out float4 l_position       : POSITION,     // Transformed vertex position 
 +        out float2 l_texcoord0      : TEXCOORD0)    // UV0 
 +  
 +
 +    // Displace the vertical coordinate based on x-location and time 
 +    float4 temp = vtx_position; 
 +    temp.y = temp.y+cos(temp.x+t); 
 + 
 +    // Calculate output position 
 +    l_position = mul(mat_modelproj, temp); 
 +    // Simply copy the input vertex UV to the output 
 +    l_texcoord0 = vtx_texcoord0; 
 +
 +</code>Now all you have to do is to supply an automatic **time** value in the shader program definition in the materials file:<code> 
 +vertex_program shader/timetextureVP cg { 
 +    source textureshader.cg          
 +    entry_point main_time_vp     
 +    profiles vs_1_1 arbvp1     
 + 
 +    default_params { 
 +        param_named_auto mat_modelproj worldviewproj_matrix    
 +        param_named_auto t time  
 +    } 
 +
 +</code>Now create a new material that uses this vertex program definition instead of the regular texture vertex program definition and apply that material to the ground object in your application. You should see your ground move! 
 +  - **Per Pixel Phong Shader**Finally, let's try calculating the color value of a fragment based on an actual lighting model such as the Phong lighting model. Since we will be calculating the lighting value inside each fragment, we call this **per-pixel lighting**. This basically means that instead of using interpolated color values from the nearby vertices, we use interpolated vector values (model space vertex position, normal, view direction and light direction) to calculate the color value inside the fragment program. Create a new shader program file called ''lightingshader.cg'' and place the following code inside:<code c> 
 +// Cg 
 +void main_vp( 
 +  float4 vtx_position       : POSITION, 
 +  float3 vtx_normal         : NORMAL, 
 +  float2 vtx_texcoord0      : TEXCOORD0, 
 + 
 +  uniform float4x4 mat_modelproj, 
 +  uniform float4   mspos_light, 
 +  uniform float4   mspos_camera, 
 + 
 +  out float4 l_position  : POSITION, 
 +  out float2 l_texcoord0 : TEXCOORD0, 
 +  out float3 l_N    : TEXCOORD1, 
 +  out float3 l_L      : TEXCOORD2, 
 +  out float3 l_V    : TEXCOORD3, 
 +  out float3 l_P         : TEXCOORD4 
 +
 +
 +  l_position = mul(mat_modelproj, vtx_position); 
 +  l_texcoord0 = vtx_texcoord0; 
 + 
 +  // The principal vectors for our Phong lighting model calculation: 
 +  // L = Light Vector, N = Vertex Normal, V = View Vector R = Light Reflection Vector  
 +  l_N = vtx_normal;  // The Normal of the vertex itself was passed in automatically 
 +  // We passed in the light and camera NodePaths and get their model space coordinates 
 +  // here through the "mspos_<name>" variable. Everything here should be done in model space. 
 +  l_L = normalize(mspos_light.xyz - vtx_position.xyz); 
 +  l_V = normalize(mspos_camera.xyz - vtx_position.xyz); 
 +  l_P = vtx_position.xyz; 
 +  // We can't calculate the R vector here because it won't interpolate correctly for each fragment 
 +  // (it relies on a dot product which complicates things for it), so we'll calculate it inside the  
 +  // fragment shader. The other vectors will all get interpolated and passed to the fragments. 
 + 
 +
 + 
 +void main_fp( 
 +  float2 l_texcoord0 : TEXCOORD0,  
 +  float3 l_N : TEXCOORD1,  
 +  float3 l_L    : TEXCOORD2,  
 +  float3 l_V    : TEXCOORD3, 
 +  float3 l_P            : TEXCOORD4, 
 + 
 +  uniform float4 k_ambientc, 
 +  uniform float4 k_diffusec, 
 +  uniform float4 k_specularc, 
 + 
 +  out float4 o_color : COLOR) 
 +
 +  // Inside the fragment shader, we get all the interpolated vectors 
 +  // The Diffuse Attenuation follows under what angle the light shines on the fragment 
 +  float diffuse_attn = saturate(dot(l_L,l_N)); 
 + 
 +  // The Specular Attenuation follows how close to the line of light reflection you are looking 
 +  float3 R = normalize(2*l_N*dot(l_N,l_L)-l_L); 
 +  float specular_attn = pow(saturate(dot(R,l_V)),6.0); 
 + 
 +  // Here we return the color based on the full phong light model 
 +  o_color = 0.2*k_ambientc + diffuse_attn*k_diffusec+specular_attn*k_specularc; 
 + 
 +
 +</code>As you can see, we are expecting the application to pass in the location of both the camera and the light into the vertex shader (in model space coordinates!). Luckily, these are available in the [[http://stderr.org/doc/ogre-doc/manual/manual_20.html#SEC89|list of automated values]] provided by Ogre. The shader program definition for the vertex shader is then: <code> 
 +vertex_program shader/lightingVP cg { 
 +    source lightingshader.cg 
 +    entry_point main_vp 
 +    profiles vs_1_1 arbvp1 
 +     
 +    default_params { 
 +        param_named_auto mat_modelproj worldviewproj_matrix 
 +        param_named_auto mspos_light light_position_object_space 0 
 +        param_named_auto mspos_camera camera_position_object_space 
 +    } 
 +
 +</code>Notice that the light position that is provided is indexed as **light number 0**. This refers to the closest light source to the object (which in this case is the point light if you used the application code provided in the first step of this lab). We are also expecting the colors in our lighting model to be passed into the fragment shader, which could be provided by each material using this shader. This is the fragment shader program definition: <code> 
 + 
 +fragment_program shader/lightingFP cg { 
 +    source lightingshader.cg 
 +    entry_point main_fp 
 +    profiles ps_2_0 arbfp1 
 +   
 +    default_params { 
 +        param_named k_ambientc float4 0.5 0.5 0.5 1.0 
 +        param_named k_diffusec float4 0.8 0.1 0.1 1.0 
 +        param_named k_specularc float4 0.6 0.6 0.6 1.0 
 +    } 
 +
 +</code>Now create a new material that uses these two shader program definitions and assign the material to the ogre model. You should see a properly shaded (albeit single colored) model!
   
    
/var/www/cadia.ru.is/wiki/data/attic/public/t-gede-13-1/lab6.1362484795.txt.gz · Last modified: 2024/04/29 13:32 (external edit)

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki