public:t-vien-09-1:lab_7_materials

- Download and unzip the Lab 7 Asset File into your working directory

**The Simplest Shader:**

Make sure you can run**testshader.py**and that you can see a white teapot on the screen. The color blue is being passed into the shader through the**tint**constant. Try different color values for this constant. Then open up**myshader.sha**and notice how the value is used in the vertex shader to assign a color to a vertex. Play with the color values inside the vertex shader and see how it affects the final render.**Diffuse Vertex Shading:**

Instead of sending a preset color value to all of the vertices inside the vertex shader, let the vertex shader calculate the color in each vertex based on a lighting model. Use the following lighting model to do this calculation:**color = 0.2*ambientcolor + diffuse_attenuation*diffusecolor + specular_attenuation*specularcolor**. Instead of *tint* you now have to send**ambientcolor**,**diffusecolor**and**specularcolor**into the shader. Inside the vertex shader, you then have to calculate the**diffuse_attenuation**and**specular_attenuation**which are values between 0 and 1 that indicate how much of that color will be seen in that vertex.

- The
**diffuse_attenuation**essentially represents how well the vertex is lit, which relates to the angle between the vertex normal and the direction to the light source. This is the formula (using Cg functions):**diffuse_attenuation = saturate(dot(LightVector, NormalVector))**.

- You already receive the NormalVector inside the vertex shader, but you have to calculate the LightVector. To do that, you have to first pass the
**light**into the shader from Panda. You can pass any NodePath into shaders using the**<model>.setShaderInput(<name>, <NodePath>)**method. When you do that, you get access to various render state attributes of that NodePath, including its model space position - it will be accessible inside the shader as a**uniform float4 mspos_<name>**.

- When you have the model space position of the light source, you can easily calculate the LightVector (i.e. the vector from the vertex position to the light position).

- Now you should be able to calculate the
**diffuse_attenuation**. Set the**specular_attenuation**to 0 for now and test your lighting model so far.

**Specular Vertex Shading:**

Now calculate the**specular_attenuation**value in the lighting model. The formula for this is:**specular_attn = pow(saturate(dot(ReflectionVector,ViewVector)),6.0)**. This essentially says that this component will get a sharp increase (by raising the results to the power of 6) when the reflection of the light lines up with the direction towards our eyes. Just like you needed the light position passed in to calculate the LightVector, you'll need to pass in the camera position to calculate the ViewVector. Once you've gotten the ViewVector, you need to calculate the light ReflectionVector with this formula: R=normalize(2*N*dot(N,L)-L). Try to make this specular lighting work in your shader.**Diffuse and Specular Pixel Shading:**

So far you have only been calculating the color values for each of the vertices in a model, which can result in rough lighting artifacts (occurs when simple interpolation of vertex colors is not enough). To prevent these artifacts, you can move the lighting calculation inside the pixel/fragment shader and perform them for each pixel. You do not have to calculate the Vectors inside the pixel shader; if you pass these vectors into the pixel shader from the vertex shader using the TEXCOORD semantics (registers), they will be correctly interpolated. Dot products don't interpolate, so that's why the final calculations have to take place inside the pixel shader. See if you can now create per-pixel diffuse and specular lighting. Notice how much better looking it is than the per-vertex lighting (but also realize that this is more GPU intensive).**Texture:**

The**teapot**model doesn't have any texture coordinates, so in your vertex shader you will have to create them and pass them on to the pixel/fragment shader. For now, let's create very simple texture coordinate mapping: Let (u,v) = (x,y) in each vertex. You can do this in the vertex shader like so:**l_texcoord0 = vtx_position.xy;**. The next step is to replace the**diffusecolor**in your lighting model with the color sampled from an actual texture. First, load the texture you want into the Panda program and assign it to the model's PathNode (The textures are in Models/Textures). It's a good idea to explicitly create a texture stage (a texture channel) for each texture. Example:diffusecolormap = loader.loadTexture("./Models/Textures/bricks.png") ts0 = TextureStage( 'level0' ) model.setTexture(ts0, diffusecolormap)

You now have access to this texture inside your pixel shader as the

**tex_0**color sampler (the new input variable is**uniform sampler2D tex_0 : TEXUNIT0**). A texture is nothing but a lookup table for color values. Now multiply your out color (that already includes your light calculations) in the pixel shader with the color value found in**tex_0**at the texture coordinates that are passed into the fragment shader in**l_texcoord0**(or whatever you named the texture coordinates you created in the vertex shader). The lookup function is**tex2D(<texture>, <texturecoordinates>)**. You should now see a nicely textured and shaded model.**Environment Map:**

To make things truly interesting, imaging that the teapot is made of shiny material and therefore should reflect an image of the surrounding environment. Start by loading any kind of environment you like around your teapot (you can use your room from the earlier labs, or simply use the supplied Panda environment:**environment = loader.loadModel(“environment”)**. Now in your Panda program, you need to have Panda generate a texture, essentially a photo, of the environment from the perspective of the teapot. In fact, this texture should be a set of photos, one in each direction from the teapot (6 textures in total). Such a texture is called a**CubeMap**and both Panda and the Cg shader language can use them. To create a cube map from the perspective of the teapot, you do this in your Panda program:ts1 = TextureStage("ts1") viewpoint = NodePath('viewpoint') # The photos are taken from this perspective buffer = base.win.makeCubeMap('env', 64, viewpoint) viewpoint.reparentTo(model) # Attach to the teapot model.setTexture(ts1, buffer.getTexture()) # The following seems to be needed to stop the cube map from rotating with the teapot... viewpointmover = lerpHprInterval(viewpoint, 15.0, Vec3(0,0,0), Vec3(359, 0, 0)) viewpointmover.loop()

You will have access to this texture inside your pixel shader as the

**tex_1**color sampler (the new input variable is**uniform samplerCUBE tex_1 : TEXUNIT1**). Notice that this is a special cube texture. The final stage is to look up the color value of this cube map and multiply that with your existing diffuse color component (which already includes the color value from your other texture). You look up values in a cube map with the function**texCUBE(<texture>,<3Dtexturecoordinates>)**. Notice that you need 3D texture coordinates here. Actually, all you need is the View ReflectionVector. That is, the part of the environment that you see reflected in the surface of the teapot is exactly the part pointed at by a vector that has the same angle as your view angle, but in the opposite direction. This vector you can calculate just like the Light ReflectionVector above, but replace the light with the view vector:**VR = normalize(2*N*dot(N,V)-V);**. So, pass**VR**now into the**texCUBE**function as your texture coordinates and you should be all set. Test it out!**Normal Map:**

For this part, you need to be using a**sphere**model and you need to assign the**bricks.png**and**bricks-n.png**textures to two separate texture stages on the model. The latter texture is a normal map (passed in as**uniform sampler2D tex_1 : TEXUNIT1**) To add lighting detail with this normal map, a new coordinate space, called the tangent space, needs to be created inside the vertex shader. This space is local to each vertex and is defined by the axes formed by the vertex normal, binormal and tangent. Therefore the following automatic varying inputs need to be added to the vertex shader:float3 vtx_binormal0 : BINORMAL, float3 vtx_tangent0 : TANGENT,

Now a transformation matrix that is capable of mapping other vectors into this local tangent space can be created like this:

float3x3 mat_tangent; mat_tangent[0] = vtx_tangent0; mat_tangent[1] = vtx_binormal0; mat_tangent[2] = vtx_normal;

Once this new transformation matrix has been created inside the vertex shader, both the View Vector and the Light Vector should be projected into tangent space through a simple multiplication with this matrix. Once these vectors arrive inside the fragment shader, they should already be in tangent space and the third important component in our lighting calculations, namely the normal, should now be looked up in the normal map accessed through (note however that the texture stores components in the 0-1 range but we want them in the -1-1 range, so you need to correct the value when you look it up like this

**tex2D(normalmap, texturecoordinate)*2-1**). Since the normal that we're reading from the normal map is already in tangent space (that's how they're stored in normal maps - and that explains why we needed to transform the other vectors), we can now proceed with the lighting calculations, just like we did in model space. See if you can make this work.

/var/www/ailab/WWW/wiki/data/pages/public/t-vien-09-1/lab_7_materials.txt · Last modified: 2009/02/26 15:12 by hannes