diffusecolormap = loader.loadTexture("./Models/Textures/bricks.png") ts0 = TextureStage( 'level0' ) model.setTexture(ts0, diffusecolormap)
You now have access to this texture inside your pixel shader as the tex_0 color sampler (the new input variable is uniform sampler2D tex_0 : TEXUNIT0). A texture is nothing but a lookup table for color values. Now multiply your out color (that already includes your light calculations) in the pixel shader with the color value found in tex_0 at the texture coordinates that are passed into the fragment shader in l_texcoord0 (or whatever you named the texture coordinates you created in the vertex shader). The lookup function is tex2D(<texture>, <texturecoordinates>). You should now see a nicely textured and shaded model.
ts1 = TextureStage("ts1") viewpoint = NodePath('viewpoint') # The photos are taken from this perspective buffer = base.win.makeCubeMap('env', 64, viewpoint) viewpoint.reparentTo(model) # Attach to the teapot model.setTexture(ts1, buffer.getTexture()) # The following seems to be needed to stop the cube map from rotating with the teapot... viewpointmover = lerpHprInterval(viewpoint, 15.0, Vec3(0,0,0), Vec3(359, 0, 0)) viewpointmover.loop()
You will have access to this texture inside your pixel shader as the tex_1 color sampler (the new input variable is uniform samplerCUBE tex_1 : TEXUNIT1). Notice that this is a special cube texture. The final stage is to look up the color value of this cube map and multiply that with your existing diffuse color component (which already includes the color value from your other texture). You look up values in a cube map with the function texCUBE(<texture>,<3Dtexturecoordinates>). Notice that you need 3D texture coordinates here. Actually, all you need is the View ReflectionVector. That is, the part of the environment that you see reflected in the surface of the teapot is exactly the part pointed at by a vector that has the same angle as your view angle, but in the opposite direction. This vector you can calculate just like the Light ReflectionVector above, but replace the light with the view vector: VR = normalize(2*N*dot(N,V)-V);. So, pass VR now into the texCUBE function as your texture coordinates and you should be all set. Test it out!
float3 vtx_binormal0 : BINORMAL, float3 vtx_tangent0 : TANGENT,
Now a transformation matrix that is capable of mapping other vectors into this local tangent space can be created like this:
float3x3 mat_tangent; mat_tangent = vtx_tangent0; mat_tangent = vtx_binormal0; mat_tangent = vtx_normal;
Once this new transformation matrix has been created inside the vertex shader, both the View Vector and the Light Vector should be projected into tangent space through a simple multiplication with this matrix. Once these vectors arrive inside the fragment shader, they should already be in tangent space and the third important component in our lighting calculations, namely the normal, should now be looked up in the normal map accessed through (note however that the texture stores components in the 0-1 range but we want them in the -1-1 range, so you need to correct the value when you look it up like this tex2D(normalmap, texturecoordinate)*2-1). Since the normal that we're reading from the normal map is already in tangent space (that's how they're stored in normal maps - and that explains why we needed to transform the other vectors), we can now proceed with the lighting calculations, just like we did in model space. See if you can make this work.