diffusecolormap = loader.loadTexture("./Models/Textures/bricks.png") ts0 = TextureStage( 'level0' ) model.setTexture(ts0, diffusecolormap)
You now have access to this texture inside your pixel shader as the tex_0 color sampler (the new input variable is uniform sampler2D tex_0 : TEXUNIT0). A texture is nothing but a lookup table for color values. Now multiply your out color (that already includes your light calculations) in the pixel shader with the color value found in tex_0 at the texture coordinates that are passed into the fragment shader in l_texcoord0 (remember to receive the texture coordinates from the vertex shader). The lookup function is tex2D(<texture>, <texturecoordinates>). You should now see a nicely textured and shaded model. If you are using the car model, you should be able to retrieve separate NodePaths to the body and wheels parts of the model (using <NodePath>.find(<name>)), and then apply the same shader to both parts but with a separate set of uniform input values (including the texture).
float3 vtx_binormal0 : BINORMAL, float3 vtx_tangent0 : TANGENT,
Now a transformation matrix that is capable of mapping other vectors into this local tangent space can be created like this:
float3x3 mat_tangent; mat_tangent = vtx_tangent0; mat_tangent = vtx_binormal0; mat_tangent = vtx_normal;
Once this new transformation matrix has been created inside the vertex shader, both the View Vector and the Light Vector should be projected into tangent space through a simple multiplication with this matrix. Once these vectors arrive inside the fragment shader, they should already be in tangent space and the third important component in our lighting calculations, namely the normal, should now be looked up in the normal map accessed through (note however that the texture stores components in the 0-1 range but we want them in the -1-1 range, so you need to correct the value when you look it up like this tex2D(normalmap, texturecoordinate)*2-1). Since the normal that we're reading from the normal map is already in tangent space (that's how they're stored in normal maps - and that explains why we needed to transform the other vectors), we can now proceed with the lighting calculations, just like we did in model space. See if you can make this work.