TGE normal mapping
by Koushik · 06/16/2008 (6:22 am) · 47 comments
Download Code File
After being asked about this by a lot of people, I've released this code to the community, when I initially thought I'd polish it and make it part of my larger framework, and release the whole thing once it looks stunning... but that might take a while, and I think it best to satisfy hungry GGers...
You can find a manual outlining the exact procedure (along with some introduction on how to setup scripts and stuff) over Here
The corresponding shaders are attached with this resource. If you find any bugs, or have any suggestions, feel free to email me.
It is most likely that some of the parameters would need some tweaking based on your level lighting and textures. If you need any assistance with these, post a screen and I'd be able to help out.
Thanks to Jon and Ari of BrokeAss Games for trying this out first and giving me some feedback initially.
Cheers!
Koushik
After being asked about this by a lot of people, I've released this code to the community, when I initially thought I'd polish it and make it part of my larger framework, and release the whole thing once it looks stunning... but that might take a while, and I think it best to satisfy hungry GGers...
You can find a manual outlining the exact procedure (along with some introduction on how to setup scripts and stuff) over Here
The corresponding shaders are attached with this resource. If you find any bugs, or have any suggestions, feel free to email me.
It is most likely that some of the parameters would need some tweaking based on your level lighting and textures. If you need any assistance with these, post a screen and I'd be able to help out.
Thanks to Jon and Ari of BrokeAss Games for trying this out first and giving me some feedback initially.
Cheers!
Koushik
About the author
Recent Blogs
• TGE Shader Pack• Shaders galore!
• Major announcements!
• ...And the sun shines again!!
• TGE fur-rendering!
#22
07/18/2008 (8:35 am)
hey daniel if you figure out your fix for your red character I would be interested in what you found. I currently have a bright red character and it seems the hardware is ok as it ran the mk just fine.
#23
Koushik - I've found a tutorial. I've seen some basic shaders, and what they're doing makes sense to me. However, I don't know how to get them working in Torque. Could you provide a simple example of a shader that just makes the material a flat colour, controlled in the material definition? [Yes, I know, I've already got a flat colour shader, but not by design :P] I think if I can see how to do that, I can figure out where to go from there.
07/23/2008 (10:57 am)
I'm going nowhere fast mate, sorry. I figure I'm eithr going to have to learn shaders myself, or see if I can use a program like RenderMonkey or ShaderDesigner.Koushik - I've found a tutorial. I've seen some basic shaders, and what they're doing makes sense to me. However, I don't know how to get them working in Torque. Could you provide a simple example of a shader that just makes the material a flat colour, controlled in the material definition? [Yes, I know, I've already got a flat colour shader, but not by design :P] I think if I can see how to do that, I can figure out where to go from there.
#24
Wait a second... no... .
That can't be right.
Okay, it must be something I've done - my player renders without a shader applied, but nothing else does. I'll try adding shaders to them...
I'm an idiot. Never mind... :P
07/28/2008 (11:43 am)
Is it deliberate that anything without a material script is invisible? That seems to be what's happening (I've only just noticed :P).Wait a second... no... .
That can't be right.
Okay, it must be something I've done - my player renders without a shader applied, but nothing else does. I'll try adding shaders to them...
I'm an idiot. Never mind... :P
#25
The material system falls back to fixed function if you haven't defined any shaders. That was one of the plus points of the whole system.
The MK also doesn't have support for DTS transparency right now.
07/29/2008 (9:12 pm)
@ Daniel.The material system falls back to fixed function if you haven't defined any shaders. That was one of the plus points of the whole system.
The MK also doesn't have support for DTS transparency right now.
#26
But is it possible to implement shaders that don't rely on a texture? For exmple, in the material definition, setting 'colours[0] = "...";' and 'samplers[0] = colour;'
One last question - can we use this for texture UV animation? Like, rolling tank treads?
07/31/2008 (6:45 am)
Yeah, I realised that it wasn't a problem with your code at all :PBut is it possible to implement shaders that don't rely on a texture? For exmple, in the material definition, setting 'colours[0] = "...";' and 'samplers[0] = colour;'
One last question - can we use this for texture UV animation? Like, rolling tank treads?
#27
About the tank treads, definitely possible. One method I could tell you off the back of my head is using variable texture co-ordinates. You basically send a variable across to the shader (unfortunately, with the current system for GLSL shaders, this has to be hard-coded in the engine) Then instead of using the interpolated texture co-ordinates, offset them a bit in the direction you need.
So basically, set newTexCoord = texCoord + vec2(x,y);
where texCoord is the interpolated co-ordinates passed from the vertex shader and newTexCoord is the co-ordinate you would use when doing a texture look-up. x, y are float variables passed into the program, these could then be varied in the code depending on time or whatever.
07/31/2008 (6:56 am)
Yeah. sure. You don't even have to define the sampler arrays in the script. You only need those if you want access to a texture. The textures array holds the corresponding file on the disk, and the samplers array points to the particular variable in your shader program.About the tank treads, definitely possible. One method I could tell you off the back of my head is using variable texture co-ordinates. You basically send a variable across to the shader (unfortunately, with the current system for GLSL shaders, this has to be hard-coded in the engine) Then instead of using the interpolated texture co-ordinates, offset them a bit in the direction you need.
So basically, set newTexCoord = texCoord + vec2(x,y);
where texCoord is the interpolated co-ordinates passed from the vertex shader and newTexCoord is the co-ordinate you would use when doing a texture look-up. x, y are float variables passed into the program, these could then be varied in the code depending on time or whatever.
#28
Okay, so if I want to define a solid-colour shder, I just write that straight into the shader program. What if I want to have the colour defined by scripts? Like, a 'red' material and a 'green' mterial, but both using the same shader? Just changing a parameter in the material definition?
07/31/2008 (12:17 pm)
Quote:(unfortunately, with the current system for GLSL shaders, this has to be hard-coded in the engine)So... no speed scaling? Or just not scriptable?
Okay, so if I want to define a solid-colour shder, I just write that straight into the shader program. What if I want to have the colour defined by scripts? Like, a 'red' material and a 'green' mterial, but both using the same shader? Just changing a parameter in the material definition?
#29
08/02/2008 (7:44 am)
I just had the editor crash, too. It was in materialList.h, I think, here:const char * getMaterialName(U32 index) { return mMaterialNames[index]; }It's trying to render a SpawnSphere. mMaterialNames is empty.
#30
Here's a quick fix (again off the back of my head) :
Just before the Material *shaderMaterial= MaterialManager->findMaterialByMapToTexture(name);
add if(name), so it should look like:
If it doesn't solve the problem, I'll get back to my system and take a closer look at it...
08/02/2008 (8:36 am)
Wow... Thanks mate, I was meaning to get around to that problem sometime... darn...Here's a quick fix (again off the back of my head) :
Just before the Material *shaderMaterial= MaterialManager->findMaterialByMapToTexture(name);
add if(name), so it should look like:
if(name) Material *shaderMaterial = MaterialManager->findMaterialByMapToTexture(name);I guess that should solve it - the spawn spheres don't have a materialList defined! Shoot it!
If it doesn't solve the problem, I'll get back to my system and take a closer look at it...
#31
you CAN scale stuff, but it would be in-extensible and crude to put it in C++ only. The only way you can work around this is currently changing the shader (making a copy of it) and changing the appropriate parameters.
I'm working on a forked-off version of the MK, this time I assure you I'll post something up tomorrow :)
08/02/2008 (8:38 am)
Oh, and regarding your previous post,you CAN scale stuff, but it would be in-extensible and crude to put it in C++ only. The only way you can work around this is currently changing the shader (making a copy of it) and changing the appropriate parameters.
I'm working on a forked-off version of the MK, this time I assure you I'll post something up tomorrow :)
#32
EDIT: Should the rest of the code that relies on shaderMaterial be in the if as well?
EDIT: The problem actually seems to come from the line:
I've started playing around wih shader programming, and finding my feet. I've got a basic texture shader working, now time to add lighting...Is there any reason to restrict the maximum lights to 4 in your shader? I guess it's an engine limitation...? Is the sun light guaranteed to be any one of these, or does it change?
RE scaling, I'm thinking that it might be possible to bind a uniform float in the shader, that tells the thing how far to animate. This float can then be set as a property of tsMesh or something. I don't know - I haven't looked at all at how shaders are bound.
My goal is to have a mesh animate its texture in one dimension, and at a speed determined by how fast a WheeledVehicle's wheels are turning. So the faster the vehicle goes, the faster the material animates. Tricky.
EDIT: Well, my first Kork shader came out... strangely. I'm happy that it works, though :)
EDIT: It works! I fixed some strange problems, added point lights, and it looks... exactly like stock TGE lighting. Whoop-de-doo. But it runs on a shader!
08/03/2008 (6:19 am)
RE the fix: thanks :) I figured it was to do with spawn spheres having no material, but would have no idea how to solve the problem.EDIT: Should the rest of the code that relies on shaderMaterial be in the if as well?
EDIT: The problem actually seems to come from the line:
const char *name = materials->getMaterialName(registeredMaterial);I fixed it by adding, before that line:
if(materials->getMaterialCount())
I've started playing around wih shader programming, and finding my feet. I've got a basic texture shader working, now time to add lighting...Is there any reason to restrict the maximum lights to 4 in your shader? I guess it's an engine limitation...? Is the sun light guaranteed to be any one of these, or does it change?
RE scaling, I'm thinking that it might be possible to bind a uniform float in the shader, that tells the thing how far to animate. This float can then be set as a property of tsMesh or something. I don't know - I haven't looked at all at how shaders are bound.
My goal is to have a mesh animate its texture in one dimension, and at a speed determined by how fast a WheeledVehicle's wheels are turning. So the faster the vehicle goes, the faster the material animates. Tricky.
EDIT: Well, my first Kork shader came out... strangely. I'm happy that it works, though :)
EDIT: It works! I fixed some strange problems, added point lights, and it looks... exactly like stock TGE lighting. Whoop-de-doo. But it runs on a shader!
#33
And about the tire tread thingy - I personally think putting it in TSMesh would be a very bad idea. TSMesh is the most-basic unit in the entire render pipeline. The render() method in TSmesh.cc is the one that actually does the triangle drawing and stuff. That means it will be used for any damn mesh (or pseudo meshes like the spawn spheres which only appear in the editor). So you could have a lot of overhead.
The best be would be to alter the material system. Its a little complicated to explain here. I was actually writing a blog entry when I saw this post. I think the blog will outline stuff in a bit of detail. I'll put up the link here once its done.
08/03/2008 (9:10 am)
Regarding the no. of lights as 4 - no real deal. I generally made the assumption that people wouldn't have too many lights in a scene. You can have as many as you like, just remember that the computations will be slower (both pre-mission lighting and the realtime shader performance). This is the only tradeoff. You are basically limited by the number of lights TGE uses in a mission (which is definitely more than 4, don't know the value though)And about the tire tread thingy - I personally think putting it in TSMesh would be a very bad idea. TSMesh is the most-basic unit in the entire render pipeline. The render() method in TSmesh.cc is the one that actually does the triangle drawing and stuff. That means it will be used for any damn mesh (or pseudo meshes like the spawn spheres which only appear in the editor). So you could have a lot of overhead.
The best be would be to alter the material system. Its a little complicated to explain here. I was actually writing a blog entry when I saw this post. I think the blog will outline stuff in a bit of detail. I'll put up the link here once its done.
#34
EDIT: I think I stumbled on a solution. I have a list pof all the matrices provided by default, and I just used them all until I found the one that looked right :P. I'm using this for my vertex shader:
EDIT: Hmm. Went back to my sensible shader (the one that worked :P) and applied it to the crossbow. It gets all screwed up... is this something to do with images? The texture coordinates seem to be going all over the place.
08/03/2008 (10:18 am)
Random question that I should probably just research myself: how do I get the vector from the camera to a surface? I figured that in your example, this was what ecPosition and eye were doing, but I seem to be wrong. I'm just seeing what I can do with shaders, and thought I'd like to try something with the dot product between a face normal and the vector from the face to the camera. Like the way things are calculated for lights.EDIT: I think I stumbled on a solution. I have a list pof all the matrices provided by default, and I just used them all until I found the one that looked right :P. I'm using this for my vertex shader:
varying vec3 normal;
varying vec3 eyeVec;
void main()
{
gl_Position = ftransform();
normal = gl_Normal;
vec4 ec = gl_ModelViewMatrixInverse * gl_Vertex;
vec3 ec3 = vec3(ec) / ec.w;
eyeVec = -normalize(ec3);
}Then in the frag shader I dot eyeVec and normal (both normalised) and apply a colour based on the result. Now... to figure out how to do transparency... :PEDIT: Hmm. Went back to my sensible shader (the one that worked :P) and applied it to the crossbow. It gets all screwed up... is this something to do with images? The texture coordinates seem to be going all over the place.
#35
As for transparency, it needs some changes in the base render code. I've worked on that and do have a solution. I'll just have to dig it up. Will put it out as soon as I find it.
And finally, about the shapeImage, there must be a problem with your shader. Maybe, your texture co-ordinates transformations are wrong? The shader code does not differentiate between TSShapeInstance, ShapeBaseImages, TSStatics and stuff. So, if there is a problem, it might well be in your GLSL code.
If you could post a pic or something, I might be of more help.
08/10/2008 (8:00 am)
The modelviewInverse Matrix is, as the name suggests, the inverse of the modelviewmatrix. In OpenGL, the ModelViewMatrix converts the world space co-ordinates to viewing co-ordinates. You might want to look up a good reference book on these transformations.As for transparency, it needs some changes in the base render code. I've worked on that and do have a solution. I'll just have to dig it up. Will put it out as soon as I find it.
And finally, about the shapeImage, there must be a problem with your shader. Maybe, your texture co-ordinates transformations are wrong? The shader code does not differentiate between TSShapeInstance, ShapeBaseImages, TSStatics and stuff. So, if there is a problem, it might well be in your GLSL code.
If you could post a pic or something, I might be of more help.
#36
I might try to improve the shader itself first, though. It seems to be lighting vertices differently when the camera moves, even if lights remain stationary. Also, no lights are coloured. I copied a lot of code from your example shader without really understanding it, so that could be the problem.
Just in case someone wants a look:
The main thing I don't get is ecPosition. Other than that, I pretty much know what's going on. The lighting does change across the mesh when it rotates, but also depending on what part of the screen the mesh is visible in (in one corner of the screen, the character appears dark; in the other corner, he appears bright). Also, what's in position.w? How can you have a 4D position vector? :P
I guess a lot of these questions can be answered by me simply reading the shader binding code. I'll go do that - don't answer any questions that are answered in there :P
08/15/2008 (4:46 am)
Hmm... that's odd, because the shader works for my Player all right (using both the Kork mesh and one of my own, with admittedly very simp,e UV unwrapping).I might try to improve the shader itself first, though. It seems to be lighting vertices differently when the camera moves, even if lights remain stationary. Also, no lights are coloured. I copied a lot of code from your example shader without really understanding it, so that could be the problem.
Just in case someone wants a look:
//Vertex shader
varying vec2 texCoord;
varying vec4 diffuse;
varying vec4 ambient;
void main()
{
texCoord = gl_MultiTexCoord0.xy;
gl_Position = ftransform();
//Calculate simple vertex diffuse
//Lights are all directional for now
vec3 lightVec;
float amount;
float len;
vec3 norm = gl_Normal;
vec4 ecPosition = gl_ModelViewMatrix * gl_Vertex;
vec3 ecPosition3 = ecPosition.xyz / ecPosition.w;
for(int i = 0; i < 4; i++)
{
if(gl_LightSource[i].position.w == 0.0)
{
lightVec = gl_LightSource[i].position.xyz;
amount = max(0.0, dot(norm,lightVec));
diffuse += gl_LightSource[i].diffuse * amount;
ambient += gl_LightSource[i].ambient;
}
else
{
//Get light vector
lightVec = gl_LightSource[i].position.xyz - ecPosition3;
//Ambient based on distance
len = length(lightVec);
amount = 1.0 / (gl_LightSource[i].constantAttenuation +
gl_LightSource[i].linearAttenuation * len +
gl_LightSource[i].quadraticAttenuation * len * len);
ambient += gl_LightSource[i].ambient * amount;
//Factor in dot product for diffuse
amount *= max(0.0, dot(norm,lightVec));
//Add diffuse
diffuse += gl_LightSource[i].diffuse * amount;
}
}
}//Fragment shader
varying vec2 texCoord;
uniform sampler2D colourTex;
varying vec4 diffuse;
varying vec4 ambient;
void main()
{
gl_FragColor = texture2D(colourTex,texCoord) * (diffuse + ambient);
}The main thing I don't get is ecPosition. Other than that, I pretty much know what's going on. The lighting does change across the mesh when it rotates, but also depending on what part of the screen the mesh is visible in (in one corner of the screen, the character appears dark; in the other corner, he appears bright). Also, what's in position.w? How can you have a 4D position vector? :P
I guess a lot of these questions can be answered by me simply reading the shader binding code. I'll go do that - don't answer any questions that are answered in there :P
#37
1) You are calculating lighting per-vertex, which is interpolated across the pixel. Unless you actually want to do this, I suggest passing the normal vector as a varying variable into the fragment shader and then computing the lighting parameter using the dot product of the interpolated normal.
2) ecPosition transforms the vector position in gl_Vertex into eye-space co-ordinates. In OpenGL, there are 3 basic transformations - modeling transformation, viewing transformation and projection transformation before the fragment is converted into the window co-ordinates and put into the frame-buffer. the first two are combined in OpenGL into one - the modelview transformation. The projection transformation then takes this modelview transformations and applies perspective correction, if required. You might want to check out the entire transformation process in a book.
3) position.w has to do with homogeneous co-ordinates. Its kinda long to explain everything here. You might want to read up some stuff on the internet on this one. It is basically done in order to represent co-ordinate transformations (scaling, translation and rotation) by a single transformation matrix. In order to do this, you'd need to do the matrix math in the 4th dimension. You do this by basically adding another component of the 3D position vector. The transformation matrix used is 4x4, hence the resultant value obtained after the transformation, when divided by the fourth component yields the normalized positional co-ordinates. Again, it's hard to explain all of this stuff without a diagram (or a blackboard so I could draw some diagrams for you), so you might want to look up stuff on the net, there are a lot of resources available.
If I've missed anything, please feel free to prompt me.
08/23/2008 (4:10 am)
Let me try to answer those questions in order.1) You are calculating lighting per-vertex, which is interpolated across the pixel. Unless you actually want to do this, I suggest passing the normal vector as a varying variable into the fragment shader and then computing the lighting parameter using the dot product of the interpolated normal.
2) ecPosition transforms the vector position in gl_Vertex into eye-space co-ordinates. In OpenGL, there are 3 basic transformations - modeling transformation, viewing transformation and projection transformation before the fragment is converted into the window co-ordinates and put into the frame-buffer. the first two are combined in OpenGL into one - the modelview transformation. The projection transformation then takes this modelview transformations and applies perspective correction, if required. You might want to check out the entire transformation process in a book.
3) position.w has to do with homogeneous co-ordinates. Its kinda long to explain everything here. You might want to read up some stuff on the internet on this one. It is basically done in order to represent co-ordinate transformations (scaling, translation and rotation) by a single transformation matrix. In order to do this, you'd need to do the matrix math in the 4th dimension. You do this by basically adding another component of the 3D position vector. The transformation matrix used is 4x4, hence the resultant value obtained after the transformation, when divided by the fourth component yields the normalized positional co-ordinates. Again, it's hard to explain all of this stuff without a diagram (or a blackboard so I could draw some diagrams for you), so you might want to look up stuff on the net, there are a lot of resources available.
If I've missed anything, please feel free to prompt me.
#38
I did intend for this to be per-vertex lighting, but I guess I may as well make it per-pixel since I'm using shaders anyway. I guess that wouldn't solve my issues with lighting changing based on viewpoint, but it'll mean the operation of my shader is closer to yours, and therefore I can copy more of your code to make my shader more correct ;)
Oh - is coloured lighting possible? I'm guessing that for the moment the problem is my implementation of the shader, but I'm suspecting that it's not... that or I just don't have the right light setup. I tried walking Kork into the DIF in the level with the goo lake, where there's a red fire light, but the light on Kork was white.
08/24/2008 (12:58 pm)
Okay, thanks for all that! I realise I do just need to do a lot more reading :P.I did intend for this to be per-vertex lighting, but I guess I may as well make it per-pixel since I'm using shaders anyway. I guess that wouldn't solve my issues with lighting changing based on viewpoint, but it'll mean the operation of my shader is closer to yours, and therefore I can copy more of your code to make my shader more correct ;)
Oh - is coloured lighting possible? I'm guessing that for the moment the problem is my implementation of the shader, but I'm suspecting that it's not... that or I just don't have the right light setup. I tried walking Kork into the DIF in the level with the goo lake, where there's a red fire light, but the light on Kork was white.
#39
I tried both fixes and neither of them worked.
10/05/2008 (9:05 am)
Do you guys know how to fix the problem with the F11 editor crashing torque yet? I tried both fixes and neither of them worked.
#40
I have a couple of questions about how to use the Material and SubShader objects. Looking through the source, these must have to be client side objects, yes? Also, it seems that materials bind to a DTS file, not to an object using that DTS file, also correct? So I have to have the materials created on the client before it connects to the server and gets a load of ghosted shapes that use that material, is that also true?
10/27/2008 (7:26 am)
Thank you for making my Artists melt from a spontaneous joygasm.I have a couple of questions about how to use the Material and SubShader objects. Looking through the source, these must have to be client side objects, yes? Also, it seems that materials bind to a DTS file, not to an object using that DTS file, also correct? So I have to have the materials created on the client before it connects to the server and gets a load of ghosted shapes that use that material, is that also true?

Torque Owner Zaque