Adverti horiz upsell
Generating and using normal maps in Maya
Generating and using normal maps in Maya
drone, added 2006-03-31 00:06:24 UTC 123,565 views  Rating:
(2 ratings)
Page 1 of 3

Generating and using normal maps in Maya

(some sample shaders and scenes)


Faster, bigger, better! If you're familiar with normal maps check directly here for a faster application of the method described down here using my rayDisplace plugin : displacement_maps
January 10, 2005 : recompiled rayDisplace6.zip for Maya 6.0, and rayDisplaceSource.zip is available


I recently I came upon this website www.crytek.de where they showcase a really nice plugin for Max, seemingly using normal maps to add visual detail to low poly models. This is done by converting the surface information of the high resolution version of the model into a normal map, which is then used at render (here real-time) on the low poly version (screenshots here: Crytek)


A normal map is close to a bump map. Though a bump map returns a scalar value that indicate perturbation of the surface along the it's original normal, whereas a normal map return a vector (by using rgb color) which will can replace or be added to the orginal surface normal.

It's easy to generate a view or map of an objects normal using false colors (red for x, green for y, blue for z). this is quite useful to create � 2D surface previews � for texturing in a 2D paint package, when you don't have access to a 3D paint.


Exemple scene : display_object_normals.zip



Note : the value returned from the samplerInfo node is expressed in camera space. To be most useful a normal map is better not be camera dependant nor dependant of where the object sits in space. So I use the camera's (here � persp �) worldMatrix to transform the output of the sampler info and obtain normals in world space, then the object's worldInverseMatrix to express them in object space. It would theorically be cleaner to obtain this result using samplerInfo.matrixEyeToWorld, as connecting the camera directly to a shading network cause it to be reevaluated each time you change the viewpoint in the interractive window, but samplerInfo.matrixEyeToWorld still seems to be quite buged at the moment.

Finally, the obtained normals will have coordinates x, y and z ranging from -1 to 1. To express them in the 0->1 color space of red, green and blue, I have to use a setRange before I output the to the color channel of my shader.

If I understood well, the idea behind crytek's plugin is to use the normal map generated by the high resolution model on the low poly version. The results were looking so nice I decided to see if it was possible to pull up some stunt using a Maya shading network. Luckily Alias released recently a free plugin collection for polygons that you can find on their website. I included the one I'm specifically using (in "Bonus tools"):

closestPointOnMesh.zip

Thanks to this nice plugin (though quite a slow node to evaluate), for each sampled point of the low poly model, I can look up to the closest point of the high resolution version (thus low poly and high poly models must sit in same place in place during the normal map texture creation) and get the value of the normal at this point. A faster alternative would be to use Maya's new "Evaluate Surface" functionalities or rayDisplace6.zip