I’ve experimented a lot with curves in maya over the past year so recently I decided to see if I could put together my own maya hair plugin for fun. Currently I’m building this as a node based system, I figure this gives artists the most freedom and also makes it easier to track down problems while developing.

At this point the system relies on guide curves at each vertex and provides the ability to use a density map, cache a hair pose, sample any maya texture based on hair curve points, adjust length based on a sampled texture, vector deform based on a sampled texture, and clump based on a sampled texture. Going forward there’s a ton of features I’d like to add to make this more artist friendly, but this is where I’m at today.

Here are some example videos I put together using nHair to drive dynamics:

and Clumping:

Recently I ran into an interesting problem which provoked me to create a node, I needed to create a particle simulation on top of a surface which had an animated vector displacement map assigned to it. The problem was that the original mesh didn’t have any of this deformation, so I had to come up with a way to create an animated mesh representation of the displacement which the renderer was creating. The requirements for this would be something which could gather points, normals, tangents and uvs of the mesh, supply them to SampleShadingNetwork function and then displace with the returned values. Here’s a walk through the code I’ve ended up with:

The first step is to get access to the name of the shading node and attribute plugged into the inColor, this requires the retrieval of the Mplug associated with InColor and then querying it’s connections. As a backup in case nothing is plugged in, I’m storing the color value as queried from the data-block.

MString shadingNodeAttribute = ""; MPlugArray connectedAttrArray; MPlug inColorPlug = MPlug(thisMObject(),aInColor); inColorPlug.connectedTo(connectedAttrArray,1,0); if (connectedAttrArray.length()) { shadingNodeAttribute = connectedAttrArray[0].name(); } else { shadingNodeAttribute = ""; } // get incolor in case nothing is plugged into color// MDataHandle inColorDataHandle = data.inputValue( aInColor); MFloatVector inColor = inColorDataHandle.asFloatVector();

I’m querying all of the uv, point, tangent and normal data at once to save processing time, however the uv, tangent and normal lists are poly centric meaning there is separate vector or value for each polygon surrounding a vertex. This means a loop is required to retrieve the proper tangent Id’s in order to pluck them from the array. I chose to do this per polygon and then average the polygons around each point in order to provide the smoothest information, especially in the case that the user provided a triangulated mesh.

// arrays for sample shading network MFloatPointArray points; MFloatVectorArray normals; MFloatVectorArray binormals; MFloatVectorArray tangents; MFloatArray uCoords; MFloatArray vCoords; // face arrays for tangents MFloatVectorArray faceBinormals; MFloatVectorArray faceTangents; // arrays which come in poly centric and must be pruned MFloatArray dirtyUCoords; MFloatArray dirtyVCoords; MFloatVectorArray dirtyBinormals; MFloatVectorArray dirtyTangents; // fill arrays transMeshFn.getPoints(points,MSpace::kObject); transMeshFn.getUVs(dirtyUCoords,dirtyVCoords); transMeshFn.getTangents(dirtyTangents,MSpace::kObject); transMeshFn.getVertexNormals(false,normals,MSpace::kObject); //// loop through polys, retrieve tangent ids and populate face tangent array /// int numpolys = transMeshFn.numPolygons(); for (int i=0; i<numpolys;i++) { MFloatVector avgTangent; MIntArray polyVerts; transMeshFn.getPolygonVertices(i,polyVerts); int polyVertsLen = polyVerts.length(); for (int d=0; d<polyVertsLen;d++) { int vertId = polyVerts[d]; int tangentId = transMeshFn.getTangentId(i,vertId); avgTangent+= dirtyTangents[tangentId]; } faceTangents.append(avgTangent); }

In the case of the UVs which will be supplied to SampleShadingNetwork, I’m querying the uv indices for the point and then using the first one the list, as averaging these doesn’t make much sense. In order to get the binormal, I calculate the cross product of the normal and the tangent and multiply it by -1.

while (transMeshIt.isDone() == false) { int index = transMeshIt.index(); MIntArray normInd,uvInd,faceInd; ////// get uv and normal indices assosciated with this point transMeshIt.getConnectedFaces(faceInd); int faceLen = faceInd.length(); MVector avgTangent; //// average tangents for (int i=0; i<faceLen;i++) { avgTangent += faceTangents[faceInd[i]]; } avgTangent.normalize(); tangents.append(avgTangent); //cross tangent and normal to retrieve binormal MVector avgBinormal = avgTangent^normals[index]; binormals.append(-avgBinormal); //// use first uv index transMeshIt.getUVIndices(uvInd); uCoords.append(dirtyUCoords[uvInd[0]]); vCoords.append(dirtyVCoords[uvInd[0]]); transMeshIt.next(); }

With all of this data now in vertex centric arrays I’m ready to perform the sample, returning me an array of colors to work with. To perform the tangent based displacement, I use the same arrays constructed for SampleShadingNetwork in order to construct a matrix for each point. Using this matrix, the values from the color are then transformed by the matrix as a point, or in option that world space is chosen, the values are simply added to the current point.

if (shadingNodeAttribute != "") { MRenderUtil::sampleShadingNetwork( shadingNodeAttribute, points.length(), false, false, cameraMatrix, &points, &uCoords, &vCoords, &normals, &points, &tangents, &binormals, NULL, colors, transps ); } while (transMeshIt.isDone() == false) { int index = transMeshIt.index(); MFloatVector normal = normals[index]; MFloatVector tangent = tangents[index]; MFloatVector binormal = binormals[index]; normal.normalize(); binormal.normalize(); tangent.normalize(); // construct matrix from normals trangents and binormals double matrix[4][4] = { {tangent.x,tangent.y,tangent.z,0}, {normal.x,normal.y,normal.z,0},{binormal.x,binormal.y,binormal.z,0},{points[index].x,points[index].y,points[index].z,1}}; MMatrix instMatrix(matrix); MPoint colorPoint = inColor; // set color if color was sampled if (shadingNodeAttribute != "") { colorPoint = colors[index]; } // set point based off of displacement type chosen if (inDisplacementType == 0) { post_trans_points.set(MFloatPoint(((colorPoint*inScale)*instMatrix)*inverseMatrix),index); } else if (inDisplacementType == 1) { post_trans_points.set(MFloatPoint(((colorPoint*inScale)+points[index])*inverseMatrix),index); } else { post_trans_points.set(MFloatPoint((colorPoint*inScale)),index); } transMeshIt.next();

After testing this I realized that from inside the shading network going into my node you could gain access to all of the variables provided to SampleShadingNetwork through the samplerInfo node, so I chose to add the option to have the retrieved color simply become the new position of the point. This way you can perform more complex operations with the points using the nodes of the shading network and not be limited to a relative transformation.

Here’s a sample video of a mesh with vector displacement exported from MudBox:

Here’s a demo of whats possible just by tapping into the SamplerInfo node:

Email: vfxhoneybadger(at)gmail.com-
#### Recent Posts

#### Blogroll

- anisotropic api coordinates curve deformer distort feather for maya geometry shader maya maya 3d maya python api vfx honeybadger helloworld maya tutorial mel in maya mel script mental ray mentalray MFnMesh mi_eval_scalar mi_sample_light mi_vector_add mi_vector_from_object mi_vector_mul mi_vector_normalize python shader writing specular sub surface mental ray shader writing c++ texture Uncategorized