Physical Sun and Sky and Linear Workflow by Xuan Prada

  • First of all activate Mental Ray in the Rendering Options.
  • Create a Physical Sun and Sky system.
  • Activate Final Gather. At the moment should be enough if you select the Preset Preview Final Gather. It’s just for testing purposes.
  • Check that the mia_exposure_simple lens shader has been added to the camera. And Check that the gamma is set to 2.2
  • Launch a render and you’ll realize that everything looks washed.
  • We need to add a gamma correction node after each texture node, even procedural color shaders.
  • Connect the texture file’s outColor to the “Gamma Correction” node’s value. Then connect the “Gamma Correct” node’s outValue to the shader’s diffuse.
  • Use the value 0.455 in the gamma node.
  • The gamma correction for sRGB devices (with a gamma of approximately 2.2) is 1/2.2 = 0.4545. If your texture files are gamma corrected for gamma 2.2, put 0.455 into the Gamma attribute text boxes.
  • If you launch a render again, everything should looks fine.
  • Once you are happy with the look of your scene, to do a batch render you need to put the gamma value of the lens camera shader to 1.
  • Under the quality tab, in the framebuffer options, select RGBA float, set the gamma to 1 and the colorspace to raw.
  • Render using openExr and that’s it.

Mari to Maya by Xuan Prada

Yes I know, make your Mari textures work inside Maya could be a bit weird specially if you never worked before with multi UV spaces.

I hope to give you some clues with this quick and dirty step by step tutorial.

I’m using the blacksmith guy from The Foundry who has 40 textures with 4k resolution each.

  • First of all check your model and UVs.
  • Export all your textures from Mari. You know, right click on the desired channel and export.
  • Now you can type the naming convention that you want to use. I like to use COMPONENT_UDIM.tif COL_1001.tif for example.
0003.png
  • Check your output folder. All your textures should have been exported.
  • Import your model in Maya and check the UV mapping. You need to understand how the UV shells are called inside Maya to offsetting your texture maps.
  • The default UV space is 0-0 the next one on the right will be 0-1 the next one 1-1 and so on.
  • Open the first texture map called COL_1001.tif in the hypershade and rename the image node to COL_1001 and the 2D placement node to UDIM_1001.
  • Do the same with all the textures.
  • Select all the texture nodes and open the attribute spread sheet.
  • Set the default color RGB to 0.
  • Select all the 2D place texture nodes and open again the attribute spread sheet.
  • Switch off wrapU and wrapV.
  • Type the properly offsets in the translate frameU and translate frameV.
  • Create a layered texture node.
  • Select all the texture images nodes and click and drag with MMB from an empty space of the hypershade to the layered texture node attributes tab. This will create one layer with each texture map.
  • Delete the default layer.
  • Set the blending mode of all the layers to lightnen.
  • Connect the layered texture to the input color of one shader of your election.
  • Repeat the whole process with all your channels. (SPEC, BUMP, DISP, etc)

Mari to Softimage by Xuan Prada

Recently I was involved in a master class about texturing and shading for animation movies, and as promised I’m posting here the technical way to set-up different UV sets inside Softimage.
Super simple process and really efficent methodology.

  • I’m using this simple asset.
  • These are the UVs of the asset. I’m using different UV sets to increase the quality. In this particular asset you can find four 4k textures for each channel. Color, Specular and Bump.
  • You probably realized that I’m using my own background image in the texture editor. I think that this one is more clear for UV mapping than the default one. If you want you can download the image, convert it to .pic and replace the original one located on C:\Program Files\Autodesk\Softimage 2012\Application\rsrc
  • This is the render tree set-up. Four 4k textures for color, specular and bump. Each four textures are mixed by mix8color node.
  • Once everything is connected, you still need to offset each image node to match the UV ranges.
  • I know that the UV coordinates in Softimage are a bit weird, so find below a nice cart which will be so helpfull for further tasks.
  • Keep in mind that you should turn off wrap U and wrap V for each texture in the UV editor.
  • Really quick render set-up for testing purposes.

Faking SSS in Softimage by Xuan Prada

SSS is a very nice shader which works really great with a good lighting setup, but sometimes  is so expensive shader when you´re using Mental Ray.
Find below a couple of tecniques to deal better with SSS. Just keep in mind that those tricks could improve your render times a bit, but never will reach the same quality than using SSS for itself.

  • I’m using this simple scene, with one key light (left), one fill light (right) and one rim light.
  • A SSS compound is connected to the material surface input, and the SSS_lightmap (you can find that node in the render tree -> user tools) connected to the lightmap input of the SimpleSSS. And then, the Simple SSS lightimap connected to the material lightmap input.
  • Write the output and resolution of your lightmap.
  • Hit a render and check the render time.
  • Disconnect the lightmap.
  • Render again and check the render times as well. We have imprpved the times.
  • If you need to really fake the SSS and render so fast, you can bake the SSS to texture using RenderMap, but keep in mind that the result will be much worst than using SSS. Anyways you can do that for background asset or similar.
  • Now you can use another cheaper shader like blinn, phong or even constant with your baked SSS.
  • As you can see the render is now so fast.

Dealing with normal maps in Softimage by Xuan Prada

Yes I know, working with normal maps in Softimage is a bit weird sometimes, specially if you worked before with 3D Max normal+bump preset.

I’ve been using the same method over the years and suited fine for me, maybe would be useful also for you.
I prefer to generate the normal maps inside Softimage rather than Mudbox or Zbrush, usually works much better according to my tests with different assets.

  • So, you should import in the same scene both geometrys, high and low. Don’t be afraid of high poly meshes, Softimage allows you to import meshes with millions of polygons directly from Mudbox or Zbrush.
  • With both meshes in your scene be sure that they are perfectly aligned.
  • Check the UV mapping of the low resolution mesh.
  • Select the low resolution mesh and open the ultimapper tool.

- The most important options are:

  • Source: You have to click on your high resolution mesh.
  • Path: Where your normal map texture will be placed.
  • Prefix: A prefix for your texture.
  • Type: You can choose between different image formats.
  • Normal in tangent space: The most common normal map type.
  • Resolution: Speaks for itself.
  • Quality: Medium it’s fine. If you choose high the baking time will increase a lot.
  • Distance to surface: Click on Compute button to generate this parameter.
  • Click on generate and Softimage will take some time to generate the normal map.
  • The normal map is ready.
  • Hide your high resolution mesh.
  • Grab one of the MR shaders and drag it to your mesh.

- Use a normal map node connected to the bump map input of the shader.

  • Choose the normal map generated before.
  • Select the correct UVs.
  • Select tangents mode.
  • Uncheck unbiased tangents.
  • Hit a render and you’ll see you normal map in action.
  • Cool. But now one of the most common procedures is combining a normal map with a bump map.
  • I’m using the image above.
  • If you use a bump map generator connected into the bump map input you will have a nice bump map effect.
  • Find below the final render tree combining both maps, normal and bump.
  • The first bump map generator has two inputs, color matte which is a plain white color and the normal map with the options which I already commented before. Be sure to select relative to input normal in the base normal option of the bump map generator.
  • The second bump map generator is your bump texture where you can control the intensity increasing or decreasing the factor value.
  • The vector math vector node allows you to combine both bump map generators.
  • Connect the first bump map generator  to the first input and the second one to the second imput.
  • In the operation option select vector input1 + vector input2.
  • Final render.

Baking between UV sets in Maya by Xuan Prada

One of the most useful workflows when you are texturing is bake your textures from one UV set to another one.You will need to do this for different reasons, one of them for example could be using different resolution models with different UV mapping or using a different UV mapping for grooming, etc.

The first time I tried to do this in Maya I realize that MentalRay Batch Bake tool doesn't work fine, I don't know why but I couldn't use it.

I solved the problem using Transfer Maps tool for Maya and decided to write down for future chances.

  • Check the different UV sets in Maya.
  • Apply your textures and shaders.
  • I'm using six different shaders with six different texture maps.
  • If you use the Mental Ray Batch Bake tool (common used for baking purposes) and configure all the parameters, you'll realize that the baked maps are completely black. Something is wrong realated with UV sets. Bug? I don't know.
  • You need to use the Maya Transfer Maps tool. Lighting/Shading -> Transfer Maps.
  • Duplicate the mesh and rename to source and target.
  • Select target and his UV set.
  • Select source.
  • Select desired map to bake. (probably diffuse)
  • Select the path.
  • Select resolution.
  • Bake.
  • Your baked texture is ready.

Mudbox and UDIMs by Xuan Prada

When you’re going to texture an asset which already have a displacement map, probably you’ll want to apply that displacement to your mesh before start the painting process.

In my pipeline, I usually apply the displacement map in Mudbox and then I export the high resolution mesh to Mari.

The problem here is that Mudbox doesn’t allow you to work with displacement maps and multiple UV shells.

I tried below to find a solution for this problem.

  • Check your UV mapping in Maya.
  • I’m using these simple displacement maps here.
  • One map for each UV shell.
  • Export as .Obj
  • Open in Mudbox and subdivide.
  • Go to maps -> sculpt model using displacement map.
  • Select your mesh and your displacement map.

As you’ll realize, Mudbox doesn’t allow you to choose different maps for each UV shell which means that Mudbox will be able only to sculpt using the displacement map for U0-1 V1-0 coordinates. Big problem.

The way which I’ve found to solve this problem is:

  • Go back to Maya.
  • Select your mesh and open de UV Texture Editor.
  • Select one of the UV shells which is outside of the default U0-1 V1-0 range.
  • Open the script editor and type -> polyEditUV -u -1 -v 0 ;
  • You’ll notice that the second UV shell is placed in the default UV shell but was moved 1 exact position. Then your displacement texture  will match perfectly.
  • Export again as .obj
  • Now you’ll can use your displacement map in Mudbox without problem.
  • Repeat the process for each UV shell.
  • Commands to move UV shells 1 exact position.

Move left -> polyEditUV -u -1 -v 0 ;

Move right -> polyEditUV -u 1 -v 0 ;

Move up -> polyEditUV -u 0 -v 1 ;

Move down -> polyEditUV -u 0 -v -1 ;

Selection groups in Mari by Xuan Prada

When you are working with huge assets is very useful to keep everything organized.
One of the best ways to do it inside Mary is using selection groups.

  • Go to view -> palettes -> selection groups.
  • Select faces, elements or objects.
  • Click on plus icon to create a new selection groups based on your current selection.
  • You can create different selection models based on different parts of your asset.
  • Now you can be focused on just one specific area of your asset.