look-dev

Multi UDIM workflow in Modo by Xuan Prada

You know I’m migrating to Modo.
Multi UDIM workflow is on my daily basis tasks, so this is how I do the setup.

  • First of all, check textures and UDIMs in Mari and export all the textures.
  • Check the asset and UVs in Modo.
  • Load all your textures in the Modo’s image manager.
  • Create a new material for the asset.
  • Add all the UDIM textures as image layers for each required channel.
  • In the texture locator for each texture change the horizontal repeat and vertical repeat to reset. And change the UV offset. It works with negative values (not like Softimage or Maya).
  • That’s it. Make a render check to see if everything works fine.

Arm texture breakdown by Xuan Prada

I did a simple and quick texture breakdown for an human arm.
These are the textures that I usually create when I need to texture digital doubles for films or any kind of humanoid character.

These are the most basic textures used.
Usually working on movies we need more additional textures depending on render engines, other pipeline tools or artistic decisions.
But as I said, take this example as a base or starting point for your work.

These are quick renders using a neutral lighting rig for look-dev.

Diffuse textures.

Overall textures.

Scatter textures.

Displacement textures.

Fine displacement textures.

Specular textures.

Zbrush displacement in V-Ray for Maya by Xuan Prada

It is always a bit tricky to set up Zbrush displacements in the different render engines.
If you recently moved from Mental Ray or another engine to V-Ray for Maya, maybe you should know a few things about displacement maps extracted from Zbrush.

I wrote down here a simple example of my workflow dealing with that kind of maps and V-Ray.

  • First of all drag and drop your 16 bits displacement to the displacement channel inside the shading group attributes.
  • Maya will create a displacement node for you in the hypershade. Don’t worry to much about this node, you don’t need to change anything there.
  • Select your geometry and add a V-Ray extra attribute to control the subdivisions and displacement properties.
  • If you exported your displacement subdividing the UV’s, you should check that property in the V-Ray attributes.
  • Edge lenght and Max subdivs are the most important parameter. Play with them until reach nice results.
  • Displacement amount is the strength of your displacement and displacement shift sould be half negative than your displacement amount if you are using 16 bits textures.
  • If you are using 32 bits .exr textures, the displacement shift should be 0 (zero).
  • Select your 32 bits .exr file and add a V-Ray attribute called allow negative colors.
  • Render and check that your displacement is looking good.
  • I’ve been using these displacement maps. 16 bits and 32 bits.

Vray sss test by Xuan Prada

Just testing Vray’s SSS shader for realistic skin look-dev purposes.
I ended with the theory that would be quite simple to set-up a nice, realistic and cheap SSS shader for human and creature assets. I love the raytraced solid scatter, but with complex models I can’t get rid of some of the artifacts in the SSS channel.
I will post more quite soon.

  • To achieve better results, I like to combine SSS shaders with Vray Mtl shaders which have better solutions for speculars and reflections. With this method the reflection of the surface is controled by BRDF instead of the poor spec control of the SSS shader.

Texturing for VFX film projects. Case study by Xuan Prada

These are the key points of an introduction lecture which I gave about texturing for VFX film projects.
We used different assets on the class but this is the only one which is not copyrighted and I can show here.
I created this asset specifically for this course.

Summary

- Check the model.
- Render a checker scene.
- Decide about the quality needed for the textures. Is it a hero asset?
- UV mapping.
- Organization methods.
- How many UDIM’s?
- Photo Shoot.
- What kind of lighting do I need?
- Accessories. (Color checkers, tripod, polarized filters, angular base, etc).
- Bakes. (dirt maps, dust maps, UVs, etc).
- Grading reference images. Create presets.
- Clean reference images for projections.
- Create cameras and guides in Maya/Softimage for projections.
- Adapt graded and cleaned reference images for projection guides.
- Project in 3D software or Mari. (Mari should be faster).
- Work on the projections inside Mari. (We can use Photoshop, Mari or both of them. Even Nuke).
- Create  a 16 bits sRGB colour textures.
- Test colour channel in the light rig.
- Create a 16 bits gray scale specular textures.
- Create a 16 bits gray scale bump textures.
- Create a 16 bits gray scale displacement textures.
- Create a 8 bits gray scale ISO textures.
- Look-Dev blocking.
- Import the light rig.
- Create a basic pass.
- Checker render (matte).
- Checker render (reflective).
- Create clusters.
- Block materials.
- Look-Dev primary.
- Set up diffuse.
- Set up specular and reflections.
- Balance materials.
- Look-Dev secondary.
- Set up bump.
- Set up displacement.
- Rebalance materials.
- Set up ISO’s.
- Look-Dev refinement.
- Rebalance materials if needed.
- Create material libraries.
- Render turntables.

Basic displacement in RenderMan by Xuan Prada

  • Select the object’s shape node in the Attribute Editor and then go to Attribute -> RenderMan -> Add Sudvision Scheme. This will create a smooth surface.
  • Load your displacement texture in the Hypershade.
  • Play with the Alpha Gain and Alpha Offset to scale the image.
  • Alpha Offset should be half negative than Alpha Gain. So if Alpha Gain is 2 Alpha Offset should be -1
  • Drag the displacement texture on to the displacement material in the shading group attributes.
  • This will create a displacement node.
  • Select the displacement node and go to Attributes -> RenderMan -> Add Displacement Attribues.
  • Set the displacement bound to something similar to your highest displacement value.
  • If you are using ray trace rendering you need to add ray traced attributes to your displacement.
  • Select your shape node and go to Attribute -> RenderMan -> Manage attributes and select TraceDisplacement.
  • Turn the shading rate down to increase the quality of your displacement. You can add a RenderMan attribute to control this instead change the global render options, you’ll save a lot of render time.

Linear Workflow in Maya with Vray 2.0 by Xuan Prada

I’m starting a new work with V-Ray 2.0 for Maya. I never worked before with this render engine, so first things first.
One of my first things is create a nice neutral light rig for testing shaders and textures. Setting up linear workflow is one of my priorities at this point.
Find below a quick way to  set up this.

  • Set up your gamma. In this case I’m using 2,2
  • Click on “don’t affect colors” if you want to bake your gamma correction in to the final render. If you don’t click on it you’ll have to correct your gamma in post. No big deal.
  • The linear workflow option is something created for Chaos Group to fix old VRay scenes which don’t use lwf. You shouldn’t use this at all.
  • Click on affect swatches to see color pickers with the gamma applied.
  • Once you are working with gamma applied, you need to correct your color textures. There are two different options to do it.
  • First one: Add a gamma correction node to each color texture node. In this case I’, using gamma 2,2 what means that I need to use a ,0455 value on my gamma node.
  • Second option: Instead of using gamma correction nodes for each color texture node, you can click on the texture node and add a V-Ray attribute to control this.
  • By default all the texture nodes are being read as linear. Change your color textures to be read as sRGB.
  • Click on view as sRGB on the V-Ray buffer, if not you’ll see your renders in the wrong color space.
  • This is the difference between rendering with the option “don’t affect colors” enabled or disabled. As I said, no big deal.

Mari to Softimage by Xuan Prada

Recently I was involved in a master class about texturing and shading for animation movies, and as promised I’m posting here the technical way to set-up different UV sets inside Softimage.
Super simple process and really efficent methodology.

  • I’m using this simple asset.
  • These are the UVs of the asset. I’m using different UV sets to increase the quality. In this particular asset you can find four 4k textures for each channel. Color, Specular and Bump.
  • You probably realized that I’m using my own background image in the texture editor. I think that this one is more clear for UV mapping than the default one. If you want you can download the image, convert it to .pic and replace the original one located on C:\Program Files\Autodesk\Softimage 2012\Application\rsrc
  • This is the render tree set-up. Four 4k textures for color, specular and bump. Each four textures are mixed by mix8color node.
  • Once everything is connected, you still need to offset each image node to match the UV ranges.
  • I know that the UV coordinates in Softimage are a bit weird, so find below a nice cart which will be so helpfull for further tasks.
  • Keep in mind that you should turn off wrap U and wrap V for each texture in the UV editor.
  • Really quick render set-up for testing purposes.

Faking SSS in Softimage by Xuan Prada

SSS is a very nice shader which works really great with a good lighting setup, but sometimes  is so expensive shader when you´re using Mental Ray.
Find below a couple of tecniques to deal better with SSS. Just keep in mind that those tricks could improve your render times a bit, but never will reach the same quality than using SSS for itself.

  • I’m using this simple scene, with one key light (left), one fill light (right) and one rim light.
  • A SSS compound is connected to the material surface input, and the SSS_lightmap (you can find that node in the render tree -> user tools) connected to the lightmap input of the SimpleSSS. And then, the Simple SSS lightimap connected to the material lightmap input.
  • Write the output and resolution of your lightmap.
  • Hit a render and check the render time.
  • Disconnect the lightmap.
  • Render again and check the render times as well. We have imprpved the times.
  • If you need to really fake the SSS and render so fast, you can bake the SSS to texture using RenderMap, but keep in mind that the result will be much worst than using SSS. Anyways you can do that for background asset or similar.
  • Now you can use another cheaper shader like blinn, phong or even constant with your baked SSS.
  • As you can see the render is now so fast.

Dealing with normal maps in Softimage by Xuan Prada

Yes I know, working with normal maps in Softimage is a bit weird sometimes, specially if you worked before with 3D Max normal+bump preset.

I’ve been using the same method over the years and suited fine for me, maybe would be useful also for you.
I prefer to generate the normal maps inside Softimage rather than Mudbox or Zbrush, usually works much better according to my tests with different assets.

  • So, you should import in the same scene both geometrys, high and low. Don’t be afraid of high poly meshes, Softimage allows you to import meshes with millions of polygons directly from Mudbox or Zbrush.
  • With both meshes in your scene be sure that they are perfectly aligned.
  • Check the UV mapping of the low resolution mesh.
  • Select the low resolution mesh and open the ultimapper tool.

- The most important options are:

  • Source: You have to click on your high resolution mesh.
  • Path: Where your normal map texture will be placed.
  • Prefix: A prefix for your texture.
  • Type: You can choose between different image formats.
  • Normal in tangent space: The most common normal map type.
  • Resolution: Speaks for itself.
  • Quality: Medium it’s fine. If you choose high the baking time will increase a lot.
  • Distance to surface: Click on Compute button to generate this parameter.
  • Click on generate and Softimage will take some time to generate the normal map.
  • The normal map is ready.
  • Hide your high resolution mesh.
  • Grab one of the MR shaders and drag it to your mesh.

- Use a normal map node connected to the bump map input of the shader.

  • Choose the normal map generated before.
  • Select the correct UVs.
  • Select tangents mode.
  • Uncheck unbiased tangents.
  • Hit a render and you’ll see you normal map in action.
  • Cool. But now one of the most common procedures is combining a normal map with a bump map.
  • I’m using the image above.
  • If you use a bump map generator connected into the bump map input you will have a nice bump map effect.
  • Find below the final render tree combining both maps, normal and bump.
  • The first bump map generator has two inputs, color matte which is a plain white color and the normal map with the options which I already commented before. Be sure to select relative to input normal in the base normal option of the bump map generator.
  • The second bump map generator is your bump texture where you can control the intensity increasing or decreasing the factor value.
  • The vector math vector node allows you to combine both bump map generators.
  • Connect the first bump map generator  to the first input and the second one to the second imput.
  • In the operation option select vector input1 + vector input2.
  • Final render.

Inverted occlusion in 3D Max by Xuan Prada

People asked me for a step by step installation and usage of Binary Alchemy Color Ray Length shader in 3D Max.
Here we go.

Installation

  • Download BA Shaders for 3D Max.
  • Copy .dll files here -> “3ds Max 2010\mentalray\shaders_3rdparty\shaders”
  • Copy .mi files here -> “3ds Max 2010\mentalray\shaders_3rdparty\include”
  • Edit “3rdparty.mi” located here -> 3ds Max 2010\mentalray\shaders_3rdparty
  • Your “3rdparty.mi” must be something like this.

Usage

  • Create a matte/shadow shader and uncheck “receive shadows” and “use ambient occlusion”.
  • In the “camera mapped background” input, connect a “BA_color_raylength” shader.
  • Play with the “spread” to control the behaviour of the occlusion.
  • Once rendered you’ll have something similar to this.
  • Mix the “BA_color_raylength” with procedural maps or bitmaps to improve the result.

Edit: The most important parameters to play with are “spread” and “far output”.

Inverse dirt maps by Xuan Prada

Sometimes is very useful to generate inverse occlusion bakes to reach an interesting starting point to paint our dirt maps.
Vray dirt material is perfect for this goal, but if you don’t work with Vray, is very easy to do the same with Mental Ray and Binary Alchemy Shaders.

  • You need to install the Binary Alchemy Shaders. Some packages are free and you will have to pay for another ones.
  • Apply a “surface shader” to the object and connect a “BA_color_raylenght” to it.
  • Put this shader in “Inverted Normal” mode and play with his parameters.
  • We get an inverted “ambient occlusion”.
  • Use a “blend colors” , “layered shader” o similar to combine this inverted occlusion with a nice bitmap.

Worn edges by Xuan Prada

This technique is based on “worn edges techniques” by Neil Blevins.

Requirements

  • 3D Max Scanline Render
  • SoulBourn Scripts
  • Warp Texture Script
  • All the objects must have a correct UV mapping

Procedure

  • We must complete perfectly the UV mapping of the objects, without overlappings and similar common issues.
  • To reach better results, we need more geometry information, especially in the corners.
  • For that purpose, duplicate the objects, rename them and apply them some bevels in the corners and one or two turbosmooths if necessary. (but try first only adding bevels).
  • Note: All the object mesh must be “Editable Poly”.
  • Select the object and execute “Corner edge to vertex map” script.
  • We will have to play with the low and high angle parameters, especially decreasing the intensity of the low angle when more complex geometry has the object.
  • The next step is to distort this mask created by vertex color, to give it more caotic shape and indeed, more real aspect.
  • We need to download the “warp texture” plugin.
  • In a standard material connect the warpt texture to diffuse channel.
  • In the target input connect a vertex color. 3D Max put by default the vertex information which we have generated previously with the corner edge to vertex script.
  • In the warp input connect a procedural noise, whose parameters will vary depending of scene scale and object size.
  • If we hit a render we reach a pretty decent results, but we need to define better our mask.
  • If we put an output in the vertex color channel, we can play with the curve for empathize the results.
  • In the noise we can also play with his output.
  • To finish, we can bake this mask to paint it in a more appropriate software.

Dust shader by Xuan Prada

This shader is completely procedural, easy to setup and quick to render. It's so basic but it gives you the sense of real dust, adding to your props some realistic properties and richness.

It might be useful for environments like basements, storage rooms, etc.
I'll be proposing two different ways to create this shader.

The first version is probably better, in a way that looks more realistic, but render times are also higher.
The second one is more simple, less real but it renders way faster.
It is always a good idea to know how to setup both version, depending on your production needs so here we go.

Dust_material_vray_v001

This material has been created using a blend node that mixes two different shaders using a mask. You can use a procedural or bitmap masks.

The first shader in the mix is the one designed to create the surface properties of your asset. In this case is called "teapot". Very simple shader with some reflection properties.

The second shader in the mix is the one designed to be dust. It's a simple shader without any kind if reflection, just matte properties.

You can also add some procedural noise as bump map, to simulate dusty areas and make some differences against the teapot material.

As mask I'm using a towards/away falloff in z direction. Playing with the mix-curve you can control the behaviour of the mix.

In the white colour of the mask we can add some procedural noise to create variation.

The best way to adjust all the shader parameters is testing them one by one in different renders.

Teapot shader.

Dust shader.

Dust mask.

Final render.

Improved dirt maps by Xuan Prada

When we start the task of texturing props, environments or characters, one of the first steps we accomplish after completing the UV Mapping, is baking different texture maps or procedural maps (or any other type) to a later use them as a base point to paint our textures.
Probably the most baked map is the ambient occlusion, to reach a base of dirt in logic areas which we will modify in Photohop, BodyPaint or Mudbox.

In my own experience I think that occlusion bakes are incredibly useful but a little bit boring.
Find below a simple method to create a more funny and live ambient occlusion bakes.

  • Using a costant shader or any other which are not affected by lighting, we mix using mix2colors, blendColors or Mixer.
  • In the channel 1 of the mix we use a map. Can be procedural or bitmap. In this example I’m using a tiled bitmap.
  • In the channel 2 of the mix we use a white color.
  • In the mask of the mix we use an ambient occlusion.

With this method we achieve that the bitmaps or procedural maps get masked by the occlusion, creating dirt on the logical and less exposed areas, but with the variation given by the map.
The keys to reach good results are the configuration of the occlusion and the quality of the maps.

This is the scene that I'm using for this example.

Regular ambient occlusion render.

Ambient occlusion render mixed with bitmap textures.

Dirt shader in Softimage.

Dirt shader in Maya.

Dirt shader in Max part 1/2

Dirt shader in Max part 2/2