UDIM workflow in Nuke by Xuan Prada

Texture artists, matte painters and environment artists often have to deal with UDIMs in Nuke. This is a very basic template that hopefully can illustrate how we usually handle this situation.

Cons

  • Slower than using Mari. Each UDIM is treated individually.
  • No virtual texturing, slower workflow. Yes, you can use Nuke's proxies but they are not as good as virtual texturing.

Pros

  • No paint buffer dependant. Always the best resolution available.
  • Non destructive workflow, nodes!
  • Save around £1,233 on Mari's license.

Workflow

  • I'll be using this simple footage as base for my matte.
  • We need to project this in Nuke and bake it on to different UDIMs to use it later in a 3D package.
  • As geometry support I'm using this plane with 5 UDIMs.
  • In Nuke, import the geometry support and the footage.
  • Create a camera.
  • Connect the camera and footage using a Project 3D node.
  • Disable the crop option of the Project 3D node. If not the proejctions wouldn't go any further than UV range 0-1.
  • Use a UV Tile node to point to the UDIM that you need to work on.
  • Connect the img input of the UV Tile node to the geometry support.
  • Use  a UV Project node to connect the camera and the geometry support.
  • Set projection to off.
  • Import the camera of the shot.
  • Look through the camera in the 3D view and the matte should be projected on to the geometry support.
  • Connect a Scanline Render to the UV Project.
  • Set the projection model to UV.
  • In the 2D view you should see the UDIM projection that we set previously.
  • If you need to work with a different UDIM just change the UV Tile.
  • So this is the basic setup. Do whatever you need in between like projections, painting and so on to finish your matte.
  • Then export all your UDIMs individually as texture maps to be used in the 3D software.
  • Here I just rendered the UDIMs extracted from Nuke in Maya/Arnold.

RAW lighting and albedo AOVs in Arnold by Xuan Prada

If you are new to Arnold you are probably looking for RAW lighting and albedo AOVs in the AOV editor. And yes you are right, they are not there. At least when using AiStandard shaders.
The easiest and fastest solution would be to use AlShaders, they include both, RAW lighting and albedo AOVs. But if you need to use AiStandard shaders, you will have to create your own AOVs quite easily).

  • In this capture you can see available AOVs for RAW lighting and albedo for the AlShaders.
  • If you are using AiStandard shaders you won't see those AOVs.
  • If you still want/need to use AiStandard shaders, you will have to render your beauty pass with the standard AOVs and utility passes and you will have to create the albedo pass by hand. You can easily do this replacing AiStandard shaders by Surface shaders.
  • if we have a look at them in Nuke they will look like these.
  • If we divide the beauty pass by the albedo pass we will get the RAW lighting.
  • We can now modify only the lighting without affecting the colour.
  • We can also modify the colour component without modifying the lighting.
  • In this case I'm color correcting and cloning some stuff in the color pass.
  • With a multiply operation I can combine both elements again to obtain the beauty render.
  • If I disable all the modification to both lighting and color, I should get exactly the same result as the original beauty pass.
  • Finally I'm adding a ground using my shadow catcher information.

Film dictionary by Xuan Prada

Pantheon
The group of directors with the highest level of achievement (Chaplin, Griffith, Hitchcock, Welles, etc.). The term is used in auteur criticism (Andre Sarris).

Safety shot
A second shot of a scene made either as insurance in case the previous shot might be faulty or as an alternative to offer the editor another camera angle or camera distance.

Import UDIMs in Zbrush by Xuan Prada

One of the most common tasks once your colour textures are painted is going to Zbrush or Mudbox to sculpt some heavy details based on what you have painted in your colour textures.
We all use UDIMs of course, but importing UDIMs in Zbrush is not that easy. Let's see how this works.

  • Export all your colour UDIMs out of Mari.
  • Import your 3D asset in Zbrush and go to Polygroups -> UV Groups. This will create a polygroups based on UDIMs.
  • With ctrl+shift you can isolate UDIMs.
  • Now you have to import the texture that corresponds to the isolated UDIM.
  • Go to Texture -> Import. Do not forget to flip it vertically.
  • Go to Texture Map and activate the texture.
  • At this point you are only viewing the texture, not applying it.
  • Go to Polypaint, enable Colorize and click on Polypaint from texture.
  • This will apply the texture to the mesh. As it's based on polypaint, the resolution of the texture will be based on the resolution of the mesh. If it doesn't look right, just go and subdivide the mesh.
  • Repeat the same process for all the UDIMs and you'll be ready to start sculpting.

Manfrotto Befree for visual effects by Xuan Prada

I've been using Manfrotto Befree tripods for a while now, and I just realize that they are a perfect tool for my on-set work.
I rarely use them as primary tripod, specially when working with big and heavy professional DSLRs and multi zoom lenses. In my opinion these tripods are not stable enough to support such as heavy pieces of gear.

I mean, they are if you are taking "normal" photos, but in VFX we usually do bracketing all the time. Like for texturing references, or HDRI's. The combination of the gear, plus the rotation of the mirror plus the quick pace of the bracketing, will result in slightly different brackets. Which obviously mean that the alignment process will not be perfect. I wouldn't recommend using these tripods for bracketing with big camera bodies and multi zoom lenses. I do use them for bracketing with prime lenses such as 28mm or 50mm. They are not that heavy and the tripods seem to be stable enough with these lenses.

I do strongly recommend these tripods for photogrammetry purposes when you have to move around the subject or set. Mirrorless cameras such a Sony A7 or Sony a6000 plus prime lenses are the best combination when you need to move a lot around the set.

I also use Befree's a lot as support tripods. They just fit perfectly my Akromatic kits, both Mono and Twins. Befree tripods are tiny and light so I can easily move around with two or three and they even fit in my backpacks or hard cases at once.

As you can see below, these tripods offer great flexibility in terms of height and expansion. They are tiny when compact and middle sized when expanded completely. Check the features on Manfrotto's site.

I also use these tripods as support for my photogrammetry turntable.
Moving around with such a small setup has never been so easy.

Obviously I also use them for regular photography. Just attach my camera to the provided ball head and start shooting around the set.
Finally I also use Befree to mount my Nodal Ninja. Again you need to be careful while bracketing and always use a remote trigger, but having the possibility to move around with two or three of these tripods is just great.

They have two different version (both in aluminium and carbon fibre). Both of them come with ball head and quick release plate. But, the ball head in the smallest tripod is fixed, can't be removed. Which means a lot of limitations because you won't be able to attach most of the accessories normally used for VFX.

Export from Maya to Mari by Xuan Prada

Yes, I know that Mari 3.x supports OpenSubdiv, but I've had some bad experiences already where Mari creates artefacts on the meshes.
So for now, I will be using the traditional way of exporting subdivided meshes from Maya to Mari. These are the settings that I usually use to avoid distortions, stretching and other common issues.

Quick renders by Xuan Prada

Quick and dirty exercises that I do when I have nothing else to do.

Combining Zbrush and Mari displacements in Clarisse by Xuan Prada

We all have to work with displacement maps painted in both Zbrush and Mari.
Sometimes we use 32 bits floating point maps, sometimes 16 bits maps, etc. Combining different displacement depths and scales is a common task for a look-dev artist working in the film industry.

Let's see how to setup different displacement maps exported from Zbrush and Mari in Isotropix Clarisse.

  • First of all, have a look at all the individual displacement maps to be used.
  • The first one has been sculpted in Zbrush and exported as .exr 32 bits displacement map. The non-displacement value is zero.
  • The second one has been painted in Mari and exported also as .exr 32 bits displacement map. Technically this map is exactly the same as the Zbrush one, the only difference here is the scale.
  • The third displacement map in this exercise also comes from Mari, but in this case it's a .tif 16 bits displacement map, which means that the mid-point will be 0,5 instead of zero.
  • We need to combine all of them in Clarisse and get the expected result.
  • Start creating a displacement node and assigning it to the mesh.
  • We consider the Zbrush displacement as our main displacement layer. That said, the displacement node has to be setup like the image below. The offset or non-displacement value has to be zero, and the front value 1. This will give us exactly the same look that we have in Zbrush.
  • In the material editor I'm connecting a multiply node after every single displacement layer. The input 2 is 1.1.1 by default. Increasing or reducing this value will control the strength of each displacement layer. It is not necessary to control the intensity of the Zbrush layer unless you want to do it. But it is necessary to reduce the intensity of the Mari displacement layers as they are way off compared with the Zbrush intensity.
  • I also added an add node right after the 16 bits Mari displacement subtracting the value -0.5 in order to remap the value at the same level than the other 32 bits maps with non-displacement value of zero.
  • Finally I used add nodes to mix all the displacement layers.
  • It is a good idea to setup all the layers individually to find the right look.
  • No displacement at all.
  • Zbrush displacement.
  • Mari high frequency detail.
  • Mari low frequency detail.
  • All displacement layers combined.

Film dictionary by Xuan Prada

As every Wednesday, a couple more cinematic words for my film dictionary.

Reaction shot
A shot of a character, generally a close up reacting to someone or something seen in the preceding shot. The shot is generally a cutaway from the main action.

Smoke pot
A small container that produces smoke for mechanical effects. The container holds some chemical, such as naphthalene or bitumen, which is fired either by electricity or a burning fuse.