In this video I will show you my process to convert 3D scans into assets ready for production. I believe the audio is in Spanish so, feel free to mute it or try to learn some Cervantes language :)
What if you are working with Ptex but need to do some kind of Zbrush displacement work?
How can you render that?
As you probably now, Zbrush doesn't support Ptex. I'm not a super fan of Ptex (but I will be soon) but sometimes I do not have time or simply I don't want to make proper UV mapping. Then, if Zbrush doesn't export Ptex and my assets don't have any sort of UV coordinates, can't I use Ptex at all for my displacement information?
Yes, you can use Ptex.
- In this image below, I have a detailed 3D scan which has been processed in Meshlab to reduce the crazy amount of polygons.
- Now I have imported the model via obj in Zbrush. Only 500.000 polys but it looks great though.
- We are going to be using Zbrush to create a very quick retopology for this demo. We could use Maya or Modo to create a production ready model.
- Using the Zremesher tool which is great for some type of retopology tasks, we get this low res model. Good enough for our purpose here.
- Next step would be exporting both model, high and low resolution as .obj
- We are going to use these models in Mudbox to create our Ptex based displacement. Yes, Mudbox does support Ptex.
- Once imported keep both of them visible.
- Export displacement maps. Have a look in the image below at the options you need to tweak.
- Basically you need to activate Ptex displacement, 32bits, the texel resolution, etc)
- To setup your displacement setup in Maya and Vray just follow the 32 bits displacement rule.
- And that's it. You should be able to render your Zbrush details using Ptex now.
Short and sweet (hopefully).
It seems to be quite a normal topic these days. Mari and Zbrush are commonly used by texture artists. Combining displacement maps in look-dev is a must.
I'll be using Maya and Arnold for this demo but any 3D software and renderer is welcome to use the same workflow.
- Using Zbrush displacements is no brainer. Just export them as 32 bit .exr and that's it. Set your render subdivisions in Arnold and leave the default settings for displacement. Zero value is always 0 and height should be 1 to match your Zbrush sculpt.
- These are the maps that I'm using. First the Zbrush map and below the Mari map.
- No displacement at all in this render. This is just the base geometry.
- In this render I'm only using the Zbrush displacement.
- In order to combine Zbrush displacement maps and Mari displacement maps you need to normalise the ranges. If you use the same range your Mari displacement would be huge compared with the Zbrush one.
- Using a multiply node is so easy to control the strength of the Mari displacement. Connect the map to the input1 and play with the values in the input2.
- To mix both displacement maps you can use an average node. Connect the Zbrush map to the input0 and the Mari map (multiply node) to the input1.
- The average node can't be connected straight o the displacement node. Use ramp node with the average node connected to it's color and then connect the ramp to the displacement default input.
- In this render I'm combining both, Zbrush map and Mari map.
- In this other example I'm about to combine two displacements using a mask. I'll be using a Zbrush displacement as general displacement, and then I'm going to use a mask painted in Mari to reveal another displacement painted in Mari as well.
- As mask I'm going to use the same symbol that I used before as displacement 2.
- And as new displacement I'm going to use a procedural map painted in Mari.
- The first thing to do is exactly the same operation that we did before. Control the strength of the Mari's displacement using a multiply node.
- Then use another multiply node with the Mari's map (multiply) connected to it's input1 and the mask connected to it's input2. This will reveal the Mari's displacement only in the white areas of the mask.
- And the rest is exactly the same as we did before. Connect the Zbrush displacement to the input0 of the average node and the Mari's displacement (multiply) to the input1 of the average node. Then the average node to the ramp's color and the ramp to the displacement default input.
- This is the final render.
It is always a bit tricky to set up Zbrush displacements in the different render engines.
If you recently moved from Mental Ray or another engine to V-Ray for Maya, maybe you should know a few things about displacement maps extracted from Zbrush.
I wrote down here a simple example of my workflow dealing with that kind of maps and V-Ray.
- First of all drag and drop your 16 bits displacement to the displacement channel inside the shading group attributes.
- Maya will create a displacement node for you in the hypershade. Don’t worry to much about this node, you don’t need to change anything there.
- Select your geometry and add a V-Ray extra attribute to control the subdivisions and displacement properties.
- If you exported your displacement subdividing the UV’s, you should check that property in the V-Ray attributes.
- Edge lenght and Max subdivs are the most important parameter. Play with them until reach nice results.
- Displacement amount is the strength of your displacement and displacement shift sould be half negative than your displacement amount if you are using 16 bits textures.
- If you are using 32 bits .exr textures, the displacement shift should be 0 (zero).
- Select your 32 bits .exr file and add a V-Ray attribute called allow negative colors.
- Render and check that your displacement is looking good.
- I’ve been using these displacement maps. 16 bits and 32 bits.