September 22, 2021
My name is Dawid Cencora. I am a senior 3D artist working in the VFX industry and I am passionate about characters, creatures, and monsters.
In this short tutorial I will show you how I created the Wendigo creature. I hope you enjoy it and find some interesting tips and tricks. Don't forget to download the project file (2GB) to follow along!
Wendigo (Windigo) originates from Native American folklore. It has many symbolic interpretations but the most common ones are the incarnation of winter, the embodiment of hunger and the personification of selfishness (Dark!).
According to the legends, a Wendigo is created whenever a human resorts to cannibalism to survive. Wendigo is described as an astonishingly tall creature with razor sharp claws and teeth and lips that are often ragged or gone completely.
I find creatures in legends and mythology fascinating. There are many beautiful beasts and truly hideous monstrosities. The stories around them are engaging, captivating, yet usually negative but if you go a little bit deeper you will often find that things are not so black and white.
To start the project I browsed through many mythological creatures. I wanted to create something with fairly humanoid anatomy but with much more exaggerated features and interesting shapes. As always, pinterest and artstation are great places to look for inspiration with so many great artists posting incredible concepts every day.
This is what gets me pumped and releases my creativity. If you find something that inspires you, do not wait for another day, better weather or full moon, just start creating. Even if you do not know the software that well or you are missing some knowledge on how to do it, you will learn along the way and you will have much more fun doing it.
At the same time I am gathering as many references as I can. I am looking not only at concepts of Wendigos to get an idea of what it could look like. I also look for references of skin, anatomy, secondary shapes, lighting, and many other details that I might draw inspiration from. I use Pureref software to keep all my references and concepts in one file.
I know this is not a very appealing topic but I believe that keeping your files organized and in a structured way is crucial when working on a more complex project. I always split my work files, exports, and caches to create and publish folders. I also name them accordingly with a version number, this way I always know where to find a specific file, which goes a long way into improving my referencing process for lighting.
There are few ways to start the sculpting process in Zbrush depending on your pipeline, time, budget and needs. In my case I started with a simple sphere and quickly pulled out the humanoid shape using the Sculptris Pro feature. While you are working on your sculpt, make sure your primary shapes are solid before jumping to secondary forms. Your sculpt needs to look good from every angle, have strong gesture, good rhythm (especially when posed) and a clear recognizable silhouette. The forms need to flow and feel natural. Primary and secondary shapes are the most important, your sculpture will work even without tertiary details if those are resolved correctly.
To make the secondary and tertiary details even more organic, you can project micro details from Mari in a layer and use them as a guide to sculpt your secondary forms. This way your secondary and tertiary forms work better together and have more depth.
If you start your project directly in Zbrush using built in primitives you might want to adjust the scale of your model. I like to work with real life scale set to centimeters in every software to avoid issues with values, lights, maps and simulations. The way you can handle that is totally dependent on your workflow, just make sure everything lives in the same realm to make your life easier.
Before moving to texturing it is important to have clean, even topology and unfolded UVs with the same texel density spread evenly across udims. I made sure I have nice edge loops following the main features of the forms and even quads across the entire character. If you are going to fully animate and deform the character, make sure you have edge loops that will support the deformation. For offline rendering, you don’t need to keep your geometry at the lowest level. Semi dense topology (100k-200k) will hold the shape nicely while tessellating during render time. For this project I mixed manual retopo with zremesher. This was completely enough for the needs of this project.
To pose my Wendigo I used a good ol’ T-Pose master in zbrush. At this point I did not know I would be animating and simulating the entire character down the line. T-Pose master allowed me to pose my character exactly how I wanted it and then fix all geometry stretching problems after posing.
I exported a few types of maps from zbrush to jump start the texturing in Mari. To make my life a bit easier and the result closer to the original zbrush sculpture I decided to use zbrush vector displacement instead of normal scalar displacement. Vector displacement will give you more accurate results by not only displacing in one direction, but all three axes, and it also mixes nicely with scalar displacement with tertiary details from Mari. If you want to export Vector Displacement for RenderMan go to zbrush Preferences, Import/Export, Vector Displacement Map and set the Tangent FlipAndSwitch to 43.
Once all my maps and geometry are ready I start bringing everything into Mari. It is important to set your color space in the beginning of the project. I would recommend using OCIO ACEScg for the entire project, which is fully supported by RenderMan. If you are using a Non-Commercial version, you can go with default settings and then convert your textures to acescg tex files. As for the geometry I usually work on decimated mesh. There is no need to use a few millions poly meshes. The more optimized my scene is from the beginning, the better.
Due to limitations of my Non-Commercial version of Mari I split the UVs to only 6 UDIMS for the body and heart. Cloth and antlers were imported as separate objects. This way I was able to texture each model in a separate node graph, see the part I was working on in context, and have more UDIMs to work with.
On top of what I exported from zbrush I usually bake a few maps like thickness, curvature, ambient occlusion, and world space normal out of Substance Painter. Basically everything that could help me get my texturing to another level. I will overlay, multiply and blend those maps with my base color later in the process. To create those maps I would use a high poly model with all the details I sculpted in Zbrush.
To create displacement for this guy I used a few types of displacement textures. To make the tertiary details feel organic, I made sure the projected texture is following skin features correctly and stretches where it should. Mari has great tools that allow you to move, stretch and bend your texture before you commit and bake it to the model.
If you plug your displacement texture to the bump slot in the shader you can preview how it looks in real time while projecting.
To connect the albedo and displacement even more I overlaid scratches and scars on top of projected displacement details. The beauty of working with nodes is that I can reuse and blend any detail with any channel to make it look better and more natural.
To create albedo texture I started with a mixture of soil and marble textures. Then I multiplied the result by the color of the skin that felt right to me. My base color was ready. From there I started blending and mixing utility maps, painting different skin tones, dirt, scratches, bruises, scars to show what he went through until this point in time. A secret to good looking textures is to tell a story by showing the history of the object or character through texturing to increase it’s believability.
Very often I use parts of my base color and displacement and convert them to roughness texture. This way the skin feels more grounded and realistic. My creature was pretty dirty all over the place, so all dents and cavities should be rougher than cleaner parts of the body. To get blood to feel fresh, I used the mask from baseColor and graded a tileable grunge map so the roughness is not just a flat color.
I like to preview my textures directly in Mari. PBR viewport is fairly close to what you would get in your raw render. It’s nice to get fast feedback especially when you mix different materials together. Painting masks for blood splatter on the skin and cloth never felt so good. Having a real time preview of the roughness, color, and even displacement makes a huge difference in making creative decisions while speeding up the process.
If you are using commercial version of Mari you can directly export acescg tex textures with a very simple command:
txmake -mode periodic -short -format pixar -resize up- "$EXPORTED" "path/to/your/project/folder/texturename_$CHANNEL_acescg.$UDIM.tex”
Otherwise you can just export textures as they are and then use either Texture Manager directly from RenderMan (available in Maya, Blender, Houdini) .
I always keep different versions of my textures. It is important to keep all the iterations and progress in case I want to revert, compare, or debug something.
This was my first Katana Project. I believe that learning software on a project is one of the best ways to master it. Being engaged in your own personal creation keeps your creativity and curiosity levels high and those are important to solve problems and figure out new workflows. Katana allowed me to update my assets constantly without any pain or weird unexpected bugs. Figuring out the basics is painless and within a few minutes you can start rendering. Renderman stability in Katana is on another level. Entire workflow feels smooth like sailing. It is very easy to start rendering your shot, just bring in the camera, object, assign shaders procedurally, place some lights and you are good to go.
I used PxrSurface shader with Non-Exponential Path Traced subsurface model for Wendigo skin. This model gave me very realistic deep skin results making my creature even more scary. The teeth, claws, tongue, and heart got their own shaders , this way I could adjust the look of each part individually. I created a skin depth mask for some thinner parts of the body like hands and arms to avoid a deep and uniform wax look. Rendering everything with just one material would mean more nodes and masks and a bit more complicated workflow. I like to keep my pattern nodes to a minimum. Instead of an infinite amount of channels and masks from Mari fed to the material I would do the changes directly in Mari and feed the renderer with baked textures. I always plug color correction and remap nodes between the texture and shader for fine adjustments.
To create a fur shader I used PxrMarchnerHair driven by PxrHairColor. My goal was to keep the hair as realistic as possible. Instead of using color, I plugged PxrVoronoise to drive hair Melanin. This gave me different melanin values within some boundaries to keep the color I wanted and kept realistic looking fur across the entire character.
The environment was lit by small rect lights pretending to be a moon light. Having them separate allowed me to have complete creative control over the background, for maximum art direction. I put fire VDB caches inside cylindrical lights to enhance the illumination in the background. My fire shader had illumination turned on, but it was not enough to light up the scene. Using a volume container and smoke VDBs greatly enhanced the mood I was aiming for.
Wendigo was also lit by rect lights. My main goal was to use cold and warm rim lights to create a complementary color palette, one from the moon and another from the torch fire on the ground. Back-lit hair and skin always looks nice. I placed the key light right above his head to enhance all the cavities in the skull and lit up his face. Two fill lights from the front left helped to brighten up his body.
Rendering setup is pretty basic, as RenderMan these days is very simple to set up. I only adjusted a few sliders to get the most out of my creation. I did set mip mapping to -2 on the Wendigo’s displacement and baseColor texture. The camera was quite far away from the character and achieving the best possible quality was a priority even though it would render a bit longer. Micropolygon for the skin was set to 0.25.
This setting gave me very crispy looking displacement details from far. For the closeup renders I changed the value to default 1. I rendered the entire shot with PxrPathTracer integrator and 4 indirect bounces. To get a clean looking render I set Sample mode to bxdf 3/4/4, pixel variance set to 0.01 and max samples to 512.
This project was a perfect start to learn Katana. It contained most of the elements you would encounter in a production environment. Volumes, fire, cloth simulations, USD scatter, hair, procedurals, animations, and simulations. Deep diving into the Katana and RenderMan world was pretty seamless, although some of the workflows were completely new to me. Luckily there is a great community willing to help and answer all my questions.
To finish this little breakdown I just want to say, no matter what, stay humble and keep on improving. Keep everything balanced. Improving physical and mental health is as much as important as improving your artistic and technical skills.
Dawid is a Senior VFX Modeler and Lookdev artist with a strong passion towards characters and creatures and a desire to become a character artist. Currently working at Infected GmbH in Hamburg.
This asset is available with a Attribution-NonCommercial 4.0 International (CC BY-NC 4.0) License. This allows you to share and redistribute for non-commercial purposes, as long as you credit the original author.