October 28, 2018
In this series, we will be exploring the technical side of rendering with RenderMan by taking a "how-to" approach to the lessons. The classes are extended with technical breakdowns, explainer diagrams, and renders. We also dive into live Maya sessions to show concepts in action!
Even though this training has been designed using version 21, most concepts are directly applicable to version 22.
How does a texture, procedural pattern, BxDF, Integrator, Displacement, Display Filters all fit into the picture of the RenderMan21 RIS architecture? In our first class, we take a look at the overall architecture of the pattern graph node connectivity in RenderMan RIS 21, and shed light on the general philosophy of Pattern's feeding into BxDFs evaluated by Integrators.
What does Thin Shadows do on your light sources? How does it compare to having them off, and where does Allow Caustics fit in? We look at all these topics with some nice renders from RenderMan21, explain the differences in the renderer and look at a diagram of what's happening scientifically in the renderer to create these realistic effects that are art-directable!
There are some features in complex caustic and glass rendering that we have to utilize bi-directional path tracing with the VCM and "Trace Light Paths" on our lights. We explore these features, show what's happening in the renderer, and do an interactive session in RenderMan for Maya.
Continuing off form our last class where we explored complex caustics with bi-directional path tracing, we expand into more features of the VCM Integrator. In the last class we modifed Trace Light Paths on individual lights but always kept the default "Connect Paths" and "Merge Paths" on in the VCM: we now explain what happens when configuring different combinations of these core features of the bi-directional path tracing integrator, with respect to complex caustics through liquid and glass, as well as hidden light sources.
How do emissive mesh lights differ from the brand new analytic lights in RenderMan21? We take a look at the sampling strategies and compare these two, to understand the differences in these light sources, and when you would want to use one over the other. We jump into a Maya session to create various mesh lights, compare with the new analytic PxrSphereLight, show diagrams of the sampling strategies, look at the RIB file created, and much more.
You can use the standard LPE (Light Path Expressions) for your typical AOVs, but did you know you can write your own custom ones in an expression language? Also, LPE's are not just for AOVs, you can control light paths in your beauty renders. We introduce custom LPE's, with diagram explainers, and spend time in Maya showing where and how they are setup including custom script locations to share your LPE's with others in the studio.
Taking off from last class, we go further into more complex examples of LPE's. The powerful feature of "lpegroup" is explained and setup in our scene in Maya, and we use the geometry tag to modify our LPE expressions. Now we can control not only specific lights, but also specific geometry in our complex LPE expressions! We also build up some of the more complex LPE expressions, showing grouping via [] () and * + operations for indirect lighting.
Jensen Dipole, Deon Better Dipole, Burley Normalized, Multiple Free Mean Paths, Singlescatter - that's 5 new subsurface models in PxrSurface! We compare all of them, give the scientific breakdowns, and do some hands-on Maya sessions in this class on subsurface scattering.
Continuing from last class we dive deeper into setting specific "per-lobe" LPE's for PxrSurface in the context of our subsurface scattering and MFP Color. We explore "diffuse optimization" that's happening under certain circumstances with subsurface in PxrSurface, and introduce per-lobe LPEs to write our own custom ones in order to get all our desired results in one AOV. We show how to edit the RIB file to get this result, and render command-line with our changes, as well as through Maya.
When doing complex layered materials, there is a handy template called PxrLayerSurface that sits on top of PxrSurface: we look at the differences between PxrSurface vs PxrLayerSurface and some tips and tricks that could be frustrating if you don't know them. We also show how to modify the xml .args file that represents the user interface to customize the behavior for you, your team, or your entire studio. We then build up an example with our wine bottle scene adding labels and wax drips on top of glass.
In this bonus class, we take the same topic of Class 10 in layering with PxrLayerSurface, and build it up in scratch in Katana RfK (RenderMan for Katana) with some different examples. This way you can see the same nodes being used in completely different 3d packages! We also cover general workflow in Katana with RfK and shading.
Sections:
A comparison of the PathTracer and VCM Integrators and a look at ideal usage scenarios.
Christos Obretenov was educated at Simon Fraser University with a BSc in Computing Science specializing in Computer Graphics, Christos Obretenov started contributing to the Animation and Film industry at Mainframe Entertainment in conjunction with Simon Fraser University. He continued his career by designing and developing shading software for Walt Disney's "The Wild" feature film, followed by shading and lighting for Superman Returns, Spider-Man 3, Beowulf, Christmas Carol, Mars Needs Moms, and Life of Pi (which won the 2013 Oscar for Best Visual Effects) feature films. Recently co-founding LollipopShaders.com, Christos develops procedural solutions to shading and lighting, currently experimenting with Physically Plausible Shading. He resides in Vancouver, Canada.