April 04, 2018
Written by Leif Pedersen
Milk Visual Effects is an award-winning visual effects studio with offices in London and Cardiff, their accolades include an OSCAR® for the feature film Ex-Machina, an Emmy® for Sherlock, and 3 BAFTAs. The Milk team chose to rely on RenderMan for one of their latest challenges, creating highly ambitious Stereo 3D renders at 6K resolution, for an immersive theme park-style experience titled Dinosaurs in the Wild.
This was Milk's first time using the renderer. "This was our first RenderMan project. The choice was based on a number of factors: quality of the images, the speed of rendering - especially for feathers as several dinosaurs were covered in them – as well as scalability and price. This was discussed with all the departments: Lighting, FX, Systems, Pipeline as everyone was affected," said Benoit Leveau, Head of Pipeline. Getting started quickly was a crucial need for the artists, "There's always a lead time in getting even experienced artists used to any new software, but we had time to learn and experiment at the start of the project. The IPR really helped the look-dev stage and Renderman's implementation within Maya, its use of simple Mel calls, allowed much of the scene building and lighting process to be automated," said Darren Byford, CG Supervisor. "Within roughly two weeks we were producing work that appeared in the final project. By the end, we'd developed a lighting pipeline that could build, check and submit final shots to the farm in minutes," added Darren.
The team shot the dinosaur environment on a farm in Oregon, USA, with extensive VFX preparation for tracking. There, they shot and tracked a single four-minute sequence for the safari style ‘drive’ sequence, which takes the audience on a simulated drive across the plains to the Dinosaur observatory. "We tried to use as much footage as possible but, in addition to the creatures, we created CG water, dust, debris, grass and a complete CG tree.
Milk relied on standard RenderMan bridge tools to output comprehensive renders with shadow passes. "We rendered a separate, optimized shadow layer and output the shadow using the standard Shadow AOV," said Darren Byford, CG Supervisor.
They created a stereo rig for outputting four large "windows" in the ride. "As you'd expect. Four stereo cameras aimed away from the center so that animation could be continuous between windows. The stereo settings were lifted from the on-set cameras," added Darren.
Many RenderMan-centric workflows allowed Milk to optimize production. "The ground scan and grass geometry, were replaced by RIB Archives to reduce RIB generation times and lighten network traffic," said Sam Lucas, Head of Modeling. The team also relied heavily on added detail at render-time. "We used vector displacement maps on all of the creatures. Once we dialed-in the displacement settings the detail was perfectly crisp, even for close-ups. Of course, it helps to have detailed maps. On the Alamosaurus, for example, we had nineteen 8k 32bit vector displacement maps," continued Sam. Handling tremendous scene complexity is one of the core strengths of RenderMan.
-Darren Byford, CG Supervisor Dinosaurs in the Wild
The Milk team relied on the out-of-the-box toolset for shading and lighting. "We're not a large company with the luxury of a dedicated shader writer, so we used RenderMan pretty much out-of-the-box. The standard PxrSurface and PxrLayerSurface shaders did everything we needed," said Darren Byford. "We also used Jensen Dipole Subsurface models for the skin. Our favorite part of the shader system was the ability to use Pixar Layer Surface to add dry, dirt and mud layers on top of the base skin layer. It gave a depth and complexity to the creatures," added Darren. The team also relied on the Marschner hair shader throughout the show. "Our favorite part of the hair shader was its predictability. Sometimes hair shaders can generate odd rendering artifacts; strange glints and flickers. But once a creature's look development had been approved we rarely had to tweak the hair shader, even in different lighting environments," said Darren.
Matching the plate's lighting was a crucial step they had to get right. "After some tests, we realized that we could get the quality we needed simply by using a good HDRI on a dome light. We rendered the shadows on a separate layer, speeding them up by replacing the dome light with a matched directional light just for that layer. The IPR certainly helped during the look-dev phase," said Darren. Their HDRI methodology was pretty straightforward. "We used a camera with a fisheye lens, mounted on a tripod, took multiple exposures in multiple directions, then the images were stitched together to form the HDRI. We had maybe, thirty or forty from the set, which we auditioned against the lighting balls from the footage to find the closest match. Once the best four were chosen, the crew and tripod were painted out and they were graded for a perfect match. We also isolated the sun in the Nuke script so we could tweak its value and balance it against the blue sky fill and ground bounce light. All of this was done as soon as we got the reference from the set so the look-dev could be assessed in the context of our final four lighting environments," explained Darren.
Milk also relied on out-of-the-box RenderMan volume tools. The FX artists did the simulations in Houdini and exported them to VDB files. They would then publish these VDB caches using their pipeline tools. "All of the four observatory window shots and the drive sequence contain dust effects that were rendered in RenderMan using volume rendering. The east window, north window and drive all had around 15 separate dust volumes each," explained Benoit Leveau, Head of Pipeline. Lighting artists at Milk would load the VDB files in Maya using the VDB visualization and render nodes provided by Pixar, to integrate the dust effects with the rest of the scene.
-Dave Goodbourn, Head of Systems Dinosaurs in the Wild
Milk originally didn't use Denoise for the project but discovered its usefulness during additional work late in the deliverables. "When we learned that the Denoiser could be applied to all of the AOVs, it allowed us to half the render times in one simple step. The geniuses in our Pipeline department added a tickbox to our farm submission tool. If it was more complicated than that, then they didn't let on!" said Darren Byford, CG Supervisor.
The team's compositing pipeline also benefited from RenderMan arbitrary output variables (AOV). "Not only did we have the flexibility of fine-tuning the renders with the plate, but we shared look-dev setups between compers. This meant we all worked on the same node layouts, so our scripts looked quite similar, making it extremely easy to find what the artists were looking for. As soon as one creature reached the level we wanted, we could share that and know it was consistent through the whole project," said Matias Derkacz, 2D Supervisor. Due to the ambitions for the project, the team had to deal with very high complexity, in both data sets and image files. "We had to be extremely organized and methodical to manage the amount of data that we had on each comp. We tried to group dinosaurs based on their interactions with what was shot. Also we had to use depth merge to manage complex layering in a more efficient way. But the key was pre-comps because as soon we finished with one asset we baked all work done on every layer in a single image file," added Matias.
Besides the production pipeline, the systems department had to solve the need for tremendous network rendering, so they relied on the Google Cloud platform. "Our cloud pipeline treats the cloud nodes as our own so they all use our existing internal pipeline. No special measures were required for artists to run RenderMan in the cloud," said Dave Goodbourn, Head of Systems. Cloud rendering was a vital component for reaching the deadlines. "We pitched for the job knowing we couldn't do it without rendering it all in the cloud. Given the volume and scale of the project there was quite simply no other way to complete this job. We rendered 77 million 6K stereoscopic frames in the cloud in 10 weeks," added Dave.
The Milk team finished the project coming away with many "happy surprises" from RenderMan. "When we learned that the Denoiser could be applied to all of the AOVs it allowed us to half the render times in one simple step. I was quite impressed by the quality of the denoiser and the ease with which we could integrate it into our pipeline," said Darren Byford. "Also, the support we received from Pixar was excellent. With RenderMan developers in the US and in the UK, it means we could get very quick replies, including a few custom builds to test changes before they were officially released," concluded Dave.