The Creator

© 20th Century Studios

Stories | RenderMan at ILM | The Creator

RenderMan at ILM | The Creator

Written by Leif Pedersen

Innovation through collaboration

The Creator is set in a world torn by the conflicts between humans and technology, where innovation and heart drives hope against all odds. Interestingly, this premise can also be used to describe the creative process of this awe inspiring film, where innovative collaborative filmmaking techniques were used to deliver Director Gareth Edwards’ creative vision.

Edwards’ background in visual effects really helped relate to technology and reach creative consensus throughout pre-production. “Gareth had a specific vision in his head throughout the film process and his ability to relate to visual effects meant that we needed less technology on set to achieve creative clarity,” said Charmaine Chan, VFX supervisor at ILM. “We traditionally have to account for many LiDAR scans, HDRI’s, etc … but we instead resorted to traditional filmmaking techniques, where we get a plate and make it work,” she added.

The spectacular imagery in The Creator was made with an efficient budget

Ian Comley, VFX supervisor at ILM noted that the creative process of The Creator was refreshingly welcome as abstracting highly technical ideas on set takes a specific visual effects language. “Ultimately, Gareth had these ideas in his head and knew where the shot and the characters were going. When we got the shots, we could further pick things apart and refine, but his creative formula was always there from the start,” added Comley.

“It’s hard to put Gareth and ILM artists in a room and not want to push shots to their creative potential … it’s who we are.”

Pushing and Pulling

ILM’s vast talent pool of creatives tried to push the boundaries of what was possible for each shot, even when ambitions called for a modest approach at first. “We looked at enhancing vehicles and characters in small ways, but many times a full replacement was needed to take the shot where it needed to be creatively,” said Ian. “It’s hard to put Gareth and ILM artists in a room and not want to push shots to their creative potential … it’s who we are,” he added.

As part of the overall budget constraint, The Creator used minimal tracking data on set, which also had a performance benefit for the actors, as they weren’t distracted by tracking markers on their face or lengthy setup times. “Many times, the actors didn’t know if they would end up being a robot or not, as Gareth would simply roll the camera and make the call in post. This really helped actors act naturally, which in turn gave the film some fantastic and natural performances,” shared Comley.

Minimal setup time and tracking markers resulted in natural performances.

Effective Communication

Clear communication and expectations also helped the production down the pipe, as ILM was able to predict shots much more accurately. “We would have creative conversations with Gareth during our plate cleanup process, which helped us streamline the painting process, avoiding unnecessary plate cleaning since we knew things were going to be covered by an asset ahead of time,” explained Ian. “This also helped our three-strikes approval process for the show, where we really needed to keep the approval flow efficient to meet the budget. This was incredibly hard to commit to, but many shots were able to proceed in that fashion, which meant we had healthy discussions about where to really spend the extra time on the VFX of a major story point,” Comley added.

The Creator’s limited budget didn’t necessarily limit creative expectations, as ILM was able to work through the visual storytelling in other ways, explained Chan. “You can have amazing looking visual effects if you keep it simple. We’re here to help tell a story and sometimes less is more,” she said. “Philosophically it’s in the best interest of artists to get things in camera when possible, not everything needs to be CG. That’s something worth celebrating in this show,” added Comley.

Director Gareth Edwards in beautiful Thailand


Edwards was always excited to introduce new technology into the process where there was a clear benefit , including StageCraft, ILM’s revolutionary virtual filmmaking process. “Besides giving us great virtual cameras for our full CG shots, we had huge environment shoots with StageCraft where Gareth’s expertise really made a difference in the quality of the cinematography,” said Chan. “We were very optimized throughout the production process, making only what was needed for a shot and implying a larger world without having to build out massive environments all the time, extending things with matte painting layers when needed,” added Comley.

ILM's Stagecraft

“The little imperfections are what really make a shot be grounded in reality and the way the film was shot lent itself really well to this philosophy,” explained Chan. “We tried to really match the plates by profiling everything and making things imperfect, including Lens aberrations and noise. Our lighters were amazing at getting us beautiful renders and we also extended the fidelity of the renders with some wonderful compositing using RenderMan AOVs and LPEs in Nuke,” she added.

The Creator sets the stage for a dystopian yet amazing world

“We relied so much on RenderMan geometry lights, especially for embedded robot lights. We leaned on that very heavily,” said Comley. “We also used light filters heavily on the show, especially to artfully craft shadows for clouds, which allowed us to really dial in the softness and scattering in shots,” he added.

“I’m sure XPU will be our future workflow, because of what it’ll offer interactively and offline.”

The RenderMan machine learning denoiser really helped achieve converged images in a significantly shorter time than before. “We leaned heavily on the denoiser. This allowed us to get predictable convergence, giving us the opportunity to re-grain renders in a consistent manner to match the plates,” explained Comley. “We didn’t have to resort to semi-converged frames to simulate grain. We were able to rely on measured and captured grain structures very predictably,” he added.

Matching the imperfections of film plates is essential to convincing CG

RenderMan IEEE Milestone Award

Learn about this milestone moment and how Pixar and Lucasfilm were instrumental in the advancement of computer graphics

Worldwide Scale

A film of this magnitude needs a lot of artist collaboration and ILM’s worldwide creative network allowed them to scale very predictably with work being done in the company’s studios in San Francisco, London, Vancouver, Sydney and Singapore as well as a number of smaller VFX vendors. “We very much like to have a common technology core between studios. This came in handy when the amount of shots grew substantially from 800 to 1800,” said Comley. “We could rely on Lama, our standard layered material system, to collaborate more effectively with partner studios. This was especially beneficial for NOMAD, the giant space station, which needed to be seen from a multitude of shot types, from extreme closeups to very wide shots,” explained Comley.

Leveraging new workflows which benefit this need for collaboration is also a fundamental and ongoing exploration with every film ILM takes on, including adopting new tech, such as Pixar’s openUSD, or further developing open standards such as EXR or MaterialX. “Although we’re still very much leveraging ILM’s Zeno as our main content creator, we’ve got a fair amount of USD in the mix already, including a proprietary USD plugin for Zeno,” said Comley. “Pivoting any company of scale to a completely new standard such as USD is not an easy task, but it’s definitely getting there,” he added.

One of the many amazing establishing shots in The Creator

Rendering the Future

Looking ahead, RenderMan is prioritizing interactivity, aiming to support artists in achieving a seamless creative process. “I’m sure XPU will be our future workflow, because of what it’ll offer interactively and offline,” said Comley. “For the longest time we’ve had a deficit between the interactivity that an animator expects and the visual fidelity of a final render and we’re looking forward to closing that gap with XPU, where we can use a common shading language to simplify the feedback loop between animation and look development,” concluded Comley.

“This was a show that absolutely leaned on RenderMan and it was a very seamless process,” explained Chan. “It helped us make the show incredibly successful,” she concluded.

NEWS | RenderMan 26 is here!