ILM is no stranger to the SciTech Awards; numerous innovations crafted by its artists and engineers have been recognized with prestigious Scientific and Technical Awards from the Academy of Motion Picture Arts and Sciences. The studio’s latest two SciTech honors were awarded recently for ILM’s GPU-based simulation and volume renderer Plume and the Zeno application framework. fxguide spoke to key members of ILM’s team about these in-house visual effects tools.
In full Plume
With the release of The Last Airbender in 2010, audiences first witnessed the power of ILM’s Plume. For that film, director M. Night Shyamalan sought a specific and art-directable pyro effects solution – ILM delivered a system based on GPUs that would allow artists to do multiple iterations, quickly, to show to the director. The system, Plume, is capable of simulating and rendering final quality volume renders via the GPU. It was recognized this year at the SciTechs with a Technical Achievement Award. The full award citation is: “To Olivier Maury, Ian Sachs and Dan Piponi for the creation of the ILM Plume system that simulates and renders fire, smoke and explosions for motion picture visual effects.”
ILM R&D engineer Olivier Maury told fxguide that what sets Plume apart from many GPU renderers is that both facets of simulation and rendering of things like smoke, explosions and fire are combined into the one tool. “The artist uses Plume to go from inputting simulation parameters and tweaking them, all the way to final rendering,” says Maury. “So what they get at the end is final frames that they can pass onto compositing.”
– The SciTech Award is presented for Plume.
The origins of Plume lie ultimately in the limitations ILM had faced previously with creating fast iterations for heavy simulations. “We had experimented on Harry Potter and the Half-Blood Prince with a GPU-based solution (for a scene of Dumbledore summoning a tornado of fire),” recalls Maury. “But we were a little bit limited with the look we could get out of it. It was a sliced-based solution, it was a 2D and a half – it didn’t allow us to move the camera freely around the fire or art direct too much around the fire.”
When The Last Airbender show was in its infancy, the team looked to create a fully-3D Eulerian solver “akin to what you would build for CPUs,” says Maury. This was tied with the volume renderer. “At the very beginning we weren’t quite sure that we were going to render directly,” adds Maury, “but very quickly we realized that the simulation was so fast and the renderer was so fast, that actually taking the most time was actually saving to and loading to disk. So we got rid of the IO because the grids were getting big – the IO time was starting to be the bottleneck. We got rid of that stage and started doing direct to renderer, and that’s really one of the major characteristics of Plume – an artist can go directly from simulation to render without having to visualize the simulation results in some kind of intermediary tool.”
This promotional video from NVIDIA discusses the use of their GPU cards at ILM on The Last Airbender.
Plume was written as a ray-marching renderer with fire and smoke built-in. Primitives and particles would be generated in ILM’s Zeno software (see below) which then fueled the simulations. The firepower required to generate these sims came from the adoption of NVIDIA’s GPUs and CUDA – beginning with the Quadro FX 5800s (for The Last Airbender, ILM ran a 12-machine GPU-based render farm with the Quadros). Today that farm is now 128 GPUs.
Initial sims run by artists at ILM were 640x320x320 – about 65 million voxels. “It’s not a huge sim, but fairly decent,” says Maury. “Even today on CPU a lot of artists would choose this res to be able to iterate quickly, even though they can go much higher res on CPU.”
That number has now increased to 180 million voxels, with artists being able to carry out GPU sims on their local workstation or via the farm. “At the beginning,” notes Maury, “we had artists at a workstation with local GPUs and they had a second GPU so they wouldn’t tie the server. Then we were pumping so many sims out, the question came up, we should have some kind of farm. But if we have only a farm then you gave away the immediacy of being able to iterate very quickly. We tweak the software (ILM’s SciTech winning ObaQ render queue management system) so that the pick-up times are very low. Basically we can launch a job and have it picked up very quickly and have it dispatched to one of the GPUs in the farm.”
Over time, Plume has been used on just about all major shows for smoke, fire, dust and explosion effects, including, among others, The Avengers, Battleship and the upcoming Transformers: Age of Extinction. Interestingly, although Plume is a full fluid solver, it is not used by ILM for fluid simulations. “It’s a gas solver,” explains Maury. “We can’t do liquids with Plume because of that. It’s not tracking an interface or a levelset, and also today’s modern liquid solvers are mostly based on a FLIP solver.”
Plume continues to be refined, of course. One addition made was support for deep output to aid in deep compositing for scenes. Maury notes that the extra data generated can mitigate the speed advantages of Plume, but the option is there, along with a hold-out feature that allows the input of deep data.
The system has also become so adept at serving artists with quick iterations that “instead of running huge sims, people here now tend to run many smaller sims,” says Maury. “And this is how we work with Plume now which is to layer a lot of sims together.”
Many things to many artists: Zeno
The Academy also acknowledged this year the innovation behind another ILM tool – Zeno. In use since 1997 at the studio, Zeno might not be something most people can identify with, but the content creation system for both 2D and 3D visual effects work is at the core of ILM’s production pipeline. The full technical achievement award citation is: “To Florian Kainz, Jeffery Yost, Philip Hubbard and Jim Hourihan for the architecture and development of the Zeno application framework.”
The development of Zeno came about following ILM’s need for artists from a wide number of effects disciplines to work together and exchange data within both proprietary and commercial tools. “A mechanism that allows artists to collaborate and transfer data back and forth between them has always been a critical part of an efficient workflow,” states ILM’s R&D supervisor Cary Phillips, who is not named on the SciTech award but has been a key figure in Zeno’s continued development at the studio.
“Most significant is the mechanism we here call ‘shot files’, the scene graph component,” says Phillips. “Engineers use these to build networks of data that compute values and store data and manipulate data. Built into this system is a very robust file referencing mechanism that allows you to take all of the data, the scene graph that represents the information in your shot and parcel it out in a completely arbitrary way to different files on disk. It allows us to make decisions about what gets stored in particular files according to the workflow that we want to craft around it.”
As visual effects shots have become more complex – involving live action plates, complex CG characters and effects sims – Zeno has allowed ILM a method to represent more commensurate complex scenes. “You have a framework that allows you to build more or less arbitrary stuff,” says Florian Kainz, ILM’s computer graphics principal engineer. “Then you break it into units that make sense for an individual artist to work on, and then pass onto other artists. You can build the kinematic rig of a creature, have skin simulations and what have you. The system is extensible and the core Zeno framework imposes a whole lot of restrictions on how the work is split up amongst groups or artists.”
An early application in the late 1990s that took advantage of Zeno’s framework was ILM’s camera matchmove system called MARS (another tool that was recognized with a SciTech award). Zeno brought together the mechanisms to connect the typical outputs and the solvers that you would need from various elements of a scene – elements such as geometric objects, coordinate transforms, cameras, images, elements in 2D, three-dimensional axis in space.
Watch the Zeno team receive their SciTech Award.
Phillips explains: “There’s a two layered scene graph. At the lowest level there are mechanisms for defining what we call operators which are the objects that compute values, perform a computation and provide an output. That’s roughly equivalent to the Maya hypergraph. But what Zeno has is a higher level construct that is another scene graph that is made up of a data object we call ‘Oid’. At the higher level there are Oids, and Oids are connected via relationships.”
“You’d have a camera, a background plate, and a solver,” continues Phillips, “and you have relationships that connect between them. So the user’s view of the data is Oids that have attributes on them and they connect them to other parts of the scene. The engineering that this award is recognizing is the higher level mechanism of defining what are the data objects in the scene, how are they connected together and how are they distributed out to files.”
Although Zeno came into existence around 1997, the tool gained significant prominence for its use in the production of the visual effects for War of the Worlds (2005). “That was the time we retired our old lighting editor and adopted the use of Zeno primarily for lighting at the front end and rendering with RenderMan,” says Phillips, who also highlights Pirates of the Caribbean: Dead Man’s Chest and ILM’s realization of the Davy Jones character as a show taking full advantage of Zeno. The system is now in major version 3.0, having experienced a major re-tooling of its interface around the time of Transformers: Dark of the Moon (2011).
Internal tools are built directly on Zeno’s framework, while third-party commercial tools are still very much a part of the system. For example, ILM now relies on The Foundry’s KATANA lighting tool instead of its earlier Lux lighting editor. And texture painting, once handled solely via Zeno, is carried out in MARI (artists still utilize Zeno’s UV layout tools, however).
Phillips says “you can ask ten different artists or engineers at ILM, ‘What is Zeno?’, and you’d probably get ten different answers.” But perhaps that is the beauty of a framework that, as an interactive content creation system, works for a diverse group of VFX specialists and facilitates the creation of ILM’s world class imagery.
In addition to Plume & Zeno…FLUX & Voodoo were honored
It is important to note that at the 2014 SciTechs in LA at the Beverly Hills Hotel last Saturday night, Ronald D. Henderson was also honored for the development of the FLUX gas simulation system at DreamWorks Animation. Henderson’s use of the Fast Fourier Transform (FFT) for solving partial differential equations allows the DreamWork’s FLUX system to create a greater level of algorithmic efficiency when multi-threading on modern CPU hardware. This innovation enables the creation of very high-resolution fluid effects while maintaining fast turnaround times, but using CPUs instead of ILM’s Plume GPU solution. Henderson and the ILM teams both publicly showed their work for one of the first times at the same SIGGRAPH – but completely independently. Henderson’s work has been used in DreamWorks’ films such as the recent animated feature The Croods, and was developed completely independently from the ILM team. Henderson spoke to fxguide about how honored he was to receive the award and his mutual respect for the GPU work of Plume by the ILM team.
Peter Huang, Chris Perry, Hans Rijpkema and Joe Mancewicz were also honored for the Voodoo application framework at R&H. For more than a decade, Voodoo’s design concepts and approaches have enabled a broad range of character animation toolsets to be developed at Rhythm & Hues. When accepting the award the team paid particular respect to the artists at R&H who helped, tested and suggested improvements to the Voodoo system.
(Voodoo was developed independently from Zeno)