Darren Aronofsky’s Noah is not quite the Noah tale you remember. While there is an ark and a flood, the director conjures up so much more in his imagining of the biblical story – from giant fallen angels to mythical powers and the timelapse depiction of 14 billion years of evolution. To help tell his Noah, the director looked to both practical and digital means for building an ark and showing large scale flooding and ocean effects.
ILM’s Ben Snow oversaw the film’s visual effects, relishing the opportunity to work on a slightly ‘unusual project’. “They sent us the script and Darren sent an explanatory note with the script,” recalls Snow. “And he sent a book of imagery with it too had that some pretty inspiring and interesting stuff that really showed this was going to be something a little bit different than what we were used to. So that’s the sort of thing I particularly love, and leaped at.”
Crafting the Watchers
Fallen angels known as Watchers exist on Noah’s world. Imagined as rock-like giant six-armed creatures, the Watchers play a crucial role in helping Noah build the ark and defending it against his adversary Tubal-cain and a horde of Tubal-cain’s followers. ILM was responsible for the Watchers.
“They were definitely one of our biggest creative challenges on the film,” says Snow. “When Darren first came to ILM one of the first things we did was a really interesting virtual cinema session where we took one of the things from the script which involved the Watchers, and had a few of our performers down on the stage in mocap suits and a little virtual camera and an ark exterior environment that we made up pretty quickly in Photoshop and a rough ark model. The Watchers were these sort of large humanoid creatures the way we’d imagined it.”
As production continued, Aronofsky enlisted the help of several art directors to keep exploring the look of the Watchers. “In the end,” says Snow, “the design we pinned down was from a sculptor working in Brooklyn, a colleague of the art director. They were mucking around in the art director’s backyard with some wax and sticks and stones and came up with this crazy design. Darren sent me a picture of it and I was like, ‘You’re kidding, you can’t be serious.’ And we got on the phone, and well actually it’s kind of cool. We started from this almost odd looking papier mâché sculpture and then had to do what we are experts at which is trying to make this weird crazy-ass design into a functional thing that you could believe on screen.”
The backstory of the Watchers is revealed as they crash into the Earth as energy and light ‘beings’, but then emerge from the liquified rock around them. “There’s this key shot where we see the impact of this comet,” describes Snow, “and we have a big explosion that we shot at 32TEN Studios as a miniature effect. Then you see this glowing creature we’ve just been introduced to in space, actually forming, and pushing up all this tar, and you can see this covering of molten tar. That was oddly enough done with the same fluid simulator that we used for our big water effects.”
Since the Watchers would effectively be moving pieces of rock, artists had to explore how this would appear in animation. “The characters are not superheroes,” explains ILM associate visual effects supervisor Philippe Rebours. “They fell down on Earth and they’re slightly broken as well because the Creator does not talk to them anymore. We studied the movement of people who were paralyzed or could not move their legs.”
“Darren even got some of the dancers he’d worked with on Black Swan to come into the mocap stage and do a bit of work before we’d designed it and after we’d designed it,” says Snow. “At one point Darren said ‘I really want them to be stuttery. And I said, ‘Well, it’s going to start looking like Ray Harryhausen with that stop motion quality,’ and he’s like, ‘That’s not a bad thing!’ He was all for that. We built stutter into their motion but then we were like, ‘Ah gee, it really does end up looking stop motion-y’ so let’s explore smoothing that out.”
The Watchers’ facial animation was also tested thoroughly. “We did some tests where we decided to go with much smaller rock pieces so they could have more movement and fluidity which looks more like a human,” says Rebours. “We did a facial mocap test where we took some dialogue from some tests we were doing separately from the movie and applied that to the creature. I think what happened is it looked too human and too realistic in a sense. Darren went in a completely opposite approach which is having the movement being minimal.”
The final animation relied on ILM’s system devised for Transformers, where, explains Snow, “you can take your base animation and override it so that you can specifically move pieces. That was how they added little impacts when the rocks collided.” The Watchers’ rocky surface was developed from photographic reference of rocks in Iceland where production filmed a number of scenes, varying surfaces and shapes so as to differentiate between Watcher characters. The final creatures were rendered in RenderMan.
Building an ark
Noah believes he has been chosen by the Creator to build an ark to prepare for ‘a great flood’. Noah plants a seed received from his grandfather Methuselah into the ground, from which an entire forest grows next to Methuselah’s mountain. Using the wood from the forest, the ark construction begins.
The sprouting forest started with live action plates captured on black volcanic ash soil in Iceland. Rivulets of water ultimately form fountains before a forest erupts at the location. A number of smaller shots envisaged for the sequence became one large spiraling shot as the forest grows. For this, helicopter plates of the actors in Iceland were filmed, with ILM launching into a fully CG build for the foliage.
“From the very start we knew we had to deal with the growing forest,” notes Snow. “We’d been using SpeedTree with some success here, so we actually got in touch with the guys at SpeedTree and said we’d have to be able to tweak the tools to be able to do the plant growth that we needed to do. We needed to grow a tree from nothing, basically. And we didn’t want it to just magically come up and unfold, we wanted it to start small like a sapling and have the twigs come out and the leaves expand at the end.”
Tree growth animation cycles were approved by Aronofsky and then choreographed for the final shots. These had to match the helicopter plate. “It was a tricky matchmove,” says Snow, “and then we got it matched and stabilized and then they wanted to extend the move with the mountain more, so we reprojected the mountain, extended the black plane they were on because it went back to normal arctic landscape, and then added this celestial sky around.”
The final forest was married to shots of the live action ark shoot on Long Island. “We re-set up the family camp there and put black gravel down to match what was in Iceland and extended that in CG,” says Snow, “but we added the same sort of tree growth in the shot following that so we’re looking up from the family’s point of view to see the trees finish growing. And it provides a nice tie-in. From the film’s point of view it represents the end of this pre-age of Earth with all these celestial skies. It’s a thematic turning point.”
Look Effects contributed shots of a rivulet forming – essentially a magical stream emanating from Noah’s camp that spreads out and leads the animals to the ark. “This proved to be a challenging sequence from a conceptual standpoint,” says Look visual effects supervisor Dan Schrecker, “because we needed to sell a couple of ideas: that time was passing and that the rivulets were branching out from their origin point in the forest. We initially had timelapse plates with which to work, but as the concept of the sequence changed we ended up creating a fully digital environment.”
To show the passage of the time, Look assembled a series of sunset images for the skies, flashing each one for two frames and animated different elements in each shot to further sell the idea. “For instance,” says Schrecker, “in one of the shots we see a series of buildings crumble before our eyes and piles of trash disappear as it is scavenged by humans who flash through the frame. The entire effect is one of timelapse photography that portends the work that ILM did later in the film as Noah recounts the tale of Genesis to his family.”
Shots of the animals arriving were shared by ILM, who created the land based creatures, and Look Effects, who concentrated on birds.
Interestingly, Aronofsky sought to differentiate the animals in his Noah from other re-tellings of the story. “In Darren’s first cover letter he sent with the script,” recalls Snow, “it said not even the animals will be clichéd – there won’t be elephants and giraffes – but he said there will be species that don’t quite exist on Earth. Everything will be a little different and fantastical. It was definitely from day dot to not do the children’s book version where you have the giraffe sticking out the top of the ark heading off into the sunset.”Watch a featurette about the animals of Noah.
ILM had to design thousands of animals. “We talked about ways we could do this,” says Snow. “There’s this game – Spore – where you can design animals and Darren though maybe we could put that on the web and everyone could design animals and if we like their designs they could go into the film?”
Ultimately, in order to realize so many varieties of creatures, ILM embarked on a ‘Zoo Project’ in which a basic animal toolkit was assembled that could be used to build the necessary kinds of animals that would board the ark. Artists looked to both real and extinct animals, particularly ones that were a little more unusual in nature. They then grouped animals into different types in an effort to work out what was similar between them.
“There were groups that were more visual or motion related rather than actual scientific classes,” notes Snow, “but we came up with those with the idea that we’d make a skeleton complementary to each of these groups of animals and be able to take animation and apply it to that skeleton across these range of animals, and similarly textures could be shared, because the topologies would be very similar.”
Those similar groupings applied to palettes of pelts and textures too. “For each of these different classes we made several variants that were pretty wildly different to one another,” says Snow. Then we made a library of horns, trying to avoid anything too distinctive like moose horns, and a set of different patterns and fur design – and we’ll able to with that end up with a whole set of mammals 2×2 each one being different.”
Reptiles and snakes proved particularly challenging since, as imagined by Aronofsky, they would be part of each wave of animals and would be crawling over one another as they arrived. “With the reptiles,” recounts Snow, “we did what we’d done with the mammals and we made two or three base lizard variants but then from one variant you could have a frilled-neck lizard and then the horn-toed lizard. We were able to add add-ons and things to make many varieties. With the snakes, it was much more pattern based. We had different head shapes we could put on the lizards.”
Although there were only relatively few animal shots, they were often quite long – and complex. ILM had 8,000 animals in one shot, for instance, and 2,500 in another. This was the first major show in which the studio used Massive to animate the creatures (and also people) in the scenes. “The big pullback through the ark was the biggest shot we’d had to render here in terms of processing time,” says Snow. The animals were rendered in RenderMan.
A significant challenge was proximity – ensuring the animals did not run into or walk on top of each other. “Also to have the pairs intelligently following each other,” adds Snows, “because you want to see they’re two by two. You don’t want them to go off wandering into the hinterland leaving their mate behind. They had to keep themselves close to one another and not run into much bigger or smaller animals.”
Animators would craft walk cycles, as well as cycles for setting down and going to sleep. “We’d go in and for the animals that were front and center, we’d do another layer of animation on top,” says Snow. “So we’d got the Massive system integrated well enough into the rest of our pipeline that we could go back and forth between the two and swap out things for hand animation where appropriate.”
Look Effects delivered complex bird animation and flocking shots for the arrival sequences, too, overseen by visual effects supervisor Dan Schrecker. The studio built a new proprietary feather and flocking system in Houdini in order to create what needed to be realistic but also varied birds.See more animal arrival shots in this clip.
“The whole thing started off with us figuring out how do we get a feathered bird to look like a feathered bird and not like hair,” explains Look Effects CG supervisor Dave Zeevalk. “We wrote a proprietary tool that generates a blue noise pattern that applies tons of points onto a 3D surface and then based on a pre-defined scale of each point, analyses the points around it and pushes other points away from it until it’s given itself enough room to meet its scale without colliding with another point.”
“Then we generated a feather tool,” adds Zeevalk, “to create a photoreal feather that gets instanced onto each point. And then through tools like pencil type tools we could sculpt really fine detail in terms of scale and where a feather lies and the direction that it lies, and then we’d paint maps direct onto the surface of your bird for dots and stripe. At render time, each of these feathers gets baked out as geometry with all the appropriate maps for detail.”
Look embraced a deep compositing pipeline for the show – a toolset Schrecker says was invaluable for one particular shot of Noah and his family walking down the aisles of bird roosts with their magic smoke. “Because they were essentially criss-crossing each other during the course of the shot, such that a single bird would appear in front of and behind other birds and the practical set, standard holdouts were impossible,” describes Schrecker. We relied on deep techniques to combine the CG elements with the roto mattes that were required for the practical roosts. This allowed us to do things like pull individual birds out of the comp fairly late in the process, stay flexible with color and depth of field without introducing edge issues and also deal with the semi-transparent curtains that were present in the practical set.”
“A further approach that we took here was to render our elements with full deep RGB and break out passes,” adds Schrecker. “Normally, the approach is to render deep opacity and shadow maps because there is less data to manage and in most scenarios this is enough. But because of our volume of birds and the amount of flexibility we wanted, we took a more extensive approach. This led to huge amounts of data that we had to manage, but in the end enabled us to achieve what we set out to do.”
The battle and the deluge
As the deluge begins, Noah and his family are threatened by Tubal-cain and his army. A enormous rain-soaked battle ensues in which the storm increases in intensity while the Watchers and Noah fight off their attackers. This battle sequence would combine live action photography acquired at the ark set in Long Island, practical rain effects, ILM water simulations and crowd creature animation.See b-roll of the storm and battle sequence.
Snow says Aronofsky wanted the battle sequence to be consistent with the director’s overall goal of being a “realistic, grittier take on the biblical epic.” So discussions between the filmmakers, including the director, Snow, DOP Matthew Libatique and special effects supervisor Burt Dalton, centered on capturing as much battle footage in camera as possible on the ark set. Since maintaining consistent lighting over what would be an area two football fields in size would have been extremely challenging, the decision was made to film the battle as ‘night for day’ using artificial lighting and practical rain effects.
Dalton devised a system of rain bars that hung from cranes over the battlefield and could be computer-controlled from an iPad app to adjust the level of rain. In addition, the rain bar structure included balloon lights that offered a way of lighting the area without shadows. The result was that Dalton’s team could dump 5,000 gallons a minute of rain on the area while the fighting scenes were shot.
Of course, the practical rain shoot was always going to be augmented with visual effects work – for both rain and the more flood-like scenes and water geysers that end up erupting from the ground. ILM has had a number of recent heavy water shows, including Pacific Rim and Battleship. For Noah, the studio built on top of their existing fluid sim tools, with work led by Raul Essig. “Some of the efforts to revamp the tools for Pacific Rim were really aimed at getting things a lot more controllable and fast,” states Snow, “which is something we definitely wanted to do on Noah, and I felt that some of the Battleship stuff had some of the real sense of the qualities of the water simulations.”
“So a lot of the toolset was oriented to these needs – essentially these large fights amidst a large body of water,” says Snow. “We needed a lot more tweaks and controls to the water. We worked on our toolset to give us back more control-ability, but now being able to utilize some of the increased speed so that we could iterate and do full water simulations.”Watch a clip from the deluge sequence.
ILM looked to practical rain from the principal photography plus elements shot at 32TEN to augment the shots during compositing. Computer graphics sequence supervisor Polly Ing notes that the compositing team had a mammoth job on the battle shots. “Compositing had to help us compress some of the wet source-y looking specular highlights on all the people’s clothing, the leather, the weapons,” she says. “From there we had to figure out how are we going to integrate the live action into the plates.”
An on-set team had captured HDRIs, LIDAR scans, spheres and other reference photography during the shoot. “So we had the actual information of every single light that was on set and the cranes, and I built 3D rigs based on different camera angles and positions on the set,” explains Ing. “From there I pretty much used from the top of the tree lines down for the actual 3D environment. Then I changed the upper half of the sphere so that it would reflect more of the sky we wanted instead of pitch black. By combining those together we had the lights that we could use to light and get a base starting point to match the people. But then we could use artistic license.”
The shots made heavy use too of Massive agents attacking the Watchers and the ark. “Darren started realizing what we were doing and the quality of the digital doubles and how close we were getting to them,” comments Snow. “Like many a filmmaker who we unfortunately show our hand to, who say, ‘Oh my God these guys can do anything,’ they say, ‘I never cared for these four or five shots, why don’t we get rid of those and come up with a big shot that really showcases the Watcher going to town on the crowd?’” ILM therefore was able to re-create the battlefield as a digital matte painting and employ Massive crowd sims, hand-animated people, splashing water and of course the digital Watchers to achieve the desired shot.
For ILM it was vital that their live action people, Massive agents and water effects integrated perfectly. The studio relied on Katana to guide lighting and rendering. “The simulations for the crowd were done in Massive and we would bring in the particle files into Katana,” says Ing. “From a sequence level we had all of our different light rigs based on the position on the set, the camera angle, so we could lock off different sections of a sequence to different lighting TDs and it was super easy to share.”
Water impacts, splashing and other effects helped sell the interaction, too, including for the digital Watchers. “We had extensive look development on The Watchers and we used various shader tricks to make it look like water was running down them and displaced off of the rock,” adds Ing. “We did splatters of rain drops and things flinging off the Watchers that were set up in a canned kind of way so lighting TDs could run those out for their shots.”
ILM also took a unique approach to how rain appears on screen by dialing in some of the effects only when they reached the DI stage at Technicolor. “We’d already seen how well the colorists could use the mattes to do the magic that DI does these days,” notes Snow, “so I said, ‘If we could give you a matte of the rain, maybe we could do this in a DI session?’ I went in and selected rain elements that would be appropriate for everyone, then Technicolor could make mattes for those and add them as an extra channel into the DPX files. Then in the DI session, the colorist could use that matte to dial in more rain. It was a really elegant solution in the end.”
As the deluge continues and only the ark and its inhabitants are safe, ILM’s shots became the ark at sea. “The way we approached that sort of thing was to try and do some sort of base simulation,” explains Snow, “but then we find that we want to add a bit of animation on top of that to really make it work. You might start with a base simulation to make it work in your buoyancy rules for the rigid body simulation, and then usually you want to go in and do some animation to add the sort of qualities that look pleasing.”
The ark does not have any rudders or propulsion – it is literally built in a rectangular shape for survival. “We wanted it ride in the water,” says Snow, “so we decided the water level just be below the top deck level where the family might actually come into the sun occasionally, then it would need this sort of ballast. So we consulted with hydrographics people and worked out it’d need this amount of ballast to actually float properly.”
Hundreds of millions of years of evolution…in one year
On the ark, Noah describes to his family the history of creation on Earth. The accompanying imagery is a two-minute long exploration of 14 billions years of evolution, told in a timelapse ‘staccato’ form. ILM handled the unique sequence, which took a year to do, overseen by associate visual effects supervisor Grady Cofer. “Darren wanted it to be very cohesive and be very striking,” says Cofer, “and it had to work on multiple levels. It had to work allegorically in its depiction of the book of Genesis and it had to work scientifically.”
Above: watch the sequence.
An early animatic developed the ‘agitated’ look of the footage that would later become more timelapse in style. “The previs was impressive,” recalls Cofer, “and I would show it to people at ILM and they’d say, ‘That’s awesome! How are you going to do that?’ We didn’t really know but we were excited for the challenge.”
ILM also referred to timelapse footage that was shot with still cameras and intervalometers as they continued to explore just how the final frames would appear. “Whereas in traditional timelapse you might set a camera to fire every 10 minutes or hour, in this case, because we’re talking about geological time, every frame might be hundreds of millions of years,” describes Cofer. “We were afraid what that might do to the frame. If you have a creature walking on a landscape, and every frame is millions of years, that means everything has to change – everything.”
“There’s a lot of stochastic noise in there,” adds Cofer, “so things are very agitated, but generally you’re telling the story of one day. We’d try to find threads like that so the audience could follow along and not be facing chaos.”
Storyboards of the sequence were reviewed by Aronofsky at ILM for further refinement. At the same time, ILM developed an ‘evolutionary chart’ – the same approach that had been taken to realize the animals entering the ark. “We tried to list every creature in the fossil record and put him on a timeline – every possible ancestor of humans going all the way back to a single cell organism,” says Cofer. “We had this spreadsheet and I brought in my modelers and View Painters and they kind of panicked! It was such a huge list of creatures, so what we did was grouped them into body types – fish, lizards, mammals and primates. And typically you would take an asset and develop it and paint it and maybe render a turntable to analyze it. But in our case because we were dealing with this stop motion language, a creature might exist for one or two frames and then it was off to a new evolved version of the same creature.”
The idea was that artists could find morph shapes between the models and body types that could be transitioned between each other. “Our modelers were trying to hit key creatures and then we’d maybe find shapes in between and come up with our own little variant of a creature half way through,” outlines Cofer. “Our View Painters were trying to do the same thing – we weren’t just doing random patterns. We would try to think of logical evolutionary progressions. You start with spots and the spots start stretching and maybe that leads to better survival and that turns into stripes. So if you actually step through it there’s a bit of logic in how things evolve.” ILM would also render alternate versions so that compositors had different options of the same creature on particular frames with varying textures.
The environments and backgrounds were also elements that had to be realized by ILM. The studio researched terrain generation tools, volumetric rendering techniques for nebulas and other landscape creation tools. In particular, the ‘sixth day’ of creation follows a creature over a landscape, something the filmmakers thought might be captured photographically and then treated to make it feel evolutionary.
For this, ILM carried out a live action test shoot. “We took two 5D cameras and mounted them underslung, so they’re upside down on a Steadicam rig,” explains Cofer. “We had a remote trigger and we went to a beach up here in Marin up near ILM and we created this path along this beach. And we walked the path firing in burst mode – maybe six frames a second. And we would walk that path and come back and we’d wait for the sun to change a little bit, and we’d walk it again. We kept doing those same runs over and over again and we took all of those images back to ILM and we almost edited randomly between them and the rocks would agitate a little and the water would be there for a frame and gone for the next frame. We presented that to Darren and he was into it and thought it had promise.” Ultimately, background live action plates were captured by Cofer and team of ILM photographers in Iceland by shooting these ‘hyperlapse’ pieces with the rig.
All images and clips © 2014 Paramount Pictures.
We've been a free service since 1999 and now rely on the generous contributions of readers like you. If you'd like to help support our work, please join the hundreds of others and become an fxinsider member.