In Journey 2: The Mysterious Island, director Brad Peyton enlisted visual effects supervisor Boyd Shermis to oversee 430 shots for the native stereo production set on a mythical island where creatures and environments are not what they seem. “In this film all kinds of wonderful and mythical things happen,” says Shermis. “There are all kinds of creatures, a hurricane, getting into the island, a bee chase, a giant lizard, an eel attack and then the island even sinks into the ocean.” Several vendors shared the workload, including Scanline, Pixomondo, Method Studios, MPC, Rising Sun Pictures, Trixter and ICOVFX. Previs duties were handled by The Third Floor and Pixomondo. We chat to each of the lead studios about their major shots.
Planning, previs and stereo
DoP David Tattersall filmed Journey 2 primarily with the Cameron-Pace rigs on Sony F35s (although Phantom, RED and SI-2K cameras were also used) on location in Hawaii and in studios in North Carolina. “The film was also designed from the start to be a 3D film,” says Shermis. “We knew it would be in 3D and wanted to take advantage of it – plus it’s a kids film. Kids still get a kick out of it when you poke them in the eye, so there are a lot of those gags. We would also use depth to enhance the emotion of the scene.”
Shermis found that the stereo nature of the production necessitated new thinking in terms of both designing shots and finishing the film. “Well, standard 2D projection systems typically operate at 14 foot lamberts of light, which is a pretty bright screen. But for 3D, with the projector, the screen and the polarizing glasses, you’re down to about 4 foot lamberts of light. So it’s roughly a third of the amount of light to display the same information. So your blacks will block up a little bit and your highlights will flip out to white much sooner than they otherwise would – the dynamic range is less, which is a challenge for the VFX process.”
One technique the VFX supe used to help achieve consistency across the shots was to rely on Toronto outfit Stereotech, part of Eyeon, to establish the correct colorspace, especially for the some of the underwater greenscreen plates shot for an encounter with an electric eel. “I sent all my raw materials through Stereotech and they did a proper extraction of the raw material and put it in a properly mapped EXR colorspace. Then I took it into a DI environment and ran a neutralizing grade across all of it, so all the underwater greenscreens had the exact same look and contrast range.” Similar work was done for the RED and SI-2K footage, and Eyeon will soon be releasing its toolset, called Dimension, to deal with stereo compositing throughput.
Previs specialist studio The Third Floor was responsible for several large sequences in the film, such as the lizard and bee chases and a ‘powers of 10′ shot pulling back from a kitchen table, through the atmosphere out to space and landing on the moon. “We also completed a good deal of postvis,” says The Third Floor’s previs supervisor Patrick Smith, “which, in a first for us, was done in stereoscopic 3D. Our postvis was additionally output at 1920×1080 HD so it could be used directly in temp screenings for the studio.”
A significant aspect of the previs effort involved “determining what size the CG creatures would be in comparison to the live-action characters,” says Smith, “and figuring out how fast these creatures moved and what were the most exciting paths of action. We also built our shots as stereoscopic previs so the visual effect supervisors and executives could get an idea of what the 3D was going to look like prior to shooting.” Boyd Shermis notes that the 3D previs helped determine what shots would hold up in stereo. “Doing fast camera moves and very fast editing patterns in 3D does not work very well. Your brain just cant fuse it. So we learned that we had to keep our editing patterns a lot smoother and when we hand-held shots we knew we’d have to flatten the 3D somewhat to help us through those editing patterns.”
For the bee chase in which the heroes ride giant bees before being chased by killer birds, The Third Floor conceived shots in previs directly from the script (see below for a discussion of Rising Sun Pictures’ final effects for the sequence). “This was one of the more exciting assignments creatively because the options were pretty wide open since we weren’t working from boards,” recalls Smith. “One of the key goals with camera and composition was to create shots to show the thrill of being on these enormous creatures flying through a dense jungle but also keep the main characters legible on screen so the the audience would be engaged and feel a high sense of peril and tension.”
The film features several scenes where digi-doubles were necessary to complete dynamic shots. For these, Shermis relied on contributions from USC ICT, Icon Imaging Studio, Lightstage LLC and House of Moves before the data was passed out to the various VFX vendors.
Here’s some info from a press release on the use of tech: “ICT’s Light Stage 6 captured full body lighting scenarios simultaneous to the scanning of character topology by Icon Imaging; Lightstage LLC recorded the actors’ facial shapes and appearances at the level of pore detail and fine creases; and HOM contributed character rigging and conducted mocap sessions. All facial and full body data, lighting and textures were captured in just a few hours resulting in the creation of high quality digital doubles that were VFX vendor-ready with minimal time required of the A-list talent.”
Into the storm
On board a helicopter, the film’s central characters – pilot Gabato (Luis Guzmán), his daughter Kailani (Vanessa Hudgens), Sean (Josh Hutcherson) and step-father Hank (Dwayne Johnson) encounter a massive weather front. They have set out to find Sean’s grandfather, Alexander (Michael Caine), assumed missing on the mythical island. An ocean environment and the freak storm were created by Scanline VFX in LA under visual effects supervisor Bryan Grill. Boyd Shermis co-ordinated a live action shoot of ocean and sky backgrounds for the chopper flight, along with practical windscreen raindrops, while the actors were filmed on a helicopter mock-up against greenscreen.
Scanline then constructed a digital helicopter and debris, fluids, sparks and smoke flying off it as it heads into the storm. “We modeled some major parts that would be ripped off, and then we had a shredding effect set up with forces which affected them,” explains Grill. “When we were inside the twister we had three different layers of debris, one going around more towards the edges, then broken pieces that would smash the chopper. There’s one shot where we animated some hero pieces and then the others are procedural.”
The studio relied on its fluid sim expertise and specialist software Flowline to realize several different kinds of water work – the rain, ocean and two twisters that emerge from it. “There was basic turbulent, stormy water, and around the base of the twisters were water funnels,” says Grill. “There was a completely different sim too that was based on the rotation of the twisters. “They started as water spouts and became these huge real-world-looking things. We found a nice balance that had water whipping off them and lots of mist, keeping them basically more like a water funnel instead of being made up of dirt and dust.”
Scanline used Journey 2 also as an opportunity to further adapt Flowline for a later sequence of the island breaking up and sinking into the ocean, which also called for earth crumbling sims, smoke, lava – all combined with water effects. “The water crashed up on the shores, with rocks and debris hitting the water,” says Grill. “It added overall turbulence, plus added mist and atomization of the water – something Boyd really wanted. That moment where you see the water splash against a rock – it creates a whitewater and the from the tips you see the mist be taken away by the wind.”
For one shot of the characters at the water’s edge as the island begins to crumble, production shot the actors on Hawaii’s China Walls – a place with smooth, sedimentary lava rock. The actors were then also filmed on a gimbal platform dressed to match the original area they were standing on. “That was trickiest part because it would shake and move but wasn’t connected to anything,” notes Grill. “We first tracked the gimbal, because in 3D it was pretty crazy. The surface was made out of fiberglass, so as it was moving, the actual portion they were standing on was flexing. One corner was flexing up and the other down – that made it very difficult as we had to do real estate to real estate hooking up our CG to make sure the area around them was breaking into the water.”
In comp, Scanline added in a cracking wall behind the characters, an erupting volcano and general chaos. Thinking Particles was used for procedural breaking and cracking while Flowline dealt with water, mist and other particles. “We also built some hero trees that were modeled and animated,” says Grill. “Then our background trees were sim’d and from movement on the ground they would shift around. For closer trees we had a basic sim and then would animate on top.”
Building the island
Many of Journey 2′s mysterious environments were created by Method Studios in Vancouver, which completed 50 shots for the film. Visual effects supervisor Mark Breakspear oversaw the work behind the island reveal and some of its locations, the City of Atlantis, the treehouse, Spooky Bridge, a beach shot, a bevy of creatures and a bouncing berry scene. “Boyd came to us for environments that existed in this gray area of reality,” recalls Breakspear. “We had to make them look real, but make them look like something we’d never seen before. They were the type of environments that were a bit more fantastical than I’d done before.”
An early shot involved placing the actors on an isolated beach surrounded by cliffs. “Originally we needed to extend the cliff up from a helicopter shot, and we thought, well maybe they could swim around, so we had to bring the cliffs out, and then we made them even higher.” Digi-doubles of the actors were ultimately added to the wider shot of the actors on the beach, while the ocean was also re-projected, marking just one of Method Vancouver’s stereo fix-its in the studio’s first 3D project. “Usually you get these little cresting waves and glints off the ocean, but in the shot they would be in the left eye and not the right eye. For some reason that happens day in and day out but your brain can disregard it, but when you’re watching a flat screen and your eye knows what it’s looking at, it doesn’t not like disregarding a highlight in one eye and flat in the other. So we took the left eye ocean and re-projected it in the right eye.”
For a reveal of the Mysterious Island, highlighting a beautiful landscape, waterfalls and forest, Method created a wide vista from geo in Maya and various projections in Nuke. “The biggest challenge was making it look real but also brand new – something the director wanted it to feel like,” says Breakspear. “So if you’re going to have a waterfall coming off a cliff, there better be enough land mass above it to justify having a giant waterfall. But also in this kind of film you can sometimes cheat it a bit. In all of our environments we also had giant butterflies, which are easy enough to create, but you take away things that are needed to give it scale, and it just looks like a miniature butterfly.”
Digital foliage was built up of plant assets based on reference photography taken in Hawaii and even house plants, plus live action stereo elements. “We used Maya PaintFX where were could to paint in a lot of grass and greenery detail,” says Breakspear. “Everything was meant to be shiny and had high-spec, too. Then as things got further and further away we could make it more card-based and use matte painting until you’ve got a distant horizon line.”
Method’s most significant environments were for shots of the characters looking upon the City of Atlantis, who are then seen walking among its ruins. Another was the Spooky Bridge, part of a travel montage, which through careful placement of rocks, ground and vines reveals itself as a face once the camera pans backwards. Method artists sketched the design out conceptually and even took reference photographs of vines at nearby Granville Island in Vancouver. Based on a helicopter plate, Method first tracked the shot and used that to build a point cloud of the environment. “Then we meshed that point cloud,” adds Breakspear, “and re-projected that texture back onto the mesh so we had a CG camera and real camera and could adjust things slightly.”
For another scene showing the characters walking towards a treehouse, production filmed a plate of the tree which was embellished digitally by Method. “The concept was that someone got shipwrecked on the island and they built themselves a boat out of the shipwreck,” explains Breakspear. “So up in the tree are all the bits of the boat – a mast, a keel, the hull – everything is up in the tree.”
“We Lidar’d the tree,” adds Breakspear, “but in the middle of a forest when you Lidar a tree, you literally cannot make out what you’ve Lidar’d. We worked on some meshing software that enabled us to mesh this tree together that gave us a basic mesh and framework. Then initially we took the boat and chopped it up into pieces and put it in the tree, and we were like, ‘That looks like someone in visual effects has chopped up a model of a boat and put it in a tree.’ It didn’t look like a treehouse! So we had our modeler look at treehouse building websites, and look at what you do when you build a treehouse from a structural point of view. From there he built planks of wood, and put the treehouse back into the tree that way – and the results were amazing.”
Various Method environments included waterfalls made up of Maya fluid sims and practical elements also filmed in Vancouver. Another aspect was adding consistency to jungle shots by incorporating flies and haze wherever possible – each time ensuring the shots worked in stereo. “One of the techniques Boyd and I both like – it’s a simple technique – is that once you comp a shot, you do an edge detect of your original matte and take your final shot and put it back on top itself using that original matte and just blur using the luma key the highlights – it just blends your shot back together. It’s a technique that’s been around for a long, long time. If you try to use some of the plugins out there, in stereo it doesn’t really work because you get this halo shape that’s just hovering in space, since it’s not left or right eye dependent. So that was a subtle way to add some realism into the shots.”
One of Method’s most simple effects also turned out to be the most popular – a stereo gag of Sean throwing berries at Hank’s pecks. “Basically they just needed red spheres bouncing towards camera,” says Breakspear. “It was the first sequence turned over to us, and as is always the case, the first sequence is the one you jump at because you always get excited. We probably over-built them because they are so motion-blurred. But we did have a great animator who choreographed them off Dwayne Johnson. We worked on the sequence and got it working in our cut and sent it off – and it was a big hit. We weren’t prepared for how much they loved it. It’s such a ridiculous sequence about how to impress women but it just was so simple and funny.”
The troupe quickly encounter one of the island’s creature mysteries – a miniature elephant. Realized as the size of a small dog, production actually filmed scenes with a Chinese pug as a stand-in. Trixter then created the elephant in CG under visual effects supervisors Dietrich Hasse in Munich and and Simone Kraus in Los Angeles. The studio also completed CG fireflies for scenes inside the treehouse and the swamp, and a swam of giant ants for a separate sequence.
Before work commenced on the final elephant shots, Trixter embarked on an R&D effort for muscle and skin simulation as well as stereo camera matchmoving – using 3D Equalizer and mocha – to deal with camera alignments. “We also developed stereo specific render layers,” says Hasse, “such as a disparity pass which allowed us to transfer specific work steps like roto masks very accurately from one eye to the other and therefore saved us quite some time in compositing.”
Trixter also invested time in studying adult and baby African elephants as animation reference. “We needed to define,” says Kraus, “how many wrinkles or how much hair did we see, how many cuts and veins do we recognize? What’s his weight? Is he bony or fat or normal? This all also helped finding the elephant’s age. We ended up with an animal who was about 20 years old, not a teenager any more, but also not too skeptical an adult either. We also looked at dogs, walking and running, not only as a scale reference, but also for the speed their legs move in relation to their mass.”
As a test shot, the studio created a walk cycle of an original sized CG adult elephant. “We experimented with different paces to find a cute walk,” says Kraus, “but one that did not look like a baby elephant. It was very important to Boyd that we could still identify it as an adult elephant, but just a miniature version of it. Boyd and the director chose one of the dog like walks – it supported the scale and looked really cute. Later in production, we did a similar test for swimming elephants for a separate scene and again, we ended up with a version which referenced a dog’s movement in water.”
The stand-in pug, although vital for physical interaction, playfulness and eyelines, of course required meticulous tracking, paint-outs and plate reconstruction – all in stereo. Says Hasse: “Since the dog was permanently moving, plus his skin was sliding over his torso, it took a lot more effort for the matchmovers to extract the basic movement of his center of mass, which would serve as a starting point for the animators to work with. Also, the hands holding the dog would show a lot of relative movement against each other because of the flexibility of the dog’s torso.”
To create the CG elephant, Trixter rigged and lit a model in Maya – also using the software for muscle and skin – and added further details in MudBox for displacement and Mari for texturing. The final shots were rendered in RenderMan and composited in Nuke. “For recreating a believable movement of muscles and skin,” explains Hasse, “it was essential to break down all secondary movements into categories like active and passive muscle movement, skin sliding, skin jiggle et cetera and to figure out which of these details define the nature of an elephant’s skin and which ones could be simplified without losing the sense of realism.”
“The photorealistic look of the skin was created,” continues Hasse, “by adding several layers of textures on top of each other – just like it occurs in reality. The underlying skin with variations in pigment density and color is shaping the main wrinkles and is overlaid by finer details on the surface like smaller wrinkles or cavities. On top of that there’s an additional layer of dust – again with variations in density and color. In order to get the best results for each framing, from closeup to wide shots, we used texture sets with different levels of detail, which we could dial in and out according to the elephant’s screen size.”
Trixter also used a texture-based approach to finesse the contact deformation of the elephant’s skin being held in Dwayne Johnson’s hands. “Even though the geometry resolution allowed for highly detailed deformation during standard elephant movements,” notes Hasse, “it didn’t quite have enough detail for this kind of tight interaction. That’s why we developed a dynamic displacement technique which gave us detailed control over the way the skin would deform around the hands grabbing it. This technique was based on the roto masks for the hands – which we had to do anyway for compositing. We projected those animated masks directly onto the elephant’s skin, applied some manually edited filters to their edge and used it like a displacement map. This displacement pushed the skin inside where the hands were and had it bulge out around the edges of the hands.”
As they travel across the island, the heroes find themselves in a nest of eggs, which ultimately turn out to belong to a giant female lizard. Once awakened, the angry mother chases the troupe through the forest. MPC in Vancouver handled these stereo shots under visual effects supervisor Erik Nordby. On set in Hawaii, the actors were filmed amongst prop eggs for the first part of the sequence, and then relied on a tennis ball at the end of a boom as reference for the giant lizard which would be added in as an entirely digital creation.
In modeling, rigging and animated the lizard, MPC looked to both several species of reptiles as reference. “We wanted something that felt ferocious enough that it felt scary,” says Nordby, “but we also wanted to have this frill that pops out – your typical frilled lizard is skinny and not that scary. So it was a mix between a dragon iguana for scale reference and texturing, and then we built the frill as mechanically operationally as a frilled lizard as we could.”
An in-house iguana was studied for textures and movement. “It was clear that the musculature we needed to support a 15 tonne creature would need to be quite large,” recalls Nordby. “We wrapped the lizard in our muscles, and figured out the flexing and the skin on top of that. Our cloth system played a big roll for the frill, and skin sliding also was done with cloth. We have a lot of proprietary tools all wrapped in a Maya shell.”
In one of the marquee shots, Sean is flung into the air as the lizard kicks up a tree branch – requiring a digi-double, CG lizard and foliage extensions. “They tried to do that shot in different parts and stitch it together,” says Nordby. “It was shot on stereo Phantoms. One of the cameras was a little out of alignment and so they didn’t get exactly what they wanted so it was quickly elevated to a lot of digital replacement. The digi-double was done at Rising Sun since they were doing that for the bee sequence. We drove the shot in terms of what the lizard was doing. From that they would then match the trajectory. We grabbed their asset and blocked in where Sean would be. We took it as far as we could, and then passed it back to RSP and lit and rendered the asset and we did the final comp.”
“For the CG foliage,” adds Nordby, “we started development in our fur system and built a spline and hair as the main stock of these plants. That worked for regular trees which were built as assets from on-set reference with a series of curves, and then we used our dynamics system to bake in some wind. And the lizard could drive everything else.”
Faced with the creature bearing down on the group, Hank punches it, but this only angers the creature into showing its full frill. The challenge of that shot, for MPC, was both making the reveal a surprise and then delivering the right amount of translucency through the skin. “The idea was that she would push a bunch of blood into this frill membrane and that would give us the ability to produce color,” says Nordby. “We had a bunch of maps that filled up the blood. We shot reference of onion skins for translucency and swapped plates to get things more backlit. Then we had a colorful frill that didn’t just feel like a surface texture. We used the back light to drive the color and got a vibrant, orange look.”
Another challenge was delivering the right amount of dappling effect from overhanging trees on the lizard as it makes its way through the forest. “I think we underestimated how much the dappling light would come into play,” admits Nordby, who oversaw the use of a Gobo-like rig to try and get the right effect. “But given the angle of light, what the lizard was doing and where the angle would fall, you’d get notes back saying ‘I loved the way the head looked, but the shape of the light wasn’t right.’ When you go in there and try and art direct the Gobo that’s grabbing the film light, we quickly found that we were getting notes about 55 spotlights as opposed to one big Gobo-driven dome. So we dedicated one lighter to Gobo design and on some of the shots we kept rolling the dice until something felt right – we’d have 50 different versions, say, all driven by a noise pattern, and then come in in the morning and pick the three that felt the closest and then lock that in.”
How to fly giant bees
At one point in the film the characters encounter a sheer cliff amongst a valley of giant flowers and realize that the only way forward is by hitching a ride on some over-sized bees. Soon they are pursued by giant birds amongst a lush forest. Rising Sun Pictures, with visual effects supervision by Sean Mathiesen, was responsible for the sequence.
Click here for an fxinsider interview with Rising Sun digital effects supervisor Mark Wendell on the tech tools behind the bee chase
- Above: watch the bee chase sequence.
On location, production shot background ground and aerial plates based on The Third Floor’s previs with various cameras and rigs, including SI-2Ks on a mini-helicopter flown through a forest grove. Mathiesen took reference photography of Hawaiian foliage and floor coverings which became crucial when the sequence required a higher number of digital forest shots than first anticipated. A later shoot in North Carolina incorporated the actors on bee bucks shot against greenscreen, where RSP created rough comps for turning over plates.
Final assets included three bees, five digi-doubles, the birds and an entire digital forest with trees, leaves and flowers. For the forest, in particular, Rising Sun relied on Houdini to instance the incredible amount of geometry required. “We actually animated and rendered the characters in Maya and 3delight and gave them off to Nuke for a 2D composite,” explains Mathiesen, “and then rendered the forest and the backgrounds and the interactive elements – leaves being kicked up for instance – in Houdini/Mantra.”
Rising Sun also introduced a more robust digi-double solution relying on Light Stage techniques and a new feather and fur pipeline for the bees and bird work. “In the Maya rig for the bird, we had a series of guide feathers which the animator has control over. We used an IK-FK system, and you could put some noise over it to get some procedural movement into the guide feathers. Then all that was taken over to the feather system – procedural application done in Maya.”
“In one shot where they crash,” continues Mathiesen, “the dynamic and feather system switches over to Houdini. The birds come together in Maya, but then there’s a feather storm – an explosion of feathers – that’s generated out of Houdini. From that we created a vortex and velocity system so that as they’re impacting – you’ve got the velocity of the birds, the feathers explode off – the trajectory of the original bird blowing them forward.”
To realize some of the dynamic flight shots, Rising Sun incorporated an ‘influence’ feel to the camera as it whooshes closely past trees and branches. “So as you’re moving forward through the forest, you’re pushing through the bushes and branches,” says Mathiesen. “It’s almost so subtle that you don’t really see it but you have a visceral feeling. We used Ocula in Nuke combined with painted maps, so that the way you get motion blur not registering more towards the center of frame. The plants are close to the camera but then wiping out the edge of frame and being able to add the right sense of distortion and stretch – it gives a sense of reality to the CG work.”
In addition, a strong compositing component was essential, involving roto-planes for the live action actors to interact with the bee fur, putting them on cards to enable correct stereo and significant re-projection work. Rising Sun also completed a CG spider and web for the sequence and an additional CG ant shot. Mathiesen was particularly excited about using stereo to help tell the film’s story. “I really found that in the layout stage there was a further ability to tell a story in that z-depth. There’s a shot where there’s all three bees flying together – they duck into the forest and you’ve got the birds chasing them. Sean’s bird peels off in one direction, and you’re left with the other characters – so we forced the viewer’s eye through convergence and other 2D tricks of defocus and lighting – it’s as if you almost move off into the distance to tell a second story.”
Swimming with the fishes (and a giant eel)
Hank and Sean dive into an enormous underwater cavern in search of Captain Nemo’s submerged Nautilus submarine but are taunted by a giant electric eel. They use the creature to jump-start the submarine and negotiate their way through the collapsing cavern. A worldwide group of Pixomondo artists contributed 80 shots to the dynamic sequence, overseen by visual effects supervisor Bryan Hirota.
Pixomondo previs’d the sequence initially before taking on the final shots. Actors Dwayne Johnson and Josh Hutcherson were filmed in an underwater tank in North Carolina on both a partial set of the Nautilus and also on a completely greenscreen set. Pixomondo comp’d the actors into its digital Nautilus sets and also created some swimming digi-doubles for wider shots.
“Before they jump into water they are standing on the China Walls area in Hawaii,” says Hirota. “We had a Lidar scan of that area and we took that and we meshed that into some geometry. We wanted a grounding basis in the physical location they started, but as the sequence progresses we go from cliff side in the real world to a fantasy based gigantic cavern full of lots of fish and an eel. We got the point clouds from the scan, meshed them and we did a lot of sculpting in ZBrush to help carve out and lead the cliffs into a mouth opening into a gigantic cavern.”
Underwater shots as Hank and Sean approach the Nautilus were filled with all manner of fish, jellyfish and plant life. “The cave was also designed to be pretty dark,” notes Hirota, “but we tried to craft a lot of volume lights as though they were cracks and openings inside the cave to let some light through and define silhouettes. “It would illuminate particulates in a volumetric fashion, and then when it hit the surfaces it would leave caustics on things. We put in actual volumetric lights and used V-Ray for lighting and rendering.”
The Nautilus was designed to look like it had been constructed with old-school techniques. “We tried to make sure it had very defined control surfaces – all of the hatches and moving parts were functional,” says Hirota. “We would reduce components and rods and pulleys to simple controls that could be manufactured. There were joins and seams and no cross-sections or silhouettes that were too perfectly designed.”
Realizing that the eel’s electrical attributes might allow them to start up the Nautilus, Hank and Sean concoct a plan to harpoon it. The eel itself was a 40 foot semi-bioluminescent creature created in ZBrush and 3ds Max with a distinctive Jacobs Ladder lighting effect passing through its skin. “The electricity was created inside 3ds Max and we played with the sense of time in terms of how long these electrical events would happen,” details Hirota. “The film’s running 24fps, the shutter’s open 50% of the time at 180 degrees. These events would happen much faster than 1/48th of a second – so the electricity stuff all had to be calculated and rendered so it had the visual appearance of being much larger and longer than you’d get in real life.”
“We gave it this real snakey spiral to its attack approach,” continues Hirota. “We had these style frames to show Boyd and Brad and had a cobra pose with the electricity – it coils up like that right before Hank spears it. There was also a secondary system on it that added procedural breathing and gill motion.”
Once the Nautilus is powered up, the whole environment begins to come down, with Pixomondo simulating crumbling cavern walls, clouds of silt and evidence of lava. “We took our geo and we laid in where we wanted cracks to expose lava or lava light,” says Hirota. “We were also playing with a few competing sources of light to get the scene – top light filtering down from surface of water as diffuse light, for example. The Nautilus had a number of lights and spotlights contributing to the scene as well.”
All images and clips copyright (c) 2012 Warner Bros. Pictures.
We've been a free service since 1999 and now rely on the generous contributions of readers like you. If you'd like to help support our work, please join the hundreds of others and become an fxinsider member.