Making mutants for X-Men: First Class

X-Men: First Class, Matthew Vaughn’s prequel to the existing X-Men films, takes place in 1962 and brings together Professor Charles Xavier and Erik Lensherr, later to become Magneto, and a band of young mutants. The film contains over 1000 visual effects shots created under the helm of visual effects designer John Dykstra. In this article, we go in-depth on the characters and final battle with the leading effects studios on the show: Rhythm & Hues, Cinesite, Luma Pictures, Digital Domain, MPC and Weta Digital.

The film was shot at Pinewood Studios in London, in Georgia and in California. Many of the film’s complicated sequences were previs’d by The Third Floor and other vendors. Prosthetic, practical and make-up effects were also a major part of the production. Dykstra was aided by visual effects supervisors Stephane Ceretti and Rob Hodgson and visual effects producer Denise Davis, and sought a fresh approach to the work that would still segue into the existing X-Men Marvel universe. “We really started with the characters,” says Dykstra, “and the coolest manifestation of their powers and fitted it into 1) the story we were telling, 2) the genealogy of the comic book and 3) what just looked best visually.”

Click here for our fxinsider Q&A with X-Men: First Class visual effects designer John Dykstra

Rhythm & Hues creates Emma Frost, Mystique and Angel

Original plate
Final shot

Rhythm & Hues, under visual effects supervisor Gregory Steele, delivered 160 shots made up of character work for Emma Frost, Angel and Mystique, as well as set extensions and environments at the Lincoln Memorial, the X-Jet hangar and a CIA facility.

In addition to her telepathic powers, Emma Frost is able to transform into a protective diamond form that can repel bullets. To develop that power for the screen, Rhythm was involved in an early look development. “There was a small team that put together a test for John Dysktra,” says Steele. “They used some dialogue and then roto-mated a diamondesque look. It was a good starting point, but the look did drastically change over time.”

Once into production, and with January Jones cast as Frost, Rhythm developed further concept art to see what would read in diamond form. “One of the problems we had was having all the facets on the surface that had a certain thickness and size, so it was hard to read the facial features, even if you added CG eyes. We distilled down what really made up her features – her eyes and lips and eyebrows and hair, and then went about building the model from scratch in 3D. We would find that a certain density of facets would give a nice shape to certain areas but then over-complicate other areas.”

On set, the standard gray and chrome balls were used to gather reference, along with a four inch cubic zirconia purchased specifically for the film to aid in how a diamond surface would reflect the surroundings. “It wasn’t like a standard effects shoot,” notes Steele. “Matthew would shoot the whole sequence from the beginning all the way to the end, and then get different coverage from different positions. Because Frost was coming in and out of diamond form during the sequences, we didn’t have the luxury of putting her in an mo-cap or tracking suit. So we had to grin and bear it and do the hard core match-moving work. We used witness cameras to help us along, but in the end it took a long time to match-move the performance.”

Original plate
Final shot

Artists relied on Maya to model the base geometry, with rendering in Rhythm’s proprietary Ren tool. The reflections on Frost’s surface were created both in the render and in compositing using the studio’s ICY system. “In the end,” says Steele, “it couldn’t just be a render-only solution. She just looked too messy if it was only a render, so we had to re-introduce some diffuse shading in there to show some of the shape of her. So we introduced some of the light passes into that and did it as a comp trick where we modulated the reflection/refraction that way.”

The transitions from flesh to diamond form were created in Houdini by applying a frost-like effect. “We would combine some artwork along with several mattes to show this frosting that happens,” says Steele. “We went through a bit of a design process to work out what is the effect that happens as she goes from flesh to diamond shape – was it heat based, facets flipping over or would her skin be slightly transparent and melt out? We honed in on one with frost tendrils that grow out across her body and that shifts into a diamond form, and it evolved from there.”

Mystique (Jennifer Lawrence) is a character appearing in the first X-Men films, whose shapeshifting powers are just developing in First Class. “Everyone was really happy with what had been done in the past,” outlines Steele. “For this film, John Dykstra was looking to make sure we could get a real dimensionality to the effect. He didn’t want it to be compy or too morphy-looking. We really strove to do a full 3D version of the transformation as much as we could, and tried to keep it out of the 2D realm.”

For her transformations between human form and a blue scaley mutant – achieved with physical make-up – artists tracked both performances by Lawrence and created a morph between the geometries. “We used a full Mystique digi-double skin that would blend through,” says Steele, “and have the scales flipping over with the full geometry to get the lighting and the sheen reading and have the right shadows.”

“What was interesting,” continues Steele, “was that we would shoot these things kind of old school where you would have the actress sitting there and shoot one plate of one person and the next plate of the next person and then line them up as best you can in the video tap. We still had to spend a lot of time match-moving to complete the transitions. When she’s in the bedroom, on both the A and B side Jennifer Lawrence would try and give as close as the same performance as she could, but there’s always a lot of adjustment needed to get them into one place. We would leverage off one performance and pull that into the other one in 3D.”

Using its proprietary animation software, Voodoo, for the scales, many of the transitions were created by a single rigging supervisor. “We had a bunch of scale geometry placed in all the right places that would line up with the make-up set-up and we animated them to come out of the skin and lie down,” says Steele. “We also had to transition the hair, which on the other films they didn’t always need to do – they could do a color wipe. On this one we had Jennifer’s long hair, so we used the approach we took with the skin for the hair. The scales would come up through the hair and then lie down and create a new surface which would be the new hair. Also, instead of the scales coming up and lying down and then transitioning, we tried to take some of the plate information. As the scales lie down in their last 10 to 15 per cent of their lifespan we would introduce some of the plate color as it moved into position.”

Original plate
Final shot

For Angel (Zoë Kravitz), who could grow dragonfly-like wings from a tattoo on her body, Rhythm referenced production concept art and then sculpted wings in ZBrush. The tattoo proved challenging, since it required four hours of application time and two hours to remove on set. “If the tattoo was on for a sequence,” recalls Steele, “we either had to remove it later or add it back in. We were tracking just skin with only a few tracking marks on her. In ICY, we took some of the texture maps that we had done for her digi-double and put those on the match-moves of the character in a digital environment and rendered those out and compositing it all in one step.”

The flying scenes were realized mostly with the actress on wires, although the dramatic aerial battle at the end included scenes shot with the character suspended under a helicopter over San Pedro. “It really added an amazing level of realism,” says Steele. “They could shoot it wild almost like Top Gun and get all these crazy camera moves that didn’t have any mechanical feel to it. In those cases we ended up removing the rigging and adding in the digital wings to her. The look-dev guys also put a nice iridescent shader on the wings that gave it a fairly appealing quality that took off the edge off the nasty bits.”

Cinesite’s Azazel, Cerebro and environment effects

Cinesite contributed the Azazel character, Cerebro effect and various establishing environment shots to the film, under visual effects supervisor Matt Johnson.

Azazel (Jason Flemyng), a devilish-looking mutant who is able to transport between dimensions, had close associations with a character from the previous films, Nightcrawler, in that he would wisp on and off the screen leaving smoke behind and creating smoke just before he enters frame. “We went through a lot of development of trying to get a digital fire and smoke sim because he was more closely aligned with the devil,” explains Johnson. “It wasn’t a full-bodied flame but had to be very quick. There was a big fight scene where he appears and disappears. We roto-mated Azazel, so whatever movement he was doing would affect the sim of the smoke. And then we also had to do the tail, which we built and animated in CG and added in some transparency and textures for a real-world feel.”

Azazel’s fight scenes were filmed somewhat traditionally, by allowing the actor to step out of the shot where necessary to aid teleports, but without having to ask the extras to freeze in the frame. “We were trying to keep away from that as much as possible and let the camera moves be very dynamic,” says Johnson. “When we were shooting we’d keep an eye on what we were looking at, and then knowing when he’d disappear from the frame, we’d try and remember what was in the frame and think, ‘right, what can we shoot that would be helpful or can we get the camera move repeated to get a clean plate’.”

– Above: a making of Azazel video from The Daily

Cinesite developed a proprietary fluid shader for the smoke and fire simulation that could be packaged into an FX rig. That allowed for the adjustment of settings based on Azazel’s body motion and speed and which frame the transport effect would begin and last for. “The result of this,” says TD FX artist Steve Shearston, “was that we could get each shot 90 per cent of the way to completion on the first pass, and only make minor adjustments to speed and turbulence, which also had custom controls, on a per shot basis.”

“The final setup,” adds Shearston, “was to use an individual fire fluid simulation for each of the characters that transported and an overall smoke fluid simulation that was emitted from all the characters, all of which was created in Maya 2011. For the fire, we filled the body with particles, which would be used to emit the fire fluid, and then animated their size so that if it was an incoming transport the fire portal would move from the front to the back and linger a little behind Azazel, and the reverse for an outgoing transport. It was rendered with PRman, through Renderman Studio, with three heat ranges being assigned either a red, green or blue colour, which gave us a lot of control in Nuke to quickly adjust its colour and opacity to create realistic looking fire.”

The speed of the smoke simulation was animated which allowed it to explode into the air and then quickly slow down to just briefly hang and move at the same speed as the rest of the breeze on the set. “The red and black colour comes from Azazel’s skin and clothes,” says Shearston, “which we get by selecting ranges of the smoke sim’s density and assigning each range one of the colours, which stays with that part of the smoke for the duration of the shot. As Azazel’s transport effect included fire, and a lot of the shots where he came or left he was in a fairly stationary position, we took a different approach from X2’s Nightcrawler, squash and stretch disappearing technique. We used the evolving fire as a mask to hide or reveal the body as it moves through him and applied some subtle heat distortion so that the body would ripple out from the centre of the effect to make it feel like he came through the portal as opposed to a Star Trek style dissolve teleport effect.”

Professor Xavier encounters an early incarnation of Cerebro – the method by which he can perceive mutants at great distances – at the CIA facility. Scenes inside a geodesic dome lab were shot against greenscreen on a raised dais at Pinewood Studios, with the dome walls composited by Cinesite. The studio also added in light effects and flares for the Cerebro helmet Xavier wears. The actual Cerebro effect mirrored the look of the previous films, but was more retro and primitive. “The thing they most liked was the effect of the first movie which was a cloudy gaseous world where everyone appears and disappears,” notes Johnson. “We had to come up with a slightly simpler version of that.”

Production shot various actors and objects against bluescreen with a Genesis camera and did multiple camera moves, handing the plates over to Cinesite to orchestrate the final piece. “We went through a phase of using Nuke’s 3D capabilities to create some 3D cameras and move around the environment, so people could be placed in their right positions,” explains Johnson. “If you’re human, you’re black and white, but if you’re mutant you’re in color. Once we had that 3D camera, we then used a combination of full 3D smoke, with different layers, again using fluid dynamics. Then we took some real practical smoke elements that were wrapped around 3D objects and moved within the Nuke environment. It was a combination of real and fake elements.”

Cinesite’s work also included various establishing shots, such as a Soviet Red Square parade. “We couldn’t film at the actual location, but I went to Moscow with our visual effects photographer for a week,” says Johnson. “We got permission to take stills at different positions of Red Square using a robot controlled nodal camera to shoot tiles. We used that for textures and then used our Photomesh tools to create the geometry.”

With that data, artists re-created the backgrounds for the parade, modeling and animating tanks and vehicles correct for the era. An army of Massive agents was also added in, based on cycles filmed of an extra who had been in the Russian military, augmented with surrounding digital crowds. Other environment work included a shot of the Kremlin made up of still images with added pieces on Canon 5D photography to give them some movement, and an alternative reality shot of Sebastian Shaw and his mutant army seemingly taking over Washington – made up of a greenscreen foreground of the principal actors, practical fire elements, a Massive crowd and a matte painted background.

Luma delivers Banshee, Havok and Darwin

Luma Pictures took on visual effects duties for three characters – Banshee, Havok and Darwin – as well as contributing several set extensions and specific shots dealing with a radio telescope.

Banshee, played by Caleb Landry Jones, has the power of a sonic scream that can not only harm assailants but also allow him to use the physical sound vibrations as a gliding mechanism. As with all of the characters, overall visual effects designer John Dykstra looked for elements of the real world that could bridge the gap between the comic book and the film. “John really pushed us towards scientific papers and how people visualize sound waves,” says Luma visual effects supervisor Vincent Cirelli. “We saw renderings of sound waves and how they reverberate, and used that as a foundation for the effect.”

Luma also produced 2D concept work for Banshee’s sound wave effects based on preliminary production artwork, which moved into basic 3D geometry for blocking purposes. “We created this rig of discs that the animators could block out and place for timing,” explains Cirelli. “The animators were able to use that to get feedback about the velocities and shapes and diameter of the soundwaves, and when they hit a surface how they would reverberate back. That geometry then triggered and drove the effects in a fluid simulation. You see a hint of geometry that’s rendered like a refractive surface to portray it as though it’s disturbing the air. It also has a hint of chromatic aberration in it. All of that is layered into a significant amount of fluid dynamics.”

The studio approached their effects work for Havok (Lucas Till) in a similar fashion, re-purposing some of the fundamental rigging from Banshee’s soundwaves to create beams of light that emanate from the character’s chest as glowing bright red concentric rings. Havok is also able to summon the discs as energy and propel them at foes, learning to control his power in an underground bunker. Here, Luma added in the rings and destruction effects.

“The rings were based more on rendering actual geometry and soft bodies and using cloth sims to define the geo,” says Richard Sutherland, CG supervisor. “The beam coming out and the rings couldn’t feel too rigid – it had to feel like they were made out of energy and expand and contract and deform slightly.” Artists created the geometry in Maya and would then bring that into 3ds Max to generate a fluid sim using FumeFX. The resulting particle cache was brought back into Maya and rendered in mental ray.

For Darwin (Edi Gathegi), a character who can summon various protective metal types including a shell-like armour, Luma devised a rig that allowed platelets to unfold and slide out underneath each other. “We then just played with the timing of it,” says Raphael A. Pimentel, animation supervisor, “before it was tracked to our digital double of Darwin so that it would seamlessly blend in and out of the human form. We created a lot of mattes that would help with the blending so that it didn’t feel like the shells were just popping onto the surface all of a sudden. We had to use specialized shaders that were like gradients, essentially, so that as the shells were animated, gradients for each shell would change the opacity of that shell. It made it feel like a smooth transition as the shell turns on and off.”

In one scene, Darwin and the young mutants face off against Shaw and his gang at the CIA facility. Havok launches a disc against against Shaw, who is able to compress the energy and place it in Darwin’s mouth. “He’s trying to contain this energy from within so he’s reactively swapping between all these different material properties,” explains Payam Shohadai, executive VFX supervisor. “He goes from human to metal, which starts to heat up. And then he transitions into stone thinking that will contain it. There’s a beat in there when he transforms almost back into human state and you think he’s going to be OK, but not for long.”

To realize Darwin’s various states of transition, Luma built a digital double, starting with a point cloud of the actor, re-building the necessary topology and detail. “We then created these transitional mattes which were animated 3D textures to help us blend between the different materials at different times and break it up, so that it didn’t feel like just a dissolve,” says Cirelli. “We didn’t want him to feel like it was a surface wipe. As the metal heated up, it had to feel like it was heating up from a central core and then that was the trigger for different portions of him to turn to stone.” Finally, artists painted in veining crack maps on Darwin’s surface, exposing hot magma seen through fissures in the stone surface, with animated textures used to suggest fire underneath.

MPC adds Beast, Riptide and yacht destruction

MPC visual effects supervisor Nicolas Aithadi oversaw his studio’s work on transformations for Hank McCoy to Beast, Riptide’s tornadoes, the destruction of Shaw’s yacht and Shaw’s escape by submarine.

First transformation stage
Beast forming

Scientist Hank McCoy (Nicholas Hoult) desires to turn his mutant simian feet into human form, devising a serum to aid in the transformation. But when he injects, the initial positive result reverses, and he becomes Beast in a graphic ripping of skin, cloth, muscles and fur. “Most of the effects in the shot were controlled by a rig moving the muscles, tendons and all the veins,” explains Aithadi. “The veins were actually triggered by the rig creating maps later used in lighting to displace the skin. Actually, by the end we even developed a foot fetish because we had been looking at feet every day for three months trying to understand how they move.”

MPC built a realistic CG foot based on Nicholas Hoult’s, as well as the slightly simian feet and Beast’s more monster-ish blue versions. “We really needed to have a near-perfect human foot to sell the rest of the effect,” says Aithadi. “So that meant 80 per cent of our time was actually spent on the human foot. We also had the challenge of going from a skin foot to a blue foot, which was difficult to make look real. All the work went into the transition – how we get from the skin color to the blue color. We tried to make the color sit out of the veins, so we had to develop a vein network – big veins for the arteries, medium-sized and down to the capillaries. We had them animated to come up to the surface during the transformation, so we could believe the serum was changing the color of the blood inside the vein and the color was seeping out like blotting paper.”

For the hair, artists relied on MPC’s proprietary Furtility grooming system that was linked to the veins and would trigger hair growth from real hairs already on the foot. According to Aithadi, the hardest elements of the transformation were the toenails and the cloth sim. “That took pretty much the whole project to get to a happy point with the trousers and shirt sleeves,” he says. “The cloth also had to rip and hair had to grow through the holes, so we had to simulate hair through the cloth sim and link the two. We created an occlusion map from the clothing to the leg and we used that in Furtility to trigger the hair growth where the trousers were ripped. When we did the previs for that and showed it to the cloth TD, he was quite scared!”

MPC also created two fully CG face replacements for Beast, seen in the X-Jet hangar and later on the plane. “We thought we might be just replacing the mouth and a bit of the nose,” recalls Aithadi, “but we quickly realized that to accommodate the dialogue we had to replace the whole thing. The eyebrows needed to move and the eyes were changing, and as he was wearing glasses, we had to match the nose movement.”

Lip sync was animated by hand since there was no facial capture data available. “As good as your animation can be,” says Aithadi, “it’s always a bit flatter than the real character because there are so many things happening with the face. So what we tried to do was use cloth sims to add skin and muscle movement. While the dialogue was running, we were sim’ing that on top of a bone and muscle system. The skin was doing some weird extra twitches but that actually helped convey the sense of facial mechanics. Another thing we did was keep the real eyes from the plate. That helps you sell the rest of the face because usually people are looking at eyes.”

Riptide (Álex González) demonstrates his mutant powers as tornadoes, something MPC needed to replicate in a variety of indoor and outdoor settings. “The first thing we did was look into the real thing and we gathered a collection of real tornados,” says Aithadi. “We took that to John Dykstra who quickly realized that tornadoes look the way they look because of what they actually suck in. It’s really just wind, but in the desert it would be sand, if it’s over the sea it will have water. But our tornadoes had to be in the living room! So we changed the look to a more vaporous thing that had the ‘quality’ of tornadoes.”

“We had our effects team create a rig with just Maya particles,” adds Aithadi. “It was particles controlled by expressions that were going around a curved rig that was the body of the tornado. These particles were used to control a Maya fluid simulation. Then these Maya fluids were used to control particles for rendering. Each tornado was made of an outer layer and a core layer, so they were both created the same way but one was a bit looser and wispier and the other was a bit denser and more volume-y. All these things were sent to lighting and rendering to give us multiple layers for compositing.”

At one point in the film, Erik Lehnsherr confronts Sebastian Shaw on his luxurious yacht and uses his magnetic mutant powers to control the anchor and destroy the boat. “This was some of the hardest effects I’ve worked on for a long time,” says Aithadi. “Luckily, MPC Vancouver had worked on a destruction tool called Kali for the Japanese temple in Sucker Punch, and we adapted that to this show.”

Artists built an extremely detailed digital yacht in Maya, based on hundreds of reference photos taken of the real boat shot in Georgia. “We also had a LIDAR scan, but it was a hard job modeling the boat,” says Aithadi. “We realised very quickly there are a lot of details in a boat and so many little intricate parts. Once we had the big shape, our job was to create everything else, down to the door hinges and hooks. The planks on the deck had to be built separately because we knew they would be part of the destruction later. And then we had to build the inside as well with furniture and props and have those things casting shadows and be destroyed.”

The next step was simulating the destruction. Kali allowed artists to define where to break pieces of geometry and how it should react. “The tool divides the geometry into sub-geometries and sub-polygons and you can go as low as you want,” explains Aithadi. “The boat was also made of a lot of different materials – a steel base, wooden doors, plastic features and glass windows. We had to set up the simulation so we had the right of debris and physical feel to the materials, along with shards of glass and splinters of wood.”

Other challenges for the boat destruction included making metal tear to look almost like cloth, but not like plastic and to ensure the resulting pieces had sharp angles. “If you scrunch a piece of foil up it has sharp angles, whereas cloth or plastic has a tendency to be rounded,” says Aithadi. “So we had to write that into our simulation, and we helped it as well by displacement maps that would have sharp edges.”

MPC also inserted CG water using Flowline, linked sometimes with Kali, around the yacht to enable objects to interact with the surface appropriately and create splashes. The right water look was important to maintain the yacht’s scale, as were reflections seen on the ship’s surface. “We had to find the right type of reflection which was sharp close up but then would diminish into the distance,” notes Aithadi. “And there were highlights we noticed in the real boat. It wasn’t obvious where the light was coming from but we knew we had to replicate these. The more you had them the bigger the boat looked. We ended up using reflection maps which were black images with white hotspots.”

Shaw escapes underneath the yacht in a submarine, a Weta Digital asset that MPC adapted for the close underwater views. “We were left pretty open to design the way the sub was attached to the boat,” says Aithadi. “I asked one of our artists here who is into mechanical things and he went crazy designing an amazing full system inside the hull of the boat with hooks and arms and mechanical parts, all animated. The main other work was the way the water was reacting around the submarine. Everything had to do with the bubbles. We had big ones that looked like jellyfish with a tail and volume looking ones which were really small ones, plus plankton and atmosphere.”

Digital Domain and Sebastian Shaw’s echo weapon

Evil mutant Sebastian Shaw (Kevin Bacon) is empowered with the ability to absorb energy and even bullet hits and re-use that as a kinetic echo weapon. Digital Domain created around 100 Shaw effects shots for a grenade explosion, the atrium and the mirror room sequence, under visual effects supervisor Jay Barton.

DD’s initial work for Shaw involved establishing the look for his mutant power. “The character design was a challenge that we brainstormed with John Dykstra,” recalled Digital Domain digital effects supervisor Nikos Kalaitzidis. “How should it manifest itself say through muscle ripples or his arms getting extended? Eventually, it occurred to us that we were doing a horror effect – a la Frankenstein. When he absorbs nuclear energy from the submarine, you could see his face distorting and meta-morphisizing into this monster. You see him transform up close full screen and then turn back into a human photorealistic Kevin Bacon.”

To achieve that kind of grotesque effect, Digital Domain received a 3D scan of Bacon and relied on witness camera footage for the actor’s scenes. Character modeller Dan Platt created the face detail. Animation supervisor Bernd Angerer led roto-mation of the plates as Shaw transforms into an almost phantom-like being. “It was interesting,” says Kalaitzidis, “even though we had a 3D scan, and we spent a lot of look-dev and matching him photorealistically, what we did notice when they did a re-shoot and we dropped in our rendered Kevin Bacon, we realized our Kevin Bacon didn’t match the new Kevin Bacon because the original scan was months earlier and his body had changed. So we had to re-adjust the model in the look-dev to match the new Kevin Bacon.”

With recent experience on complex facial animation for Benjamin Button and TRON: Legacy, artists at DD followed the same methodology used in those films, but this time adopted V-Ray for rendering. “Going into the show we knew for the mirror room it would be very highly reflective with mirrors and lots of motion blur,” says Kalaitzidis, “so we went with V-Ray. It had also come out with certain skin shaders and hair shaders that we were able to utilize.”

A first taste of Shaw’s power comes when he absorbs the explosion of a grenade and uses it to kill a government informant. “That was a real head-scratcher,” recalls Kalaitzidis. “We were trying to design the character and giving him multiple hands and arms. Is there one arm that fans out? Is it high frequency? In the end, the character was entirely CG – his face, his sunglasses, his hair, his hands. The explosion was tricky because we didn’t have any reference of someone holding a grenade and it exploding – was it slow-motion? No, it was more or less real time how he absorbed this grenade. We went with the thought that the grenade energy was this zero gravity explosion that happened – he’s absorbing it and it’s going into his skin and up his sleeves. The fire and absorption was done in Houdini. We had a lot of rigging techniques for the animation of the multiple arms and multiple heads. There were different models of the heads we used that were introduced into the animation for the horror effect. ”

At a later point Shaw destroys an atrium, an environment augmented by DD with fully CG explosions, debris and fire. “Our atrium explosions were led by effects animation supervisor Brian Gazdik,” says Kalaitzidis. “We used Houdini to break things and add fire on top of that. It included desks, and ceilings and windows and even typewriters on the desks. In the past we’d use real pyro elements but I felt like this was one of the shows where we didn’t need to do that for CG fire. And not only did we have CG fire but all the characters shooting at Shaw from the top of the atrium were all CG and mo-capped.”

After attempting to draw the nuclear power of the submarine, Shaw is confronted by Lensherr in a ‘mirror room’, an environment shot completely against greenscreen. Once filmed, DD assembled some rough post-viz to help establish the key beats and rudimentary, but multiple, reflections. “That was approved by John Dykstra and then we came up with a look-dev for the mirror room,” says Kalaitzidis. “We used reference from Enter the Dragon with Bruce Lee, in the mirror maze. That showed the imperfections and the hues and mutual density that we needed to add in.”

Witness-cam footage of Shaw and Lensherr, again, was used to copy their performances via roto-mation. “Using this animation,” explains Kalaitzidis, “we reflected CG doubles into the mirrors themselves. They’re fighting each other and there’s breaking glass which was done with Maya and Houdini, and then composited in Nuke.”

The final showdown

Weta Digital handled extensive visual effects work – around 350 shots – for the film’s climatic face-off between the Soviet and US navies near Cuba (after Shaw has orchestrated a missile crisis), the destruction of Shaw’s submarine, scenes on a nearby beach and the near annihilation of the navies by their own missiles.

Using previs of the sequence as a base, production filmed scenes both on greenscreen stages and at Jekyll Island in Georgia, a location originally chosen to match Cuba’s white sandy beaches and palm tree setting. “Apparently before we got there, however,” says Weta visual effects supervisor Guy Williams, “it was 80 degrees and beautiful weather, then a week before we showed up the temperature dropped down below freezing, which is the most surreal thing in the world. It got insanely cold.”

“The set was this massive sandy beach with a big green buck for where the sub was,” adds Williams. “The front half of the X-Jet was there also but the back half was another big green buck. They actually trucked 300 to 500 palm trees and buried them into the sand so that it looked like it was a tropical beach as opposed to an east coast beach.” Due to the freezing temperatures many of the palm trees became brown or died only days into the shoot, necessitating significant digital color correction and foliage work for Weta. “Luckily, John Dykstra did an element shoot for us with palm trees against greenscreen so we could comp those in. We also made some CG palm trees with a little bit of wind simulation on them.”

For the missile stand-off, Weta built three battleship types per fleet, other smaller ships, a freighter, Shaw’s submarine and the X-Jet using its tried and tested modeling tools. “We have an elegant way of dealing with multiple datasets which lets us deal with large amounts of data and large datasets inside that large amount of data,” explains Williams. “The way the system works is that you don’t have to copy and paste say a gun turret for a ship – you just say on the other ships that you want that gun and it knows where to find the proper textures and proper shaders and makes it easier to recycle assets and add detail. So anything that was built on one ship could be used on the other ships, thus making heavily detailed ships an easier process.”

In addition, artists relied on Exotic Matter’s Naiad for water simulations, in combination with Weta’s own proprietary fluid simulation software called Synapse. “We can use that also to handle complex amounts of voxels so you can get really detailed simulations,” says Williams. “It allows us to manipulate the Naiad simulations and build upon them and move backwards and forwards amongst the volumes. We can use Naiad say to do the surface of the water, and then we can turn around and do particles into the surface of the water and air bubbles and use other driving factors like vorticity or velocity or even age to create foam.”

“We also use FFT systems to create the waveform of the open ocean,” continues Williams. “For one scene, we had baked that into a displacement map, and one of the things we were fighting with was a boat going through the middle of this waveform – how do we transfer from the waves to the isosurface or the created mesh of the actual wake of the boat? In the end, we came up with this way of putting everything into an isosurface so that it could all be unionized properly. The water rendered to the horizon is an implicit surface – all the various simulations can be integrated back into that surface and be rendered as a single mesh. It allowed us to have more than 12 boats in the water, each with a high resolution wave sim coming off of its bow wake, and each one of those can be stitched back into the water and rendered as one surface. So there are no blend areas – no comp tricks to try and put things back together. This meant it became relatively easy for us to do very complex shots.”

A particularly complicated scene included the emergence of Shaw’s submarine from the water, pulled out by Erik Lensherr’s magnetic powers. “The hardest part of that shot was scale,” recalled Williams. “One of the first things you do is depart from reality in the way you want to stage the action. For example, they wanted the sub to come out of the water really fast and the problem is that you have an animator animate it and you get something looking really cool moving fast. But then you look at the numbers and it’s moving 60 or 70 miles an hour backwards and its props are kicking water in a backwards direction, so you have to put thrust coming off the props.”

“Something at the scale with that much water really wants to do exotic things with the physics of water. If you turn on all the proper bells and whistles and everything, it starts to do amazingly energetic things – you get these massive explosions in the water with water being kicked up hundreds of feet in the air. If you move something that big and that fast in the water without any kind of dampening, that’s what will happen. So you get into this whole delicate balancing game of bending the physics of the situation so that it still looks real but it’s not falling apart like it would at those kinds of forces.”

Various simulations worked together to form the layers of water, bubbles and foam. “When it breaches the water it starts to create foam on the surface of the water,” says Williams, “which then slides along the surface of the water based off the motion of the actual water itself. As the props splash onto the water, you have another sim that’s doing little splashes. Then there’s another sim just doing the aeration and the bubbles underneath the water, as it starts to get close to the surface of the water you allow cavitation to occur – pulling little pockets of air out of the water and creating little bubbles that want to float around for a second. As the underwater bubbles get closer to the surface, they actually turn to surface foam. It’s this nice giant integrated family of effects that go into creating the end result.”

The final missile launch – thwarted by Lensherr who then turns the missiles on the Soviet and US navies – allowed Weta to undertake some significant research into advanced weaponry. “We got to be completely nerdy fan boys and go off and research lots of Russian and U.S. military hardware,” says Williams. “The Americans make these missiles that are 12 feet long, like a Tomahawk or a Harpoon. The Russians make these monstrous missiles – about the size of a school bus! The problem we ran into there is that we had this flight of missiles coming in towards the beach and initially it felt too close to camera, but actually it was about a 100 yards away because it was just enormous. So we had to keep that in the back of our heads in terms of the relative sizes and create a faux sense of perspective.”

Exhaust trails and the subsequent mid-air explosion of the missiles were created in Weta’s fluid simulation software. “I love doing that sort of stuff because it allows you freedom of camera,” says Williams. “If you shoot a big, black oily explosion you’re kind of locked into the angle of the camera. Since we’re doing them as fluids we can actually pass right by the explosions as they’re happening. Plus they’re also being lit by the same lights as are lighting the ships and the water. So there’s a sense of continuity and connectivity that helps with the reality of the whole situation.”

Handling the physics of the missile exhausts as they come to a spluttering stop was one of Williams’ biggest challenges for the sequence. “You have the missiles stopping in mid-air but the exhaust coming out the back of them needs to be moving at 700 miles an hour. If you take a fluid solver and create a voxel set that’s large enough that you can throw this exhaust 200 yards back, but also fine enough that you can see the texture to it, and then you put that kind of pressure into it, you start to get into some problems with the fluid solvers. So we used a fluid solver that was more robust to let us do high-end pressure physics. We could put a nozzle at the back end of where the missile was and shoot exhaust out at 700 miles an hour and have it travel backwards for a couple of hundred yards and then slow down until it was a larger cloud, and then we actually hung that off the back of a few of the missiles as they were starting to slow down, which helped with the sense of energy.”

Effects around the world

Original plate
Final composite by Method Studios (London)

Co-ordinating a worldwide effort for the First Class’ visual effects relied on numerous cineSync sessions and communication for John Dykstra and his visual effects team. In particular, visual effects supervisor Stephane Ceretti from Method Studios in London (which itself completed 23 shots for Havok’s training scene and for the Hellfire Club sequence where Erik and Charles meet Angel for the first time) was loaned out to the X-Men production team for Fox to work in the UK. “It was great to be working with such a legend as John Dykstra,” says Ceretti. “He’s extremely focused and professional, yet retains a great sense of fun in the midst of all the madness. As I was in London, I was in charge of the seven smaller vendors on the show that were based in the UK and Europe, who completed around 300 shots. The process of working with more than one supervisor went very smoothly. John was the driving force and Rob Hodgson and I were there mainly to ensure that John could focus on the main vendors and the big shots and his interactions with editorial, Matthew and the studio. Rob did a fantastic job jumping onto the fast moving train of the LA shoot and I was there to help make sure the VFX department was open 24 hours a day and no time was wasted with the time difference and feedback to all our European based vendors. We were all pushing towards the same goal – finishing the movie in time and making sure we would push the shots until the very last minute.”

All images © 2011 Twentieth Century Fox Film Corporation. All rights reserved.

2 thoughts on “Making mutants for <em>X-Men: First Class</em>”

  1. awesome article. thanks for the detailed information. Xmen 1st class was probably the best of the entire franchise. kudos to all the hard working fx peeps!

  2. Pingback: Bubbles of Mischief » Emma Frost

Comments are closed.