Man of Steel vfx milestones

“Zack Snyder wanted Man of Steel to appear very natural because there’s some very fantastical things in there and he wanted people to suspend their disbelief, and we the visual effects team had to make it as easy as possible for them to do so.” So recounts overall visual effects supervisor John ‘DJ’ Desjardin on the philosophy behind Man of Steel’s visual style.

Superman takes to space. VFX by MPC.
Superman takes to space. VFX by MPC.

Desjardin notes that the intent was to shoot a more handheld (the DOP was Amir Mokri) and documentary-style film than previous outings in this comic book character’s ‘verse. “We had to think about what that would mean since we also had to photograph some crazy action,” says Desjardin. “So for a lot of the previs we did, we’d start to think where our cameras were and where our cameraman was. A lot of the rules are the Battlestar Galactica rules for the space cams that Garry Hurtzel developed for that mini-series where we want to make sure if we’re translating the camera at all it makes sense. Unless the action is so over the top, like in the end where Superman is beating up Zod – we had to break it a bit.”

fxguide talks to the major players responsible for bringing to life the visual effects of Man of Steel: overall supervisor John ‘DJ’ Desjardin, and Weta Digital, MPC and Double Negative. With so much work in the film, we delve down into just three of the many tech accomplishments:

1. The tech of Krypton
2. Live action and CG takeovers: the Smallville confrontation
3. Destroying a city: the invasion of Metropolis

And we also take a look at PLF’s previs work, Scanline’s tornado and oil rig effects and Look Effects’ work on the bus crash.

The tech of Krypton

A war brews on Krypton before its destruction.
A war brews on Krypton before its destruction.

Act I of the film takes place on Krypton, facing destruction from an instable core. Weta Digital created alien planet environments, creatures and also the key means of display – a technology the filmmakers came to call ‘liquid geo’ meaning liquid geometry. “Basically,” explains Weta Digital visual effects supervisor Dan Lemmon, “it’s a bunch of silver beads that are suspended through a magnetic field, and the machine is able to control that magnetic field so that the collection of beads behave almost like three-dimensional pixels, and they can create a surface that floats in the air and describes whatever the thing is you’re supposed to be seeing.”

The liquid geo devices appear in the planet Krypton scenes, as well as later sequences on the Kryptonian ship the Black Zero. Similar technology making up a panel display resembling a Greco-Roman bas-relief – but achieved via a different method – is present in a scene in which a hologram of Superman’s father, Jor-El, explains the history of Krypton to his son.

In creating the liquid geo which took the form of anything from wide planet views, x-rays, displays on floating robots and even to depict Jor-El communicating with his wife Lara; Weta Digital took these steps:

1. The look – The beads, which up close would appear to be pyramids with a slight bevel, were designed to create a surface of the object they were depicting inside some kind of console. “Essentially we would have the normals of the objects that we were targeting provide a simulation with an orientation that one of the most dominant sides of the pyramid would align with,” explains Weta Digital lead FX TD Brian Goodwin.


– See how Weta Digital created the liquid geo and History of Krypton sequences, thanks to our media partners at WIRED.

2. Modeling and animation – The models used for animation ranged from purpose-built (Lara’s face) to ones appearing in grander scenes (such as approaching scout ships). Says Goodwin: “We had to develop a pipeline to bring in assets, so instead of going through the route of reducing the polygon count to something usable what we would then do – you would take the model in whatever way it was made and just scatter discrete points onto it, and extract the matrix onto the animation and copy these points onto the matrix and have these sparse points behaving in a way that the model would.”

“We had animation provide us with geometry that we would then track beads on,” adds Goodwin. “Those beads would then be turned active in front of the actual console and the console would decide what beads it needed to provide to actually draw particles from the actual earth we described or this invisible bowl.”

3. Simulation – After animation, artists ‘copied’ little beads onto the animated geometry for a pre-sim’d lighting version to get approval on how the object would read. Sims were then run “on all the targets which would be discrete beads floating around on top of the surface which would have its own set of parameters,” says Goodwin. “The bead size or the turbulence that would crawl along the surface constantly updating the orientation was based on the normal provided by the surface. That was then saved to disk and we would use that sim as the final target for the simulation.”

Liquid geo console displays can be seen in this pic from the Black Zero on earth, along with Metropolis backgrounds.
Liquid geo console displays can be seen in this pic from the Black Zero on earth, along with Metropolis backgrounds.

The sims were based on a fluid sim. “We used Houdini’s internal FLIP solver that gave us the pressure, the sense of volume maintenance,” notes Goodwin. “We’d have the console sim inside an invisible membrane, and there would be currents that we would describe with what we would find aesthetically pleasing within the shot.”

“The console was like a cup turned to the side,” adds Goodwin, “so whereas gravity would be Y pulling ‘down’, in this case the Y is facing into the back of the ‘cup’ console, ” which means essentially gravity is pulling from the back of the cup towards the actors watching the display. The beads then fall (after they pass some threshold) – towards the inside of the geometry. It is as if one poured the beads into a glass bowl, all filmed from below the bowl looking up, – except everything is turned on its side, so the beads fall towards the viewer, and the geometry of say a planet Earth is a hollow glass bowl between the sea of original beads and the viewer. “We sort of reversed it by having gravity faced towards the front of the membrane which would be the meniscus and the surface of the actual water was the front of the console, so we would have a constant force pushing towards the front which would give us a sense of a flat surface with water traveling around on it, but this surface was never rendered.”

4. Noise – After simulation, Weta Digital ran every bead through a temporal filter to remove jitter. “Even with the highest RenderMan settings we would still face a lot of noise, and that led us to taking out all the noise within the simulation,” continues Goodwin. “Even the most subtle twist of the bead half a degree, 2 degrees, would, because of being mostly specular, would result in seeing a completely different point within the IBL (Image Based Lighting) and that would create a tremendous amount of variation between the slightest bit of movement. By filtering it, it softened the whole piece out, to the point of sometimes we needed to get a little bit of grittiness back in because smoothing out the beads too far would look too boring.”

The team could control the flow from back to front and back again. “We allow the simulation, to go through a series of noise fields to make it a little bit more interesting and then join the target,” says Goodwin. “Then once it’s joined the target it would essentially no longer be registered in the simulation. Once the target appears we release it and it finds its way back into the simulation, by using the opposite force – we essentially use level sets to create some sort of pressure and it would know when it was inside or outside this world and a bunch of rules that would dictate whether it was allowed to be outside.”

5. Lighting and rendering – Lighting solutions were from taken the set. For the consoles, Weta moved to the next level of RenderMan to take advantage of improved raytracing and instancing objects. Motion blur was also a particular challenge. “We had the traditional motion blur – in that our particles do technically move,” says Goodwin. “We did a test where we rendered the objects and we would compare the motion blur represented from the object’s motion blur literally straight from animation and we would line that up with the render we would get out of the beads. In some cases we would have the vectors that were provided to the renderer shortened ever so slightly, so that we would have, as the beads form a target across two frames, would result in spikes within the motion. You’d have a bead that travels across the length of the frame and you’d have a long streaky specular highlight – ultimately it’s shading at one end and smudging it and in that case we’d shorten the motion blur so it wouldn’t create bright little spikes.”

The History of Krypton sequence with VFX by Weta Digital. The studio also worked on the liquid geo communication shots.
The History of Krypton sequence with VFX by Weta Digital. The studio also worked on the liquid geo communication shots.

The history lesson bead shots were created slightly differently and without an underlying sim. “In addition to working out all the technical aspects, just figuring out aesthetically and creatively what  actual images we would use to tell the story of Krypton was really important,” says Lemmon. “We did quite a lot of concept art based on various sculptures. We looked at bas-relief from the Rockefeller Center, we looked at Greco-Roman references and explored those kind of aesthetic looks but applied to spaceships and alien planets and alien technology – if you were depicting a sci-fi world through the medium of stone sculpture, what that might look like.”

Weta Digital had originally planned to do these shots with the liquid geo simulation engine as well, but ultimately the look required was a different one. “It’s more of a relief style,” says Goodwin. “It actually looks like the space that each object exists within looks like it exists within a world that’s flattened. The idea was that it went from an idea of being a simulation forming things to being a relief. If you look closely the background is flowing with the text and graphics – the beads travel along it – but outside of that, things weren’t actually moving to and fro.”

The action for the history lesson was animated based on greenscreen performances by Russell Crowe and Henry Cavill in what became, according to Goodwin, ‘humongous spaces’. “It was traveling hundreds of thousands of meters in a Maya scene, then we sent that through a projection,” says Goodwin. “It was all animated in world space and would then send that through a transformation which we would then project onto a back wall and relief it. We needed to represent all the information in a confined space.”

The liquid geo shots on Krypton occur while the planet is both under siege from Zod’s crew and as it becomes unstable and, ultimately, implodes. Before that happens, wide views of Krypton depict an alien atmosphere, which are mostly Weta Digital environments, spacecraft and creatures.

Faora and Kryptonites.
Faora Zod on Krypton, note the women’s suits are real, the men’s are digital thanks to Weta Digital.

One shot of Jor-El riding a winged creature made use of a buck and gimbal set up to replicate the move fashioned in previs. “We shot elements of Russell Crowe in his flying costume on that buck,” explains Lemmon, “and using previs as a guide tried to match both camera and the movement of the gimbal to the previs. Then we put those elements onto a digital creature and a digital world. But of course some of the stuff that moved in such a way that it wasn’t possible to get those movements. There were shots where we transitioned in and out – we go all digital – then Russell Crowe for just five frames or so, and then back to a digital character.”

For some dramatic shots of Zod’s ships approaching the house of El, Weta Digital referenced scenes from Apocalypse Now. “Zod flies in these attack ships and his descent on The House of El was modeled on the Rise of the Valkyries sequence,” says Lemmon. There’s actually a shot that everybody thinks is in Apocalypse Now, but isn’t, it’s in the Apocalypse Now poster that – the sun shot – the ships flying out of the sun.”

Suits of armour worn by characters on Krypton were mostly CG additions to the actors wearing gray suits with tracking markers (although female characters wore practical armor on set). The tracking of these shots is therefore particularly complex to match all the movements of actors sometimes engaged in hand to hand combat such as Zod’s attack after Superman’s pod is launched from Krypton.

A phaser battle contained a specific look for blasts with plasma residue. “Those were mostly Houdini simulations,” notes Lemmon. “We wanted to avoid a straight laser beam and do something that had a little bit more interest in it. The idea was the beam moves through the air and charges and ionizes particles in the air. In Krypton there’s particles that float in the air the same way that dust does here, but we treated them as if they were heavier and got more excited by the beam. As that the beam moves through the air it glows and starts to swim a little bit and leave that residue, particularly when it hits somebody.”

Zod and his followers are captured and banished to the Phantom Zone (also completed by Weta Digital).
Zod and his followers are captured and banished to the Phantom Zone (also completed by Weta Digital).

Aerial battle shots employed Krypton’s hazy environment and shafts of light through rock pillars to add depth. “In busy sequences like that it’s important to compose things so that you can actually see what’s going on and see who’s good and bad and who’s winning,” states Lemmon. “One thing that drives me nuts in big action sequences is when you can’t actually – it’s just noise – and you can’t see what’s going on.”

Later, as the planet begins to destroy itself, the studio worked to show various angles of the destruction including a ‘from space’ view. Lemmon says he enjoyed “figuring out how it would look – playing it out as a geo-thermal event that’s influenced by the planet’s magnetic field, and maybe have it collapse along the equator rather than blow out spherically and implode first then explode afterwards. Playing around with those ideas was a lot of fun.”

Live action and CG takeovers: the Smallville confrontation

VFX by MPC.
VFX by MPC.

A major challenge faced by the filmmakers and visual effects crew on Man of Steel was to realize elaborate close-combat fight scenes between Superman and his Kryptonian foes. They wanted to take advantage of digital effects to portray superhuman strength and powers, but without what had been perceived previously as ‘cutting’ from live action to an obvious digi-double and environment. Instead, the filmmakers wanted these shots to be executed as seamless takeovers.

Desjardin explains: “When we do these fights and these hyper-real things, we don’t want to do the traditional, ‘OK I’m a cameraman, I’m shooting a clean plate, I’m going to pan over here to follow the action that’s not really there yet but we’ll put the action in later. Because that’s us animating the characters to the camera. So we would do that animation with the characters – grappling, punching or flying away – and we would take the real guys up until the point until they were supposed to do that and we’d cut. Then we’d put an environment camera there and take the environment. And then a camera for reference of the actors and get each moment. So then we had a set of hi-res stills for the environment and the characters. Then in post we take the digi-doubles and animate them according to the speeds we want them to move in our digital environment.”

See parts of the Smallville encounter in this TV spot.

This approach was pioneered for the Smallville encounter in which Superman confronts Zod and his crew after they have threatened Martha Kent. They fight on the streets of the town and are further attacked by the military via A-10s and ground assault troops. MPC handled visual effects for this sequence (in addition to many other shots in the film ranging from Arctic scenes to shots in the upper atmosphere when Louis and Superman are taken to the Black Zero).

In order for the seamless takeovers to occur – and for the shot to continue with a pan, tilt or other movie – a new capture and post-production process was proposed by MPC visual effects supervisor Guillaume Rocheron, in conjunction with Desjardin. Here’s how it broke down for a typical Smallville shot:

1. The shot would be previs’d and particular fight choreography for the fights established by stunt coordinator Damon Caro.

2. Knowing from the previs the shot that was required, live action portions of the scene would be filmed in little pieces. “If say Superman was being punched and would land 50 meters away, we would shoot our start position and end position, and then bridge that gap with the CG takeovers,” says Rocheron.

3. A camera rig dubbed the ‘Shandy-cam’ (named after on-set VFX coordinator Shandy Lashley) obtained keyframes of the actor. “It’s a six still camera rig that’s built on a pipe rig so that you can run it in at the end of a setup and get stills of keyframes of a performance or an expression,” says Desjardin, “and then we could use those hi-res stills to project onto the CG double and get really accurate transition lighting and color – right from the set.”

MPC also handled the energy masks worn by the Kryptonites.
MPC also handled the energy masks worn by the Kryptonites.

4. On set, another camera rig was also used to capture the environment. “We ended up calling it Enviro-cam,” notes Rocheron. “It was a rig where we mount a Canon 5D and a motorized nodal head, and that allows us to capture to full 360 environments at 55K resolution for every single shot. The capture time is very quick – we were taking between 2 and 4 minutes for every shot, so it was really easy – the same way we capture HDRIs.”

“The sets are there so why not capture them?,” says Rocheron. “It basically allows you to film what is not filmable. Here there’s no cuts with no interruption. We also did a lot of entirely digital shots which had no live action. So we had our Enviro-cam, so we used that to capture the environment rather than a plate and we could put our CG characters in there.”

The set capture resulted in lighting and textures that could be re-projected onto geometry (the sets were also LIDAR’d to aid in reconstruction). “We wrote a little pipeline in Nuke that allowed us to stitch all the photos together and then very simply calibrate them with the Smallville geometry,” says Rocheron. “We would calibrate just one angle, because for the full dome, all the photos would get automatically calibrated on the geometry. For us it was a very good process – it wasn’t just a sphere. Everything was re-projected in two and a half D on the geometry to get parallax and the camera would travel technically in all directions.”

Superman is also seen in several sequences, of course, flying though clouds. “We used volumetric clouds,” says Rocheron, “using an internal tool we have for clouds to mobilize geometry and transform it into volumes and refine with layers of advection and noises for the fine details.” In terms of environments Superman flies through, such as over the Arctic circle, over canyons in Utah, over Africa, and over the Dover cliffs, MPC developed these first in Terragen and then took them through to matte paintings and geometry.

Smallville features in this Man of Steel trailer.

5. Full-screen digi-doubles were of course a major component. MPC led the digi-double Superman, Zod, Faora and other Kryptonian creations which were shared with other vendors. Digital armour was also added along with the energy-based Kryptonian helmets. Cyberscan and FACS sessions were conducted with the actors, and polarized and non-polarized reference photos were taken. Superman’s cape and costume were scanned in high detail – the cape in particular became a direct extensions of Superman’s actions. “Our main reference for the cape were illustrations from Alex Ross,” states Rocheron. “We had the cape here at MPC so we could really study its thickness and the velvetness. The light is very soft on it. We did a cloth solve in nCloth and we wrote a number of tools in animation to be able to animate the cape and see it in real time. Once the animation was approved, we had a basic representation of the cape and we would then use that to drive the nCloth simulation.”

MPC used the latest version of RenderMan and its raytracing capabilities to help with the chainmaille look of Superman’s suit and the Kryptonian suits and armour. “They were all painted as displacements but we did hi-res displacement,” says Rocheron. “Raytracing allows you to capture that very subtle detail between the reflective pattern of the chainmaille and the light absorption of the blue part of the suit, since the underlying layer is bleeding through. We have infinite area lights which are the dome and the finite direct area lights that are the direct light sources you want to position in space. And that physically based setup gave us a terrific look for the reflections and the fall-off of the light – really key to get the details of the suit and armour, which in reality was mostly black.”

In one shot, Superman fist fights with an eight foot Kryptonian. “We shot a live action piece and just replaced the performance capture stunt guy and added the cape onto Superman,” says Rocheron. Then we just thought about it and said it would be much cooler – since the Kryptonian is very tall then Superman should fight against him while he’s hovering. We did those shots as entirely digital shots. It has that very cool feeling of flying around and punching him from all different directions.”

6. For each shot, it then became a matter of choosing the right transition point. “There’s a little transition zone that’s maybe only one or two frames,” says Desjardin. “We knew that we wanted to keep Superman real in certain places because it was say super-sharp and we want to use that to anchor the shot, even if just for a couple of frames, and then we’re going to go into digital because it’s crazy right after this.”

MPC also worked on shots of Superman 'learning' to fly.
MPC also worked on shots of Superman ‘learning’ to fly.

“We layered a couple of other things on top of that,” adds Desjardin. “One, if there’s a punch being thrown, you can lose the arm real fast if it’s too fast. So a lot of times the arm of a CG character may be going just slightly faster than a human’s – we put a sonic boom-type signature around the forearm and we might put a little heat luminance on the leading edge surfaces of the fist. It puts an idea in your brain that it’s moving really, really fast even if it isn’t.”

7. Not only were fights being depicted with digi-doubles and environments, they also traversed cornfields, through buildings, glass walls, on roads and against flying A-10s. That necessitated incredible destruction and for this MPC looked to its finite element analysis tool Kali, which had first been developed for the wooden pagoda destruction in Synder’s Sucker Punch. This time around, following a few years of development, Kali was able to handle so many more kinds of materials. “So we could take a tarmac and break it differently,” explains Rocheron. “It’s more resistant so it has a crater but cracks at certain places near the surface. And Superman crashes into a bank vault and crashes through a glass door with a metal frame and finally into the vault which is made of super strong steel so we made that bend and wraps around him.” Particle sims and Flowline were also incorporated into the destruction pipeline for Smallville.

Destroying a city: the invasion of Metropolis

Superman takes on Zod at Metropolis.
Superman takes on Zod at Metropolis.

Determined to conquer Earth and transform it into a new Krypton with ‘world engine’ machines, Zod launches northern and southern hemisphere strikes in Metropolis and the Indian Ocean, respectively. The result, until Superman battles and then defeats the Indian Ocean world engine, is that significant parts of Metropolis and its soaring skyscrapers are destroyed – a task given to Double Negative. The studio also realized further full-scale destruction as Superman and Zod wreak havoc on remaining buildings and each other.

“Down in Metropolis there was a very clear design edict that came from Zack about how the evolution of the battle was going to be,” says Desjardin of the lighting design for the film’s third Act. “The sun had to be not quite setting when the Black Zero comes down and then very quickly it’s in its setting position and by the time Superman and Zod go to fight it’s down below the horizon and there’s a Hawaiian cloud and colorful clouds and it’s getting dark with a twilight sky – more an ambient look. Then once Supe and Zod jump up into that sky then you have some other lighting options with the sky and certain lit billboards. It’s a way to make the city come alive to make it even more dramatic to keep characters backlit.”

Supe and Zod battle it out.

To create a convincing Metropolis, Dneg looked to Esri’s CityEngine to help procedurally deliver the city, a tool it had first employed for the sprawling future world of Total Recall. “That was a much more sci-fi based role,” notes Dneg visual effects supervisor Ged Wright, “so we took what they had done and extended it a great deal. The work we were doing was based around the Downtowns for New York, LA and Chicago and that gave us the building volumes for heights. We’d skin those volumes with kit parts but most of it then had to fall down! So we had to rig it for destruction and use it for other aspects of the work as well.”

For building destruction, in particular, the studio re-wrote its own asset system to be geared towards dynamic events. An implementation of the Bullet engine inside Houdini – dubbed Bang – became Dneg’s main destruction solver, with a core philosophy of allowing for quick iterations with heavy control. “We wanted to be able to run an RBD event and trigger all these secondary events, whether it was glass or dust simulations – all of those things needed to be chained up and handled in a procedural way,” says Wright. “One of the advantages of this was that, because it was all based around a limited number of input components, you can make sure they’re modeled in a way they’re useable in effects – you can model something but they’ll be another stage to rig it for destruction.”

In addition, fire, smoke and water simulation tools were further developed at Dneg. The studio moved from its existing proprietary volume renderer DNB to working in Houdini and rendering in Mantra for elements such as fireball sims. Dneg’s in-house fluids tool Squirt also benefited from new development to handle larger scale sims and interaction for more tightly coupled volumes and particles. Overall, the studio’s rendering pipeline has moved to a more physically based approach in RenderMan.

Within the Metropolis sequence there were other numerous requirements including attacking and destroying aircraft and, of course, digital representations of Superman and Zod when they fight. One particular element Dneg contributed and also shared with other facilities was Zod’s armour. “There was no practical armour for Zod,” states Wright, “he only wore a mocap suit. We took concept art, came up with a ZBrush sculpt of the armour and could show them turntables of what it would look like during filming.”

Dneg took MPC’s Superman and Zod models and adjusted them for their own pipeline in order to rig, groom hair and adjust shaders. “We also have more of a photogrammetry approach to facial,” says Wright, “so we made the actors sit there again with an eight camera rig – similar to Light Stage but portable and gives us polarized photography to reconstruct the facial expressions.”

Zod's powers come to the fore in his battle with Superman.
Zod’s powers come to the fore in his battle with Superman.

Zod and Superman battle in amongst buildings and when they hit each other tend to generate enormous shockwaves that rip skyscrapers in half. Although much of this was completely digital (some live action was shot in Chicago and then on Vancouver greenscreen soundstages), Wright says Dneg implemented real photography onto its digital doubles wherever possible. “Because you have their performances you engage with it – and your eyes go straight to their faces. If they’re big enough in frame and doing something, you want to use a photograph of them. As soon as you buy that and get what’s going on, you’re more willing to take on board what’s going on with the rest of the frame.”

Adds Wright: “There’s one shot where Zod hits Superman up the side of the building. Superman is hovering above. Zod starts running up the side of the building. This is just before he rips his armour off and is taking in more of the sun’s energy. Superman flies down to hit him and the two of them collide causing that shockwave. DJ and Zack were both really keen to make it feel like two Gods were fighting, and they were at the height of their powers right then.”

Rounding out Man of Steel’s effects

Helping to round out the effects work on the film were companies like Scanline and Look Effects. Here’s how they added crucial shots to the film.

Scanline – tornado and oil rig

Scanline delivered shots of the tornado sequence in which Smallville residents shelter in an underpass from an approaching twister, while Clark Kent’s father Jonathan returns to a vehicle to rescue a pet dog. “For the tornado itself,” explains Scanline visual effects supervisor Chad Wiebe, “we actually came up with a unique methodology by combining a number of individual fluid sims which would be wrapped around the funnel. This allowed us to create a bigger and denser funnel without some of the overhead that would have been generated by trying to do a single sim for the entire funnel. This also allowed us to pick and choose from a library of different sims which gave us greater control over the look and speed of the funnel, the variation of different parts of the funnel, as well as the technical aspects such as density and resolution for some of the more close up shots.”

See part of the tornado work in this TV spot.

Along with the tornado, artists added ground dust and debris, farm buildings and uprooted trees. “We also had to create digital versions of all vehicles that were shot on set as well as a number of additional vehicles to suggest a longer line up of traffic stopped on the freeway,” says Wiebe. “As the sequence progresses most of the vehicles end up getting damaged or destroyed to some degree so in addition to a typical vehicle rig for the basic motions and wind buffeting, we also created a system where we could dynamically damage the vehicles based on collisions with one another or based on forces as was the case with the Kent truck which gets destroyed at the end of the sequence.”

An earlier sequene of Clark saving workers from a burning oil rig made use of reference of the BP Deepwater Horizon explosion and the Toronto Sunrise Propane factory explosion. “We tried to make sure we were as accurate as possible regarding the look of the fire and smoke plumes that are generated by oil fires, which have a very unique and identifiable quality,” notes Wiebe. “The exterior plates where shot with the actors on a set built helipad, with a real helicopter, and green screen surrounding 3/4 of the set. From there we created and entirely digital oil rig, and we would composite the actors and helipad onto our digital rig, or at times we would replace the helipad as well. Many of the hero helicopter shots also utilized a digital version of the helicopter in order to get the interactive lighting and reflections matching.”

“The oil rig collapse was a series of rigid body simulations created using Thinking Particles,” says Wiebe. “From there we would also add fire, smoke and dust trailing off the rig using Flowline, which was also used for the fluid sim when the rig came crashing down into the ocean below. There were also a series of explosions happening throughout the sequence also using Thinking Particles for the RBD’s and Flowline for the fire and smoke.”

Look Effects – the bus crash

MOS-13635CIn the film, Clark remembers key moments from his youth, including those they gave hints to himself – and others – of his tremendous powers. One is the crash of a schoolchildren-filled school bus. After it blows a tire and launches off a bridge into a river, Clark dives out of the bus and pushes it to the bank, and then rescues another child from under the water. Much of the crash was filmed practically on a bridge and quarry location, and then on a tank stage. Look Effects helped piece the scene together.

Some of the work included rig and camera removal and also clean-up of the bridge railing. “There was a POV camera angle from the bridge looking down at the bus as it was sinking into the water,” notes Look visual effects supervisor Max Ivins. “The bridge part was shot separately from the sinking shot and the exterior sinking was shot in a rock quarry so there was no moving water like the river. We did some CG replacements of the vents on top of the bus and we had to make the remnants of the splash when someone runs up to look at the bus over the edge. We added the foam ring and the bubbles coming up. Used stock footage and CG elements to make a post-splash surface of the water.”

For interior shots of the bus with the children, Look altered water levels to make the danger appear more prominent. “They had a surface outside of the bus that was basically the same as the inside of the bus – they couldn’t really sink it because there are kids involved,” explains Ivins. “So they had to make it look like the outside water was 2 feet taller than the inside water that is rushing in to give it that sinking feeling. So we did whole simulations and cleaned up some of the lighting. We made bubbles coming up and making it turbulent.”

Look’s other contributions to the film included several monitor comps, including ones at NORAD, and some artefact clean up for a flashback signature shot that had been time-ramped of Clark wearing a cape and with his hands on his hips in front of some blowing dandelion heads.

All images and clips copyright 2013 Warner Bros. Pictures.

5 thoughts on “<em>Man of Steel</em> vfx milestones”

  1. Pingback: I predicted this a while ago | Everything is Spatial

  2. Pingback: Cutting Edge Render Tech | 次时代人像渲染技术XGCRT

  3. Pingback: Randomly Related Slightly Salient Stuff - Man of Steel Answers Insight Commentary

  4. Pingback: Badfan v Superman 5: You’ll Believe a Man Can Die — SpecFaith

Comments are closed.