G.I. Joe: The Rise of Cobra

With G.I. Joe: The Rise of Cobra, director Stephen Sommers delved into the world of the popular Hasbro action-figure toy line. Bringing the characters and vehicles to life was partly the role of overall visual effects supervisors Boyd Shermis and Greg McMurry and a host of effects vendors. fxguide profiles some of the key effects shots by Digital Domain, CIS Hollywood and CIS Vancouver, Prime Focus VFX, The Moving Picture Company, CafeFX and Framestore.

Digital Domain

Digital Domain handled 320 shots for the film, including the Paris chase sequence and Eiffel Tower destruction, along with various shots of a concussion cannon and an opening convoy sequence, under the visual effects supervision of Bryan Grill.

09Sep/gi/dd_1
Production shot mostly in Prague for the Paris sequence, in which newly inducted G.I. Joe soldiers Duke (Channing Tatum) and Ripcord (Marlon Wayans) – wearing accelerator suits – chase down the Baroness (Sienna Miller) and Storm Shadow (Lee Byung-hun), who plan to fire a highly destructive nanomite missile capable of eating away at the Eiffel Tower. The accelerator suits were animated by Digital Domain and placed into live action and augmented backgrounds featuring moving cars, trains and general street chaos.


09Sep/gi/dd_12
“We basically had to make a lot of the streets ‘Parisian’,” remarked Digital Domain digital effects supervisor Darren Hendler. “We had a whole team doing set survey work and taking images, and we had some LIDAR scans of the environments.” Most of the Paris street scenes were accomplished using real photography projected onto simple geometry, combined with extra 3D, occlusion and reflection passes. CG cars and other vehicles, digital doubles or greenscreened extras were then added into the plates where necessary.


09Sep/gi/dd_3
“One of the challenges of these shots was how fast the guys had to be running around in the accelerator suits,’ said Hendler. “When we started animating we realised we had to speed up or slow down the plates and do a number of re-times and post camera moves. Sometimes we had to re-build entire plates to get the camera looking in the right position.”


09Sep/gi/dd_4
Practical accelerator suits, constructed by Stan Winston Studio, were made available on set for reference and scanning. Digital Domain relied on 3D scanning company XYZ RGB, who used a combination of projected light and stereo reconstruction in a portable setup to create suit geometry.


09Sep/gi/dd_5
To animate the suits, Digital Domain animation supervisor Bernd Angerer completed some early tests and determined that motion capture run cycles would be necessary. “We developed a certain type of run for the accelerator suits,” said Angerer. “It was tricky because if you make someone run faster than a human can physically run it can look very cartoony very quickly. Instead of speeding up the run cycle to make them run faster, though, we left the actual run at the same speed but gained more distance by increasing the air time in between each step. That’s probably what an accelerator suit – if it really existed – would do. It wouldn’t really speed up your moves, it would enhance your moves.”


09Sep/gi/dd_6
Animation was mostly completed by hand, with some falls and other exaggerated movements performed on the motion capture stage. “The general approach was to stay in reality as much as we could,” said Angerer. “We wanted to stay in the world of physics and mechanics. If there was anything we could perform with an actor, we would.”


09Sep/gi/dd_7
In one all virtual shot, the two characters run down a street away from two missiles. As they duck and weave over cars, the shot plays out in slow-motion, complete with practical explosions layered into the scene. “Our animation and rigging for the characters was done in Maya,” said Hendler, “with effects work in Houdini and rendering a combination of Maya through Mtor to RenderMan for characters and effects rendered using Mantra and our own volumetric rendering system, Storm”. Digital Domain then composited the shots in Nuke.


09Sep/gi/dd_8
Duke and Ripcord are unsuccessful at stopping a nanomite missile from being fired at the Eiffel Tower. When it hits, the nanomites begin to eat away at the metal tower, causing it to crumple into the Seine. “Just building the tower itself,” noted Hendler, “was extremely complex and involved an incredible level of detail. We actually had the full set of plans from when the Eiffel Tower was originally built. The final model was so heavy we could never actually load it into a single file. We could only work on separate pieces of it until the very end.”

“At some point,” recalled Digital Domain computer graphics supervisor Kalaitzidis, “we noticed little artifacts on the model and we zoomed in and in and found out the modeler made planters with trees everywhere, and even telescopes!”


09Sep/gi/dd_9
To start the buckling process, an animator would create some general movement through key-framing in Maya to see how the shot was looking. “It worked well, but there were a lot of things the animators weren’t getting for free,” said Kalaitzidis. “They would shake it and when it came down by a level it wouldn’t crunch at all.” The solution lay in running a cloth simulation in Houdini to get the feeling of elastic deformation on the actual Tower geometry. Digital Domain tested the sim first on a tin can model, then applied it to the girders of the Tower. “That’s when things got complicated,” said Hendler. “We would have to run dynamics for having pieces drop off – and then we still had to add the nanomites.”


09Sep/gi/dd_10
The green nanomites and the associated dust and debris was achieved via a particle system and rendered using Storm, a renderer developed and used at Digital Domain since the late 1990s but always in continued development. “It was a fun effects task to work out what the nanomites would do once hitting the Tower,” said Kalaitzidis. The nanomites eat away at the surface of the metal to reveal highly detailed structure underneath. “We had to have really dense geometry to get nice edges and broken off pieces,” noted Hendler. “Then, if the nanomites were eating away at a girder, at some point the girder was not being held up by anything and would start falling away dynamically. Times that by 400,000 girders and it started to get very tricky.”


09Sep/gi/dd_11
An earlier convoy sequence, in which a small military unit carrying nanomite warheads is attacked by Cobra, featured substantial Digital Domain effects work. “Most of the weaponry in that sequence uses a concussion gun, which is also in other parts of the film,” said Kalaitzidis. “The effect of the gun wasn’t just about the look the concussion – although for that we did use fluid sims and solvers, gas animation, electricity, velocity vectors and vorticity – it was more about the effect the gun had on everything around it. The ground. The trees. We used Houdini again and Storm to build up a look and also added in practical elements for debris from in our library.”


09Sep/gi/dd_2
One particular shot in the convoy sequence involves an Apache helicopter being taken out by a concussion blast. “To make the helicopter explode,” explained Kalaitzidis, “we ripped it apart with blend shapes then added a lot of particles and simulations in Houdini, rendering it out in RenderMan. I really like the nightime lighting of that scene, there were reds and greens and even purples.”


CIS Hollywood and CIS Vancouver

CIS Hollywood and CIS Vancouver worked on 263 G.I. Joe shots, predominantly the Joe’s underground base known as ‘The Pit’. The base sequence starts with the approach of a Howler aircraft between some pyramids and up to a large sand-covered door. “The approach to the base was completely digital,” explained CIS Hollywood visual effects supervisor Bryan Hoita. “Production shot some Vista Vision sand dune plates in California which we replicated using photogrammetry techniques. We could then re-project those textures to do a virtual shot showing the Howler aircraft entering the door that leads to an enormous landing platform.”

09Sep/gi/cis_bs_A
09Sep/gi/cis_finalA


The platform – intended to be a mile and a half long and half a mile wide – included numerous Joe vehicles and personnel. CIS matched a live action plate of the Joes on a cage set to a CG Howler and surrounding digital platform environment. “The challenge was how to light something that big and still make it hold up and look interesting,” said Hoita. “We needed a number of localised light sources for the landing area so we introduced global illumination calculations into all of that environment.” Computer graphics elements for the shot were modelled in a mixture of Zbrush and Maya, with smoke and other elements rendered in Maya and FumeFX, a 3ds max plug-in. The Joes’ tour of the Pit also reveals a vast aquatic training zone created by CIS and an urban combat zone.

09Sep/gi/cis_bs_b
09Sep/gi/cis_comp_b


The Baroness, Storm Shadow and Zartan (Arnold Vosloo) attack the Pit in search of nanomite warheads. CIS created their subterranean approach using volumetric fluid dynamic simulations as they surprise a group of camels and a herder. “We called that the Tremors or Bugs Bunny gag as the characters hit the base with their mo-pods,” said Hoita. A small set was built for shots of the mo-pods piercing the Pit walls, which CIS augmented with a virtual environment. A massive fight ensues, with further digital environments, pulse weapons and a camouflague suit all part of the effects work.

“The idea of the camo-suit was that it was made of cameras re-projecting what was behind you to give a pseudo invisibility,” explained Hoita. “They shot a stunt actor in a suit with tracking markers and reference using HD cameras. We used the film camera and HD cameras to basically create some motion capture data that drove a CG character. Then we replaced the stunt actor with our CG character and used velocity vectors on the surface of the suit to tell it where it was moving. If it was moving quickly, we gave it a bit of a delay. Also, the suit flickers when it comes on as if thousands of cameras are registering all at once. Finally, we added glitches to the suit to break up the projection lag look to make it seem like it wasn’t a hundred per cent developed yet.” Compositing was completed using a combination of Shake and Flame.

CIS also completed shots comprising Cobra’s Artic ice hangar and the Joes arriving in a submarine. “That shot starts off with a polar bearing digging around in the snow,” said Hoita. “The bear was created by MPC who gave us a bunch of layers to work with.” The bear then runs off as the submarine breaches the ice, a combination of Maya dynamics and RealFlow to create the water and spray. Further shots in the Artic sequence involved the Joes entering a missile Silo – achieved using digital characters complete with simulated furry collars – and shots of missiles being fired. A fight between Storm Shadow and Snake Eyes (Ray Park) in an energy shaft also required digital doubles and compositing of arcing laser beams using Flame.

Prime Focus VFX

Prime Focus VFX (formerly Frantic Films VFX) produced 124 shots, including the final aerial sequence. The studio also contributed significant previs to the film and look development for the nanomite technology.

09Sep/gi/Frantic_screen
The aerial sequence, in which Ripcord pursues nanomite missiles aimed at Washington D.C. in a Night Raven plane, was executed mostly in the digital realm. “Ninety-five percent of that sequence was synthetic,” said Prime Focus visual effects supervisor Chris Harvey. “We had to create the Night Raven, the environment it flies through, including cloudscapes and the ground, plus show the plane’s nanomite disintegration.”


09Sep/gi/Frantic_NanoCloud
All this occurred while the plan was travelling at tremendous speed. “Stephen Sommers kept saying ‘Mach 11, Mach 11!'”, recalled Harvey. “We had to create so much geometry because of the distance that thing travelled.” For views of the clouds and of the Potomac River, artists rendered fluid sims and combined these with matte paintings. “We could cache out fluid sim clouds and artistically place them or sculpt them into the environment,” said Harvey. Shots near the White House were created from thousands of stills gathered using a stereo rig. The environments were generally assembled in 3ds max and composited in Fusion.


09Sep/gi/Frantic_2Planes
Nanomites from one of the missiles begin eating into the Night Raven as it moves at supersonic pace. “They were supposed to act as an intelligent swarm,” said Harvey. “They had a threatening evil viciousness about them that would consume anything they touched. We did research into magnetisim and crystal growth to get the right look for the nanomites.”


09Sep/gi/Frantic_Plane
Prime Focus used its proprietary volumetric particle renderer, Krakatoa, to help create the nanomites and the plane’s disintegration. “Rather than rely on texture maps to make it look like something was eating away at the plane,” added Harvey, “we converted all our surfaces into full 3D geometry with volume so you would see metal being eaten away with internal structures and thickness.”


09Sep/gi/Frantic_EarthNano2
“To simulate the destruction,” continued Harvey, “we would invert the motion on the shot so the jet was stationary and all the cameras contained the motion. Then we would fake all the speed stuff in the particle system with wind and turbulence. It was the only way we could wrangle control and art direct the particles.” A single shot of the ship’s destruction often generated two to three terabytes of particle data.


The Moving Picture Company

09Sep/gi/mpc_1
The climatic underwater battle sequence was the responsibility of MPC, which completed 176 shots for the film. Working from concept paintings and previs provided to the studio, MPC built an entire 3D underwater base for the sequence, adding ships, attack subs, Cobra fighters, missiles and explosions.


09Sep/gi/mpc_2
Finding a balance between believability and style proved challenging. “From the beginning,” recalled MPC visual effects supervisor Greg Butler, “we knew we were dealing with slightly contradicting goals. The underwater look had to be believable enough for the audience to be kept in the film, but the battle we would be showing could never really be filmed. Given the clearest water that exists in the world’s oceans, half-kilometre visibility is about the maximum that’s possible.”


09Sep/gi/mpc_3
“But many of our bigger battle shots required an average of one to two kilometre visibility. Because of this, we were constantly manipulating the amount of underwater look effects that were applied to shots. When the action was staged close to camera on wide lenses, the visibility almost reached realistic levels. It’s a common thing in VFX these days. We’re asked to show something never seen before, but in a way that can be called photo-real. So you grab on to what real reference you can and then stretch it as far as you can hoping you don’t break it.”


09Sep/gi/mpc_4
Limited live action elements of actors in cockpits were tracked into CG ships for the sequence, with everything else entirely synthetic. “The underwater look was achieved in a number of ways,” said Butler. “Colour and value falloff based on distance from the virtual water surface (which was our main light source), colour and value falloff based on distance from camera, light scatter (highlight blooming) and most importantly, lots of fx bubbles and marine snow (stuff always floating around in the water). Our compositing supervisor, Stuart Lashley, worked with Mo Sobhy, Anders Langlands and Fabio Zangla, our shading and lighting supervisors, to develop a methodology for rendering 3D passes in such a way that the underwater effects could be dialed in primarily as a compositing process.


09Sep/gi/mpc_5
Underwater explosions were created in Flowline from Scanline. “I had used Flowline to create water and fire effects on Harry Potter and the Order of the Phoenix,” said Butler, “so I knew its solvers could be manipulated to work in non-standard gravity situations. The FX Department, lead by Nigel Ankers, created a small library of pre-rendered Flowline explosions that compositors could track in to some shots so the our FX TDs could spend more time on the closer to camera work, especially when these interacted with ships and the base.”


CafeFX

CafeFX contributed 102 shots to G.I. Joe comprising holograms, facial transformations and Destro’s mask.

09Sep/gi/cafefx_3
CafeFX visual effects supervisor David Ebner created hologram designs in Lightwave before handing them off to another artist for re-engineering in XSI. “The hardest part of the holograms was making characters come in and out of them,” said Ebner. “McCullen (Christopher Eccleston) steps out a hologram at one point. For that effect, which was photographed on set without motion control, we had to reconstruct the whole background and roto all the other people in the shot.”


09Sep/gi/cafefx_4
Facial transformation shots showcasing the all-consuming nanomites made up most of CafeFX’s work. For a shot of a Viper soldier’s face succumbing to the nanomites, artists created full 3D face animation with blistering skin. “Because of the rating,” noted Ebner, “they didn’t want to see muscles and bones and bloody tissue. Instead they wanted it to look like the nanomites had already eaten the inside of his head. So we had to turn it into a hollow mask and let it deflate. We didn’t want it to look too rubbery so we had the face crumble like ash as it deflates.” The shot was achieved using XSI’s node-based ICE interface.


09Sep/gi/cafefx_2
A subsequent sequence dubbed ‘the screaming man’ featured nanomite tests being performed on unwilling participants. “It’s barely seen in the movie because the shots are integrated into some monitor composites,” said Ebner. “Up on the screen is a shot of the doctor injecting a guy and the guy is writhing in pain as his face starts to blister. We looked at a lot of skin diseases and rashes. Our initial concepts were more skin-peeling type effects where you got this yellowy bone stuff with muscles. In the end we went with something more time-lapse-looking, as if the skin was rotting away. Small things on the face, like chicken pox type rashes, tended to be freakier looking than big facial holes.”


09Sep/gi/cafefx_1
ICE was also used for shots of Zartan being transformed to look like the U.S. President. “We referenced photos of boxers after they’ve been injured in a fight to see what their face and eyes looked like,” said Ebner. “We made some blend shapes of that kind of swelling with textures and had four controllers that interactively blended the shapes. This also created a shockwave between the shapes, so if you had two or more in the same area it would automatically calculate an overlapping blend.”


Towards the end of the film, McCullen dons a mask and becomes Destro. The mask was realised entirely in CG by CafeFX. “You see this stuff being injected into his face and it starts crawling over his skin, which had just been burned,” said Ebner. “It seems to heal at first, but then gets darker as it’s eaten away with more and more blisters. Then underneath you see this dark metal almost iron-like material starting to build until it becomes a silvery mask.” Again, shots of the mask were done in ICE. A final shot of Cobra Commander and Destro imprisoned on pedestals involved a CG environment created in Lightwave.

Framestore

09Sep/gi/framestore_3
Framestore was brought on late in production to work on four complicated standalone shots. “Our first shot involved a helicopter taking off from an airstrip,” explained Framestore visual effects supervisor John Thum. “We had to put an Afghanistan background completely behind motion-blurred rotor blades and heat distortion from the helicopter. You also had to see the background through the chopper’s glass.”


09Sep/gi/framestore_2
The next shot was a night time matte painting of Paris inserted into a day-time helicopter plate that followed a car driving along a country lane. Framestore artists re-timed the plate, created a Paris matte painting and added CG headlight beams to the car, which was put together as a two-and-a-half-D matte painting in Nuke.


A fully CG environment was created for the inside of a missile silo, featuring digital trucks, digital doubles, scaffolding and other geometry. “That was a tricky shot because we basically had nothing to begin with, other than a concept painting,” noted Thum. Digital missiles in the shot were augmented with 2D liquid nitrogen elements coming down the sides and around the base.


09Sep/gi/framestore_1
Framestore’s final shot features Cobra Commander and Destro imprisoned inside of an aircraft carrier. “It starts off in what looks like a corridor but then pulls back from the characters to reveal an aircraft carrier in the middle of the ocean,” said Thum. The live action plate was re-projected and then as the camera pulls back the shot becomes entirely CG. Artists used Maya and RenderMan to create the carrier, ocean, wake, bow splashes, sky, people on deck and aircraft taking off.