Ender’s Game: how DD made zero-g

Among the dazzling sequences in Gavin Hood’s Ender’s Game are several zero gravity training sessions in an above-Earth ‘Battle Room’, featuring visual effects by Digital Domain. These sessions are designed to equip gifted child Ender Wiggin and his fellow cadets to be able to orchestrate battles against an alien race known as the Formics. Director Hood worked in concert with his previs, stunt co-ordination and visual effects teams in order to replicate the zero-G conditions as accurately as possible. “A huge amount with the actors had to be corrected in CG,” acknowledges Hood. “Not close-ups, but a lot of the physical body movements had to be adjusted so that it would do what it had to do perfectly.”

Above: watch a breakdown of Digital Domain’s zero-g effects for Ender’s Game, thanks to our media partners at WIRED.

Find out more below about the process the filmmakers followed to create the zero-G effects, including why flying rigs were not always the answer, and why completely CG characters weren’t either.

Step 1: Planning and previs

Before any shooting occurred, Hood visualized the battle room sequences in script, storyboard and previs form (using “rather bad versions of space children,” he told fxguide). This was crucial, because it allowed the director to block the scenes in virtual reality and then be able to film as efficiently as possible for the later complicated greenscreen shoots. “The key to these things is really good prep,” he says.

A scene from the zero-G Battle Room.
A scene from the zero-G Battle Room.

Hood consulted with all the key department heads even while carrying out previs, including production designer Ben Proctor, stunt co-ordinator Garrett Warren and visual effects supervisor Matthew E. Butler. The director would then show the previs to the actors, who still before shooting went through a six week training course that involved working with members of Cirque du Soleil to tailor the balletic and fluid movements required for space.

Step 2: Shooting

Production filmed the battle room sequences on a greenscreen stage in New Orleans (DOP Donald McAlpine, ASC shot on the RED EPIC). Armed with the previs, Garrett Warren devised a series of rigs that would be used to shoot the actors in ‘zero-g’. However, the intention of the filmmakers was always for multiple methods to be used for flying depending on whether the shots were wide or close-up. In addition, CG augmentation and complete body replacement were also known weapons in their arsenal before shooting began.


Original plate.
Original plate.
Final shot.
Final shot.

“We came up with a plan,” outlines Butler, “which was a hybrid marriage between elaborate rig stuntwork coupled with computer generated augmentations or generations. Either we took content that we photographed and then manipulated it or we created it synthetically from scratch.”

The zero-g rigs varied from dual-axis ring rotators, a doggie-cam setup for spinning shots, puppeteered wire rigs, bicycle seats to ‘tuning forks’. For close-ups, the actors could be shot often in very simple ways. “For a lovely shot of Petra pushing off against Ender – just of her face – we laid down some apple boxes and pushed a camera in,” says Butler.

As well as zero-g shots, DD created enormous Formic battle scenes.
As well as zero-g shots, DD created enormous Formic battle scenes.

Hood notes that during shooting they would oftentimes carry out a specific gag for reference in the knowledge that it may well be digitally replaced later. “We had Garret do a full rig of all those kids flying across the room in the final battle scene, for instance,” he says. “Obviously that was extremely hard to do, and we shot portions of it so that our animators would have a visual reference. But that entire formation aside from the close-up work is full-CG. But by having the kids and stunt performers we had a visual performance.”

Step 3: Actor augmentation and replacement

Certainly, a large scale visual effects effort was necessary for the zero-g battle location – low-Earth orbit – as well as the battles themselves which contained hundreds of thousands of ships and mass scale destruction and action. But, as mentioned, the cadets too came in for their own effects manipulation, partly simply because of the rules of gravity that must be followed in an on-Earth shooting environment.


Original plate.
Original plate.
Final shot.
Final shot. Note how the body has been completed in CG.

“It’s incredibly difficult because you’re constantly fighting gravity,” explains Butler. “If somebody moves like a pendulum, they swing back. So your pivot point, unless you’ve got extra thrust, is always going to be at your center of mass. Imagine I’m floating out in space and I go to touch my toes. As I touch my toes my arms go forward, I hinge at my waist, I touch my toes but my bum will move backwards as my feet move forwards, such that my center of mass will not move. That’s the giveaway of wire-rigs – they are wonderful toys but they are pinned as harnesses on a fixed point.”

For Digital Domain, that mostly meant retaining facial performances and then replacing entire body movement. DD wrote tools that computed the center of mass for the characters to ensure animation looked right. “It knows that the center of mass is moving along in a linear trajectory and computed the difference and then effectively stabilized it in a 3D world prior to rendering,” says Butler.

In the Battle Room zero-g environment.
In the Battle Room zero-g environment.

Full CG performances were also created, but as a general rule this did not extend to moments when the actors were delivering dialogue or performing extreme actions. We asked DD, a company with a history of realizing fully digital actors in films such as Benjamin Button and TRON: Legacy, why they didn’t consider the same approach for Ender’s Game.

“The answer is money and time,” says Butler. “Yes, we pride ourselves on doing humans as synthetic performances, but it’s incredibly hard to do. If you want Asa Butterfield looking at camera and delivering a performance, and you can shoot him, you’re kind of silly not to. That’s why we did a hybrid approach. We did have him as a fully synthetic guy many a time, but when he’s really doing a clear performance we chose the performance of a human. This is not a movie about showing off a CG face. In Benjamin Button, it was the only way to do that. My point of view, is, if you can shoot it, why not.”

To help create their digi-doubles, the actors were still scanned in USC ICT’s Light Stage. “We used it to get the textures and physical model and it was a lot less about performance capture,” says Digital Domain CG supervisor Hanzhi Tang. “We got some basic FACS shapes here,” adds Digital Domain associate VFX supervisor/DFX supervisor Dave Hodgins. “But because we had a very limited performance we only had to create about 30 shapes for the eye motion and the basic emotions.”

DD also crafted large-scale Formic battles and destruction effects for Ender's Game.
DD also crafted large-scale Formic battles and destruction effects for Ender’s Game.

Looking back at the zero-g battle room shots, Gavin Hood says that the earlier prep work was absolutely crucial to the success of the final imagery. “Previs, previs, previs, edit your previs, edit your previs,” he suggests. “Work with your stunt co-ordinator, work with your visual effects supervisor, talk to everybody as much as possible beforehand so that on the day you can shoot it as fast as possible.”

All images and clips copyright © 2013 Summit Entertainment. All rights reserved.

1 thought on “<em>Ender’s Game:</em> how DD made zero-g”

Comments are closed.