Double Negative has just completed Flyboys. Shot on Genesis - Double Negative managed to pull off an astonishing schedule to deliver some amazing digital environments, air to air combat and period drama for the first World War I aviation film in over 40 years.
For Dean Devlinâ€™s new movie Flyboys, Double Negative (Dneg) produced both period dramas and vastly complex WWI dogfights. Flyboys, was one of the first feature films to be shot on Panavisionâ€™s new Genesis camera.
In this week's podcast we talk to Alex Wuttke and Jody Johnson from DNeg about the 3D and 2D approaches and solutions to the film. Alex is currently working on the Magic Flute from Director Kenneth Branagh. Alex previous credits include Batman Begins, AVP: Alien vs. Predator and The Chronicles of Riddick. Jody has been at Double Negative since 1999. In that time he has worked across a multitude of shows starting as a digital compositor and more recently as sequence supervisor and overall VFX supervisor. Jody recently completed work as the Visual Effects Supervisor on Penelope. In the podcast we compare briefly Jody's work on Flyboys with one of his much earlier films - the WW2 classic Enemy at the Gate, in 2001.
The film is the story of the men of the Lafayette Escadrille, the first American fighter-pilot squadron to see action in World War I. Following on their success on films such as Batman Begins and Harry Potter and the Goblet of Fire, Double Negative were appointed by Flyboysâ€™ producer Devlin (Independence Day, The Patriot) and the filmâ€™s director Tony Bill (The Sting) to help put the audience in the cockpit with the young flyers for over 740 shots, for over 50 mins of total screen time.
There were six battle sequences in the film. The speed of WWI planes meant that the experience was very visual, yet we have not seen such air battles depicted with modern visual effects. Double Negative became involved with the film very early. DNeg aimed to base as much of their visual effects work in reality. So the effects were designed with a large live action component, and when that wasn't possible the motion of real planes was captured and that drove the 3D animation..
Flyboysâ€™ director, Tony Bill, a pilot himself received provided the effects team with strong footage of stunts performed in real life, from fellow pilots. Using all these materials Dean Devlin, Tony Bill and Double Negativeâ€™s VFX Supervisor, Peter Chiang, met frequently to look at the material and between them devise the series of events for each battle. Based on this information Double Negative went straight into pre-visualisation, which was used for the live action shoots and acted as a 3D storyboard.
Knowing that the CG animated airplanes would have to blend seamlessly into the film with live action stunt flying, Double Negative, led by Peter Chiang, came up with the idea of capturing real flight movement. An IMU (Inertial Measurement Unit) was attached to a Jungmanâ€™s stunt plane and sent up to fly in similar types of flight styles as would be seen in the film. The IMU records the exact movements of the plane 128 times a second. Once back on the ground, this data was cleaned up and the high frequency noise removed. The data was analysed by DNeg with their IMU importer, created by R&D Supervisor, Oliver James. This allowed them to apply real flight characteristics to a CG plane. When a CG plane using this data for its movement was placed over the original plane in a shot, the movement matched perfectly. Using this technology, DNeg had successfully â€œmotion-capturedâ€ the planeâ€™s flight over a 12 km wide "virtual stage".
The Double Negative artists were then able to apply real flight characteristics directly to the CG planes. Some of the action captured was amazing, including a death spin, but due to the speed and aerodynamics of the planes the live action did eventually have its limitations. Then the animators could embellish the motion, adding touches of drama as required and all the IMU movements could be distilled by lowering or raising the noise frequency, so the animation could have more bob and weave where needed.
Another piece of software, dnG-Force was written by Senior R&D Developer, Jeff Clifford, to alert the animators when they were exceeding the aircrafts limits. Wider shots of the flying were executed through an animation process while the animation for the 3D pilots was motion captured and that action was applied and manipulated via a slider. Mattias Lindahl, Double Negativeâ€™s Animation Supervisor, recalls, â€œWhen we first approached the work it certainly seemed like a daunting task, there were so many shots in a very tight schedule, so the strict pipeline we set up and the R&D tools developed were crucial and they really paid offâ€.
Many of the close-ups of the pilots in their planes were shot on set. A practical plane section was built and placed on a gimble rig for movement. Double Negativeâ€™s pre-visualisations were used for the gimble shots to maintain the continuity of flight pattern on the greenscreen set. The pre-vis team knew the constraints of the set and where the plane should be and were able to use all this information to give the cameramen a start, middle and end frame, allowing them to replicate the camera move. In some cases the gimbleâ€™s centre of gravity proved to be too low and in those shots the entire gimble had to be replaced with a CG version. In order to do this, the pilots were rotoscoped out of the shot and placed back in with the CG gimble before being composited with the backgrounds.
The planes were also augmented with CG extensions for the wider shots and IMU noise was used to add realistic plane movement to the gimble. The animation rig included a slider with a huge queue of IMU data, allowing the animator to slide it up and down depending on how much frequency was wanted, from turbulent to quite calm. Jody Johnson, who was co-2D Supervisor with Charlie Noble, notes, â€œThe hardest part was taking the greenscreen gimble elements and making them work within our arena, making them look like they were really in the airâ€. To achieve this the teams executed a multi-pronged attack, adjusting the camera move to a more handheld feel, adding more action to make them work within the violent battles and using 2D lighting tricks to reinforce the exterior nature of the shots, manipulating all the material to create cohesive and dynamic battles.
While all this work was taking place, another team of DNeg professionals set about modelling the aircraft for the wider shots. Led by co-CG Supervisor, Rick Leary, the team started with research and examined books, pictures and historical footage for clues. All of the planes were built as CG models and these included Nieuport 17â€™s, Fokker DR1â€™s, Gotha IVâ€™s and a Handley Page, Type 0/400. DNeg also took additional steps to incorporate as much realism as possible into the shots by capturing the reflectance characteristics of real Nieuports. Gaining access to replica planes, which have been built by enthusiasts, the team photographed every inch of these aircrafts to get accurate examples of dirt, light reflection and other minute detail.
The team used a technique known as BRDF solving (Bidirectional Reflectance Distribution Function), which entailed procuring a piece of the doped fabric from the plane, wrapping it around a cylinder and capturing the reflectance properties of it using calibrated flash photography from various angles. These images were then run through proprietary software that outputs data to surface shaders. The net result is that as the planes move through varying lighting conditions and turn and spin past camera, the highlights and sheen mirror their live action counterparts seamlessly. As the battles proceed, many of the planes are destroyed or damaged; DNeg also had to replicate the way the structure would break up, demanding an intimate understanding of that structure to be able to reproduce the cloth dynamics
The final battle scene sees the Lafayette Escadrille challenging the onslaught of the German air force; the enemy approach with a troop of Zeppelins and for this scene Double Negative were also set the task of creating the CG Zeppelin. Again this involved meticulous research on the Zeppelins and the materials used to make them, which were surprisingly hard and rigid. A 30ft practical model was used for the final explosion, as the team wanted it to take on a realistically organic feel. In addition, a special â€˜skinâ€™ section had be built that allowed the crew to do an even bigger pyrotechnic at the end all resulting in a particularly thrilling sequence, which sees the Zeppelinâ€™s gunner pointlessly trying to run for safety from the ensuing flames.
The backgrounds for the aerial shots were also crucial, both for aesthetic and practical reasons. Deg, led by co-CG Supervisor, Alex Wuttke, set about creating a 360-degree hero environment made up of 2D cards, matte paintings and 3D clouds and trees. The team procured a dataset of high-resolution aerial photography from a company called Aerofilms; covering a distance of around 12km, this would form the basis for the ground the planes would be traveling over. Because of the obviously heavy loads that such a huge dataset would introduce into the paint and render pipeline, DNegâ€™ senior R&D Developer, Jon Stroud, devised a system which could efficiently manage high resolution terrain models and textures, both offline and online at render time. Called simply Tecto, it presented a front end to artists, allowing them to view the whole landscape in one go and zoom in on certain areas. The artist could then drag a region over several tiles and export them for such tasks as painting out towns or other modern features. Once happy, the artist could then republish their work back to the database.
The database also contained other types of data, such as CG trees, these were created by a proprietary software known as â€˜sproutsâ€™, volumetric sprites that render millions of trees in the landscape, developed by Oliver James and R&D Developer, Ian Masters and DEM data (Digital Elevation Model) describing the terrain's contours. Terrain and all associated data including trees and hedgerows for the region within the view of the camera were loaded dynamically and rendered on the fly within the EXR pipeline. Ultimately, a hero environment was created in which the artists were able to fully control the lighting and place the sun and clouds anywhere they needed them. The team were able to augment the environments for a series of places, backgrounds for the wide and close shots and extending the set of No Manâ€™s Land amongst them.
VFX Supervisor, Chiang, knew that the shots would need clouds to create a sense of travel and as a marker for how fast the planes were flying, but he also knew that the artists would need to be able to control those clouds. In order to introduce a real sense of speed and danger to the dogfights, the decision was made to use clouds as geographical waypoints, helping to introduce a sense of scale to the proceedings and anchoring the battles into a real world continuity. This was important, as most of the time, the choreography of the battles dictated that the camera become unchained from its usual constraints and swung around the scene tracking the action as it unfurled across all three dimensions.
To help the audience in maintaining a sense of up and down, clouds would be used as static anchors in the shots. This meant that the clouds would have to be full volumetric entities. Ian Masters, created another new tool, dnCloud, used to model 3D clouds in Maya's viewport, which employed implicit spheres and noise functions to generate exactly the types of clouds that were needed. R&D also created custom volumetric shaders to make the clouds react to lighting in a believable way, incorporating such effects as multiple forward scattering and self-shadowing. Double Negativeâ€™s proprietary voxel renderer DNB (originally developed by Jeff Clifford for Batman Begins) was used to render them. Once set up, the clouds could be pulled into position within the master battle arenas and rendered on a shot by shot basis.
Matte Paintings also play a crucial part in the period setting of the film and were used to help create the environments for many of the departure scenes as the young pilots left the United States for France. One of DNegâ€™s Digital Matte Painters, Diccon Alexander, created these beautiful paintings, one example of which is the New York harbour, where one of the squadron, the wealthy Lowry is seen off to war by his disciplinarian father. Using period photographs of New York as reference, the ship, which was modeled on the Aquitania and the background are all a digital matte painting.
A matte painting was also created of the Gare du Nord station in Paris; the live action was filmed in England, where a historically accurate location was found. To assist with this matte painting, reference stills were shot of the Gare du Nord and although the Eiffel Tower cannot normally be seen from the famous Parisian station, the team were asked to place it in the background to help identify the setting. Finally, at a more modest train station in Nebraska, where all-American Jenson leaves his family and fiancÃ©e for France, the scene was set with a matte painting background of a beautiful prairie wilderness.
During the four month post-production schedule Double Negative also supplied many other auxiliary visual effects for Flyboys, such as CG bullet hits with period correct smoke trails, fire elements, CG propellers, CG battle damage, dynamic camera moves and camera shake.
We've been a free service since 1999 and now rely on the generous contributions of readers like you. If you'd like to help support our work, please join the hundreds of others and become an fxinsider member.