In the film I Am Legend, Will Smith plays a character who might be the last man left in New York City, or at least the last one unaffected by a cure for cancer that goes horrifically wrong. Jim Berney supervised the visual effects at Sony Pictures Imageworks, and fxguide caught up with him to discuss shutting down one of the world’s busiest cities to take pictures of a stuffed toy deer.
The camera is following the brilliant scientist Robert Neville (Will Smith). He’s walking through tall grass, stalking a deer. The camera pans up. He’s hunting deer, yes — but hunting in Times Square. It’s a Times Square which is 100% digital. At least it is today, which is of some relief to visual effects supervisor Jim Berney. On other days of the shoot, some of NY’s busiest streets were shut down and that was when Jim Berney found himself in the middle of a deserted street with thousands in off camera crowds watching him. Most of New York City (NYC) was backed up waiting for him – shooting visual reference of a stuffed Deer.
Of course, visual reference is always highly desirable on a film. But stopping Manhattan to photograph a stuffed Deer made Berney slightly more self conscious than normal -as he explained to fxguide recently when we spoke to him from his home in LA.
Berney joined Sony Pictures Imageworks in 1996 and has served as visual effects and CG supervisor on a number of notable projects. Most recently, he supervised the creation of over 500 shots for the Academy Award nominated film The Chronicles of Narnia: The Lion, the Witch, and the Wardrobe. In 2004, Berney was visual effects supervisor on the IMAX version of the performance capture feature The Polar Express. Prior to that, he was the visual effects supervisor on The Matrix Reloaded, The Matrix Revolutions, The Lord of the Rings: The Two Towers and Harry Potter and the Sorcerer’s Stone. Berney also served as CG supervisor on the 2000 Best Visual Effects Nominee Hollow Man.
The film started with 400 shots but grew to over 800 by the end. This ran the gamut from building the deserted streets of NYC to creating full digital actors for the film’s “Infected” to building the digital animals which included the deer. “The city is completely empty of people,” Berney says. “But the film was shot on location in New York City. So we had to remove all signs of life, crack the streets, grow weeds, add wildlife, and so forth. In the beginning, the “infected”’ were going to be played by actors, stunt people, and dancers in makeup and costumes.” However, in the end they were fully digital characters that had to be created by the crew at Imageworks.
NYC Empty Environment
Once again poor NYC is under attack. The city has been beaten up in Independence Day, knocked around in at least a few King Kong flicks, blown up in Heroes, stepped on in Godzilla, frozen in The Day After Tomorrow, had rocks thrown at it in Armageddon, and washed away in both Deep Impact and AI. It’s even been overrun by gangs in Escape from NY – while having actually been run by gangs in Gangs of New York. And now in the case of I am Legend, NYC ends up with an extremely bad case of the flu.
As most of the NYC population is gone in the film, Imageworks was given the task of producing a desolate city. The brief was not to ruin the entire city as if it had been in a war, but instead to make it look deserted. Director Francis Lawrence wanted as much realism as possible, so the production decided to shut down streets and film in the real Manhattan.
Given the relatively short amount of time the streets were available, the production could not fully dress the streets. Ruined cars and other trash was brought in, but it was still down to Imageworks to sell the shot and solve the parts of the shot in the distance that may not have been blocked off. Furthermore, with no power, the city’s lights and signs needed to be digitally altered as did any office lights in the Manhattan skyline.
NYC is completely decimated in the film and the world that comes to life is jarring and eerie as herds of wild animals run through the city. Every sign of life needed to be removed from every frame. Recognizable landmarks were flawlessly recreated and ultimately destroyed including Times Square and the FDR Bridge.
The destruction happens during flashbacks when the military seals off Manhattan after the evacuation. “The military comes through and blows up the Brooklyn Bridge right in your face,” says Berney. To create these shots and others of collapsing bridges, the Imageworks crew first built low-resolution representations of the bridges. They ran simulations that used rigid body dynamics to buckle and deform the girders and crumble sections of the low-resolution bridges. Then, they broke apart detailed versions of the bridges from motion data blocked in with the physics-based simulation.
One of the key aspects of the change in the city was the concept of grass growing back and animals returning to the canyon-like empty NYC. As grass could not be done live action, Imageworks researched and developed a comprehensive grass and weeds pipeline to accurately depict what would grow in the world created for the film. Berney commented that while this was a lot of work, the grass actually came in handy for hiding items such as cars with inflated tires, or seam lines in the live action and matte painting.
“The building crumbling involved more 2D projection than 3D buildings,” says Berney. “The biggest and hardest part was the weed pipeline. We had to track in CG work on every aspect of the frames, not just the edges or the center. All the shots were handheld and the camera was always moving. And, the plates were shot anamorphic. It was very difficult.”
To populate the cracked streets and sidewalks with rampant weeds, the Imageworks crew developed a system that started with a library of tiles ranging in size from 10 feet to 50 feet square. Each tile had its own pattern of grass, which they grew using a hair system from painted cracks and potholes. Layout artists laid these tiles onto match-moved ground planes in Maya. Then, they moved the layout data into a proprietary new plant-growing program written inside Houdini.
Starting from photo reference, look development artists developed algorithms for particular plant types; that is, recipes for growing the plants. Using these Houdini plug-ins, the artists could specify how tall or bushy to make a ragweed plant. Pre-modeled flowers, buds and leaves procedurally layered into the plants added variety. In addition, layers of procedural information could cause various parts of the plants to move as something passed by or as a virtual wind blew through the street.
The artists could place the plants in the grass, paint other areas for the plants, or distribute the plants procedurally. “We also had procedural tricks for changing the density of plants in particular areas,” says Dave Stephens, effects animation supervisor. “We could manipulate the data in any way someone might think of.”
A survey team measured everywhere Smith walked or drove in NYC during filming, so that modelers could create simple geometry to match. Lidar scans provided data for higher resolution geometry. The artists could age and alter some of the environments by removing some details – cars, signs, and so forth – and adding others – grime, dirt, cracks on the live action plates. For others, they painted on photographs taken on location that they projected onto low resolution geometry.
Around 30 shots had full CG environments, including those at a seaport where Smith’s character runs over infected creatures and in Times Square. Working from shots filmed on a blue screen set in a warehouse, the Imageworks crew created a virtual city around the seaport for part of that sequence. For other shots filmed on location at the seaport, they aged the surrounding buildings by painting on photographs projected onto geometry as they had for other areas in the city.
Berney decided to shoot blue screen over green as a lot of the action took place in a blue cold environment or at night and so blue was a more natural colour for the environment, hopefully making any bounce or Imageworksll more in keeping with the final look of the film.
Times Square, however, required more work. Modelers built several blocks of high-resolution geometry to extend the set piece – the Duffy Square “island” built inside a warehouse – into the general Times Square area. The team had a lidar scan for details as small as two inches. Artists then projected altered photographs taken at the same time as the lidar scans onto the detailed geometry.
Numerous CG animals were created for the film including rats, dogs, butterflies, bugs, birds, deer and lions which fly and roam through the deserted city. There were both normal and infected animals such as a pack of infected dogs and infected lab rats. For the birds, the crew used a flocking system for movement without much variation and hand animation for more specific movement. Animators keyframed the other animals, including a family of lions and a herd of deer.
To create the herd, modelers procedurally generated 15 variants of a male and female deer by giving them wider bellies, darker noses, and so forth. “We had six sets of antlers for every deer,” Smith says. “If we didn’t like one, we could flip a switch and get another one.” The herd of deer appear early in the film, running past Smith as he drives through the empty city in a new red Mustang. All around him the viewer sees damaged buildings and weedy streets created in CG or altered from live action plates. To provide visual cues for the sounds of birds and insects, the crew also layered swarms of insects into the live action plates. ”We had a lot of environmental work,” says Berney. “We shot all over Manhattan.”
Effects were a large component on the show, especially in the flashback sequence when we see the city in the grip of panic. Effects used include fire, smoke, debris and dust, were all used, both in the flash back sequence and in the various fight sequences and actions sequences of the film. All of which add a layer of textured reality to the supernatural world of the film.
The Infected are sensitive to sunlight. Therefore, every time they go outside, their skin burns painfully which required intricate and fully detailed smoke. To enhance and extend practical pyro, they used Maya fluids and techniques developed earlier for the fiery Ghostrider.
Destroying the creatures’ skin, though, depended on new lighting and rendering techniques. In the film whenever the Infected were hit with bright light, their skin bubbles and boils, smokes and smolders, and emits smoke and steam, “We had techniques for doing that with one or two characters, but we had to apply these effects to more than 40 characters at a time.” says Berney
To make this possible, the crew first lit the characters and then moved the illumination data into a point cloud file that Houdini read as particles. Because the particles accumulated light during the course of the animation, the lighters could, in effect, burn the skin by shining light on it. And, they could emit smoke from the burned areas.
The mutant creatures — called the “Infected” — are a perfect blend of reality and pure imagination, highlighting Imageworks’ ability to create digital characters that can seamlessly integrate with live action characters and are able to hold their own in the center of a scene. The Infected are scary yet strangely beautiful at times, with translucent skin built with two main layers of skin and an anatomically correct bone structure underneath. Because the characters do not talk, their movements must convey every emotion and ring true in every scene. It is their behavior that carries these characters throughout the film.
“We were shooting parkour people doing these crazy stunts, but we weren’t getting the type of behavior Francis and Akiva wanted,” Berney says, referring to director Lawrence and producer Akiva Goldsman. “In December, we decided to go with full CG versions of the infected.” In fact, there are very few actors in the film, and other than Smith, most appear only in flashbacks. The other “stars,” the infected, are always digital, and Imageworks created dozens of them. Forty-three hero creatures to be precise.
While Berney was on location, digital effects supervisor Dave Smith oversaw the CG supervisors and artists at Imageworks who created the creatures and the environments. Before they crew had gotten word that all the creatures would be CG, they had begun working on the digital doubles. They were waiting for reference material from the actors in makeup and costumes filmed on location when they learned all the creatures would be CG. “One of the specifications was that the creatures would have a sick look,” Smith says. “Another was that their skin would be more translucent than a typical human, so we would see muscles underneath.”
Modelers worked from cyberscans of several stunt actors filmed on location, and then sculpted six creatures. The creatures look strong enough to have survived the virus, but extremely lean, as if the virus had eaten away some of their soft tissue and created cavities. The modelers first sculpted an alpha male and female. Next, they created stouter and slimmer versions of each to end up with six base models. Lastly, they modified heads, hands and various proportions of those models to produce the 43 hero creatures. “There’s no fat rounding out these characters,” says Smith. “You see muscles and tendons and pulsing veins. They wear tattered clothes and have a sickly look to their surface.”
To develop the creatures’ look, CG supervisors, texture painters, look developers and shader writers used designs created by Patrick Tatopoulos and images of human cadavers as reference. “We analyzed everything and came up with a vocabulary to describe the look,” says John Monos, CG supervisor. “The final design has a component of musculature visible through a semi-transparent, thin membrane of skin. You can see veins and striations of muscles.” Texture painters provided such details as veins, scars, and scabs, which also helped create variation in the 43 hero creatures, and modelers working in ZBrush added displacement and bump maps.
One difficulty with the creatures was in determining how deeply light could shine into the skin. “In some of our early renders, the creatures glowed like light bulbs,” says Monos. “If you were to stick a teapot into a creature’s head, you’d see the shadow of the teapot in the translucency of the skin (in the shadows).” The shader writers devised a solution using bones to control translucency – the closer a bone was to the surface, the less light would transmit into the surface. Texture maps provided additional control over the amount of light transmitted beneath the surface of the creatures’ bodies. And in addition, pre-baking the illumination rather than rendering shadows from a point in space, allowed the lighters to use more sophisticated subsurface calculations for the light moving through the surface, which helped create softer transitions between light and dark.
For the animators, riggers built a similar bone structure for all the scanned characters; the rigs changed shapes and scaled to match the various body types. Thus, the animators could move more easily from one character to another without needed to learn an entirely new rigging system. David Schaub supervised the animation team. Motion capture by Giant Studios of the stunt actors on location provided a base from which the animators created the hero performances. “We amp’d up the animation for the runs, jumps and stumbles to make them faster and more violent,” Schaub says. “Also, since we were going to go all digital, the director wanted to do something with these creatures that we couldn’t do with real actors.”
For example, exposure to light burns the photosensitive creatures and the actors on location reacted to light by writhing on the ground. But the animators made these reactions much more violent by slamming their faces into the pavement and flopping them around as if being jolted by a tazer.
“As we got deeper into production, the director gravitated back to performances that were more human-like, but still had the infected quality of craziness built in,” Schaub says. On top of the motion captured hero performances, for example, animators had the creatures hyperventilate, flair their nostrils, flex the muscles and tendons in their necks and so forth to give them an intense “infected” quality.
Once Schaub knew how Lawrence wanted the creatures to react to light, Imageworks had the stunt actors Sharon and Kirk Maxwell motion captured at Giant Studios. “Sharon and Kirk gave us some amazing source material that we were able to build upon” says Schaub. “I can’t imagine doing the big crowd scenes without the base performances that Sharon and Kirk gave us.” Massive software managed the animated cycles in large groups of infected creatures. Berney was very keen to keep the motion of the Infected away from things that would break the laws of Physics. In this respect, even getting a run or walk cycle for the characters was extremely hard he commented.
All told, a crew of around 300 at Imageworks worked on I Am Legend, some for 14 months, others, on a more compressed schedule to handle the CG creatures. At the end, the studio had Manhattan back to nature and populated it with a cast of CG actors.
“The biggest challenge was the creatures,” Berney says. “To create dozens and dozens of these creatures all in the computer was fairly daunting. We used a good blend of new technologies, but they were still handcrafted by Dave Smith and his crew, which I love. We use the technology where we need it, but the artists bring the characters to life.”
We've been a free service since 1999 and now rely on the generous contributions of readers like you. If you'd like to help support our work, please join the hundreds of others and become an fxinsider member.