Harry Potter and the Deathly Hallows

Several visual effects shops contributed to Harry Potter and the Deathly Hallows: Part 1 under overall vfx supe Tim Burke. In this article we continue our coverage of the film by focussing on the signature shots by MPC and Cinesite and the work of Double Negative, Framestore, Rising Sun Pictures and Baseblack.

Double Negative’s magic

10Dec/harry/HP7_dneg_lovegoodhouse Double Negative, under visual effects supervisor David Vickery, created 166 shots for the film. Dneg worked on extensions of the Burrows house and surrounding environments, the setting up of the marquee for Bill and Fleur’s wedding, as well as the Patronus, Death Eaters, fire, the magical battle and confetti butterflies. Additional shots included Diagonalley set extensions, kitten plates in Professor Umbridge’s ministry office and the Deatheater attack on the Lovegood house, the kids apparating away and the house destruction.


Click here for an in-depth podcast with Dneg’s David Vickery on making magic for the film

Framestore’s character animation and the Three Brothers sequence

10Dec/harry/HP7_framestore_kreacher

Dobby the house-elf and Kreacher were the creations of Framestore, led by visual effects supervisor Christian Manz. The characters were entirely realised by keyframe animation and used multiple subsurface scattering techniques.


Click here for our in-depth podcast with Christian Manz on how Dobby and Kreacher were created

10Dec/harry/HP7_framestore_threebrothers

Framestore’s Commercials team was behind the ‘Tale of the Three Brothers’ animated sequence, directed by Ben Hibon. Drawing on the early animation work of Lotte Reiniger’s hand-cut paper silhouettes, artists relied on Maya and textures created in Nuke to tell the story.


Click here for our interview with Framestore sequence supervisor Dale Newton

Rising Sun’s Dementors and Death Eaters

10Dec/harry/HP7_risingsun_elevator

Rising Sun Pictures completed work on Death Eaters, Dementors and the locket horcrux. For the shot of Snape arriving at Malfoy Manor, the character appears initially in Death Eater form using Maya cloth for the rob and Houdini for the smoke that made up the body.

Rising Sun also created the Dementors, updated from previous appearances in the film franchise. “The Dementors were really just a black silhouette,” said Rising Sun visual effects supervisor Tony Clark. “But you can get a lot of emotion from them even though they are just a head, two sticky arms and the flowing cloth that has a seaweedy underwater feeling. The original brief was for them to be inside the courtroom which was a very dark environment, so they were enveloped in smoke to read the silhouette.”


Baseblack delivers significant work

Baseblack was behind 300 shots on Deathly Hallows: Part 1, more than any other vendor. Led by visual effects supervisor Matt Twyford, those shots included the snitch, Hermione’s magic handbag, the cafe wand shoot-out, various spells, the horcrux locket’s underwater attack on Harry, moving pictures in newspapers and photographs, environments and the final scene when Voldemort breaks open Dumbledore’s tomb to steal the Elder Wand.

MPC

MPC makes multiple Harrys, helps Order escape from Privet Drive and conjures Nagini

10Dec/harry/HP7_MPC_multipleharrys Early in the film, the Order of the Phoenix seeks to safely deliver Harry to the Burrows, deciding to disguise themselves as several Harrys in the hope of thwarting any attacks from Voldemort and his Death Eaters. They drink polyjuice potion, a magical formula that transforms each member into a version of Harry. “We talked about the polyjuice sequence very early in production, and basically started working from the pages of the book,” said MPC visual effects supervisor Nicolas Aithadi. “The idea was to have one single shot and it was originally to be around a table, but that changed a little later in the production. We talked to overall vfx supe Tim Burke about what technology to use and we tested different approaches and equipment, then very quickly settled for the Contour system from Mova, which was able to give us fine details.”


MPC produced some artwork of all the different formations to give the director an idea of what they would look like, going for both subtle and completely cartoony renderings. “The idea,” explained Aithadi, “was that we would never end up on Harry, meaning we would never have a complete transformation of any of the actors to the real Harry – we would keep say the nose of George – it was always a hybrid version.”

To shoot the plate, all the characters – George, Fred, Ron, Hermione, Fleur and Mundungus – were shot on set acting out the transformation, sometimes adjusting their height to be the same as Harry (Daniel Radcliffe). A second set of plates involved Radcliffe acting out every character. “He was dressed like George and Fred et cetera,” said Aithadi, “and we did beauty passes of him acting out the transformations, using some attributes of the other characters. Daniel picked up really nice little details which proves they know each other very well. They were quite funny plates to look at.”

The actors then spent a day on a London stage in front of Mova’s Countour system, acting out the scene but this time just for the facial performance. Phosphorescent make-up applied to the actors was picked up by a series of geometry cameras and this produces a tracked surface and mesh. The resulting data was then processed at MPC in conjunction with a facial rig built for each actor. “We had also done a life cast for each actor,” recalled Aithadi, “in order for us to be able to do extreme hi-res cyberscans. They’re usually difficult to do hi-res with live actors because even the slightest movement can affect the data. The plaster face gave us all the nice skin pores and everything that we could scan and extract all the detail.”

To help model fast-moving eyelids, a group of animators had to manually rotoscope blinks based on video reference from the capture stage. “It was a time-consuming process,” said Aithadi, “because at the level we were working on, any discrepancy between the live action and CG was just jumping off the screen. It took quite us quite a long time – 2 to 3 months – to get us to a point where the face was working and we were happy with the system.”

Production then began on animating the facial performances. To be able to change the shapes from the source character to Harry and to change them in a non-uniform way, animators relied on spheres. “We would animate around the face and when it would contact the model, this area would be blendshaping and morphing into the next character,” said Aithadi. “Another thing we also did was build skulls inside the heads to try and keep the facial mechanics of each character as true as possible. By morphing the skull we could keep the jaw bone and everything consistent between characters. Actually, for the first test we did, we applied the facial animation of Mundungus to Harry and Harry didn’t look like Harry. It had something of Harry but his face was weird because his jaw bones were moving differently than Daniel Radcliffe. So we quickly realised we needed to assign animation but keep the bone structure.”

Maintaining details became the most crucial part of MPC’s work on the transformation sequence. “We soon realised eyelashes or eyebrows and all those little things that we don’t think about all the time were very, very important,” said Aithadi. “We would do renders and see something that didn’t look quite right, and you’d realise it was that the eyelashes were not curled enough or were too short. Sometimes we had to groom the eyelashes of each character and then do morphs between the two and change the hair simulation from one to the other. We had to go back to MPC’s Furtility grooming system and write some options to be able to transform between grooms. We also did that for the peach fuzz on the skin – the very fine hair layer that everybody has on their face.”

“We were doing animated morphs of hairdos as well,” continued Aithadi. “It was funny, actually, because, for example, Mundungus is bald. In the beginning we thought we would be having to grow hair on his head. We had this animation where the hair was shooting out of his skull and then forming into Harry’s haircut. But we calmed everything down to focus on the change to Harry’s face, keeping the skin tone of the original actor. The other thing we started to do at that point was pump out the animation, like make the ears grow bigger and the nose change – doing some comical-type transformations. But the thought was that this would be too distracting. The next sequence is quite dramatic so it didn’t feel right to have a massive comical sequence. In the final shot, you see the transformation but it’s not jumping off the screen.”

MPC spent about seven months on the transformation sequence, which then transitions to shots of the Order taking flight to the Burrows on Thestrals and vehicles, including the real Harry who rides with Hagrid on his motorbike. But the Death Eaters have been informed of the planned escape and attack the Order as they attempt to fly away. Harry and Hagrid seek refuge closer to the ground but are then engaged by Voldemort. After a wand duel, Harry is finally safely delivered to the Burrows.

10Dec/harry/HP7_MPC_escape1
10Dec/harry/HP7_MPC_escape2


The initial attack takes place amongst the clouds in a frenetic and chaotic sequence that referenced Battle of Britain-type footage. MPC animation supervisor Ferran Domenech worked with Tim Burke for nearly a year at Leavesden Studios on previsualising the scenes. For the clouds, MPC relied on Maya Fluids to create the initial shapes. “There were some resolution issues,” explained Aithadi, “so we divided each shot from the previs as having layers of cloud – extreme background clouds, background clouds, mid-ground clouds, foreground clouds and extreme foreground clouds – and then decided what technique to use. The extreme background clouds for example were a cyclo done by the environment team at MPC. The background and mid-ground clouds were Maya Fluids clouds rendered with mental ray. The foreground ones were Maya Fluids but rendered in RenderMan and the extreme foreground clouds were 2D elements that were comped in.”
10Dec/harry/HP7_MPC_escape4before
10Dec/harry/HP7_MPC_escape4


Spells fired from both sides light up the clouds, with the original approach being based on the look of flak guns from World War II. “Our first test, though, made it look even more chaotic,” said Aithadi, “so instead we would stick the spell inside the cloud like a lightning storm and we used that as a tool to silhouette the action. We also used a lot of lens flares – not so much normal lens flares but more dirtying of the lens – and camera shake to get more chaos to the sequence. The CG characters were relatively straightforward in terms of modelling and animating them with all the proper cloth simulation.”

10Dec/harry/HP7_MPC_escape3 Hagrid soon realises the cloud fight is too dangerous and decides to fly away. He pushes a button on his bike and accelerates via a massive flame from the bike exhaust, called dragon’s fire. “It was a huge effect,” said Aithadi, “not really for the fire because we do quite a lot of that, but just because of the length of it and the speed it made the motorbike go. One of our effects TDs said, ‘Well I can’t see the background now because it’s going 1,000 miles an hour!’ When you’re flying that fast we had a trail that was one or two miles long, which we had to simulate. Every time we had the dragon’s fire, it took ages to sim and memory was a big issue because there was almost a terabyte of data to cache.”


Reaching street level, Harry and Hagrid encounter traffic, tunnels and more attacks. The production blocked Dartford Tunnel outside London for a couple of nights, with additional plates of the motorbike shot at an clear airfield in Bobbington and inside another tunnel. “We had to take these three environments and make them one,” explained Aithadi. “So we went to the Dartford Tunnel and took 360 degree pictures and our environment team built a 3D version of the highway based on all the stills and camera-projected and created some matte paintings for the views of the action shot at Bobbington. So every time you see the cars and the bike on the highway, it’s all CG around them. Tim Burke and David Yates wanted the tunnel to be quite claustrophobic and dangerous for the bike, so they asked us to transform the four lane tunnel into a two lane tunnel. So we pretty much roto’d all the cars and did the whole thing as a CG object. But the original tunnel was used as reference for lighting.”

10Dec/harry/HP7_MPC_pylons2 Voldemort then confronts Harry amongst large transmission pylons in the countryside (an MPC environment), arriving via a huge volume simulation of smoke before the pair engage in a wand duel. “For the duel, there’s a massive shot where Harry and Voldemort get connected inside the wand core, breaking it, and then you’re flying through the stream of the energy and getting very close to Voldemort’s face,” explained Aithadi. “Voldemort was projected onto 3D geometry and it was full CG smoke. Again, we used Maya Fluids to create the simulation because it was giving us a really nice organic feel.” For shots of the pylons exploding, MPC hand-animated the structures and relied on cloth sims for the cables and 2D elements for electric sparks running across the power lines.


The final part of the sequence shows Harry and Hagrid crashing the motorbike into a Burrows bog. MPC used CG versions of the characters and bike to crash into a CG water environment, matching to live action plates before and after the synthetic shot. “We also shared these with Double Negative,” said Aithadi. “When our shots were approved we extracted the camera and they rendered the environment in different layers. ”

Another of MPC’s major effects work was for Voldemort’s snake Nagini, present at a meeting of the Death Eaters and in a later scene having taken over the body of Bathilda Bagshot and attacking Harry. “We had actually done Nagini before on Goblet of Fire,” commented Aithadi. “This time Tim Burke proposed to bring a professional snake wrangler to Leavesden and shoot reference and textures off a real python. We thought it would be scarier if Nagini looked like a real python. So after that shoot we decided to scrub the original model and do it from scratch basing it on this python, but also creating viper and cobra movements. I think that made the character that much more creepy. We went back to the rigging. We added a lot of skin sliding and muscle sliding and cloth simulation and just animation.”
10Dec/harry/HP7_MPC_nagini2before
10Dec/harry/HP7_MPC_nagini2


“One of our artists created all the textures from the stills we shot on set for the scales,” continued Aithadi. “She actually outlined every single scale by hand to create the displacement map. I thought she was crazy! But it was really good because the scales matched the colour textures. That meant we had really hi-res textures of the skin. We wrote a shader that had the iridescence and all the various weird speculars and reflections that snake skin has.”
10Dec/harry/HP7_MPC_naginiattacks
10Dec/harry/HP7_MPC_naginiattacks3


Revealing herself to Harry, Nagini breaks through Bathilda’s skin and goes on the attack. “You could see the violence of that sequence even in the previs,” noted Aithadi. “When Bathilda starts crumbling, we actually did a full CG face – the only thing that is real is the body. We made these weird displacements and blend shapes. Then everything explodes and the body falls on the ground and there’s a cloth sim and particles work there. Daniel Radcliffe interacted on set with mostly, little, although the chair in the scene was rigged to break and some grips wearing green gloves held the actor down to make it feel like he was being restrained. Then we had to remove the guys and insert Nagini. We had a 3D model of Harry that we used for 3D rotoscoping of his movement and had it interact with the snake as they rolled around.”

Cinesite

Cinesite creates Voldemort’s nose

10Dec/harry/HP7_cinesite_voldemort Lord Voldemort’s snake-like nose was one of Cinesite’s key effects on Deathly Hallows: Part 1, replacing the real nose of actor Ralph Fiennes in 46 shots. On set Fiennes played the scene with 16 tracking markers applied via a mould. “It was a latex mask that had holes cut in it and they would put the markers though that so that they were always in the same place,” explained Cinesite 2D supervisor Andy Robinson. “That enabled the trackers to line up the head a lot better.” HDRs of the scene were also captured and a texture shoot of Fienne’s head gave Cinesite polarised and unpolarised images to derive both bump and spec passes for the face.


At Cinesite, the first step was to create a rigid track of the head. “We would then get the basic movement of the entire head which would actually get us a lot of the way,” said Robinson, “because then the lighters could start lighting it, the compers could start cleaning up the markers, and a lot of times that would be all we would do for a temp stage. Ultimately what we did was end up unwrapping the actor’s face completely, using a rigid track of the actor’s face. We would then unwrap that in Nuke and do little touch-up paints.”

The second stage of the process was to matchmove Fiennes’ speaking motions, more tracking than animating at this point. “Voldemort’s quite an expressive character so we’d have to completely articulate the lower jaw and upper lip,” said Robinson. “We had wire deformers for the lip line and the creases on his face – his laugh lines. So we sometimes had to pull those into different shapes to get them to line up. It wasn’t just the nose we were replacing – depending on the shot it could go into the cheeks or even the upper lip. We also had to translate that subtle motion of the nose as well, say when the character talks.”

Cinesite had a fully-built CG head based on a cyberscan of Fiennes that was lit and rendered and could be used for clean-up purposes, such as an individual cheek and to remove shadows from the real nose. The studio’s proprietary software csSkinShader, which is a multi-level subsurface scattering algorithm, was used to enhance the skin and final look.

In Nuke, artists used another proprietary tool to take the CG face and apply some of the lighting detail from the live action. “It takes the large details and applies it to the scene,” said Robinson. “It evaluates the luminance and pushes back the colour as well, using a subtraction process and you remove all the details the live action. You don’t get the subtle details but you get the overall colour.” Compositors combined subsurface passes, adding more rim lights or turning down specs or the bump where necessary.

Nuke also allowed artists to make small adjustments, such as painting out some practical make-up veins or for where the nose would block part of the face. “In the end they were not too difficult,” recalled Robinson, “because we just treated them like a traditional clean-up exercise. We could flop one of the eyes over to the other side, for example. Because we were working in Nuke, we were able to use UV textures and basically paint on a moving still effectively.” For one particular long scene in Malfoy’s Manor, Cinesite relied on a motion analysis tool to check for subtle movements in the skin, like twitches, that would not normally have been picked up in tracking but could then be applied to the CG version of the face.