Nanny 2 – Nanny McPhee at Framestore

For Nanny McPhee and the Big Bang, the sequel to 2005’s Nanny McPhee, director Susanna White brought Framestore on board as the sole visual effects provider. While completing a number of magical effects, the studio also delivered extensive character animation of pigs, an elephant and a jackdaw bird. In this interview we talk to Framestore visual effects supervisor Christian Manz about his work for the film.

10Apr/nanny/50293195fxg: There’s a neat mix of photoreal and magical effects in this film that I think works really well. Can you tell me about the brief for this Nanny McPhee?

Manz: The initial brief was that our pigs, elephant and other animals weren’t to be fantastical creatures – the audience had to believe they had been shot for real. This brief changed a little as the production went on and what those animals did became more fantastical when the director and the producers realised what the CG work was capable of. The challenge for us was to intercut the real thing with our CG performances. In the end we produced about 350 shots for the film with animation, magical effects and other composites. Pre-production began in February 2009, filming started in April and then wrapped in September. We had from September to January this year to work on the effects, which was an extremely accelerated schedule. The cut didn’t lock until two weeks before the end of final delivery. But in the end it was a good, fun film.


fxg: What were some of the environmental and magical effects you created?

Manz: Nanny McPhee, played by Emma Thompson, has got her magic stick that she bangs on the ground and magic sparks fly out of it. That was an effect we designed for the first film that we enhanced by trying to get more light interaction. Certainly knowing what the effect was going to look like ahead of time helped on set, so we shot some in-camera interactive lighting. We also had a journey through 1940s London. The War in the brief wasn’t specifically the Second World War, so we didn’t have to be absolutely historically accurate. But we did do some very early 4am and 5am shoots in London during last summer to get plates of Buckingham Palace and Trafalgar Square that we enhanced with matte paintings. Adam McInnes, the production VFX supervisor, led a shoot of practical vehicles on an airfield against a bluescreen. In the end, because of editorial changes, most of the vehicles and scenery are CG. We went to museums around London to get photographic period references.

On top of that, the film ends with a sequence where Nanny McPhee magically harvests a barley field. Mr Edelweiss, her jackdaw, eats an explosive putty, expands and burps out this magic wind across a field which then whips up into the air made of barley. It then turns into the shapes of animals like elephants, whales, bears and pigs in the sky before bursting into fireworks. And then it revolves to being nice and neat bales of straw. The brief for that came about six weeks before the end. We wanted to do something that was animation driven so the clients could sign off something they understood and so that they could cut with something. Once the barley was airborne it had to look like flocking starlings. There’s loads of reference of that on YouTube. We did animation of that and then built on top of it with effects animation to create the final effect of millions of those bits of barley forming these animals. So the clients had to put a lot of trust in us as they didn’t get to see the final shots until the final week of post, which was emotional to say the least!

fxg: What were some of the techniques you used to realise the barley shots?

Manz: We were keen to go with animation as it was better for the client to sign off on movement rather than simulation. Simulation is always something that’s either right or wrong, takes a long time and can be frustrating. So we rigged up some creature models and various shapes in Maya that were able to use to give a strong feeling for timing and performance for the animals. Then we mainly used Houdini to simulate and animate on top of that with lots of flocking sims. There was one animator and about four Houdini guys at the end turning it around. Then we comped it in Nuke, which let us do a little bit more work in the 3D space. The barley field itself was specially grown for the film about two months ahead of what it would normally be for harvest time in September. It had to be grown and be ready for August. The field was actually 60 miles away so any time you see the farm and the barley field together, it’s a CG barley field. That was a Maya hair-based model, with simulation so that we could blow wind through it.

fxg: How were Nanny McPhee’s magical stick effects achieved?

Manz: We used a Houdini particle-based setup so that we could direct different interactions. This let us see particles spiral rather than bounce – all of that was first done in Houdini and then the whole look was acheived in compositing. We made sure we shot some practical lights on location. So when Nanny McPhee bangs her stick down we shone some bright lights into the barley itself so we could cut down and use those light interactions. Nothing beats the real thing, really, to make it sit into the plate.

10Apr/nanny/50236301fxg: Can you talk about the piglets?

Manz: Because of how the film was shot, they had to have four sets of piglets. There were seven in the story and on each day they had 14 they could shoot with, as they had to swap them out. It was decided that those would be normal plain pigs made to look like Gloucestershire Old Spots. All their spots were make-up spots and that was something we had to keep track of here for continuity. The key sequences were a pig running up a tree as the kids are trying to catch them. Nanny McPhee’s magic makes the pig able to run vertically up a branch and hop from the branches to another tree. As post went on, the director had been watching some gymnastic reference and liked the idea of the pig looking like it had been doing a beam workout. So we took that reference and turned the shot into a much more controlled position who then flick flacks down the branch before leaping an impossible distance, like 100 metres, into another tree before doing a loop the loop and landing gracefully on the ground. All of that was very tough. Kevin Spruce, our animation supervisor, worked very closely with the director in trying to attain her vision and trying to make it as real as possible.


10Apr/nanny/50293291fxg: Then there’s the synchronized swimming sequence with the pigs. How was that achieved?

Manz: We had talked about previs’ing the pig sequences, but production initially wanted the freedom of shooting those things on the day rather than locking down to something we had done. So we ended up doing some animation tests for ‘pig swim’ and ‘pig tree’ to ascertain what performances were obtainable from the creatures. It gave everybody more of an idea of what they had to shoot. It also gave the clients confidence on something they wouldn’t see finished for months from us. One of those tests was ‘pig swim’ when the pig runs, jumps, holds its nose and dives into the water. Then all six pigs jump in, come to the surface and do a synchronized swimming routine. They dive underneath the water and we’ve got an underwater shot of them swimming in a loop through each other. There’s even a sequence where one of them jumps up to catch a lilly in its mouth towards camera.


10Apr/nanny/50293244Even though it’s set in a mucky pond on a farm, the director wanted the top shots to have a more David Hockney ‘swimming pool in LA’ look. It took quite a lot of work to make that look right, especially as it intercut with naturalistic shots. Chris Lawrence, the CG supervisor, did a great job realising aspects of that sequence. In all of those shots, the entire water surface is CG. We replaced what was shot and using voxel-based 3D sims in Houdini, we enhanced the shots with 2D simulations. I also directed a water elements shoot that let us match to what had been animated. It was great that we were somewhat along with animation. In one part, the director wanted the pig to spit out water out of its mouth as he’s lying on his back. We didn’t have that element, so initially it was me lying on my back in the Avid spitting out water, shot against a black door. Then we shot it again in a blacked out room with the effects artist who had been doing the CG water and spitting out the water himself.


fxg: What kind of challenges did the jackdaw pose?

Manz: The jackdaw had to be a photorealistic bird intercut with a real bird at full screen height, opening and closing its wings. It was six months of rigging and development to get an animation rig, working right up until the end of production. We had a very fluid pipeline that let us tweak the model and do re-renders to check how the bird was working. We showed the bird for the first time to the clients a week before Christmas. When they saw the shot, they asked us when would we be putting our CG bird in, and we had!

fxg: Did you ever think about doing CG replacements for the heads or parts of the animals?

Manz: It was under consideration initially. The elephant was always going to have a CG trunk because he had to suck up pens at one point in the film. But I think we always felt that it was more complicated to attach things, especially for the bird. Although we gave Mr Edelweiss a lot of facial performance, a real one doesn’t necessarily show that initially. But they were never going to go the 3D animated movie route. They definitely didn’t talk and they definitely didn’t have more anthropomorphic expressions. So a lot of the performance is given over by its bodily movement. It was great to get reference from the two trained birds. And Emma Thompson had worked with them. They would fly and land on her shoulder and fly off. It almost looked at one point that we wouldn’t have to do that much, but as the edit progressed they realised there were key character points that could only be accomplished with our CG bird. We ended up giving it more of a stage performance, so making it fly in, look left, look a bit disgruntled and then walk off.

10Apr/nanny/50236286fxg: Were there any tools in particular you used for the animation and then final look of the feathers?

Manz: We developed rigging tools for the show that created complex deformations without slowing down Maya, enabling the animators to have full interactivity whilst working with hi-res rigs. These tools allowed us to be animating until very late in the schedule, and plugged seamlessly into the creature simulation and render pipeline.  This gave us the confidence that what we were seeing in an animation scene made it into the final render. The feathers were groomed using our proprietary fur system and the passed through our feather splatting tool, to deal with collisions and how they laid to the skin. We were then able to generate proxy barbed feathers which gave us the lighting response you would expect, whilst keeping render times very reasonable. Animation could be changed in the morning and be in compositing by the afternoon.


fxg: How were the elephant shots accomplished?

10Apr/nanny/50236271Manz: The initially methodology was that it would be a bluescreen shoot with a real elephant which would be inserted into shots. A couple of other shots with a cow and a goat were done in a similar way. Unfortunately, we were out at a shoot and the zoo phoned up and informed us that the trained elephant had passed away with a flu-like virus. So they turned to us to provide the elephant via CG. That was such a big challenge because everyone had fallen in love with Riddle, the real elephant. We based our elephant on him. The muscle dynamics and their trunks are very intricate pieces of biological engineering. He doesn’t do anything too fantastical but he certainly gets a reaction from the audience when he walks into the farmhouse – he looks very cute.


We rendered him and the other animals in our proprietary render tool, fRibgen. We were one of the first shows here to test it in full. fRibgen is our bridge to Renderman and enables us to deal with complex and large numbers of assets.  It also worked seamlessly with our feather splatting and fur tools. Obviously there were a bunch of other 2D and greenscreen shots in the film. We used Nuke to do projections and other things in 3D. Our compositing leads were Russell Horth, Alex Payman and Sean Danischevsky. I think basically there’s an invisible join between 3D, lighting and compositing these days – it’s really just the same team trying to make high-quality pictures.

>
Your Mastodon Instance