fxguide is in Stuttgart, Germany for FMX 2012, our second year at the festival. It looks to be an incredible week of visual effects, animation, virtual production and filmmaking presentations, plus a chance for students to talk to VFX studios face-to-face and to hear from software and hardware providers. Here’s our run down of just some of the sessions featured on day one.

Virtual humans – MPC, Weta Digital and more

Visual effects by MPC.

A three hour session today on virtual humans examined the work of MPC and Weta Digital in crafting virtual humans and characters on films such as Harry Potter and the Deathly Hallows, X-Men: First Class, Rise of the Planet of the Apes and Tintin. The session also looked at real-time realistic skin rendering research.

MPC’s Dan Zelcs and Mathieu Assemat broke down the Potter and X-Men work. For Deathly Hallows – Part 1, in particular, the polyjuice potion scene where the characters transform into Harry lookalikes required detailed takeovers from plate to CG, full CG heads, recognizable performances, hair transformations – all in one long camera move. Here’s how fxguide covered the work. MPC went through their internal process for how this occurred, starting with physiological reference of the human body and the underlying skeletal and muscle structures they would be replicating. They then went on to show how each of the characters came to life via photographic reference, Mova facial capture, modeling, texturing (using polarized images to remove specular detail), rigging, hair grooming, matchmoving, animation and contemplation of the way each would transform to Harry. A similar approach in terms of the underlying physiology helped with the beast transformations in X-Men: First Class, with the addition of layering detail for veins, cloth and hair (see fxguide’s article for more info).

Stables also broke down how Snowy the dog was created by Weta Digital.

Weta Digital’s Wayne Stables then took the stage and discussed the development of his studio’s characters from Gollum through to Tintin. Interestingly, in an effort to make their creatures more realistic, they opened up old CG models and asked themselves what needed to be done to make them more real. Their answer was simple – base everything on real skeletons. This seems obvious for the simians in Rise of the Planet of the Apes (“Wellington Zoo probably knows us well by now,” said Stables) but the same approach applied just as equally for Weta’s humans in Tintin, even if they were more stylized.

Stables says the result is that having all the correct bones, muscles – and even a fascia layer – means that you start with something anatomically correct and work from there. Weta has adopted that approach studio-wide, even producing realistic skin textures via silicon moulds taken of staff members that are digitally scanned and textured in Mari for incredible fine natural detail. Also, real animal scans of bones and teeth and muscles and bodies are regularly carried out for Weta projects. In fact, Stables says his eldest brother is a pathologist and that he would often phone what the chance would be of cutting up a dead body to get the right detail – to which his brother would then give him a lecture on ethics. Stables was also (jokingly) sure that his brother “thinks we are monsters at Weta.”

See fxguide’s look at Tintin here.

Finally, Jorge Jimenez, Graphics & Imaging Lab, Universidad de Zaragoza, presented photorealistic skin rendering research, including real-time rendering. See below for a video on Jimenez’s look at separable subsurface scattering.

Douglas Trumbull on projection and production

fxguide has been able to speak to VFX legend Douglas Trumbull several times in recent years, including at the VES Awards and Sci-Techs where he has discussed his views on the future of cinema, frame rates and developing a new sci-fi film. At FMX, Trumbull continued his look at these aspects of filmmaking and also showed a number of behind the scenes videos from his own studio.

Doug Trumbull (right) and Ray Feeney were also part of a panel on frame rates.

Trumbull is certainly on a quest to educate people about film acquisition and projection. He was, of course, an early pioneer in large format films (he invented the Showscan process, for example) with many of the issues of high frame rates and stereo now re-surfacing thanks to digital cameras. What’s also clear is that Trumbull wants to make sure people understand the tech behind high frame rates – for so long, he says, people have been concentrating on production and on vfx, and not projection, although James Cameron and Peter Jackson are changing that.

In addition to showing significant background materials, Trumbull then hinted at his own new research into varying frame rates by pixel or by object – he called it ‘frame integrated motion analysis’. This is something he has applied for a patent over. He acknowledged that not all films needed to be filmed or shown at 48fps or 60fps, say. But he suggested that particular moments could be, such as a car crash where a part of the frame could be altered to take advantage of the higher frame rate and thus provide greater impact on the audience. Trumbull showed an example of this featuring a martial arts fight shot at 120fps, then converted to 24 and 60fps, and the fighters isolated to operate at different speeds. From what we saw it looked impressive, and hopefully forms part of Trumbull’s much-hinted-at science-fiction space movie that he is developing.

Double Negative and John Carter

Original plate.

Final shot.

Visual effects supervisor Ken McGaugh spoke during a session covering Double Negative’s creatures work for John Carter. (There were also sessions covering Cinesite’s VFX on the film, too). Work began about three years ago when Dneg was approached by Disney to work on the film. At that point, Dneg had progressively done more and more character animation, but not at the level required for John Carter.

McGaugh said he’s often been asked why Disney approached Dneg about doing the film, since they had not done a film with so much character work in it before (Paul was still in production at the time). In general terms, Disney seemed to be looking for a partner in the process, since this was the first combination live action/animation effort with director Andrew Stanton. Instead of one of the parties telling the other how to approach the project, the two companies could work together through the process and come up with the best way of tackling the movie.

Dneg did a proof of concept test to demonstrate that they could do the work and also work through how they would actually complete the work. While the test was successful, with Disney awarding Dneg the project, the task was “very difficult to do at the time” according to McGaugh, as it pointed out the problems using their existing pipeline. So they undertook some serious development work to improve the animation pipeline.

At the same time, McGaugh and the other Dneg vfx supes began working on solving some of the problems they foresaw with principal photography, combining the Tharks and humas. At nine feet tall, connecting the Tharks’ eye lines with the humans would be critical to the process. It was especially critical because the film would be anamorphic, leaving very little room for framing errors between a nine foot and six foot character. So the vfx supervisors at Dneg in London basically filmed scenes from the storyboards, attempting to troubleshoot potential errors before they happened. Since the Tharks had four arms, this would involve having another person behind the main “actor” miming out the actions for the test takes.

The FMX 2012 venue.

While McGaugh joked during the presentation about the inherent humor in the scenes being poorly enacted by a vfx crew, but they did serve their purpose. They were able to visualize problems that could come up on set — before they came up — and this led to a much smoother shoot. They ended up with a “shoot bible” that tried to come up with all the possible scenarios they would run into during the shoot. It contained scene by scene story boards, illustrations and diagrams of best shooting practices, and schematics of character sizes and dimensions. In addition, there were on-set boxes of varying sizes and heights in order to mimic the height distances between the characters, heads on sticks in order to obtain correct eye lines for talent performance, as well as other tools. There were still problematic scenes where eyelines ended up not working, but all in all it was a very effective process.

The principal actors went through “Thark camp”, a way for them to get into the character roles. One aspect of the camp involved learning how to act on stilts in order to get the eye lines right for scenes. McGuagh pointed out that Willem Dafoe was especially adept at picking up walking on stilts. During the camp, the actors even used mocap suits so that they could visualize their actions on the animated characters. However, for principal photography, mocap was not used and instead traditionally animated.

Check out our earlier coverage of Dneg’s work for John Carter here, and our fxguidetv on Cinesite’s VFX for the film.

Trixter tackles elephants and stereo

We recently covered the visual effects in Journey 2 The Mysterious Island here at fxguide, including Trixter’s work on the miniature elephants. Here at FMX, visual effects supervisor Dietrich Hasse delved into that work and also spoke of the challenges of working with native stereo plates. See our earlier fxguide coverage here.

The main challenges, for Trixter, ultimately ended up being matchmoving, contact deformation, retouch and compositing – since the plates were filmed with a stand-in pug for the elephant, and then required stereo fix-its for the left and right eye views. An initial thought that having two cameras might help with matchmoving by giving depth detail proved not quite true because of the slight differences that always exists between the different lenses. Hasse showed some nice examples of the matchmoving solutions, one was called ‘mops box’ around the bug (mops being the German word for that kind of dog). Another was the use of roto masks of hands used to project onto the elephant’s skin that were then used to deform it for enhanced interaction.

Other things to see and do

On hand at FMX during the week will be several visual effects and production studios at the recruiting booths. Just some of those present are Lucasfilm Singapore, MPC, Sony Pictures Imageworks, Animal Logic, Framestore, Method Studios, Pixomondo, Prime Focus, Scanline VFX, Double Negative and The Mill.

The Framestore recruitment booth.

Other sessions that were featured on day one included Pixar presenting on La Luna, Aardman on Arthur Christmas, Dreamworks Animation on Madagascar 3, Prime Focus discussing its 3D stereo process, Platige highlighting its Witcher 2 cinematic, a focus on work from Poland and some transmedia and cloud computing talks. Day two will see the beginning of the virtual production thread here at FMX, and we plan of course to bring you coverage of that in detail. Autodesk will also be streaming the virtual production track live on AREA.


Thanks so much for reading our article.

We've been a free service since 1999 and now rely on the generous contributions of readers like you. If you'd like to help support our work, please join the hundreds of others and become an fxinsider member.

Leave a Reply