fxpodcast: Dr. Mark Sagar

In the last of our series on the tech behind the major studios, we switch gear to facial animation and Dr. Mark Sagar of Weta Digital. Mark is a back to back 2010 and 2011 winner of the Sci-Tech Award (Scientific and Engineering Award) from the Academy. At Weta Digital he has directed the development of the performance driven facial animation system for Avatar and King Kong.

Mark specializes in facial motion capture, animation and rendering technologies and is currently focusing on bio-mechanical simulation of the face. Mark was a Post-Doctoral Fellow at the Massachusetts Institute of Technology and holds a Ph.D. in Engineering from The University of Auckland, New Zealand, where he worked on Virtual Reality Surgical Simulation and Anatomic Modeling with Peter Hunter’s Bioengineering group.

Here’s a run down of Mark’s background:


Pacific Title Mirage / LifeFX

Mark was the technology co-founder and co-director of Research and Development for LifeFX which set milestones in realism for digital humans for film at Pacific Title Mirage. He also developed interactive Internet based technologies for eCommerce and other Web based applications such as FaceMail at LifeFX Inc.

Siggraph Electronic Theatre

The short film “The Jester” was selected for the Siggraph Electronic Theatre in 1999, showing the LifeFX technology. This is considered by the graphics community to be a milestone in computer generated humans.

Watch ‘The Jester’

In 2000, “Young At Heart” was selected for the Siggraph Electronic Theatre, pushing the technology further which put a fully digital face in a standard dramatic context, and also demonstrated digital aging – creating an 80 year old version of a 20 year old actress (to put in context this was 10 years before Benjamin Button).

Watch ‘Young at Heart’

Sony Pictures Imageworks

Prior to Weta, Mark was R&D Supervisor at Sony Pictures Imageworks and developed the Image Based Rendering system for the Doctor Octopus and Peter Parker faces in Spider-Man 2 using Paul Debevec’s Lightstage, and the Performance Driven Facial Animation system for Monster House.

In 2004, Monster House (not released until July 2006) was the first use for film production of the Facial Action Coding System (FACS) for performance capture and mapping onto an arbitrary character face. The system analyzes the motion capture data for facial expression (rather than skin motion) and represents this as the fundamental information of the performance, and then translates this expression data onto an arbitrary digital character.



Weta Digital

In 2005, FACS was implemented at Weta Digital for King Kong and was able to capture extremely subtle to highly dramatic expression and faithfully translate Andy Serkis’s performance to King Kong’s face.

For Avatar, Weta knew that James Cameron wanted to use helmet cameras for motion capture, so Mark did an initial proof of concept test to see if they could use a single video camera and 2D image based tracking for facial motion capture rather than 3D tracking. They used the FACS system to solve and map the performance to Gollum for the test.

The FACS system was used for all the Navi character performance capture in Avatar. The Real-time Facial Motion Capture framework uses video from a small, helmet-mounted camera to record the actor’s facial performance in real time. New custom facial tracking software based on FACS rapidly tracks the movements of the face, and maps them onto a rigged model that replicates the actor’s expressions onto a facial puppet in the virtual monitor used by the director on the virtual stage.

The facial expression solver is a real-time version of the system developed for King Kong. This new system takes a live stream from the tracking workstation and determines which muscle groups or FACS units are being used to make the actor’s expression. The system automatically maps these FACS poses onto a puppet’s face. By using the standardised ‘language’ of the FACS poses, the system could drive a rig which could make 10,000 different expressions and included the detail needed to capture the subtle eye and mouth movement necessary to bring the characters to life. Also, it also enabled faithful representation of the original actor’s expression on a face with quite different proportions (for example, the Na’vi), enabling the actor to drive the virtual puppet with their performance.

Sagar’s awards

2010 Sci-Tech Award (known by some as a Technical Oscar)

Scientific and Engineering Academy Award – Dr. Mark Sagar, Paul Debevec, Tim Hawkins and John Monos for the design and engineering of the Light Stage capture devices and the image-based facial rendering system developed for character relighting in motion pictures.

2011 Sci-Tech Award

Scientific and Engineering Academy Award – Dr Mark Sagar, for his early and continuing development of influential facial motion retargeting solutions.

Sagar’s Credits

AVATAR – Special Projects Supervisor
(2009) Director: James Cameron (20th Century Fox)

KING KONG – Special Projects Supervisor
(2005) Director: Peter Jackson (Universal)

MONSTER HOUSE – CG Special Projects Supervisor
(2006) Director: Gil Kenan (Sony Pictures Imagworks)

SPIDER-MAN 2 – CG Special Projects Supervisor
(2006) Director: Sam Raimi (Sony Pictures Imagworks)

1 thought on “fxpodcast: Dr. Mark Sagar”

Leave a Reply