SciTech Awards: Scanning Hollywood with Paul Debevec

Creating photo-real digital actors is one of the biggest remaining challenges in feature film visual effects. Digitally rendered faces must achieve a high level of realism in order to cross the “Uncanny Valley” and appear close enough to real people in order to be accepted by the audience. Often, the digital faces must accurately bear a resemblance not just to the real actors, but the characters they are portraying.

Sigourney Weaver in the Light Stage (before Light Stage X), using the new Polarised technique for Avatar.

Key to so many vfx digital doubles, character performances and high end digital faces in recent films has been the ultra high quality scanning provided by specialist scanning done in a Light Stage. This innovation is being recognised by an Academy Technical Achievement Award ( or Sci Tech Award) being presented to Paul Debevec, Tim Hawkins and Wan-Chun Ma and Yu Xueming.

Tim Hawkins, Cyrus Wilson, Paul Debevec, Wan-Chun Ma, Yu Xueming and Jay Busch.  (Cyrus Wilson and Jay Busch were key PSGI Collaborators). Photo by Greg Downing

The award is actually in two parts, with the first three recipients being awarded for the invention of the ‘Polarized Spherical Gradient Illumination facial appearance capture method’, and Yu Xueming being recognised for the design and engineering of the Light Stage X capture system.

This is not being award for the invention of the Light Stage itself, as the Light Stage is just part of the solution. In fact, the Light Stage has already been part of a pervious Sci Tech award. Debevec has been a key part of the team behind several incarnations of the Light Stage, first publishing a paper at SIGGRAPH 2000.

Face in a variety of HDR environments (Original Sci-tech Approach)

The original Light Stage received recognition in its contribution to film technologies in 2010. Then Paul Debevec, Tim Hawkins, John Monos and Mark Sagar were awarded a 2010 Scientific and Engineering Award for the design and engineering of the Light Stage capture devices and the image-based facial rendering system developed for character relighting in motion pictures. The key difference between the first award and this second one is how the data was used to create faces, The two approaches are vastly different.

fxguide’s Mike Seymour in Light Stage showing subtle light changes achievable extremely quickly. Mike was scanned himself for the MEETMIKE project.

The first award is based on a Light Stage that allowed, via a set of combinatorial maths, any face filmed in the Light Stage to be seen with any lighting setup. In simple terms you can photograph an actor with a set of light patterns and then by adding or subtracting these patterns the computer can provide a photographic solution of what that face would look like with inside any target HDR Light Probe. If our actor was photographed in the original Light Stage and we need to see what their face would look like under the red skies of dusk, a quick mathematical operation and one can see an accurate shot of the actor, lit exactly as if the LED lights in the Light Stage had all been set to emulate that red sky HDR. But this process did this effectively with computational photography, It did not produce a high resolution 3D face, with surface normals, etc.

Polarized Spherical Gradient Illumination was a breakthrough in facial capture technology, as it allowed for the shape and reflectance capture of an actor’s face with sub-millimeter detail, enabling an incredible recreation of an actor’s face.

Sigourney Weaver’s Polarized Spherical Gradient Illumination based 3D model

The Light Stage X was a major innovation over all previous Light Stages and has been the foundation for all subsequent innovation and the keystone of the method’s evolution into a production system. The Light Stage X does not work to show you a photographic solution of a face in any lighting setup, but rather provides the 3D face so you could render it in traditional animation pipeline. The lights in Light Stage X can not only relight a face to a pattern, but can enable the computer to understand the geometry, pore texture and light properties of a face.

Summary of the current Light Stage X pipeline. Image from the 2011 Eurographics paper: Comprehensive Facial Performance Capture by Graham Fyffe, Tim Hawkins, Chris Watts, Wan-Chun Ma and Paul Debevec.

At the time of the invention of Light Stage X, there was no efficient and reliable technique to digitize the shape and reflectance of an actor’s face in natural facial expressions at sufficiently high resolution to create photoreal digital characters. Light Stage X and the approach of using polarized spherical illumination provided such a solution, and has been used to help make digital characters in movies beginning with Avatar in 2009, through to films such as Logan, Valerian, and Blade Runner, and numerous more currently in production.

Avatar’s Stephen Lang

The actor is photographed in a series of polarized spherical gradient lighting conditions with a set of digital cameras, in such a way that customized algorithms can derive a high-resolution 3D model of the actor’s face, along with high-resolution diffuse and specular reflectance maps. The process is designed so that each scan requires just a few seconds, and the actor can be recorded ear-to-ear in any natural facial expression. The polarized light allows the initial surface specular reflection of the skin to be isolated, allowing for 0.1mm accurate surface detail to be reconstructed without the need for an old school plaster facial cast. The recorded geometry data and reflectance maps can be used to create 3D digital doubles of actors for visual effects sequences, face replacements, or to provide source geometry which can be used by artists to retarget to a digital creature based on the real actor.

Paul Debevec with a Light Stage X in 2012.

When Light Stage X first came out, fxphd (our sister site) interviewed Paul and that Background Fundamental Video is in this fxguide story from 2012. Click to see it.

The Award goes to…

Paul Debevec, Tim Hawkins and Wan-Chun Ma and Yu Xueming.

There are four people named on the Award, although Paul Debevec is quick to point out if they had been allowed, other members of his team would have been included, since any enterprise as complex as the Light Stage is a collaborative effort and involved irreplaceable contributions from both his current team and others.
Tim Hawkins now runs LightStage LLC. Wan-Chun ‘Alex’ Ma was Debevec’s first PhD student, and was co-supervised by him as well as his other PhD supervisor at the National Taiwan University (2008). He is the reason that the first Light Stage X outside the USA is currently in Taipei. He is now also working at Google, having worked around the world at various key research facilities such as Weta Digital and ETH Zurich.

Yu Xueming, who is credited with the design and engineering of the Light Stage X, is Senior Hardware Engineer at Google, he came over from USC ICT as part of the team that joined Debevec in moving to Google. “Yu Xueming joined our laboratory at USC ICT. He had been a student at USC in computer science and I actually hired him as a software developer,” explains Debevec.  “While he was working there, he mentioned that he also had an electrical engineering degree and that he might be able to help with building , you know, some circuits that could improve some of our labs projects!”. This proved pivotal when the time came to upgrade the Light Stage Lighting system.Yu Xueming lead the electronics project to build custom circuit boards and a new polarized gradient illumination light source system.

Paul Debevec accepting the award at the 2019 Sci-Tech with fellow recipients and also host David Oyelowo, Photo by Greg Downing

 

Digital Emily

One of the first public examples of the Light Stage X approach was the Digital Emily Project with Image Metrics. As Avatar was yet to come out, it was the Emily project that introduced this technology to the broader effect’s community. The Emily project was insanely good, and brought enormous attention to the new work, with even Debevec impressed by how well it worked out. “I was blown away by it. There is no better thing than to be a project collaborator on such a project. You try and do your part, and everyone else does their part, and then you realize something pretty magical just happened,” he fondly recalls. “It was a fantastic collaboration with Image Metrics. When they originally approached me about doing a digital character, it was actually going to be in the context of the older light stage technology, as they did not know about the newer research… and Image Metrics proved to be a perfect partner for Emily because they have such advanced facial animation knowledge.”

Hollywood comes to USC ICT

Sam Worthington being extremely good natured in 2006 while the first Polarized scans were being sorted.

Avatar was the first film which used Polarized Spherical Illumination facial scanning, with numerous sessions conducted from August 2006 through to April 2009 at USC ICT. Actors Sam Worthington, Zoe Saldana, Joel David Howard, Stephen Lang, Sigourney Weaver, and several samples of plants and props were scanned. Mark Sagar from Weta Digital had visited USC ICT for a demo of the Polarized Spherical Gradient Illumination face scanning, and he saw the high-res surface data from the first internal April 2006 test. “The next day Mark lets us know that Weta may want to scan actors for Avatar soon,” Debevec recalls. This work was done on Light Stage 5, and it would not be until 2010 that Light Stage X would be built and 2011 before the first commercial Light Stage X scans were performed at USC ICT for Underworld 4.

Above is an unedited video from the first ever Light Stage scan of Sam Worthington for Avatar.

Since the days of Avatar, over 30 motion picture high-end VFX teams have had over a 100 actors scanned by the Light Stage and this has meant a host of Hollywood A list actors working with the Light Stage team.

Dwayne Johnson, the most scanned actor in the Light Stage.

Dwayne Johnson is the most scanned actor. The Rock has been in the Light Stage “four times now and every single time he’s just the friendliest guy, in the best possible mood, truly professional,” explains Debevec.

Dwayne Johnson

Johnson was one of the first actors to also have the close up micro geometry capture process used on their face. In the early days, this required the actor to bend over and position different parts of their face up to a small metal aperture with this macro lens to try to get 100th of a millimeter of detail of the skin. “Dwayne was one of the earlier people to go through this and since he had to have his head right in the middle of the Light Stage, he had to squat and hold a pose each time for this. Fortunately, he has no issue about holding a squat for awhile!” Debevec jokes. “We’ve since made the process more ergonomic, but he was just totally game for it!”

But not all scanning sessions run the same.

Some actors, especially those normally with blue eyes rather than brown, can find the lights very bright, particularly if they have some form of light sensitivity. For example “Furious 7” created an extensively-used CGI version of Paul Walker, and required Walker’s brothers to be scanned. “Cody and Caleb Walker were both quite light sensitive and they were blinking or squinting a bit more in the Light Stage than lot of other people do,” Debevec recalls. “When that happens, we do turn the Light Stage to half intensity. We then bump all of the cameras up to a 1250 ISO instead of 650 ISO and we pretty much get the same images. We just get a little bit more noise, but it is just a minor hit to the resolution of the scans.” The Light Stage X runs at about 5,000 lux, which is similar to daylight on a cloudy day. It seems bright when one is inside a dark scanning studio, but outside on a full sunny day can be closer to 180,000 Lux, so the Light Stage is not unusually bright or in anyway harmful.

Logan : Hugh Jackman

It is also an issue if the actor has tiny facial hair, known as vellus hair or peach fuzz. Some actors are meant to be in character with beards – such as Hugh Jackman for Logan. “There’s been lots of shaving occurring in the ICT offices during Light Stage scans. Paul Walker’s brothers deliberately came in with a Paul Walker ‘scruff’, to appear as Paul would have appeared in the movie. We scanned them with a scruff and then we had them both go and shave clean. We then scanned them again.” Generally the actors are asked to shave beforehand, but the team need to be very respectful when dealing with some lead female actors.
Many actors are keenly interested in the Light Stage process, with some even wanting to shoot their own ‘making of’ videos. Debevec and the team have also worked hard to not make the experience a chore for the actors. “I didn’t want to become something like going to the Dentist, for any A list actor,” Debevec jokes. “Overall folks have been good. There have been a couple of people who were in character as villains and a bit more ‘method actor’ and that’s a little bit problematic. You literally can’t really talk to them and you’re told not to talk to them. They are not as friendly, but that’s a particular kind of acting strategy.” Interestingly, this behaviour is not some childish indulgence. The original FACS research by Paul Ekman and Wallace V. Friesen, (published in the late 70s), noted that Method Actors were better than almost any other group at delivering accurate genuine emotions. As certain facial muscles cannot be consciously controlled by most people, to fire these muscles for FACS sessions, being ‘in character’ and accessing past real emotions is technically the best way to get the correct facial FACS poses.

Logan’s Scans

“It’s been a great honor to work with some exceptionally talented actors and try to explain to them what the process is,” says Debevec. Many actors are fascinated with the process and are keen to understand it fully. “Tom Cruise seemed to think it was cool, from what I judged,” explains Debevec, who explained to the actor how the ICT team were going to do his scans and a little bit about the polarization technology. “I can honestly say I have never felt more that a human being was giving me 100% of their attention then when Tom Cruise was listening to what he needed to do in order to shoot a good Light Stage scan.”

Tom Cruise

Solaris: The Newest Light Stage

Dr Paul Debevec with the new Solaris lights at Google. The new generation lights better model the light properties of the Sun.

Spread around the world, there are now about seven Light Stages licensed from USC ICT’s research. Polarized Spherical Gradient Facial Scanning is available through the USC Institute for Creative Technologies in Playa Vista and from LightStage LLC, in Burbank, which also operates a mobile unit.

Most recently the newest Light Stage has been unveiled at Google. Paul Debevec’s team have built a brand new Light Stage in Los Angeles as part of the new ongoing research that Paul Debevec and his team are continuing. “I was very happy to come over to Google with Jay Busch, Graham Fyffe and Xueming Yu,” commented Debevec. He considers this the minimum core team that would be vital in order to design and build a new next generation Light Stage. The team at Google Playa Vista research lab encompasses innovative Light Field research and other things beyond just Light Stage research, but Debevec is “very happy that Google was also interested in working with the Light Stage technology and very recently we’ve been able to show our newest Light Stage device. We’re calling it Light Stage X4. But it is nicknamed Solaris, because it’s particularly good at simulating the angular spread of the sun.”

Solaris has only been shown once when the team unveiled it for the first time to the surprise of a VES- SIGGRAPH group touring the Light Field work at Google.

The new capabilities have not been published nor spoken about yet, but it is exciting to see where Google will help Debevec and his team take the Light Stage research next. Since the Light Stage X started working almost two decades ago, there have been many advances both to the Light Stage itself but also in alternative approaches. Approaches such as photogrammetry based facial scanning has increased in popularity and there has been wide adoption in recent years of stereo correspondence software, but generally they all still fall short of the accuracy of the geometric detail or the reflectance maps provided by the Light Stage X approach.

The Light Stage X4

 

The Films and Actors using Polarized Gradient Illumination Facial Scanning since 2009.

2009:

Joel David Moore seen being scanned and as an insert from Avatar.
  • Avatar (2009): Sam Worthington, Zoe Saldana, Joel David Howard, Stephen Lang, Sigourney Weaver

2010:

  • Endhiran / Robot (2010): “Superstar” Rajnikanth
  • G.I. Joe: The Rise of Cobra (2010): Adewale Akinnuoye-Agbaje, Sienna Miller, Marlon Wayans, Byung-hun Lee, Saïd Taghmaoui, Rachel Nichols, Christopher Eccleston, others
  • Tron: Legacy (2010): Bruce Boxleitner, Jeff Bridges, Olivia Wilde

2011:

  • Tintin (2011): Jamie Bell, Andy Serkis
  • Underworld: Awakening (2012): Kris Holden-Reid
A macquette for Underworld: Awakening inside the Light Stage.

2012:

  • Journey 2: The Mysterious Island (2012): Michael Caine, Dwayne Johnson, Vanessa Hudgens, Josh Hutcherson, Luis Guzman
  • X-Men: First Class (2011): Kevin Bacon, Michael Fassbender

 

Mark Ruffalo
  • The Avengers (2012): Mark Ruffalo, Robert Downey Jr, Scarlett Johansen, Tom Hiddleson, Chris Evans, Chris Hemsworth
  • Maattrraan (2012): Surya
  • Ek Tha Tiger (2012): Salman Khan
  • The Twilight Saga: Breaking Dawn, Part 2 (2012): McKinsey Foy

2013:

  • The Lone Ranger (2013): Armie Hammer, Johnny Depp
  • Oblivion (2013): Tom Cruise, Olga Kurylenko

  • Ender’s Game (2013): Aramis Knight, Kyhlin Rhambo, Moises Arias, Asa Butterfield, Hailee Steinfeld, Suraj Partha, Conor Carroll
  • Gravity (2013): Sandra Bullock, George Clooney

2014:

  • Captain America: The Winter Soldier (2014): Chris Evans, Scarlett Johansson
Imelda Staunton inside The USC ICT Light stage
  • Imelda (centre) and the ICT crew – (Paul Debevec Right)

    Maleficent (2014): Juno Temple, India Eisley, Lesley Manville, Imelda Staunton (Knotgrass), Sam Riley (Diaval), Sharlto Copely (Stefan), Angelina Jolie (Maleficent), Isobelle Molloy (Young Maleficent), Brenton Thwaites (Prince Phillip), Elle Fanning (Aurora)

 

See our story on Maleficent face generation here.


2015:

  • Furious 7 (2015): Caleb Walker, Cody Walker, John Brotherton, Vin Diesel, Jason Statham
Cody and Caleb Walker at USC ICT
  • Point Break (2015): Edgar Ramirez, Luke Bracey

2016:

  • Batman vs. Superman (2016): Ben Affleck, Henry Cavill, Gal Gadot
  • The Jungle Book (2016): Neel Sethi
  • The BFG (2016): Mark Rylance, Ruby Barnhill, Germaine Clement
  • Central Intelligence (2016): Dwayne Johnson, Sione Maraschino

See our story on Central Intelligence face generation here

  • Suicide Squad (2016): Cara Delevingne, Margot Robbie, Jay Hernandez, Jared Leto, Karen Fukuhara, Alain Chanoine

2017:

  • Underworld: Blood Wars (2017): Kate Beckinsale
Kate Beckinsale
  • Logan (2017): Hugh Jackman, Dafne Keen
  • Valerian (2017): Rihanna, Shasha Luss, Pauline Houarau
Pauline Houarau
  • Blade Runner: 2049 (2017): Sean Young, Loren Peta, Ana de Armas, Mackenzie Davis

  • Thor: Ragnarok (2017): Chris Hemsworth, Mark Ruffalo, Cate Blanchett
Cate Blanchett Close up skin pore scan
  • All I See is You (2017): Blake Lively, Jason Clarke

2018:

  • Ready Player One (2018):

    Tye Sheridan

And there 5 or more in production now…


 

Medusa

Also honoured were the Medusa team from Disney Zurich Research, seen below here with the Light Stage team. We will also be featuring the great work of the Medusa team in an upcoming fxguide story.

The Light Stage and Medusa Team pose for a combined photo. Both teams were rewarded at the Sci-tech Awards this year. Both team have advanced facial modelling and animation enormously.