Epic’s Unreal ‘When the Serkis comes to town’: GDC Part 3

As part of their State of Unreal presentation Epic showed the latest work from 3Lateral featuring Andy Serkis. The world’s most famous and successful motion capture actor Andy Serkis is very familiar with doing motion capture, after all he was the actor behind Ceaser In the Planet of the Apes films, Gollum in Lord of the Rings and Snoke in the latest Star Wars VIII. What made this demonstration jaw dropping was that the real time digital Andy Serkis was shown in UE4 fully without manual tweaking or polishing. While the process produces human and computer readable channels for editing, this demo was without manual adjustments by any animator.

 

3Lateral in Serbia scanned the actor using a new 4D technique that allowed real time animation that required no manual intervention. Normally, there is a lot of clean up with motion capture, and with facial animation in particular.

3Lateral are known for their facial rigging work, producing some of the best facial rigs in the world, but their new 4D scanning allows them to also model and animate a face completely without needing artist intervention. It allows faithful high fidelity during production. In many cases the facial animation can be approximated on set, but the director or crew on set see only accurate body movements and much of the facial performance is not realised until the data is processed in post.  With the new 3Lateral workflow it is possible to have incredibly vivid and accurate animation which allows the director to creatively make appropriate creative decisions with facial MoCap in the same way they do with the other actors. For some time there has been a tendency to only discover issues in post and then build a composite performance from multiple takes and with a lot more artist interpretation. This can produce great results, but it is time consuming and isolates the performance artists acting from the rest of the cast.

Epic demoed the Digital Andy using virtual cinematography controlled by an ipad. It also showed real time adjustment of the played back performance. Using the Ipad, the face can be additional adjusted with delta offsets to the base animation. For example, adjusting and additionally animating Digital Andy’s eyebrows while the character delivered the lines in real time.

Not only was the Epic Unreal team able to show a digital Andy Serkis running in real time, but his Shakespearean performance was re-targeted to a dragon 3Lateral asset. The “Macbeth” performance data drove 3Lateral’s fictional digital creature, Osiris Black, to demonstrate how the same capture can drive two vastly different characters. What is significant is that the two faces are completely compatible and 3Lateral will next be producing a demo that allows a user to morph between the two in real time, in engine. This is possible due to a key technology 3Laterial call digital DNA. The DNA architecture is a part of it’s Rig Logic solver dense vector field regresses to the animation rig.

The Osiris Black character is part of a larger world, passion project, that 3Lateral is working on.

The entire demonstration came together in just a five weeks. Serkis flew to 3Lateral Serbian headquarters in February and was scanning by the new 4D rig. This data was then used to build a digital Serkis and texture him. The face is then faithfully rigged and animated based not on traditional FACS poses, but on Serkis’ actual performance.

3Lateral’s new approach is called the Meta Human Framework and it is a combination of volumetric capture, reconstruction, compression and 3Laterial’s Rig Logic technology combining to bring digital performances to life.

The volumetric data was generated by capturing a series of high-quality, HFR images of Andy Serkis from multiple angles under controlled lighting. 3Lateral’s process involved various capture scenarios, some focused on geometry, some on appearance and others on motion. All of these inputs were applied to generate a digital representation of Andy Serkis, and to extract universal facial semantics that represent muscular contractions that make the performance so lifelike.

In the resulting real-time cinematic, a high-fidelity digital replica of Andy Serkis recites lines from “Macbeth” in nearly indistinguishable video and performance quality from his real-life acting.

Serkis was a willing and ideal subject for this proof-of-concept demonstration as in addition to his remarkable acting talents, he is deeply versed and experienced in digital performance process around the world. While the process can capture teeth, for this demo Serkis teeth, tongue and interior mouth were created with off the shelf software.

In order to display these massive data sets, 3Lateral’s semantic compression reduces data sets while preserving the integrity of the data, enabling the ability to retarget the performance onto a digital character while easily altering gaze and subtle performance nuances. This incredibly high-fidelity capture is pre-processed offline to a data set that can be loaded into Unreal Engine to enable real-time volumetric performances.

Vladimir Mastilovic Founder and CEO commented to fxguide, “what is critical is that the DNA solving and estimation is solving for identity and expression”. Once the identity of the individual is isolated 3Lateral effectively have an offset against a general rigged/model to the specifics of that one performer. But as all the characters share this common base, multiple DNAs can be blending between to create new high quality characters easily and quickly. The new 4D scanning process is effectively used 7000 scans resolved to a FACS common model for the Andy Serkis demo. While the hardware is still under wraps, the machine vision cameras and rig are not the key IP, the innovation is the complete Meta Human software framework.