Gradient Effects in the fifth episode of HBO’s Righteous Gemstones de-aged John Goodman, using a new AI-assisted tool called Shapeshifter. Actually, the tool is a different approach but the technology has been in development for years and has already been used in various forms for much simpler and very different work.

Dr. Eli Gemstone ( John Goodman) is a larger-than-life patriarch of the Gemstones, Eli built an evangelist empire with his late wife Aimee-Leigh and fears his children Jesse, Kelvin and Judy may be lost without her. Episode 5 involves a flashback to 1989 to a time when the Gemstone empire was still growing and Eli’s wife, Aimee-Leigh was still alive. But going back also meant de-aging Goodman for an entire episode.

Dr. Eli Gemstone (present-day) played by John Goodman
De-aged Dr. Eli Gemstone (John Goodman)

Gradient Effects is a one-stop studio for primarily TV episodic visual effects. But the technology is shared with their feature film division: SCRTLB – Secret Lab. The company has offices in Los Angeles, and Munich. The film division has used ShapeShifter over the past 8 years on projects such as adding lighting to Thor for Marvel Studios. Gradient have approximately 30 full-time artists supplemented with freelancers.

Gradient sidestepped the Uncanny Valley to shave decades off Goodman for an entire episode, delivering nearly 30 minutes of film-quality VFX in only six weeks.

Shapeshifter starts by analyzing the underlying shape of Goodman’s face. It then extracts important anatomical characteristics, like skin details, stretching and muscle movements. This involves an advanced new tracker seen in this video below. Note the process requires no special makeup or dots on the face of the actor to work, (in fact it can track many other complex surfaces such as water).

 

With the extracted elements saved out as layers to be reapplied at the end of the process, artists could start reshaping his face without breaking the original performance or footage. Artists could tweak additional frames in 3D down the line as needed, but they often didn’t need to, making the de-aging process nearly automated.

“Shapeshifter is the first of its kind – an entirely new way to de-age people,” said Olcun Tan, owner and visual effects supervisor at Gradient Effects. “While most productions are limited by time or money, we can turn around award-quality VFX on a TV schedule, opening up new possibilities for shows and films.”

Traditionally, de-aging work for film and television has been done in one of two ways: through complex compositing or CG head replication which can take months to achieve. Even with new DeepFake style AI with training data,  the range of motion and fidelity of skin texture are major issues with faces. Shapeshifter introduces a new method that not only preserves the actor’s original performance but also interacts naturally with other objects in the scene. Shapeshfter while using AI, is not doing a Deep Learning (Deepfakes) solution. “No, it’s not, done that way”, says Tan, who personally wrote the code and developed Shapeshifter. “It analyzes the motion of the face and the skin, how it stretches and moves. It then creates metadata for us in 3D” he explains. The program works like Motion Capture on the shot footage. “You’re creating from the ‘Mocap’ a skeleton system, with different layers that is then available to us in Maya. From there the Gradient Artists can go in and reshape one frame and it recalculates back, using the footage, and reshapes the face.  We can still add to it or change it”.

In other words, it produces a 3D representation of the face, and adds the textures, all from the source camera footage, per shot. Goodman did not need to be scanned or go through a FACS session. “Every shot can have a slightly different result in the underlying mesh” he explains. “But whatever you do in reshaping the face in a shot, will then be inherited throughout the whole shot. That’s the most interesting part of our methodology – it doesn’t throw away the filmed performance and you don’t have to reinvent the performance or clean up the performance with animators. It maintains all that because you’re using the real actor’s performance.” As the face is made per shot, the face gets the performance and the natural lighting of the shot. As the animation is interpreted into muscle movements, it means that at any point an artist still has an accessible way of getting in and creatively adjusting the actor’s face. It is not a FACS decomposition.  Tan did not want to break down the performance into FACS AUs and then reconstruct the face, but he did comment that he thinks he could derive a FACS decomposition eventually, although the system does not require that and is not currently written to provide it.

 

“One of the first shots of Interlude shows stage crew walking in front of John Goodman,” said Tan. “In the past, a studio would have recommended a full CGI replacement for Goodman’s character because it would be too hard or take too much time to maintain consistency across the shot. With Shapeshifter, we can just reshape one frame and the work is done.”

Shapeshfter runs as a plugin to Autodesk Maya. Once a shot is finalled it is rendered as any other 3D element would be, in Righteous Gemstones the team rendered in Arnold. The tool has been in development for 8 years, and it is currently being patented. The code was written in C++. For the first few projects, it was used to replace actor’s eyes and eye lines. One of the earliest digital makeup jobs was “we helped Leo DiCaprio lose weight in The Revenant“, recalls Tan.

One huge advantage of the Shapeshifter tool is time, “I think the most important thing about this tool is that it came from thinking outside the box. Instead of doing 3D scanning of somebody’s face, – the mantra for our process, from day one has always been, – avoid that scanning process. We tried to use what we get in the footage and bypass those 3D processes and just get into it”, says Tan.

This is possible because Shapeshifter continuously captures the face, including all of its essential details, using the source footage as its guide. With the data being constantly logged, artists can extract movement information from anywhere on the face, whenever they want, replacing expensive motion-capture stages, equipment and makeup teams with the new method. The approach is therefore not limited to de-aging, it could be used for a host of digital makeup and other applications. The team used an earlier version on Stranger Things, series 1 for the plasma goo that is seen as one of the characters is moving out of a tree trunk (see below: bottom).

 

One of the key aspects of the technology is that it captures not only the face and its texture but its elasticity “how the skin stretches and the elasticity of the skin”.

As faces age they sag, ears get bigger, skin falls so the process still requires comp work to inpaint where the background is missing if Goodman’s face is lifted. This work is very similar to 3D hand generated stereo work. The compositing was all done in Nuke. There are also filters with are applied to the face so the skin texture does not appear to soften due to image processing. The actual extent of any facial feature reduction is artist controlled. Also, Goodman’s hair was a wig when he was younger that needed minor blending work.

At the moment there are no plans to sell or license ShapeShifter, it is Tan’s personal labour of love, and a major competitive advantage for both Gradient and SCRTLB – Secret Lab. Until the patents are final he is not talking publically, but he is proud to now have a fully featured version of the software running, and once it is patented he does plan to publish.

“Shapeshifter is the first of its kind – an entirely new way to de-age people,” comments Tan.  “While most productions are limited by time or money, we can turn around award-quality VFX on a TV schedule, opening up new possibilities for shows and films.”

Non-face effects

A good example of a non-face use was a non human sequence in The Revenant. In one sequence the team needed to have digital arrows hit and fall into the water of a river. The team used Shapeshifter to produce effectively motion capture data of the surface of the river. “There’s an epic arrow scene, when they get attacked by the Indians, and they are fired on with arrows. There are of course no tracking markers in the water, it’s a stream, but Shapeshifter tracked the arrows hitting the surface of the water”.  The output from Shapeshifter in this shot was an arbitrary set of tracking markers, but all as structured data. Shapeshifter does not track everything, but Tan feels it works 95% of the time.

 

The effects of Shapeshifter in a different form can be seen in the following sequence from Series 1 of Stranger Things. Here it was used to analyze and track the motion for the millions of ‘ooze’ elements added in post-production.

 

The Righteous Gemstones is on HBO and Episode 5 first aired on September 15, 2019.