Sweet Tooth is an American fantasy drama series developed by Jim Mickle. It is based on the comic book of the same name by Jeff Lemire and premiered on Netflix in June. After an unexplained viral pandemic wiped out most of the world’s human population, there emerged a mysterious set of hybrid babies born part human, part animal. The show centers around Gus (Christian Convery), a half-deer hybrid, lives in the wilderness with his father, until his father passes away. Gus then teams up with a reluctant Tommy Jepperd (Nonso Anozie), who set off to Colorado to find Gus’ Mother. General Abbot (Neil Sandilands) the leader of the Last Men, hunts hybrids and uses the children for medical research.
The show has now been watched by 60 million households putting it in at number 6 in the list of most-watched English language original series. Netflix has ordered a second season of the show based on Jeff Lemire’s DC Comics series. Like the first season, the new season will comprise eight episodes. In July 2020, New Zealand granted the series permission to film there, despite the travel restrictions due to the COVID-19 pandemic. The visual effects were led by Rob Price as VFX Supervisor at Zoic Studios. The New Zealand On-Set Supervisors were Matt Bramante, Pania Williams, and Jacob Leaf. Zoic got involved fairly early working on the pilot for the series, this included concept art, visualizations, and pre-viz.
Digital and Practical Makeup
The baby and children hybrid facial effects were a mix of prosthetics and digital effects. For the very young babies, most of the work was digital, but for the main character Gus, 11-year-old Convery wore prosthetic makeup which was enhanced as needed. For example, in Episode 3, it was important to see Gus’ ear move in very distinctive ways. In such cases, the ears were replaced with digital ears. “We added a lot of subtle comp effects to the makeup, to add an extra layer of life to the costuming and special effects makeup,” comments Price. But not all the movement was digital, Gus did have animatronic ears in many shots when the ear motion was not the primary focus. “For Gus sometimes in editorial, we would be called in to add that little something extra,” he adds. “And also for Wendy (Naledi Murray) whenever she needed a little nose snort,… as her makeup was so tight to her face that it wasn’t practical to have animatronics at all.”
For the CG character work, the team used Maya and V-Ray for the most part, but they also used specialist tools such as ZIVA for all the muscle simulation in the captured tiger in Episode 4 and the elephants in Episode 2. The fur grooming was inside Yeti. For texturing the team used Substance and sculpting, Z-brush.
The team of almost 200 artists worked directly on completing 996 shots for season one. The compositing was done in Foundry Nuke, using several key plugin tools such as the KeenTools – FaceTracker, which is a high-quality plugin node for Nuke created for facial tracking without mocap rigs and markers. “There was a lot of very interesting work done around Bobby, where Bobby was a practical puppet,” explains Price. “In a lot of shots, there would be three or four puppeteers to be removed in comp, and in addition to that, we also had to do quite a lot of facial animation.
“Eye movement, nose twitching, blinks, facial animation, we actually got quite good at it, – all inside Nuke not needing to do any 3D outside Nuke for puppet enhancement work ” comments Price. “All of this was 2.5D, eye replacements, lip-sync when he talks, – all done inside Nuke.”
For the wider shots of when Bobby was walking in and out of shot, the team moved to a fully CG digit double of the character.
A similar problem occurred with the flashback sequence to the babies, which were themselves puppets in the delivery room. Here there were so many puppeteers, everything but the babies were digitally redone. All the room, the environment floor to ceiling was digital, except the babies themselves.
All of the wide city shots were digital. The production did shoot in Auckland for a lot of the ground-based city street shots, and so anything close to the camera including a large amount of foliage was practical and then backgrounds were modified and extended.
Zoic was using a lot of Houdini, using it for the cityscapes, and procedural wide shots, including cars and some foliage. “We developed some tools inside Houdini for vegetation growth,” Price explains.” a lot of the out-of-the-box programs for vegetation don’t grow the greenery in a natural way.” The Zoic team wanted the plants to not just populate the scene but be positioned based on plausible growth patterns that would naturally result from shade, exposure, and what objects or obstacles the plants would need to grow around.” We actually developed more natural tools that grow things, starting with vines, but then we implemented flowers and grasses… mostly inside Houdini rendered with V-Ray.” The team also did some Maya work, but the primary pipeline was Houdini.
Yellowstone National Park is a key plot point and for this iconic American location, the team did modify the New Zealand bushland, with some 2.5D matte paintings for some of the wide shots. “But having said that it was remarkable to me how well New Zealand did work as Colorado and parts of the Midwest,” Price recalls. One distinctive location was the Visitor’s Center with its cable car. This was primarily a set with dressing immediately outside the windows, “what was fun for this production was being able to use LED Screens,” he adds. “For any of the views outside the Visitor Center, we had these 60ft LED screens, which displayed landscapes we made in Epic’s Unreal Engine.” This series of daylight, dusk, or night exteriors provided great flexibility on set and strong contact lighting from the exterior into the set-piece. “You can really see that in the light play across the set in the lighting storm during the attack at night, …- getting all that light play in camera was really great for us.” This was also used when Gus and Tommy leave in the cable car, “we ended up filming a drone coming down the side of a real mountain and then we added in CG towers and wires to the environment so we would get all the reflections and shadows in-camera as well,” he adds.
The show was delivered as a 4K master, and post-produced in an ACES pipeline with close consideration for the HDR grade, as is becoming standard on Netflix shows. This is very much a standard pipeline for Zoic who are known for their high-quality episodic work. Zoic produced all the effects dealing with 4 or 5 episodes overlapping in post-production with specialist teams in 3D and an ‘evens’ team and an ‘odds’ comp team bouncing between episodes to composite and finish the sequences.
The Train Escape
In Episode 6, Big Man, Bear and Gus manage to board the moving train to Colorado. When Gus jumps onto the train, the production filmed a stunt person, extended the train, replaced the background, replaced sections, and extensively cleaned up the shot.
In some cases, getting the foreground elements to align with the background plates meant that to sell the movement relative to the train and the camera, the team would do a fully digital environment.
Video Game Tools
Zoic’s Real Time Group virtual art department provided assets in previsualization that allowed the creative team to optimize set design, lighting, lensing and camera ahead of shooting. These pre-rendered assets, combined with the on-set virtual production using UE4 game engine were used on LED walls mentioned above. Additionally, the Zoic team also had to make an actual video game as a visual prop inside the program.