Bringing Robots to Life. DNEG’s Westworld

As we mentioned in our previous story on HBO’s Westworld, (here), season three takes place three months after the Westworld massacre of the second season. Unlike previous seasons, this season is in the regular world and not in the Park, but it is a world of self-driving cars, drone helicopter taxis, and mechanical robots.

The CG generated “Mech, the Riot Control Robot”,  Image Courtesy of DNEG © 2020 Home Box Office, Inc.

It is a complex problem to animate a character and bring them to life, even more so when the character cannot be motion captured and they only have a rigid metal head that does not allow the animator to directly display expressions. In Westworld Season 3, the non-human robots were crafted by the artists at DNEG split in both LA and Vancouver. DNEG did a range of visual effects for the show, including car bombings and environment work, but fxguide decided to focus on just three particular robots when we spoke with DNEG’s VFX supervisor Jeremy Fernsler and Animation supervisor Ben Wiggs.

The three non-host robots are

  • George, the construction Robot,
  • Harriet, the hijacked ‘heist robot’,
  • Mech, the Riot Control, 14-foot tall attack droid
CG generated ‘Harriet’.      Image Courtesy of DNEG © 2020 Home Box Office, Inc.

“When DNEG first got involved with Westworld season 3, we were given the opportunity to do all the non-host Robotics,” recalls Fernsler. “We did the Lab robot, Harriet, the construction robot, George, and the big riot control Mech… and that body of work grew a bit more to include a car chase, as well as a variety of other shots as well.”

The onset data collection pipeline was handled by the production visual effects crew under the overall show’s VFX supervisor Jay Worth. “Jay and his team were very good about getting absolutely everything that we might need in terms of HDRs, reference photography, and scans” Fernsler adds.  Onset there was motion capture with performers wearing inertial suits. “It was good to have these as a reference and see them in-camera … but we actually ended up almost never using the data. But we had those guys on camera, which was great for our animators to work against”.

Source image, courtesy of DNEG © 2020 Home Box Office, Inc.
Final Image courtesy of DNEG © 2020 Home Box Office, Inc.

The reason the motion capture data was not used, came down to the issue of weight and movement. The non-host robots needed to feel both heavy and much simpler in technology than the hero hosts, who are at the center of the series. “The brief we got from the showrunners was that we had to make them feel very gritty and grounded in reality, rather than showy Sc-Fi… – after all this is only a couple of decades in the future”, comments Wiggs. “We approached it as if you could conceivably imagine that they were created by a company like Boston Dynamics…and so everything had to look physically achievable. Getting the performances right was all about them looking heavy and real but letting their cognitive ability shine through”.  While the team did not directly copy Boston Dynamics, DNEG did study the way those real-world robots moved, and it proved a better source for reference than the actors who stood in onset and provided the MoCap data.

What makes a Boston Dynamics Robot empathetic?

In recent years robotics in the real world has produced seemingly large leaps in mobility and agility. This success in robotics has not come from advances in super isolated intelligence, which is what is associated with the hosts in the TV show. The seeming ‘humanity’ the Boston Dynamics robots exhibit has come from a simpler form of constant emulation. The popular implication has been that robots are getting much ‘smarter’. This success in emulation and the resulting ‘appearance of intelligence’, has created a strong popular view that robots today are much more natural and almost alive. Yet both in Westworld and in the real world, it is the way the robots react and update to their environment that produces the illusion of life, not vast leaps in computer processing. Without soft bodies or digital faces, it is this environmental reaction and movement that the animators at DNEG had to capture if they were to produce ‘dumb’ robots, that were still lifelike and appealing to the audience. The audience had to perceive these simple robots had intelligence, but that they were not vastly intelligent like Dolores (Evan Rachel Wood) or Maeve (Thandie Newton). “We just didn’t have much to play with,..I actually think that was part of what made this show so enjoyable for us, this was almost like going back to basics of Animation 101 to communicate emotion”, says Wiggs.

In the case of Boston Dynamics, their form of embodied cognition has proven successful in dealing with complex real-world technical issues such as agility and mobility for their robots. Honda’s famous ASIMO robot was built using an older traditional cognitive, computational approach, sometimes called Computational Theory of the Mind (CTM), which relies on a representational view of the world. Honda was able to make the robot walk and climb, but even minor issues could disrupt the robot as it tried to interpret data, update its model of the world, and decide on a reactionary course of action. Compare this to the work of Boston Dynamics, which has been building a series of robots starting with BigDog, using a very different approach. The BigDog robot can handle complex terrain and later versions even manage to recover from violent knocks from testers or walking on slippery ice.

‘Big Dog’ Real World Boston Dynamic Robot

Boston Dynamics decided that a computational strategy like CTM would be too slow and so opted for a Dynamic System. They built a robot with springy legs and joints that mimics those seen in animal quadrupeds. BigDog has a comparatively small computer and its success was not due to it having access to a more powerful computer than ASIMO. The specific movements the Big Dog and later Boston Dynamics bi-peds exhibit, emerge from the interaction between the robot’s moving legs, the surface they are on, and any other forces acting on the robot. If a tester or engineer tries to knock over the Robot, the robot does not re-compute it’s behavior; it simply responds to the new force and the details are left up to the limbs or legs based on Dynamic Systems Theory (DST). What the DNEG animators needed to capture through keyframe animation was this DST style of movement. DST  comes from a broad approach imported from the physical sciences and used in cognitive science as an alternative to the computational and information-processing approach. DST, as seen in Boston Dynamics Robots, is at the core of the “cognitive ability shining through” that Wiggs sort to capture. DST is best described, in academic circles as complex, non-linear, self-organizing and emergent. In this form of robotics the ‘cognition happens’ in real-time as a probable outcome of many possible alternatives rapidly being explored, instead of linear-assembly-of-symbolic-processes as in CTM. It is this reactionary motion that a robot such as Harriet in Westworld, exhibits so strongly.

Source image courtesy of DNEG © 2020 Home Box Office, Inc.
Final image courtesy of DNEG © 2020 Home Box Office, Inc.

Without a face or any of the traditional soft body tools at their disposal, the DNEG character animators very successfully show motivation, sadness, and personality. Harriet and George, in particular, seem grounded in the frame, as they so closely react to the environment. The DST type of motion produces a very natural looking physical solution to movement, especially if the robot is interfered with such as being shot. Both in the real world and in the show, the resulting movement of the robots can be described as being ‘very biological’. Ironically, it is a style of balance and movement that a human motion capture data set does not provide.

George the Construction Robot

DNEG worked on all 8 episodes of Westworld season 3, with significant contribution to episodes 2, 5, and 6. George was one of the first Robots to appear this season as he is a construction robot seen working with Caleb Nichols (Aaron Paul).

The robots were animated in Maya and rendered in Isotropix Clarisse. The animators mainly reviewed their work in Maya playblasts, but almost everything including Houdini simulations was rendered in Clarisse.

For the most part, the designs for the robots came from the Production Art department “George, the construction robot, was very straight forward. The Art Department defined how it should look and for the most part, by the time we were done modeling and rigging George, he looked exactly the same, minus some minor corrections for colliding geometry and joint construction,” explained Fernsler. “For the Harriet robot, there were a few more gray areas, especially in the way her hands were initially designed. We discovered that they didn’t work for one of the sequences. So we ended up doing a redesign on those”.

For the animation of George, to make him more ‘likable’ the inventive approach the team ended up with was “mimicry of Caleb’s actions,” says Fernsler. “If you see Caleb wiggling his feet having lunch, then George will wiggle his feet. If Caleb looks around then George also looks around a beat later. It was just that idea of when you meet someone for the first time if you mimic their actions, they might like you a little bit more, and thus George would be more endearing”.

Harriet – the hijacked ‘heist robot’

When the audience first meets Harriet she is yet to be taken over, but the Dynamic System approach is evident. “Harriet, when she is in the laboratory at the very beginning, you get the idea there’s not much going on, she has a set of tasks to do, but she was animated to retain sort of micro-movements and these little balance actions, – until she becomes activated and controlled, – at which point the animators gave her a lot more of a sentient feeling” explains Fernsler.

Image Courtesy of DNEG © 2020 Home Box Office, Inc.

Weight and movement was such an issue for the team, and this is so hard to mimic for MoCap actors, especially with Harriet. “The MoCap performers were roughly 150 pounds ~ 170 pounds and, they were great but it’s very difficult for them to act like they weigh 300 or 400 pounds,” comments Fernsler. “We worked out that Harriet weighs, about 450 pounds, George weighs 600 pounds, the Mech weighs a whole lot more !” An example of how this materializes on screen is when Harriet goes through the glass window or doors. they just don’t slow her down, since the weight differential ratio and momentum are completely different than with an actual actor weighing one third as much. “And I think that all gives credence to the power that she has behind her and the mass that she was carrying”. says Fernsler.

Source image courtesy of DNEG © 2020 Home Box Office, Inc.
Final image courtesy of DNEG © 2020 Home Box Office, Inc.

Mech, the Riot Control / Attack Droid

One of the more unusual robots is the Mech Riot control robot. This huge machine self assembles from a set of packing crates and ends up attacking staff and security men. The Assembled Mech is fast-moving but huge, standing over 14 feet high.

Source image courtesy of DNEG © 2020 Home Box Office, Inc.
The actual size of the Mech can be seen clearly here (below) compared with what the actors had to react to on set (top image). Image Courtesy of DNEG © 2020 Home Box Office, Inc.

A key aspect of character animation is normally secondary motion. DNEG did a lot of testing and experimentation to see how this could be introduced into the metal, hard body robots such as the Mech. “We ended up using some quite neat tools that allowed the animators to very quickly try out a kind of secondary jiggle on the component parts of the machine,” says Wiggs. “So often what holds a piece of animations back from looking heavy is if the asset looks like plastic and it moves in too rigid of a manner”. DNEG in Vancouver did a lot of work to visually sell that all the pieces of the Mech were actual pieces that fitted together on an underlying frame. “If you look at the design of the Mech, it’s body paneling is made up of these individual panels, with sort of little flaps that connected places on the body… ” DNEG’s animation team adopted an approach, not unlike skin sliding over muscles on a biological creature, explained Wiggs adding that, “the idea is that there is an underlying sort of mechanical skeleton, and on top of that, everything can have a bit of secondary jiggle. This just upped the heaviness factor by several different factors”.

The in-house tool the team used for this process was developed some time ago by the rigging team at DNEG and is called ‘Animator Friendly Dynamics’. “It was built in order to allow the rigs the animators use to be lighter weight,” Wiggs explains.  “You can either have heavy rigs in animation, or lighter rigs but with good tools to support them, – and that was how this tool was born. You can create any piece of animation and then you just input it into this tool whatever components or secondary controls you want, such as, ‘I want this piece to have a bit of a timing delay’ and it solves that. I believe it’s built off around Maya’s hair dynamics,” he adds. Once the effects are baked, the animators had full control to adjust or tweak the resulting movements to fine-tune.

Source image courtesy of DNEG © 2020 Home Box Office, Inc.
Final image courtesy of DNEG © 2020 Home Box Office, Inc.

For the most part, the DNEG team stayed with the blocking and framings as they had been shot. As the actors did not have anything exactly the right size to react to onset, there was some work done in animation to make eye lines work. “The actors for the sequence when the Mech crashes through the wall didn’t have anything to work against onset and so they were pretty much pantomiming where they thought they would be seeing the 14 foot Mech,” says Fernsler. “It did give us some difficulty with eye lines, but we tried to solve as many of those problems through animation as possible, just making sure that our poses were putting the bot or the bot’s appendages in the places where it would make sense based on their eye lines filmed in camera”.

As the Mech picks up and throws a security guard and kills several others there was both digital take over from stunt wire work as well as ragdoll simulations of fully digital (dead) guards.

As shot, source image courtesy of DNEG © 2020 Home Box Office, Inc.
Final with animation and simulations.  Image Courtesy of DNEG © 2020 Home Box Office, Inc.

In the end, the DNEG team created the right balance of intelligent robots, that were still inferior to the hosts, grounded in reality but empathetic and visually appealing.