Alive and animated: how DD created ‘Robotarm’

In ‘Robotarm’ a fully-CG Cisco TVC completed by Digital Domain/Mothership, nine automotive assembly robots fix and diagnose themselves, all to the tune of Gary Numan’s ‘Cars’. We talk to visual effects supervisor Aladino Debert about the challenges of completing the spot in a tight four week schedule.

Watch the final spot.

fxg: Congratulations on the spot. It’s very hard to believe that this was all-CG, and I guess photorealism was the mandate going in to make it really have resonance?

Debert: Well the only reason we agreed to do it in the crazy amount of time that we had was because we knew the realism was something we were comfortable pulling off. What we wanted to do wasn’t a ‘cool CG spot’ – what we wanted to do was a cool spot. In this particular case, CG was the one way you could pull it off. Neither David Rosenbaum the director nor I wanted to do it in CG just to pull off the latest greatest CG ever – we looked at it and thought, ‘Was it something we could shoot in the time?’ Well, no. ‘Can we do this all in CG?’ And we thought about it a little bit and said, ‘Yeah, I think we can do it.’ For us we were more concerned with the storytelling and art direction than whether we could make it look real or not. Obviously if it didn’t look real then it would fall on its face, because then it would look like a cartoon.

fxg: How did you get the design process through? How quickly did that get resolved?

Debert: Funnily enough, when we first did the job the idea was to only use three robots – and I thought, ‘Yeah, sure, we can pull off three robots, why not?’ It seems like this happens more in short shows than long ones, but the creative changed a little bit, and the agency (and us too) wanted to make it slightly more character-driven. For that we needed to create robots that had more of an anthropomorphic face, something that could almost pass as a face, while keeping it within the confines of what a real industrial arm would be.

So we did some sketches in the first week. As we were already modeling and rigging and doing the more traditional-looking robots, we did a bunch of sketches based on the drawings we already had – pulling them apart and saying, ‘What could you grab to make it look anthropomorphic?’ Using those sketches we got the client to pick the ones they liked – we went back and forth a few times to finesse a few things like making the beak a bit longer or the eyes bigger. Then we went ahead and started modeling right away. We had literally four weeks and three days from the day the show was awarded.

Watch a work in progress of the spot.

fxg: How much does this tap into technique and artists from shows like Real Steel, and how much is this a different group with just similarly good guys, who know what they’re doing?

Debert: Obviously we cross-pollinate whenever possible across features and commercials. We all have feature experience here and there but in this case we didn’t use any staffers from features. However, our pipeline has been growing and getting more sophisticated as the years go by thanks to the experience both in commercials and in features. I couldn’t tell you that just because we did Real Steel that we know how to do robots, because we’ve been doing cars and robots since a lot longer than Real Steel’s been around. Things like Real Steel also pull from past experience. Lee Carlton was CG supervisor and was invaluable. Adrian Dimond and Derek Crosby were responsible for rigging all these crazy robots. Without their work, of course there wouldn’t have been any animation.

fxg: I imagine as a commercials group, you have digital cars down pat – you’ve done a heck of a lot of great digital cars, – but this spot has that great authenticity of scratches and nicks and the kind of aging stuff that most people don’t want on a real car.

Debert: We walked a fine line on this. In fact, on the first round of look-dev the robots were even more beat-up. We probably went a bit too far, and the agency was concerned that Cisco would be a bit ticked off, you know, ‘Hey, we’re trying to showcase our high technology and all these robots looked like they came off of a fight’. It’s not like there were dents but they were pretty scratched up and a little dirtier. We really liked it but they asked us to pull it back. I always say that you can make a cube look good if you have a good lighting. However, with something like a robot where it’s already doing something that is slightly unrealistic – behaving in a way that a real robot wouldn’t. So for us it was very important that the look felt as close as possible to what a real robot in a factory would look like.

We looked at a tremendous amount of reference of automotive and industrial types of robots to see what kind of damage they have. What we noticed was that they’re pretty banged up – they work with big pieces of metal, particularly the ones that work with cars. I’m pretty sure once in a while they hit something or something falls on them. They had a fair amount of scratches and paint chips, particularly on the leading edges of their bodies. So we painted a huge amount of very specific textures for each of them. Because their topology was pretty different, we had to do it manually. It’s not something we could do as a procedural shader. I think it helps a lot with the realism of the piece.

Also, Adriam Graham did all our FX rigging. He created a tool that essentially did all the FX out of one tool – you could do spot welding, regular arc welding, leave residue or completely change the look of the sparks depending on what you were doing. You could also do the metal flicks from the drills – I kept calling them the fingernails – the little piece of metal you get out of drill.

fxg: In terms of texturing, you’re still using UV maps rather than something like Ptex?

Debert: Well depending on the artist and their choice of platform, some were doing it in Modo and just painting straight up into geometry and some were doing it in Photoshop.

fxg: What were you rendering with?

Debert: We render everything in V-Ray. It’s exactly the same pipeline as something like Real Steel or Transformers used.

fxg: Was it a 1920×1080 finish?

Debert: As a matter of fact we ended up rendering a little bigger – we did an overscan of all our renders because we wanted to have some flexibility to do some slight push-ins and lens distortions in compositing. Even though the final piece was HD, all our renders were about a 10-12% overscan. In the end I think we used the advantage of that overscan in about two or three shots, but we wanted to have that in our pocket in case we needed it.

fxg: On a project like this where it’s all CG and it’s on a tight timeline, are you assuming a good DI grade at the end or are you assuming you’re just going to nail this out of the gate and just do what you had to do in compositing, because it wasn’t like it was a heavily graded piece like a live-action piece with a certain look?

Debert: It was a bit of a mix of both. We worked very closely with the compositors as we were comp’ing the stuff to get it as close as could with what we felt was the right grade. However, we always felt that we were going to have a couple of days at the end in Flame with the final shots and do a final, final grade. Jeff Heusser was responsible for that work. (NB.  Jeff also worked as part of the Nuke team).

One thing we had was a contact sheet of the entire spot every day so we could see every shot and one next to the other and what the overall grade was looking like, just to make sure that everybody was aware. We had to do a lot of things at the same time, which is not your preferred way. We were doing animation as we were doing modeling as we were doing look dev. And as were doing lighting we were doing comping and effects. We couldn’t wait so we were working in a lot of things in parallel, which created a lot of complexity because our pipeline had to be rock-solid – things couldn’t break in the middle. You could still work on a model as somebody else was working on the textures for that model as we were animating and lighting. We literally changed textures two days before delivery, and animation! We rendered and comp’d and delivered the next day.

Going back to the grade, we always know on a full CG spot, and even one that’s not full-CG, that we will be doing a little bit of a grade at the end in Flame. What we try to do is get as close as we can in the comp. It’s important to know that our rendered elements were coming out of our lighters and they looked amazing. I pushed very hard to try not to leave anything like, ‘Well, we’ll fix it in comp’ kind of thing’ – because we knew we wouldn’t have any time. We called it ‘Light Comp’ where essentially what the compers do is render it out and put it together in a simple comp just to check everything is alright. The Light Comps already were looking pretty amazing.

fxg: If this was a live action set,  you could do a real HDR in the middle of the set and use that for the lighting. Did you do a fake HDR –  actually generate one and hand it to everybody as the base?

Debert: We did actually. We created the first lighting rig of the factory and then we went ahead and rendered a spherical camera in the center of the factory in multiple exposures and we created a real HDR out of our renders. And we used that to render.

fxg: So then you got consistency, as everyone had the same lighting base, which rules out errors that could otherwise come in from the tight time scale.

Debert: Well, the advantage of this particular spot is that the entire action happens inside a relatively limited space, so we knew we could come up with a lighting rig that, if it worked in one shot, we could be pretty damn sure it could work for everything. We had to add a few lights for closer or wider shots here and there to get the kicks you would like, just like any cinematographer would do. We created a proper linearized HDR out of our render by creating a spherical camera and then we used that. That not only had the advantage of making sure the look was consistent across the board, but also improved our render times dramatically. Tim Jones was the first one to do the lighting development. He created our fake HDR based renders and the overall lighting rigs.

fxg: I guess the other thing is you had solid previs on this so you knew where you were going?

Debert: Well, when we started doing the previs which was the day after it got awarded, the idea was still a little different. The idea for the spot was much more of a kind of ‘Daft Punk’ dance routine. They were going to be much more choreographed, rather than almost what ended up being – they were almost written in spite of themselves. So the first few passes of previs was much more like a dance line and much more traditional-looking assembly line, with literally 50 robots in line doing the same work. It was a different concept.

So for the first few days we were going ahead with that, and then the client requested a change of direction, which was much more character-driven which was not something we had planned on doing at the beginning. So normally, yes, the previs would dictate, but on this case it didn’t. We almost jumped into animation right away. Niles Heckman was my head of previs and became my lead compositor. I actually ended up animating four shots. It wasn’t planned, but it was so crazy and hard for us to find the right people at the right time. My idea was to have an animation lead, but we couldn’t get the right people to do that. So I decided to do one shot to kind of set the style for the animation, and that moved into the next. But it’s always a team effort – I kind of set the pace a little but we had really great artists working for us.

fxg: Was there any discussion going back and forwards on what one word or two words were going to appear on those screens, or was that decided at the start?

Debert: I think that the final approved screen text was done the day before we delivered, [laughs].

fxg: You had to have a very small number of words to convey the emotional feeling of the robots.

Debert: Yeah, we had to think, ‘Do we want to make it more high-tech, do we want to make it more analogue, like a screen, a smiley face.’ We did about four or five iterations and concepts, created by Cody Williams, but the actual one that ended up on the screen we got approval for the day before delivery. Obviously we had rendered everything out so we only had to render the screens and comp them.

fxg: But in a sense it’s hard to know until you’ve got it finished what’s going to play right.

Debert: And also because of the timing of the shots, one thing that we kept going back and forth on – and we changed animation a lot because of that – well, when it breaks down, do you want to go into a close-up of the screen so you really see? One thing I made sure that we kept doing was that I really wanted to see the conversation between the two robots be clean. In other words, ‘I’m broken’, ‘No problem’, ‘Let me fix you up’, ‘Thank you’. Some of the early iterations of the animation had him breaking down, then a few things would happen in the middle, and only later would the helper come in and say ‘No problem’. So I tried to make sure that not a lot of time happened between ‘I’m broken’ and ‘No problem’, otherwise that human connection we wanted to achieve between the two robots was lost.

fxg: I was going to say that even the font you would have chosen gives it personality. It’s almost like the font is the accent that the line is delivered in?

Debert: Absolutely, and we tried a lot of iterations of that too. It’s cool because what we ended up going with was very clean, very clear and simple, which is what I think needed to happen in order for this to be readable. It’s a 30 second spot with 20 shots, so we had two-and-a-half seconds to read something while you’re still digesting the performance.

fxg: I presume that right out of the gate you had the actual music ‘Cars’ because that would be pivotal to the editing?

Debert: [Laughs] You’re pointing out all the things that I wish I had had! The final track was approved and mixed a week before delivery.

fxg: So there was a chance you’d actually have different beats?

Debert: The main song changed three times – to completely different songs. First it was Daft Punk, then there was one about working on the railroad, then it ended up going with ‘Cars’. We had a temp version of ‘Cars’ two weeks before delivery. We had to go into animation and tweak – not everything – but I wanted to make sure from the get-go that we kept the way we were animating the robots standardized. So the timing and the way they would get to places and how soon they would start doing an action after getting to a particular place was very specific. For instance, it would take 20 frames to rotate, and then it has a 5 frame hold, a 10 second move, another 5 frame hold and then it goes into the action.

The reason why I wanted to do that is two-fold. Number one, that’s kind of the way robots move in a factory – they’re not all goosey and moving all around, they’re very precise. I also wanted to keep it mathematically accurate in terms of the beat. Then when we were pulling our hairs out of our heads when we figured we had a different song, we actually had to go and shift animation here and there for the key moments to hit some beats.

When we got the final mix, low and behold, it wasn’t exactly accurate to what we were working with. So what we had to do was to go into editorial and shift things here and there a little bit – 5 frames here and there. We were all working with 10 frame handles and thank God that was enough to shift some of the shots. Some of them we ended up using all the heads and all the tails and that’s all we had.”

fxg: Well congrats and thanks again for talking to us.

Credit List

Credits for: CISCO “ROBOTARM”
Directed by: David Rosenbaum
Client: CISCO
Agency: OgilvyWest
Chief Creative Officers: James Dawson-Hollis, Bill Wright
Associate Creative Directors: Jeff Heath, Dennis Lee
Agency Producer: Eric Rasco
Agency President: Heather MacPherson
Account Director: Efren Gonzalez
Account Management Supervisor: Sarah Tabbush

Production, Animation, Editorial & Visual Effects by: DIGITAL DOMAIN

Ed Ulbrich: President, Commercial Division; Executive Vice President
Tanya Cohen: Executive Producer
Scott Gemmell: Head of Production
Aladino Debert: Visual Effects Supervisor
Lee Carlton: CG Supervisor
William Lemmon: Producer
Alex Michael: Coordinator
Fred Fouquet: Editor
Colin Woods: Editor
Brian Creasey: Generalist, Previs Artist
Tim Jones: Generalist
Trisha Mcnamara: Generalist
Val Sinlao: Generalist
Anthony Ramirez: Generalist
Rick Thomas: Animator
Stephan Brezinsky: Animator
Ruel Smith: Animator
Tom St. Amand: Animator
Adrian Dimond: Technical Director
Derek Crosby: Technical Director
Tom Briggs: FX Artist
Adrian Graham: FX Artist
Cody Williams: Motion Graphics Artist
Jeff Heusser: Nuke Compositor, Flame Artist
Scott Hale: Nuke Compositor
Niles Heckman: Nuke Compositor, Previs Artist
Sven Dreesbach: Nuke Compositor

Music Composition: Gary Numan “Cars”
Re-record: kinseywhitehouse
Sound Design: Henryboy
Sound Designer: Bill Chesley
Producer: Kate Gibson
Mix: Rohan Young, Lime Studios

1 thought on “Alive and animated: how DD created ‘Robotarm’”

  1. Pingback: ERIC ALBA | Alive and animated: how DD created ‘Robotarm’

Comments are closed.