Breaking down the Super Bowl spots – Part 2

In Part 2 of our 2012 Super Bowl TVCs coverage, we take a look at Pysop’s animation for the traveling-bugs Chevy piece and find out how Quietman executed a Pepsi MAX spot. Then we highlight the interactive Coca-Cola polar bears work from Framestore and Animal Logic, plus give run down of the effects by The Mill, Method and Eight VFX for several Super Bowl commercials. You can read Part 1 here.

Psyop’s joyous bugs

For this Chevy Sonic ‘Joy’ spot, Psyop worked with agency McCann New York/Detroit to deliver several adrenalin-seeking bugs who take a ride in a car grill. We asked Psyop about how the TVC came together.

fxg: At first seems like a scary prospect for the bugs to take this journey and obviously turns into joy – can you tell me about the brief for this spot?

The spot was always intended to be an energetic ride with some badass bugs. These characters meet regularly to get their ‘buzz’ on – and there’s a real sense of anticipation and excitement as they get ready for the ride. It was always a high energy brief – and even from the pitch phase we ended every meeting with a “Weeeeeeee!”.

fxg: Each ‘bug’ seems to have a distinct personality – can you talk about the design process, and how you matched the animation to the design and voice actor?

In the design process we wanted to draw together a motley crew of thrill-seekers from different backgrounds. These guys would come from all over – and have the same intention. We needed to keep within the realm of photoreal insects though – so a lot of our subtle design features (eyebrows, chubbiness, subtle tattoos) were amplified by our animation and then the voice actors. We paired our bugs to certain celebrities for character reference – like the dragonfly was a crazy stuntman, and the spider had this creepy but outlandish character. It was a lot of fun.

fxg: What kind of ‘research’ could you do for this spot? Did you take any reference photography or footage of front grills and how did you look into bees, spiders, ants etc?

We used a lot of photographic and macro reference for the details on the bugs, and had a miniature museum exhibit here for a while with preserved bugs for reference. On the shoot we took a lot of reference of the grill so we could re-build it perfectly in CG (including all the fine details of the real dust, etc). For character reference and animation we drew from the creative minds of the Psyop team and, again, had a lot of fun.

fxg: How did they shoot plates for the close-up grill shots? What things were done on the shoot to help with animation/compositing later on? Were the close-up shots of the grill all live action or did you reconstruct a section?

All of the close-up shots of the grill are completely reconstructed in CG. It was better to re-build and have full control of the shot. As always, we shot environmental and character lighting reference and background plates to give our artists as much to work with as possible.

fxg: What tools did you rely on for modeling, animation, rendering and compositing?

All of the modeling, rigging and animation was done in Autodesk Maya, and rendered in Arnold. The renders were then composited in Nuke and Flame. It was our first job in the LA office using a Maya Arnold rendering pipeline, and we’re more than thrilled with the results.

fxg: What were the extra details that made this spot work – such as wind rippling through wings and fur, and reflections in the car duco?

The speed and environmental effects certainly helped the bugs feel alive and interactive. The combination of the wind in the fur, fluttering antennae and interactive sun light really gives that distinct feeling of speed. The finer details then help to subtly convince you of the realism – feeling the weight of the characters, the dust on the grill…the tiny hairs on the inch worms back.

fxg: How long did you have to work on the spot and how big was the crew?

The spot was about two and a half months from start to finish, with a crew of about 15.

From He-Man to Voltron – Psyop was also behind the Metlife ‘Everyone’ commercial which features a diverse range of cartoon characters. Check it out below.

Quietman checks in with Pepsi MAX

In a spot that continues the friendly rivalry between Coke driver and Pepsi driver, ‘Check Out’ for Pepsi MAX from TBWA\Chiat\Day features visual effects by Quietman. When the Coke driver reluctantly becomes a Pepsi winner, the store goes wild with falling confetti and balloons, palettes of Pepsi and the appearance of special presenter Regis Philbin.

“That was a whole greenscreen shot,” says Quietman creative director Johnnie Semerad. “It was a very efficient way to work. If you’re going to shoot in a grocery store, they usually make you do that from midnight – which means you don’t get all day. If you shoot it in a studio you get much more time and the actors and celebrities can all show up at normal times. And it makes it easier with all the confetti too.”

Quietman completed the entire visual effects work on Flame, initially completing a rough comp which was sent out for testing before finalizing the spot in just three days. “The thing about a spot like this,” says Semerad, “is that we stay up all night finishing the work, have a whole crew on it, and then you show it to your friends and they’re like, ‘What’d you do on it?’ That’s the challenge and that’s where you want to be – you want to be invisible.”

The bears are back, for Coca-Cola

In a unique take on interactivity and a throwback to the ads of days gone by, both Framestore and Animal Logic produced some of the most popular and successful Super Bowl advertising this year. Framestore partnered with Blitz Games Studios and agency Wieden & Kennedy to produce real-time rendered Coca-Cola CG polar bears during the actual game. A computer gaming engine, pre-rendered actions and XBox controllers were all used to puppeteer the bears who sat in an ‘ice cave’ and performed motions responding to the game’s events and interacted with online viewers. We chat to Framestore Digital’s creative director Mike Woods.

fxg: This campaign seemed to capitalize on so many great things – social media, interactivity and the popularity of the bears themselves. How did the collaboration between Framestore and Blitz come about and what were the things you needed to solve early on to make the tech possible?

Woods: We’ve been doing work with games companies for a number of years. Cinematics, trailers, cut scenes, environments etc. We had good contacts, and had always had a keen eye on real-time render options. Within Framestore Digital we have been very used to using Flash, Papervision and other real-time, user controlled options, so had a good idea of what we’d need to move this up to the next level. About 18 months ago we approached Blitz with an idea, as we were most impressed about their emotion and performance based studies, and felt their intuitive talents as well as what BlitzTech, their engine, could offer us something very exciting.

I wanted to test a pipeline where we could take existing Framestore film and TV characters and place them onto game engine rigs. We worked this pipeline to perfection until I felt confident I could begin to roll out the tech to our best clients. Our early solves were the restrictions of gaming rigs, compared to what our animators were used to, and of course being able to achieve photoreal real-time shaders. Blitz blew us away with what they could do with texturing and lighting.

fxg: What kind of preparation and planning for the live interaction of the bears was necessary (ie. rigging, controls and pre-renders)?

Woods: There was a good mix of planning between what would be emotional response based reaction animation, triggered by X-Box controllers, and what would be longer linear animation sequences, added by Framestore and triggered by a web based UI.

fxg: Can you talk about the tech used to puppeteer the bears during the Super Bowl?

Woods: We had in excess of 100 different animations, sat within a BlitzTech game engine, on a specced PC with a NVIDIA 550 graphics card. Two ‘puppeteers’ would take control of a character each. These character controllers could be forced into different emotional states of Positive, Neutral and Negative. Each of these joypad control states would engage different responses from the joypad buttons, through light, medium and heavy emotional responses. For example, a bear who was losing the game could be in a negative state, and if the opposition scored could hit the heavy response button to trigger an exaggerated animation of disapproval.

Both these bears are infinitely puppetable, and each button was loaded with one than one animation in each state to give us a really wide selection, and not feel any kind of repetition or loop. Added to this, was the Director controlled web based user interface, that could override the puppeteers, and added another 65 animations, that covered things like either bear leaving the room, more exaggerated dances and all the the other characters. All of these elements were effortlessly blended together in BlitzTech.

fxg: What is the real-time render solution that Framestore/Blitz relied on? The results were incredible, but what are some of the current limitations and how do you think things will progress into the future?

Woods: Again, solely BlitzTech, with some Framestore handholding as we were receiving assets from the TV spot. Obviously fur is a problematic thing even in pre-rendered world, so this needed a lot of thought. Any kind of particle/flocking/Houdini/grading issues were tricky, but not insurmountable. Real-time rendering is the future of our Digital department, so we were well experienced in being able to explain to the client in advance some of the obstacles that might be presented. Our pipeline excites us hugely. We’ve barely scratched the surface of what we could do with this. We are already embarking on another project where we can offer the control of characters to the consumer via a social network. This has huge potential consequences for any kind of entertainment/marketing/advertising/gaming.

fxg: So how did the day progress for you and your team? Were you able to do some rehearsals? What were some of your favorite results from the CG bears?

Woods: It was an incredible experience. We had rehearsed solidly for months in advance, continually using previous year’s games to practice with, and also understand the kind of range of animations that would be needed. Come the night of the game, we were locked into a broadcast room at MLB-BAM servers in NYC, working hand in hand with the Coke social media team as all tweets were indeed live, so needed absolute serendipity. It was a four hour metaphysical marathon, as even the TV spots were made to correspond exactly.

Reintroducing the polar bears: In addition, Animal Logic through both its Sydney and LA offices completed three final spots – ‘Catch’ (above), ‘Superstition’ and ‘Arghh’ – featuring the polar bears that aired during the game. The director of the spots, David Scott, and CG supervisor Feargal Stewart tell us how the commercials were produced.

fxg: Can you talk about having to tap into the original bears and where you were able to take the spots?

David Scott (director): For the inception of the whole thing, it was key to Coca-Cola that we were able to reference back to those original spots. But then again we had to elevate the look and style of everything. One of the biggest challenges we had was making the bears relate-able to a home audience. We made them look a little bit more like the fans at home, and tried to create a parallel between the look of the footballers and the bears themselves. The original polar bear campaign tried to go close to what a polar bear looked like, but in ours we gave them a little bit more upper body strength and really tried to get a lot more physicality across.

fxg: How were these spots designed, and how did you work with the agency and Framestore?

Scott: We were given a set of three scripts from the agency that were very funny, about these bears sitting on the couch watching this game. We submitted a treatment about how we’d go about the design process – what the approach would be to the environments, to the character design, to the look and feel and style. What resonated with the agency was that we wanted to make it look and feel like an actual football game. So I talked about how the Super Bowl is actually shot with a long lens coming across the field, for example.

So once we started working with the agency it was very collaborative from then on. We went through a lot of iterations with the bears and environments, then once we had our character design model and handed it off to Framestore. There was a little bit of back and forth to make sure they had what they needed. One of the things that was quite challenging was that our bears were going to be rendered so we wanted to do the full fur and shading, but of course Framestore had to be real-time.

fxg: What things did you guys have to solve in terms of how the bears looked and fitting them into the environments?

Feargal Stewart (CG supervisor): We were using a lot of technology that was developed on the film side of things at Animal. We have a solid fur system. For animation, we had a bald version of the bear, and the animators worked with a ‘volume’ model that takes up the right space that a fully furred bear would. In terms of the icy environments, one of the things that really helped with that were the effects work that we had developed on Happy Feet. We created footprints and scratches in the ice and snow-kicks.

fxg: Was it a pure keyframe animation approach?

Scott: It was all keyframe. The fun part was coming up with all the moves, say in ‘Catch’. We would run around and come up with all these moves ourselves. Animation dailies would involve getting up, rolling around and brawling with each other.

fxg: Can you talk about the fine details, such as the cloth for the scarfs?

Stewart: The scarfs were definitely tricky to do. From an animation viewpoint we needed them to deform and move around with the bear as a base, and on top of that the animators had a lot of control to keyframe them. They could adjust the shapes if it stretched too much, or if they needed to slide it up the neck they could do that. On top of that we could do cloth sims. So it was a three-layered approach.

Scott: From a directing point of view, it was handy to have all those layers. When the bears are running and colliding with each other it was really important to get strong silhouette value out of that. That’s something you can really only control by having an animator hone it in. And then to get that extra detail from the cloth sim was fantastic.

fxg: What were the tools you were using?

Stewart: We use XSI and Maya for animation and the cloth sims. We use RenderMan for rendering including for the fur. We have some in-house shaders. Also, all the displacement and fine detail comes from really detailed displacement maps for the snow, which also comes down to RenderMan.

fxg: What about the product shot – the Coke bottle itself – how much was that scrutinized and how important was that to get right?

Scott: It was obviously a very important element. Through the whole process we left it to give us the most amount of control in the final grading stage. We had control of nearly every single element on the bottle from the quality of the amber glow to the label and the text on it. That came down to great shading in the first place but also having overall control in the final grade.

fxg: What kind of time frame did you have on this and the team size?

Scott: It was a relatively modest team with about 30-40 people over four months, rolling on and off throughout the entire process.

fxg: How did you work between the LA and Sydney offices?

Stewart: Finishing the spot over in LA was a really good process after we had worked on it in Sydney. Having the client in a suite in LA and the rest of the team feeding stuff through – pushing renders across with grade maps, and then grading over there.

Scott: Throughout the process we were working remotely. We would have reviews with them in the morning at the end of their day, and then as their day was starting up it was the beginning of our day. Finally we went to LA for the last entire month – there’s nothing that beats having the client in the actual room and being able to hone the spot. We have a great pipeline between LA and Sydney, so we could identify changes. So you’d think that a change might be hard but we have it set up so say a camera would just be an asset along a long line of events. That camera can then go into an automated pipeline and within half a day the entire shot can be put through the whole process again. And we even had to leave the colors of the scarfs until the last minute. The playoff final was only a week before the Super Bowl – but we had the scarf colors related to the team colors from the day once we knew who the teams would be.

A big year for The Mill

It was another significant year for The Mill – the studio contributed effects to several Super Bowl spots. There’s a neat round-up of them here at The Mill’s website, but we also feature a couple below, including behind the scenes of the Lexus ‘Beast’ ad.

More on Method

In Part 1 of our Super Bowl coverage we looked at Method Studios’ work on the epic Chevy Silverado spot. Among other spots, a worldwide effort from Method also contributed to Time Warner Cable’s ‘Enjoy Better Anthem’ TVC featuring CG Blackhawk helicopters, matte paintings, layered explosions and atmospheric effects. And the popular ‘Drive The Dream’ spot for the Kia 2012 Optima Limited combined forced perspective scale effects with CGI. Check out those two commercials below.

The transactions of Eight VFX

Finally, we take a quick look at Eight VFX’s work on the Acura ‘Transactions’ spot (the studio also worked on Sketchers’ ‘Underdog’ and the Teleflora ‘Give’ commercials). The extended version of ‘Transactions’, below, follows Jerry Seinfeld in his attempt to become the first owner of an Acura NSX and required effects for holograms, a speed boat ride, a zip line through Manhattan and a jet pack flying squirrel suit worn by Jay Leno.