In ‘Data Baby’, a beautiful spot for IBM directed by Mathew Cullen, Motion Theory visualised a world in which data envelops and surrounds a new-born infant. Part of a series of seven commercials, the spot utilised specially developed C code to give the data a more humanistic feel. We talk to visual effects supervisor John Fragomeni and art director Angela Zhu about the work.
fxg: What was your overall brief for the ‘Data Baby’ spot?
Angela Zhu: IBM came to us to visualise a world of data. The data had to be very fragile and humane. The difficulty was to find the balance between technology and humanity. One of the visual cues was creating a blanket of data hovering above the baby as a protective shell. The data blanket had to feel like a mother’s finger running over a baby’s face – the fragile love and protection is hard to recreate with technology. Technology is informational, humanity is emotional. We tried to give some characteristics to the data. Apart from conventional data visualization which often seem digital and departed from reality. IMB data has real life physics, light property and interacts to its environment. When the baby’s hand moves, the data will move with it. We also visualized a baby mobile. In the baby’s world, the first thing he sees as well as his parents is the mobile. So when we designed the mobile we tried to capture the six categories of IBM’s data services. IBM wanted to show through its technology: heart rate, respiratory rate, oxygen saturation, blood pressure, ECG and temperature. The mobile captures all six categories of essential information that tells the well-being of the baby. When we were designing the mobile, it had to feel in camera, soft, not scary.
John Fragomeni: One of the things about those categories was finding a voice or an expression for each one of the elements. Part of the concept design was finding a language that was a common thread that felt connected. We had to work out how do we visualise that so that it’s grounded in data but also so that the data drives the processing. We came up with a C visualisation of the data. That’s where the concepting came in, which was a jumping off point for inspiration. In some ways we stayed true to that in the way that the coders were directed to produce elements, but it also evolved, which is the creative process. We finally start taking this data and visualising it through C and various other methods. It was really important that this not only be a commercial but also a piece of generative art. It showed the really warm side of technology, which can be cold and faceless. This was more the humanity side, that every day in every way technology and data affects your life.
fxg: What was involved in the live action shoot?
Zhu: We recreated a IBM neonatal ward. Shooting the infant was a difficult task, four of our little stars were only two weeks old. We auditioned about 20 babies. All of which are precious and beautiful but we had to cut down to four. It must have been tough for the parents!
Fragomeni: You could only shoot in 20 minute blocks, too. So we had close-up babies for the face shots and stunt babies for the arms and legs.
Zhu: It was a bit like a war zone… If one baby start crying on set, you’d have to replace one with another. We’d have mums lining up ready for swap. Everyone on set had to wear masks to minimize contamination. The camera could not get any closer than a foot or two feet away. There were many regulations and restrictions to protect the baby.
Fragomeni: Everything was shot on long lenses and in lower light. There were some filming restrictions but I don’t think when you look at the spot you can really tell. All kudos to Mathew Cullen, the director, and the production crew.
fxg: How did you go about translating the design work into the on-screen data graphics?
Zhu: There were two parts – design R&D and technical R&D. For the design R&D, we wanted to be inspired by nature, nature’s smallest building blocks, elements that felt soft and light based, such as northern light, bioluminescent animals, complex and intricate geometries. From a technical point of view, we hired a professional MD to help us better understand real life procedures in neonatal ward. Where and why each category is measured and placed. Basing our design on real life data was fundamental. Great solutions came with better understanding of the problem. Each shape in the mobile had a distinct characteristic. For example, ECG is about connection, blood pressure is about circulation; heart rate is about pulsating. respiratory is expansion and contraction. They all have to hold their own concept and feel unified at the same time.
Fragomeni: It was all grounded in real world information and there was a lot of technical accuracy that went with it. It was important to show how the data was interacting with the baby. It couldn’t be threatening in any way, it had to be comforting. The data ‘blanket’ was protective. The data that came off the baby was meant to be very organic, rather than like a digitised baby. In the early days we had the data much closer to the skin, but when you’re working that close, we found we needed to lift it further and further off the skin because it started to feel like a digital tattoo.
fxg: What visual effects techniques did you end up using?
Fragomeni: The data was grounded in the real world. What we did was take the data into C and that actually drove the expressions. The way blood pressure works and heart pressure works actually drove the coded expressions. We interpreted the heart rate as almost like a flower that bloomed from the surface and pulsed. We used the data to take that shape and it moved in sync with that data. Beyond that, we reverse-engineered the whole thing from a visual effects point of view. This was particularly so because we couldn’t scan a baby. So we had to build four different baby models, and they had to be perfectly skinned to the baby’s surface and then matchmoved. We used those models as the basis to run curves or paths across the surface of the baby to follow the geometry and curvature of the body. Then we used that geometry to incorporate our code to create these visualisations of the data representing the six different technology functions. There was also a process of taking the CG body to create the actual blanket – we called it ‘bloated baby’. Then we used a UV mapping process and took the visualised data and almost wrapped the baby in a blanket with just data. Everything had to be delicate and soft the whole time.
fxg: Can you talk about the tools involved?
Fragomeni: All our digital modeling was done in Maya, with additional CG work in Cinema 4D. We did a lot of lighting passes and fresnel passes to help integrate the code into the CG. We used all those tricks and compositing tricks to make sure people didn’t think it was a digitised baby – that’s not what we wanted. The other challenge was that you never actually see the data emitted from the surface or skin of the baby, you see it coming off the blanket, almost like an aura coming off. We used After Effects and Nuke for compositing, and Flame in the later stages.
fxg: What was the most challenging aspect of the spot do you think?
Fragomeni: It was all about finding that common language from a design point of view. Being pure to the design was important. You can get overwhelmed by a project like this and try to throw everything at it, but it can end up looking like you’ve thrown everything at it. Instead what we did was peel away the pieces that we thought didn’t work.
Zhu: The data being derived from real neonatal ward babies. We tried to keep it as authentic and realistic as possible. We where relieved and proud when the “Baby” finally delivered. Even though it was only a two month project, it felt like nine month pregnancy!
IBM “Smarter Planet” Title: “Data Baby”
Agency: Ogilvy & Mather/NY
Executive Producer: Lee Weiss
Associate Producer: Rich Fiset Sr.
Partner/Worldwide ECD: Susan Westre Sr.
Partner/ECD: Tom Godici, Greg Ketchum
Creative Director: Rob Jamieson, Chris Van Oosterhout
Production Company: Motion Theory
Director: Mathew Cullen
Executive Producer: Javier Jimenez
Line Producer: John Marx
Director of Photography: Guillermo Navarro
VFX Company: Motion Theory
Creative Directors: Kaan Atilla, Mathew Cullen
Producer: Patrick Nugent
VFX Supervisor: John Fragomeni
Art Director: Angela Zhu
Design Leads: Paul B. Kim, Satomi Nagata
Designers: Heidi Berg, Leanne Dare, Kenneth Lee
Onset FX Supervisor: Sean Looper, Trevor Tuttle
3D/Nuke FX Lead: Marion Spates
3D/Lighting Lead: Trevor Tuttle
Pre-visualization: Trevor Tuttle
3D Artists/Animators: Brandon Lester, Gil Hacco, Casey Hupke, John Robson
Matchmover: Joe Cullen
Comp Lead: Danny Koenig
Compositors: Andrew Ashton, Chris Riehl, Daniel Raschko, John Stanch, Dorian West
Code Artist Supervisor: Keith Pasko
Lead Code Artist: Josh Nimoy
Code Artists: CJ Cenizal, Ekene Ijeoma, Jeremy Rotsztain
Finishing: Danny Yoon
Rotoscope Artists: Megan Gaffney, Gil Hacco, Rob Liscombe, Eva Snyder
Editorial Company: String
Editor: Jeff Consiglio
Assistant Editors: Jeff Aquino
SOUND DESIGN MUSIC
Sound Design Company: Sound Lounge
V/O: Forest Whitaker
Music track: “Boatfriends” by Black Moth Super Rainbow
Executive Music Producer: Karl Westman
We've been a free service since 1999 and now rely on the generous contributions of readers like you. If you'd like to help support our work, please join the hundreds of others and become an fxinsider member.