Smart Energy GB has enlisted the support of Albert Einstein to front its latest campaign, Join the Energy Revolution.
The new campaign, developed by AMV BBDO, with visual effects by The Mill, sees Albert Einstein transported from the 1950s to the present day using the latest CGI technology. He is amazed by some of the innovations that have happened since his time but can’t understand why Britain has not sufficiently upgraded its energy system. In this integrated campaign, a series of ads across TV, YouTube, social, radio, and print, Einstein explains smart meters’ personal and environmental benefits in his own charming way.
The 40” ad is designed to introduce Einstein to the audience, connect him memorably to smart meters, and land the core messaging around infrastructure. Four weeks down the line, with Einstein established as the voice of smart meters, a shorter-form 20” copy will be distributed, dramatizing the many benefits of smart meters from helping Great Britain reach net zero to saving consumers’ money.
In consumer research, Einstein was found to personify ‘smart’ as well as having the ability to create surprise and deliver a wide range of messages.
Global production partner The Mill were tasked with re-creating a digital version of Einstein using performance capture and CGI. A team of visual effects artists developed custom software and an innovative pipeline to ensure the digital Einstein was as realistic as possible.
Alex Hammond, Head of 3D at The Mill comments, “Tasked with the exciting but hugely ambitious ask of re-creating a digital version of Einstein, we had to create a unique and ground-breaking visual effects pipeline in order to create an avatar that was truly convincing. Our visual effects team, including facial shape experts, spent months researching and developing a robust toolset so we could convincingly portray the nuances of Einstein’s personality.
Alex Hammond served multiple roles on the project. He was the Creative Director, VFX & Shoot supervisor and Lead 3D artist. “We used cutting edge 4D volumetric capture technology to capture the performance of an actor. This was then used to re-create subtle facial performances and intricate details in CGI. We developed a bespoke system at The Mill to process and export facial data before our team meticulously groomed each hair, wrinkle, and eye detail on the CGI model.”
FXG: What 4D tech was used?
Alex Hammond: 4D volumetric capture was used in the process of facial shape and performance capture (worked with DI4D) Essentially it gives you a fully reconstructed mesh for every recorded video frame, so what you get back gives you very detailed interpolation and surface information.
FXG: Was this a CG FACS rig or did you use Neural Rendering approaches?
Alex Hammond: This was entirely a FACS driven rig- no machine learning was used in this process. We did however use sequence caches to drive the facial shapes which were extracted from the 4D volumetric data,
FXGUIDE Note: the team used Di4D to capture input data and ended up with a unique FACS rig made up of over 300 Shapes.
FXG: What was the modelling/rendering pipeline used?
Alex Hammond: We utilized our traditional 3D pipeline (with some small tweaks to the caching).
- Modelling/ Lookdevelopment was done inside Maya (with Arnold rendering) and specifically, all the shape work was created using our millFACS data management system
- These were exported and embedded into our face rig (strict naming convention to automatically update with a few scripts).
- Animation took place on a ‘Level 1’ subdivision mesh
- The animation was cached out and imported into a skin pin scene (this was to pin the mesh back to the live-action body)
- After the skin pin we then wrapped ‘Level 2’ subdivision mesh onto the cache for shot sculpting (to get the right neck shapes in particular)
- This Level 2 cache was then exported into Houdini so we could simulate smaller skin details and hair.
- The process is all managed via Ftrack so updating caches within the rendering scene is all taken care of meaning our lighting artists could focus on getting Einstein looking good!
FXG: Any issues with lip sync
Alex Hammond: Not specifically however there was a lot of dialogue changes throughout the project, so we had to accommodate this and often re-film/ reference new performances.
FXG: Was this a Nuke comp with DOF added so the plate was fully in focus… and how much of the bathroom was digital?
Alex Hammond: The plate was Live action (with in-camera DOF) so we matched this inside Nuke for the CG compositing. Naturally, we re-build a lot of the environment as we have to render additional reflection passes so that it can get comped and integrated properly. It’s worth mentioning that we render using Light AOVs… we have a custom tool for Nuke which allows the 2D artists to re-build the lighting and tweak temperature/ exposures of individual lights for the 3D renders.
FXG: How long did it take and how big was the team
Alex Hammond: It took roughly 9 months …on and off, as the lockdown naturally slowed certain decisions and production down overall. This was all done with a small team of around 8 people.
Creative Director, VFX & Shoot supervisor, and Lead 3D artist: Alex Hammond
Executive Producer: Chris Allen
Producer: Sean Francis
2D Lead Artist: Ben Turner
2D Artist: Matthew McDougal
3D Artists: Harsh Borah, Clare Williams, Andreas Graichen, Maria Carriedo, Sefki Ibrahim, Joao Pires, Dan Yargici
Matte Painting: Ross Urien
Designers: Freya Barnsley
Production Coordinator: Gabriela Goncalves
Colourist: James Bamford