Echo: advanced motion control

Echo is a short film written and directed by visual effects director and fxphd.com senior contributor Victor Perez. It tells the story of a girl who wakes up in the middle of nowhere and sees in a mirror her reflection ten seconds ahead of her in time. When she wakes up again the nightmare has just started over.

ECHO is an innovative film in terms of storytelling language, and it was shot with extensive Motion Control at Stiller Studios in Stockholm (Sweden) on their two motion control rigs, a Mark Robert’s Cyclops and a Bolt. The motion control team that run the technology research at Stiller Studios worked for months to develop a new approach and setup the studio hardware to make it possible to film an effect never seen before on the screen.

Perez was actually introduced to Stiller Studios through his work at fxphd.com while developing his popular Nuke, Motion Control & Deep Compositing course in late 2015 (find out more about Victor’s six courses at fxphd.com). “Without that course, ECHO never would have existed,” says Perez. “I had seen a few of their works but I’d never had the chance to work with them (and their expensive toys) [he laughs] and being able to create a course being assisted by the guys at Stiller themselves was the best experience.”

He was able to use the days on-set filming for the fxphd course to ask as many questions as possible and learn as much as he could about Stiller Studios’ technology. ” I questioned everything,” says Perez, “to try to understand why they were so special about motion control, and the more I learned the more I wanted to know.” It was this relationship that lead to the team at Stiller to reach out to Victor with new techniques they had developed.

By synchronising the two motion control rigs, the team lead by Tomas Tjernberg, created a formula to make a reflection effect be recorded simultaneously by both cameras. All without filming any actual mirrored surfaces.

The visual effects director at Stiller Studios, and co-producer, Tomas Wall, pushed this technology even further by adding an extra element to the mirror effect, the time delay or “time displacement” factor. Time Displacement technology makes the reflection on the virtual mirror run at a different speed and/or synchronisation point in relation to what the actual hero camera is seeing. The reflection plane is always perfectly aligned, thanks to the physics calculations and software created by Stiller Studios team.

The result is a mirror reflection out of sync and speed, in relation to main camera and scene. A totally unnatural effect, impossible to make with a moving camera without this complex research and staging.

 

During pre-production and shooting of ECHO at Stiller Studios, Perez supervised the visual effects team and oversaw the “time displacement” evolution. “I was fascinated. A couple of days after that first meeting I sent a script to them – something that I had written over 5 years earlier”, he explained, “it was a story I never thought achievable”. Stiller Studios solved the vast technical complexity and, a few months after that conversation, ECHO was in production.

The film was produced by the director Victor Perez (Left) and Tomas Wall (Right)

“I wanted to take this technology at its maximum expression in terms of narrative, to tell the story of ‘Echo’. And the team at Stiller Studios accepted the challenge” comments Perez. The result is a short film shot entirely in just 5 takes. This is all achieved with an actress synchronised to the camera movements as if by a hidden musical choreography. “But after all technicalities and technology I had always one key element in mind: tell a good story in a new way never seen before. Integrating VFX at the service of storytelling, not the other way around,” says Perez

The Main stage

Months of research and development, rehearsals, and planning were necessary to accomplish the ECHO Mirroring effect. The synchronisation algorithm allowed the filmmakers to tell the story in just 5 long takes, but in those takes, both motion control rigs were shooting simultaneously to create the effect of the virtual mirror. Once the sync was pitch perfect, Stiller Studios worked closely with Perez to accomplish the highest challenge: a reflection in the mirror which was out of sync in relation with the hero camera and vari-speeded to alter the time within the reflected image while always maintaining the correct angle of reflection in relation to the main camera.

Stiller Studios is owned by Patrik Forsberg. He originally brought the state of the art motion control technology for their real sound stage to allow the filmmakers to design just this style of complex choreography with actors to match the virtual world. Stiller Studios focuses on intricate motion control work, where virtual and real camera positions and paths need to be perfectly matched and output in real time as usable data.

The short film done in just 5 takes had 2 motion control rigs filming at different film rates

The film was shot on RED Epic Dragon in 3K Widescreen. The original plates delivered at 3072 x 1296 pixels resolution.  All the post-production and CG and Digital Matte Paintings (DMP) was completed at 3K resolution, with the final output delivery from comp being a 2K Scope format. The 1K ‘extra’ was used to add handheld or camera shake in post, by capturing/tracking real cameras hand held movement and applying it during compositing.

The whole film was shot on greenscreen, but without a single tracking marker.  “The team at Stiller said it would not be necessary – the guys were really confident – and they were right!” remarked Perez. In total, the team rendered out 11,738 frames to add the CG environment to the footage. The DMPs were a combination of several photographs from different places, all stitched together in Photoshop.

A vast 32K digital matte painting was used.

One of the largest DMP was a 32K lat-long for the exterior mountains. “We needed to cover almost the whole dome of the lat-long due to the angles of both cameras” recalls Perez.

The production used only one lens, a RED Prime 18mm for both cameras.  “But then I played with virtual camera angles changing the DoF and the distance of the foreground (greenscreen) in relation to the background CG. I was able to simulate both zoom-in/out and dolly-in/out movements” he adds. The team used Bokeh by Peregrine Labs for the DoF. The longest shot of the five takes is where she wakes up for the first time, it was one 3383 frames long shot with an additional 853 frames in the mirror image. It was all shot in one take and it took 23 takes on the sound stage to get it right.

DOP (Ex-Aussie) Marcus Dineen with one of the RED cameras with the 18mm RED Prime lenses

In the film, the iPhone screen needed to be replaced entirely in order to add the reflections of the clouds, remove green reflections, and to put the exact time. This process was done entirely using Nuke 3D space. All blood in the film is digital, and it was applied using smart painting in Nuke (vector tracking). This is the same technique that ILM used in the Revenant to apply the wounds from the bear to Leo DiCaprio.

Every outdoors scene contains 9 3K plates ( of which 2 are clean plates passes for the green screens) and each comped individually:

  • Hero camera (Cyclops) far background, DMP
  • Hero camera (Cyclops) mid background CG
  • Hero camera (Cyclops) mirror dirt layer live action elements
  • Hero camera (Cyclops) girl, bed, bedside tables: greenscreen + clean plate (for aid greenscreen extraction)
  • Reflection camera (Bolt) girl, bed, bedside tables: greenscreen + clean plate (for aid greenscreen extraction)
  • Reflection camera (Bolt) mid background CG
  • Reflection camera (Bolt) far background, DMP

The virtual defocus was done using real lens data to simulate the exact DoF, “then I had to manipulate a few values to keep the photorealism when applying the zoom-in/out and dolly movement by eye,” Perez comments.

The team working on the effects worked via the cloud and were actually located in several different countries: UK, Italy, Sweden, Argentina, France, Spain, Netherlands, India, Germany, Greece, Denmark, Switzerland, Belgium.

The total storage of the short film (excluding any backups ) was 48TB (CG, plates, other footage, tests, etc…)

The CG layers were rendered entirely in Arnold. Perez decided to have rendered:

  • Beauty
  • Direct Diffuse
  • Indirect Diffuse
  • Primary Specular
  • Secondary Specular
  • Direct Specular
  • Indirect Specular
  • Reflection
  • Depth (Z) 32-Bits
  • Normal World (N) 32-Bits
  • Position World (P) 32-Bits
  • Various Object IDs

“Over the 20 months of the process of producing this film I was traveling with a suitcase containing a RAID storage of 24TB, Wacom, keyboard, spare drives, and a HP ZBook mobile workstation… I worked on my free time even during the shooting of the feature film that I was supervising… I worked on ECHO on the go in Italy, Spain, UK, Denmark and US,” remarks the director. In the end the Perez moved to working full time on the project for the last two months over the summer.

The lead role was played by Spanish film, television, and theatre actress Maria Ruiz. She had previously played the lead role in Antonio Banderas’ feature film Summer Rain (El camino de los Ingleses), which was awarded the Label Europe Cinema Award at Berlin´s film festival (Berlinale). This is the second time Maria had played the lead in a Victor Perez’s film.

“I was working with the Maria Ruiz (actress) in London for 3 weeks staging the movement and calculating the reactions to her own actions, so we had to create emotions and actions that served as both ‘emotional triggers’ and reactions 10 seconds later” explained Perez.

Everything was choreographed with music in order to have time cues and adjust the scene in terms of timing. “I was aware once the camera movements were defined I couldn’t change anything. On those 3 weeks we create the fabric of the story based on movement of the actress”.

Once the movements of Maria were clear the director started playing around with his iPhone to find the angles to work with.  “I wanted to show for all the beats in the story in terms of emotions,” says Perez. For the most complex setups, to make the audience feel the camera as an omnipresence force with no human limitations on the movement, he used a small mannequin to test.

“I took photographs of the impossible angles for instance, the high-angle over the bed that then transforms into a handheld movement,” Perez relates. “I wanted to mix computer-perfect movements with dirty hand-held ones… but with a smooth progression so nobody feels the absence of film-edit but I also wanted to underlining certain frame ranges with different camera feelings.”

Once the choreography was set, there were two days prior the shooting when the team scheduled filming at Stiller of Maria from a fixed angle just to capture her choreography and then translate her movement into a 3D dummy (a very simplified digital-double). This was done by hand by Tomas Tjernberg.  Once they had her movement in 3D space they were able to stage the hero Cyclops camera around this 3D animatic to recreate the camera movements that they all liked (based on the rehearsals). In summary, the project worked out to be: three weeks rehearsals, two days staging, and one day shooting (22 hours straight).

The virtual set was roughly build in 3D based on Patrik Forsberg (owner of Stiller Studios) personal bedroom, for the size of the room, position of the windows and volumes of the furniture. The team took a 360 degrees photos and rebuilt the room as low poly geo in Maya. Then on Stiller Studios main stage, the team did a video projection onto the floor of the studio using a Top down view from Maya . This provided a 1:1 position from the 3D to the real studio of the 4 walls, the windows and of course the direction and amount of light so they match from the 3D to the real world.

“That stage was one of the coolest things I’ve ever experience in VFX, a translation of the 3D world into our real world… but if that wasn’t enough they gave me an iPad where I had a real-time monitor able to watch what the Cyclops camera sees with the whole environment (using UE4 real-time technology). So when I was shooting I was practically watching at a rough slap comp of the environment instead of a meaningless greenscreen. For me, as a VFX artist, what demonstrated that setup was rocksteady was the intersection of the real bed with the virtual wall and floor, no slipping at all.For the scenes outside that setup wasn’t necessary as the place had no restrictions in terms of walls or virtual assets, just an open space.” 

Victor Perez

Victor Perez is a senior visual effects artist with over 20 years of production experience in computer graphics and visual effects worldwide.

Known for his work with the Foundry’s Nuke (he is a Official Certified Nuke Trainer by The Foundry).

Victor is also well known as a senior instructor at fxphd.com where he teaches advanced Nuke compositing.

He started his career as digital compositing artist and 2D Technical Director. Perez has worked and researched with Oscar winning studios such as Cinesite, Double Negative and Pixar Animation Studios amongst others. His film credits include The Dark Knight Rises, Rouge One: A Star Wars Story, Harry Potter and The Deathly Hollows  and many others.

 

9 thoughts on “<em>Echo</em>: advanced motion control”

  1. Very cool concept for a short film. Looked the look behind the scenes at the motion control rig. Every time i see one i am in awe. Way to go Victor.

  2. Hmm, not sure about that one? I honestly think that this might not be the best example to showcase the possibility of the moco rig. This is an entirely shot green screen emotional piece, where for 80% of the time i am wondering why the background is all digital and keyed for a maybe 10s scene with time offset mirrored person??? I think sometimes VFX are used for the wrong purpose. Why not shoot this whole piece on location with one moco Rig and have the real feeling and emotion of being out in nature and put real props and mirror in front of actress And have her simple re-act the mirror sequence time offset with some music beat markers. It is an actress, that’s what they do, they can perform this again and again if they have to and since the offset is quiet big i don’t think any audience would notice slight discrepancies between performances. Now, i know this is a tech demo and they want to show what they can do with the moco rig, but i wonder if there wasn’t a better smarter setup, that would showcase that exact feature for 80% and in a more appropriate setting without looking at green screen composites 80% of the time, that have nothing to do with the moco FX calculation wizardry work which happens only during the mirror sequence. I think this is a bad example since it would have transported emotion and storytelling better with old school approach. But that’s just my 2 cents.

    1. Shooting something like this requires far too long for set-up and execution to be done on location. I wasn’t stripped of the emotion. But that’s just 80% of my 2 cents.

    2. I agree with Peter. Shoot it on location, it’s not like we actually key anything anymore, we just send it out for roto. Might as well have proper lighting and a scene for the actors to work with.

  3. This Victor dude is so full of himself; always about sonorous self-promotion. And it seems he came up with nothing new after all. 20 years of production experience right? Looks like he graduated film school less than 10 years ago LOL

  4. anyone else notice the blood on her shirt in the mirror was much bigger than when in happened the second time?

Comments are closed.