It’s likely that Avatar put virtual production directly on the map with its successful workflow of previs, performance capture, ’simulcam’ use and virtual worlds. Now and more than ever, virtual production is used on numerous films as well as television shows, commercials and in games. One company at the forefront of virtual production and, in particular, motion capture, is Animatrik Film Design. fxguide talked to President and CTO Brett Ineson and Director of Business Development Bruno Sargeant about their recent and future work in the field.
One of Animatrik’s recent projects involved droid motion capture work for Elysium. The studio collaborated with Image Engine to capture both digi-double sequences and also virtual cameras for a few space sequences. Animatrik had done similar capture with Image Engine for the aliens in District 9.
Elysium’s fully digital mocap took place at Animatrik’s 15,000 square foot recording stage, where the company has installed Vicon T series cameras. Until recently, most of Animatrik’s projects involved capture within this Vancouver studio space. “Historically, motion capture wasn’t able to go outside,” notes Ineson, whose experience before forming Animatrik included mocap work with Weta Digital on The Lord of the Rings: The Return of the King. “But with new sensors coming out on the market, these enable us to bring our cameras on set and not be seen by principal photography.”
“In traditional mocap,” he adds, “the light originates from the camera – it will have a strobe wrapped around the camera lens and will bounce light off a retro-reflective marker. But in this case, in the exterior, there’s a lot of infrared light in the outdoors so bouncing light is pretty inefficient. So what you do is turn the strobes off the cameras and you strobe light from the marker itself.”
Animatrik has invested in Naturalpoint Optitrack Prime 41s for outdoor work. “We’ve basically put our equipment on wheels and we’re transitioning into work coming from abroad, so now we can take it on the road, we can take it anywhere, not just Vancouver,” says Sargeant, who previously spent eight at Autodesk, including leading the team working with Lightstorm Entertainment and Weta Digital to develop the next-gen tech for the Avatar sequels.
In addition to the principal motion cap cameras for body capture, Animatrik also provides facial capture services – with two solutions. “We have one headcam solution that is a lightweight single camera unit, designed to work with optical flow based solvers,” explains Ineson. “We’re also working with a stereo unit designed to work in a 3D markered dot paradigm.”
Autodesk MotionBuilder is heavily relied upon for realtime playback of the captured action. The studio has also added a SolidTrack system from SolidAnim for previs and virtual camera work. “The core of what we do,” says Sargeant, “is capturing performance on set or on location, playing back that performance, applying it to CG characters in realtime for the key creative – so that a director can make directorial decisions immediately for CG.”
Both Ineson and Sargeant see a continuing widening of the market in virtual production, especially in the area of gaming. Ineson comments that the games work differs sometimes to the methodical nature of film and matching of live action plates. “In the video game world,” he says, “you can let things run a little more freely in that the final renders are all in CG – you can move the camera anywhere you want and change things on the fly and that won’t typically affect things down the line.”
In VFX, too, virtual production is being deployed not only for fully-CG productions, as can be seen with Elysium and other projects Animatrik has contributed too, such as the zombies in Warm Bodies. “There’s also more demand from non-technical creative people,” says Sargeant. “This comes from VFX driving the use of the technology on set, and I think this is a way for VFX to keep on track and deliver on budget under tight pressure.”