Halon Entertainment is bringing a deep dive visual effects tour de force on War for the Planet of the Apes to Comic-Con. Just before the guys went on stage, we sat down and discussed their experiences on this masterpiece of visual effects aided storytelling.
Halon just appeared this morning at Comic-Con to a packed Hilton Bayfront ballroom to discuss their work on War for the Planet of the Apes. The team showed exclusive clips from behind the opening scenes and battles in the blockbuster latest installment of the Apes films. VFX Producer, and film Co-Producer Ryan Stafford, was joined by Previs and Postvis Supervisor AJ Briones and Halon Entertainment Lead/Sup Casey Pyke. The panel took the audience through the process of creating the monumental battle between Caesar and the army of humans determined to settle the fate of both of the species and control of the future of the planet. The panel showed exclusive behind the scenes material, and was a must for fans of the film and other artists.
Halon reunited with our Dawn of the Planet of the Apes collaborator, director Matt Reeves, for War for the Planet of the Apes utilizing Previs very early in the production process. Working closely with Reeves, Halon planned a number of huge sequences for the film. Halon developed proprietary tools and pipelines to render their Previs on the Unreal Engine, a first for Halon for a major feature film. This provided for more realistic lighting, dynamic effects and an overall higher level of visual polish, which translated to a more effective means of communicating the Reeves’ ideas to the production. “Our efforts continued on-set where we provided rapid turnaround Previs and Techvis to assist the production as needed,” commented Briones.
Halon started work in June/July of 2015, using a beta of UE4 4.09, which was only fully released at the end of August 2015, and then moved to 4.11 by the end of the project. Briones knew that Halon had done game cinematics before using UE4 but never a major feature film. The third Apes film had a 3 year production schedule, a full year longer than Reeves had when he joined the second film. Briones had joined the previous film for the third act, tower battle. While this film gave Reeves more time, Briones commented that in this latest film, while this time was “a healthy timeline, it was still aggressive as there was a lot of things to do, and we wanted to switch to the new Unreal Game engine pipeline, which would up the quality of our visuals and enable us to do some things like atmospheric – effective for free.” Although Briones points out that getting it up initially was quite a bit of work, as his Previs team is staffed with generalists. “We are not like a game company where people work all the time with the engine and have specialists and low level programmers, but once we got it up and going it was a good experience for us,” he adds. Halon estimated that on this project they spent a lot less time over rendering and by the end of the production they were going considerable faster than they normally ever go. “So not only were we getting shots we could never have achieved but we could get shots with many more apes in them and the speed was much better than play blasts in Maya,” commented Briones.
With the continuing convergence of Previs, Postvis, virtual production and more real time on set cinematography, it was a very explicit objective for Halon to migrate to a real time game engine pipeline. Halon is very keen to provide interactive realtime scouting, and they do this now in Maya.
Halon had unprecedented access on War due to their existing relationship, which meant that unlike most productions that often need to rebuild everything from scratch, Halon accessed multiple assets from the second Apes film and that was the basis for the Previs when they started working on War. “We started basically with the final assets from the second film and that is what we used to create our asset rigs, and then throughout the production anything I needed – we got access to, all the dailies…we were looped in on everything, which just made it so much easier,” recalls Briones.
Previs and Postvis used the UE4 engine (see above). The facial Ape Previs and Postvis work was never intended or used in the final film, they were “just crude compared to the crazy good work Weta did,” joked Briones.
In post-production, Halon generated Postvis for several hundred shots in the film, working closely with editorial and Weta Digital to ensure a unified look and level of polish. Throughout, the team used just about every tool in their creative arsenal: using both keyframe and both in-house and Weta Digital motion capture in order to tell Matt Reeves’ epic story.
The shot above is an example of the flexibility that UE4 provided. Much like one would work with a massive crowd pipeline, “we would not have been able to achieve this in Maya without layering (in the playblast Previs pipeline),” explains Briones. Halon did not have alembic import into UE4, (it is a more recent addition in functionality), “but at the time we could bring in animated assets, and duplicate them in Unreal with their animation, so the way we set up these prison shots was not unlike the way you would set up in a stadium with a crowd. In that case you would typically just have a hand full of different people animations which you would duplicate out in comp to make a giant crowd of extras,” Briones explained.
The new Halon pipeline for War for the Planet of the Apes:
- MoCap feed the team’s Maya pipeline,
- Camera tracking from Syntheyes,
- Animation in Maya – along with set extensions and then exported,
- Imported into UE4 and rendered in passes if needed,
- And then composited in After Effects if needed.
There is no real pathway out of UE4 to export Alembic or camera data for Weta Digital, so UE4 was really seen as a realtime renderer first and foremost, rather than a tool for building or blocking assets. A key aspect of Previs is being able to hand assets on to the primary effects vendors as a starting point. UE4 has been improving the tools for this sort of use, from OpenEXR pathways, the new Alembic import, and the increasingly powerful Sequencer tool.
“We have our own MoCap pipeline,” explains Pyke. So when the project moved to Postvis, Halon was easily able to move over to dovetailing into getting MoCap data from Weta Digital and use that data to help the editorial team and director Reeves edit the film. “We could order selects directly from MoCap via editorial and then we just modified our pipeline to take that motion in,” says Briones.
There were points where editorial were trying to combine plates with motion capture photography/data, and it was just much easier for Halon to take the data and render the elements into the plates for editorial. In addition to doing a large amount of Postvis shots directly, the team also produced a series of set extensions stills, especially of the prison set, so these could be slap comped into shots by the edit team themselves for simple green screen type shots.
Weta Digital entirely handled the apes and their motion captured facial animation, but Halon did do temp keyframe animation of expressions on the Ape characters so that Reeves could have something roughly accurate to edit with, prior to the incredibly nuanced Weta shots being able to be used. “They were crude, but sometimes Matt would want a temp for a screening, and you can tell from his style – it was all about the performance – it’s always about the closeup,” commented Briones. Pyke pointed out that Halon did make blendshapes for all the Ape characters, but never with the intention that these would be used for anything other than helping editorial.
The great thing about using the Unreal engine in Postvis, in addition to being able to access Weta’s MoCap data and import it, was the ability to match lighting. “We were able to match and get elements to match into the plates a lot more,” comments Briones. For the first time for this project, the team brought the plates into UE4 so that they could colour sample and adjust their UE4 lighting to match. While in 4.09 they could not frame lock the plates, they intend to explore this further and they are very keen to see IBL lighting dynamically feeding live rendering down the track.
Final VFX shots
“Halon’s contribution ran past the Postvis phase, and our team came full-circle completing finals for many of our Previs and Postvis shots. While Weta did the character animation, Halon contributed a range of final vfx shots from adding things that would have been dangerous on set such as spears and arrows, to rig removal of tracking markers and wires.
For Briones, he believes that the ultimate aim of Previs is, “to allow a director to walk on set and have the supreme confidence to know he is going to get exactly what he wants, because he’s already ‘been there’ and thus knows the entire set front to back.” And so he sees there is a lot of benefit from real time and the interactivity that game engines like UE4 provides, and sees nothing but upside to the way the technology is advancing. Halon is now brilliantly positioned to embrace the developing world of virtual production, real time feedback and immediate turn around. Halon is now working to allow directors to do virtual camera scouting using the Unreal Engine on Halon’s MoCap stage and further take advantage of it in their pipeline.
Previs & Postvis Supervisor for Halon was AJ Briones, with the Previs & Postvis lead in LA by Casey Pyke and Kenny Di Giordano. The team did 10 months of previs in America, 6 months in Canada and another 11 months of Postvis on the project for director Matt Reeves.
fxguide will have more coverage of War for the Planet of the Apes, from Weta Digital coming soon.