The tech behind Once Upon a Time’s Frozen adventures

Crafting up to 400 VFX shots for each episode of Once Upon a Time is no easy feat, especially when there are 22 episodes each season. But the effects company behind the work, Zoic Studios, has mastered the art of virtual sets and integrated pipeline tools to make those deliveries possible. We find out about some of the tech behind this new fourth season of the popular ABC show, which incorporate Frozen characters into its storyline.

App support

Having realized the effects for several episodic television series filmed largely on greenscreen sets, Zoic is well-versed in the area of virtual sets (see below). That culminated in the development of iPad app called ZEUS:Scout – now also available to anyone for download – to aid in the planning of shots and shooting on virtual set stages.

An image from the ZEUS Scout website. The app lets you previs virtual sets and do live greenscreen composites.
An image from the ZEUS Scout website. The app lets you previs virtual sets and do live greenscreen composites.

“How it works,” explains Zoic visual effects supervisor Andrew Orloff, “is that in the pre-production process the art department put together a SketchUp file and a piece of concept art for each of the virtual sets that we’re going to use, and we build those for real-time 3D playback. We convert them through Unity to our ZEUS:Scout app.” “We upload all of those into a protected FTP site for Once Upon a Time and they have their whole library in the Scout app,” adds Orloff. “So when we’re in pre-production we’ve got the ability now for the director to pull up any of the sets, move around the sets with the controllers that we have in the app, and also use the gyroscope in the iPad to do real-time tracking. If they’re sitting on a chair and they start moving around, they can look around the set as though they were standing in one spot on it and moving a magical window around it. If you’re on a greenscreen we can do a realtime composite as well.”

A scene from the show.
A scene from the show.

As Orloff notes, the Scout app is most helpful in pre-production, helping Zoic and crew members decide on orientation of the actors to camera and the location of objects in the scene. Then when the cameras start rolling, that’s when Zoic’s implementation of a complete virtual set system ZEUS comes into play.

The virtual world

ZEUS, short for Zoic Environmental Unification System, is a combination previsualization and on-set realtime compositing that relies on Lightcraft’s Previzion tech for realtime camera tracking and keying. It allows for 3D sets to be ‘piped in’ to the Once Upon a Time greenscreen stage, helping the actors and crew to get an idea of what the final shots will look like.

Original plate.
Original plate.
Final shot.
Final shot.

Zoic has built extensively on top of the Lightcraft system with ZEUS. “ZEUS is also our conversion process and a bunch of proprietary tools that let us deal with that data,” says Orloff. “Once they edit the show we have some tools to take those edit points and extract the data from Lightcraft and put all the scenes together. When an edit list comes in we go to our asset library and to the greenscreen footage and we compile that and put it altogether automatically for the artist to just start doing their compositing and 3D work right away.” – Above: this ZEUS reel showcases some of the previous work Zoic has done in episodic television. “What we’re using LightCraft for on set is direct visual feedback and then an editorial temporary composite so that you can see it and show it to people and they’ll get a good idea what the test’s going to look like,” adds Orloff. “But then we’re extracting that data and doing a more photorealistic render using Maya and V-Ray that piggybacks on the data we gathered with the Lightcraft system.” Zoic has also carried out an extensive Shotgun implementation with ZEUS with a file server that tracks dependencies across multiple locations (see our previous coverage here).

Efficient CG

In addition to the scores of virtual sets, Zoic is often called upon to create complex digital characters – for the new series these include the snow creature Marshmallow and the troll Grand Pabbie. Some creatures, and indeed props, have to be built and kept online in case the asset might return later in the season. And in amongst this work includes also a raft of magical effects, such as wispy smoke, snow and other particles. For that Zoic relies on a mix of Maya fluids, Phoenix and Krakatoa. “We’ve really found Krakatoa very efficient in rendering a lot of particles in a small cache size,” comments Orloff. “We do so many smoke effects and it has to be really detailed – swirly wispy smoke that goes along with all the magic. It creates a lot of particles with a really low footprint and that really works for us.”

Troll
Grand Pabbie.

Another key approach Zoic takes for realizing shots quickly enough for an episodic TV schedule is in lighting. “Out of V-Ray we’re rendering what we call ‘light selects’,” says Orloff, “which is each individual light from Maya in our scene as a separate pass, and we’re actually doing the lighting mix in NUKE. We start with the blank set and we’re able to adjust the intensity and color of each light individually in each shot in NUKE without having to go back and re-render.” A look development artist at Zoic sets these light selects and then publishes the light mixes as a template for other artists – an important solution since there is often not an on-set HDR solution for the greenscreen photography. “It’s really helpful for virtual sets,” states Orloff, “because you can’t have multiple artists doing multiple different color corrections per shot since it starts to bump in the edit. They all have to be consistent.”

Water sims

This new season begins as Anna and Elsa’s parents’ ship battles the harsh ocean. For the digital water simulations, Zoic partnered with Fusion CI Studios, a company with extensive fluid sim experience. Their task was to somewhat replicate the same sequence that took place in Frozen as a live action sequence. Fusion capitalized on its suite of water and ocean tools for the work. “We had just recently finished some other ocean work where we had done a top to bottom re-design of our ocean tool,” notes Fusion visual effects supervisor Mark Stasiuk. “So we were so happy to be able to deploy those!”

Original plate.
Original plate.
Final shot.
Final shot.

The process involved a range of workflows depending on the type of water required. “We have a workflow for an ocean surface that just is natural in behavior and does not have any wide shots,” says Stasiuk. “Wide shots make a big difference! In that case you use a different simulation engine and a totally different approach where the first thing you do is design the surface and you do it in a bunch of components. The natural part of the surface is properly simulated but then we have all the art directed components, the art directed waves that come through which are the character aspects of this.” “Our tools allowed us to isolate a particular wave form and add it to the surface,” continues Stasiuk. “We can actually compile surfaces that are made up of a natural ambient wave field combined with isolated wave forms we can isolate in positions and timing. We can time warp them, deform them to make them steeper so they’re a little bit more scary, say. We can add those into this wave surface, and once we’ve got this 3D dimensionally composited wave surface, we can then add on properly simulated foam and wave crests and white caps.” – Above: this Fusion CI breakdown shows their ocean work for a previous episode of Once Upon a Time called ‘The Stranger’. You can find out more in this case study here.

Zoic would send Fusion geometry for the boat as well as camera animation and a proxy ocean – a simplified ocean surface. “We do all our simulation design work,” says Stasiuk, “and send them back initially low res versions of the surface, low res versions of the wave crests and white caps. It gives them a chance to test out their lighting and do previews for the client and start getting approvals. As soon as they get those approvals we start cranking up the detail level. In the shots, it was about 25-30 hero wave crests in each shot, and something like a couple of 100 white caps per shot and a foam pass on top of that. For the foam passes we give them interaction maps – the interaction between the particles and the surface – sometimes referred to as wet maps. And we have particle sets for the white caps, particle sets for the hero wave sets.” Fusion relied on RealFlow for the ocean surface, as well as in-house plugins to that software for specific looks. Lighting and rendering was handled by Zoic, who, in particular, used Krakatoa to render the particles that comprised the wave crests and white caps.