The Midnight Sky used a ‘pop up’ LED Volume stage for virtual production in the film. ILM’s StageCraft, as used on The Mandalorian, is an option for filmmakers to use on any production. It requires that a temporary volume can be built and deployed, such as the filmmakers did here for most of the sequences where George Clooney’s character looks out from the Barbeau Observatory.

In the film, also directed by George Clooney, Clooney’s scientist protagonist, Augustine Lofthouse, must venture further north into the Arctic Circle to warn off a returning spaceship following a global catastrophe. The film is set in 2049 and it is based on the novel Good Morning, Midnight by Lily Brooks-Dalton. Industrial Light & Magic (ILM) and Framestore were the principal VFX vendors, with Framestore covering most of the space footage around the interplanetary Aether spacecraft and ILM handling the ground-based visual effects and virtual production. Steve Ellis was ILM’s StageCraft CG Supervisor based in London.
The StageCraft LED Volume for The Midnight Sky was a bespoke volume built on a stage at Shepperton Studios in the UK. It was most prominently used for scenes in the film’s first act, where Lofthouse was in the Barbeau Observatory, looking out at the arctic wilderness beyond, and when communicating with the Aether ship from the Observatory control room. Two LED Volumes were constructed around the observatory set. The main wall was a relaxed ‘L’ shape design measuring 43m wide and 6m high, which wrapped around the front and side of the set. A second curved wall was constructed and suspended from the stage ceiling to provide coverage for the window aperture of the control room in the upper level of the observatory. “All shots within the Barbeau Observatory used StageCraft not only for the exterior environment but for lighting the sets as well,” explains Ellis. “These were both very bespoke designs to fit around that particular set, which was really nice actually,” Ellis remarked. The actual LED screens were primarily the Black Pearl 2 (BP2) LEDs with a pitch of 2.84 millimeters and 1500 nits maximum brightness. The bottom of the smaller curved wall hung approximately 3m above the stage floor, to allow the production to shoot scenes below for the observatory entrance scenes. “In addition to the two main volume walls, we also had two supplementary, height adjustable, LED ceiling panels, which would be used to balance the set lighting” explains Ellis “This allowed the DP to dial-in the exact lighting scenario he wanted on a per-shot basis.”
Filming began in October 2019 in England, and wrapped in Iceland in February 2020, just before the global Pandemic. TheMidnight Sky was shot by Martin Ruhe on Arri Alexa 65 cameras with the intent of screening it in IMAX theatres, but COVID put a stop to that plan. The scenes set on Earth and involving Clooney were shot before the end of 2019, and used the LED Volume, StageCraft based virtual production technology built around ILM’s Helios real-time renderer. StageCraft and the Helios renderer are entirely new toolsets that allow for assets of final VFX level quality to be displayed on the LED walls without requiring as much manual optimization while allowing a high degree of reconfigurability “live on the day”. High-end VFX artists can focus on creating content without being as constrained by the limitation of real-time performance requirements, which truly facilitates the creation of interactive on-set virtual production imagery.
Stagecraft
For this to work, the team needs to start by calibrating the LEDs and make sure this aligns with the VFX renders and the Helios on-set renders. This is one of the key calibration steps at the start of production and involves generating complex 3D LUTs.

“We have effectively complete control over the LEDs once we’ve sampled the screens. We built custom LUTs for the specific display device that we’ve constructed on set,” explains Ellis. There are several stages before the shoot he adds. “We would produce test comps and getting them signed off, in a normal workflow sense, by showing them to the Production VFX supervisor, the DP and, of course, the Director – Making sure everyone’s happy with the look of the content. We then engage the ILM advanced color science team to dial in the exact 3D LUT that will exactly reproduce the correct colors when filmed through the Alexa camera on set.” Ellis believes the color science team are the “unsung heroes” of virtual production, “because, without this, none of it works effectively. I don’t think people realize the extreme output that you can get from these (LED) screens and their color ranges can be wildly different from what you think due to the inherent technology, the LED screens or even professional LED lights tend to have frequency spikes which can lead to very complex light responses on human skin, and something that is actually remarkably hard to color grade out completely, even with modern grading desks.”

On season one of The Mandalorian ILM worked with the ROE Black Pearl BP2 LEDs and since then they have perfected calibrating and matching them with the color sensitivity and reproduction of the Alexa.
Assets and Stage Ready Conversion
Clearly, real-time rendering is demanding and particularly in regard to asset creation for real-time responsive playback. While it is commonly understood that assets need to be smaller or less complex for real-time systems, ILM’s approach is highly tuned and more nuanced than just reducing the polygon count of any 3D asset for real-time use.
ILM does have automated processes to make their assets stage-ready, but it is much more complex than just reducing the poly count. Interestingly with Helios and modern NVIDIA GPUs, there is generally not a bottleneck around polygon complexity. “Our renderer can actually handle huge amounts of polygons as in millions of polygons in real-time. It is more of the optimizations around texture memory,” explains Ellis.
The modeling and generalist teams produce assets for StageCraft projects leveraging the many years of experience in the ILM artist-base, at feature film level of detail and complexity. From there, one of the automated processes will do things such as texture packing. An ILM feature resolution asset can have many, many UDIMs, and some of the processes will do automatic texture packings that will reduce the number of UDIMs down, but at a higher resolution. UDIM is an automatic UV offset system that was made popular by the Foundry’s MARI and Katana. It assigns an image onto a specific UV tile, which allows you to use multiple lower resolution texture maps for neighboring surfaces, producing a higher resolution result without having to resort to using a single, ultra-high-resolution imagery. But texture memory is a critical resource in GPUs and so fewer UDIMs packed into one, if it is higher resolution, is worth exploring.ILM’s Helios real-time renderer is able to handle extreme scene complexity, with millions and millions of polygons. “We’ve recently introduced Vulkan into the renderer, and we are now doing real-time ray tracing in Helios. Although it was not used this way on The Midnight Sky.” Vulkan is a new generation graphics API that provides high-efficiency, cross-platform access to modern GPUs. Newer graphics APIs like Metal and Vulkan are designed with multi-threading in mind and thus are well suited to virtual production pipelines. The Vulkan Ray Tracing extensions were released to the community in November 2020 allowing ray tracing functionality alongside Vulkan’s rasterization framework, but initial versions have been available for some time. Vulkan was the first open, cross-vendor, cross-platform standard for hardware ray tracing through a fully vendor-agnostic API.
Midnight exterior
The content for the StageCraft volumes was shot in Iceland by a splinter unit a week before principal photography began. A custom camera array rig was designed for the shoot by ILM in collaboration with the production camera department, which had an approximate combined FOV of around 160 degrees, constructed from 3x Arri Alexa 65 cameras. A small crew, comprised of the 1st and 2nd Camera AC’s, B camera operator, and key grip, as well as the production VFX Supervisor, an ILM stills photographer, and the art department’s drone operator flew out to Iceland for a week-long shoot on the Joklasel Glacier.
Meanwhile onset, the team worked carefully to have the correct lighting and the correct negative lighting from the structures that were only on set, but that would still influence the digital landscape outside the observatory. For example, “for The Midnight Sky, we did have a digital model of the observatory itself that was also casting light back onto the environment, it’s quite a subtle thing, but it was there,” outlines Ellis. In the nighttime scenes, the DP could use virtual production technology to film and expose both the outside and the inside of the Observatory together, in ways a real location would not have allowed. “I think that was the thing that the DP liked the most. Obviously, if you’re got all the lights on inside the observatory at night, it would be black outside, but it is also reasonable that that light would spill out onto the environment from the observatory.” Some of the setups with the exterior of the Observatory are solved with pre-baking. “We did use pre-baked lighting as a layer that we could use to illuminate onto the snow outside. We also used the render to pre-bake the beautiful shadows coming off the lumps in the snow and any kind of rocks etc outside”. The onset team had that as a pre-baked texture that they could switch on and off. “If the DP wanted to have more light, we had control over that as we could easily increase the exposure of that light.”
The tone of the film and the shooting style selected by Clooney and DP Ruhe, lead to these melancholic, isolated, static viewpoints to show this solitary character within an isolated Observatory. “For one of the first scenes we shot, George had this idea to set focus very short and lock camera off, then without racking focus, have Lofthouse walk through the Observatory into the focal plane; this worked perfectly when matched up with the Stagecraft lighting and environments, resulting in the majority of shots in this sequence being captured in-camera, as opposed to the traditional approach of using green or blue screens and adding the background environments in post,” explains Ellis. The exquisite control of the Stagecraft system allows the DP to set the color temperature and exposure of the outside environment to be exactly as they’d like for the shot so that they can then compose their shot with control over both inside and outside set lighting. This is not possible with a greenscreen or bluescreen, where the DP has no reference of the exterior exposure as it’ll be set later in post-production. Not only does this approach lead to greater looking images, but as Ellis noted, it also has the advantage of being able to shift a chunk of the visual effects work from post-production to pre-production, with the final shots captured in-camera during the shoot. This reduced the number of required shots to complete in post-production, allowing the effects team to focus on the film’s other large sequences involving the space-based portion of the story in the later acts

While most of the outside sets could be thought of as ‘3D’, there was one very interesting shot, added late into the production, that posed some unique problems. There is one key shot as our heroes are leaving the Observatory, seemingly filmed from inside the now empty observatory, with the Skidoo seen driving away for the last time. The shot was shown as if filmed from the deserted cafeteria area looking down to the snow. “That was a fascinating shot because that was actually shot out on location,” recalls Ellis. Right before the shot was to be filmed, the production came to the ILM team with that live-action clip and said, “Could you get this clip into StageCraft?” recounts Ellis. “To use the terminology of StageCraft, that shot alone would be its own load.” A ‘load’ is what ILM refers to as a virtual production LED scene. Ellis and the team immediately set to work stitching that footage into the rest of the panoramic photography, and projection mapping it so the perspective of the Skiddo would look correct from the cameras POV, as the Skiddo was not 3D, and so the normal virtual production tools of self-adjusting parallax would not apply. “That was a really interesting shot to do. It wasn’t something that we’d ever really planned to do, but it was just something that was brought to us on the day before.” It does however illustrate the flexible nature of both StageCraft and the way the ILM teamwork on set.
As mentioned above, there are always memory limits with virtual production, and the team had to work within the real limits of the amount of RAM on the graphics cards. Even with the top of the line cards used at the time of filming (Nov 2019), for The Midnight Sky, this high-resolution Skiddo sequence was tricky to make fit. “We were able to playback a sequence of about a minute of the unedited clip,” Ellis recalls. Based on these production experiences, ILM has since upgraded and developed StageCraft even further to allow for much longer streaming clips in complex scenes,” reports Ellis.

StageCraft is now being deployed around the world on ILM projects, in upcoming articles we will see how this and other new technology has been used in episodic TV as well as films.