Inside the (LED) bunker of Fallout

Fallout: Traditional VFX from next week’s FMX panel discussion. Copyright: Amazon Prime Video

Amazon Prime’s Fallout is based on one of the great video games of all time. Jonathan Nolan and Lisa Joy are executive producers. The creators of the show (and showrunners) are Geneva Robertson-Dworet and Graham Wagner. In 2020, Amazon purchased the rights to produce a live-action project. Jay Worth was the visual effects supervisor. Jay was at FMX in Germany last week, discussing the series with the award-winning visual effects team. Their panel presentation with Jay Worth (Overall VFX Supervisor), Andrea Knoll (Producer), and Andreas Giesen (Rise FX)  explored and outlined the way physical and digital production methods were utilised to create the world of Fallout.

The bunker, retrofuturistic society, and above-ground resource war required an enormous amount of VFX. We will have an upcoming podcast with Jay Worth, but one technique that the production used very effectively was LED Volumes, both as a VFX tool and a story device. We spoke to Magnopus co-founder and Oscar winner Ben Grossmann.

EP & Creator Jonathan Nolan in Fallout Season1.   Photo: JoJo Whilden/Prime Video

 

Fallout is the story of haves and have-nots in a world in which there’s almost nothing left to have. Two hundred years after the apocalypse, the gentle citizens of a luxury fallout shelter are forced to return to the incredibly complex, gleefully weird and highly violent universe waiting for them above.

The story takes place after a global thermonuclear war on Saturday, October 23, 2077. It’s unclear who fired first or what happened, but billions died, world governments and economies collapsed, and the climate was ravaged, leading to permanent changes to the global ecosystem.

To build the LED volume, the production turned to Magnopus in LA. Magnopus is a content-focused technology studio that has been fortunate enough to be involved in virtual production for the past eight years. They are based in Los Angeles and London and create award-winning work in VR / AR  and virtual production on projects such as The Lion King and Westworld. The Magnopus team has been instrumental in developing and deploying LED volumes since their very inception. Building a volume of this complexity during Covid, especially given the custom innovations that were required, meant getting some extra firepower in a few key areas.  Magnopus leveraged their close working relationship with Fuse Technical Group and the MBS Group to collaborate on volume design, deployment, and operations.  This created a modular flexible LED Stage that could even be employed with other filmmakers in between shooting blocks.

Cinematographer Bruce McCleary

The brief was to bring the post-apocalyptic world to production using the most advanced LED volume technology available. Over the season, 37 members of Magnopus worked on the project, led by Virtual Production (VP) Supervisors Kalan Ray and Kathryn Brillhart, along with Virtual Art Department (VAD) Supervisor Craig Barron. The specific VP and VAD heads were Kat Harris and Devon Mathis. It is significant to note that during this production, the director and the creative team worked with the art department and the virtual art department as just one team. Both creative teams worked completely side by side and integrated together is a way not often seen in LED stage work. “The virtual art department was brought in, not as a vendor but as part of the art department,” Ben Grossmann explains, “So the Production Designer worked with the VAD as just of the part of the art department. There weren’t two teams on either side of a divide – as there sometimes can be. It was a great relationship.” Ben Grossmann also attributes another production move with significantly improving the speed and quality of the LED capture volume work, and that was to have a dedicated cinematographer just on the LED volume for all eps. Multiple directors and creative teams are nearly always assigned to a series, so the work on individual episodes can overlap with the episodes following. By having Bruce McCleary, the highly experienced second unit DOP, as the one person running the stage, all the processes could be streamlined, and most of the lighting and calibration issues could be solved before the shooting of each new episode.

The creative and supervision team worked on the project for four months, creating content. The Stage commissioning and testing took approximately 5 weeks, and the LED stage was used for principal photography for four weeks. Ben Grossmann recalls, in particular, that Jonathan Nolan, who also directed the first three episodes, had a deep understanding of the process. As Nolan “was brainstorming shots and sequences, he was as interested in the implications of what was being suggested as he was in the creative impact – which created a unified story. He wanted it to be a success, so he was very engaged with everyone (on the VAD and VP team).”

In total, the team created three unique and vital environments in UE for use on the LED volume; this granted the production flexibility and efficiency outpacing alternative traditional greenscreen methods. The story lent itself to the technology of the LED volume, as in the series, a Telesonic projector simulates a Nebraska cornfield landscape within the vault, so the LED volume is used both as a technical tool and a story point. Although in the story the wall is meant to be a projected image. Interestingly, in the first episode, this meant that the team had to simulate the digital cornfield images burning up as if the film itself is on fire on the ‘projected’ screen – that was actually a real LED volume.

The Fallout setting provided several locations that work well on an LED Volume. With In-Camera Visual Effects (ICVFX), Magnopus knew they could shoot the actors and the crew directly in a massive underground vault or a post-apocalyptic landscape. Expanding on Magnopus’ previous collaboration with Kilter Films on Westworld, it became evident that real-time 3D assets created within Unreal Engine would work best. UE offered flexibility, allowing the team to modify the set during the pre-production phase creatively.

Magnopus joined the creative process before the project was fully green-lit. They started by assisting in the initial design phase of a volume located in Bethpage, Long Island. At this time, very few stages were available, and almost none would meet the production’s requirements.

Working with their development partners, Magnopus completed construction of the LED stage with Manhattan Beach Studios and Fuse Technical Group. The team then commissioned the volume and brought the various technical systems online in preparation for the shoot. The stage combined camera tracking solutions with custom object tracking, allowing the filmmakers to move wild walls and re-project onto them in real-time using Unreal Engine. Magnopus tested lighting alongside the DP and gaffer by integrating a DMX lighting control system into the stage’s dim-board. Importantly the project was always slated to be shot on 35mm film – which is unusual for LED projects. Critically, the team fine-tuned the genlock, data recording, timecode, content management, and systems operations with the Fuse TG team in preparation for the principal photography. Ben Grossmann believes that the film component introduced a new dimension beyond playback in terms of calibration and colour science.”It certainly helped add from a realism stand point,” due to the nature of film and its properties. Shooting on film required extensive testing and even minor tweaks in the grade. Unlike some productions, this entire process was so successful that imagery on the LED screens was not then replaced with similar high-resolution VFX clips in post. The production managed to capture final pixels on stage and use them seamlessly in the final edit.

On the content front, Magnopus collaborated with the filmmakers to determine which elements from the script were best suited for utilization on an LED volume. Working with the production designer on concept artwork helped identify which portions of the sets could be built practically and which would be modelled virtually. Sets were built entirely virtually inside Unreal Engine, allowing for multiple opportunities to virtually scout the sets by putting the filmmakers into VR and letting them block out the action with virtual cameras and blocked character animation. The team identified which areas of the sets would be seen in more detail than others; those areas had their lighting refined and more fidelity added. Overviews of the complete set were made in Unreal so the team could see where every storyboard frame was intended to be filmed from – in context – in real-time 3D.

The Visualization team used storyboards, real-time sets, and the blocked cameras from the virtual scouts to conduct both previz and techviz reviews with the filmmakers. Every shot was visualised from the perspective of the camera, allowing the creatives to understand where and how the physical and virtual sets came together. The team found these methods provided ample opportunities for the filmmakers to make creative decisions on framing, lighting, and action blocking very early in development and even helped the ADs schedule shoot days by visualizing how large pieces of equipment would enter the set and where they could be staged.

Some sets used 180-degree photography instead of real-time virtual 3D sets. The team worked with drone operators to design a multi-camera capture solution and stitching workflows to create footage for the LED wall. Then they helped design custom software for playing back that footage on a 20x95ft screen in 6K resolution at 24fps. On the day of shooting, the scene was shot from multiple simultaneous film cameras through the glass of a gimbaling aerial vehicle.

Camera tests allowed for both creative and technical review of the images projected onto the LED wall by looking through the lens of a digital camera with an applied film Lookup Table (LUT) to approximate the show’s final look. Using this workflow, Magnopous adjusted the color of each virtual scene to blend seamlessly with the physical environment so that it appeared as intended once captured on 35mm film. Throughout development, Magnopus collaborated closely with the director to fine-tune the shoot day requirements, ensuring we could account for any special technical considerations that might arise from putting a camera in a given location.

“The set Howard and the art department built, combined with the assets that Ben Grossmann, AJ Sciutto and the team at Magnopus put together, created a flawless environment. We got the footage back after the first day, and we couldn’t tell where the practical set ended and the virtual one began.”

Jay Worth, Visual Effects Supervisor on Fallout

In addition to the main LED walls, multiple large portable panels were mounted on wheels and calibrated and tracked inside the volume. The team had set up object tracking inside the volume for props so this was also applied to the portable panels. ” These panels were custom built so they could flip up or lay flat, and they had tracking systems built into them so if you moved them around in the set,” Ben explains.  “With the tracking the team could reconfigured them in the nDisplay system, so we did not have to go back and recalibrate the volume every time we wanted to move a ‘wild’ wall in quickly on set.” This meant that if an additional image was needed on the floor or inside a particular blocking, these portable screen could be wheeled in, and significantly also fully logged and it would display the right images relative the master camera..

Magnopus, Manhattan Beach Studios and Fuse Technology Group successfully delivered critical creative development and stage operations, running asset management, virtual art execution, camera tracking systems, and data wrangling. All in a flexible and very effective LED volume that did not require extensive removal and replacement in post. It is rare to have so much in-camera ICVFX work remain unreplaced during post, and thus, the volume is delivered on the promise of final pixels in the camera.