Territory Studio is an award-winning independent design company established in 2010. Although the company is known for its user interface designs seen in tentpole films, it also works on broader design, digital VFX, and title designs. The team’s credits include motion graphics and VFX for The Batman, Dune, Bond: No Time to Die, Spider-Man: Far from Home, Avengers: Infinity War & Endgame, The Martian and many more.

Recently the company obtained an Epic MegaGrant, and the team is now able to offer a dramatically new form of interactive on-set screen graphics in real-time using the Unreal Engine.

The company’s new proprietary technique is to harness the power of Unreal Engine technology to enable live, in-camera, 3D screen graphics that track to screen placements on-set. Marti Romances, Co-Founder and Creative Director at Territory Studio is the first to point out that some form of attempting this has been tried by others in the past, “it is more than this as a different use of something we have seen before.” He points to early work with Vive trackers or Wi sensors on people’s heads and other attempts. For Marti, this is a new innovation and a very effective new tool for engaging with the audience and helping the creative team on set. It is also a vastly more cost-effective solution for many productions as they can deliver new visuals in-camera as final pixels. “Part of me was ‘we just need to try and do this!’ If someone else does this first, I will hate myself for not trying, because our business is very much focused on this style of work,” he joked when discussing the development effort it took to perfect the new system.

The easiest way to think of this approach is consider it as using LED volume camera mapping to display correct perspective graphics on desktop monitors or handheld props. This allows filmmakers to give the illusion they are ultra-high-tech holographic devices in camera. Just as an LED volume updates the LED wall with the correct parallax to give the illusion the LED wall is actually a vast vista, so too does the new approach update the screen to give the illusion they float in a holographic way in 3D.

Note here the object is tracked in real-time so it can move at the same time as the camera

On a set, these new motion graphics, and high-tech user interfaces can be featured on static or mobile/portable screens as required. If the set has fixed monitors – such as in a control room, then once the system is calibrated, only the camera needs to be tracked. In the example above, the handheld tablet and the camera can be tracked so the illusion works while actors hold the high-tech prop or screen.

The company is used to providing multiple screens for control rooms – such as in The Martian, which used traditional monitor screen graphics.       (Image courtesy of Twentieth Century Fox)

At first, the team considered the process would be computationally very expensive, but all the monitors in a high-tech control room can be treated as just one render target inside UE4 allowing for a typical set to be run off one machine. As the camera approaches anyone prop or screen, it increases in view, but it can never be greater than simply filling the whole frame from the UE4 render perspective. Conceptually, there is no difference between one screen-filling frame and a load of monitors appearing in a screen that appears to cover most of the screen. While a stack of monitors would need multiple graphics drivers, UE4 just needs to be able to render one consistent set of monitor displays that align with their angle to the master real-world camera.

The team started with just one screen and then they “moved to stress testing with 250 screens. And that is when they realized the network was not capped,” Marti recalled. “Because you are effectively broadcasting one 3D environment, not 250,” Marti Romances explains. Assuming the screens are not moving individually and are scanned and locked at the outset.

For static screens the system just needs to be calibrated and then only the camera needs to be tracked

The graphics can be synchronized to different camera setups to match the framerate, resolution data, and lenses. This approach replaces the typical post-production workflow for compositing which in turn, saves production teams time, costs, and gives greater creative control.

The virtual screen graphics can also be made interactive for actors. Live and on-set, actors
can use motion to trigger screen graphics cues during scenes to realistically convey important plot points. This approach means that complex graphics can be captured as final pixels in-camera without the need for roto or green screens. Even if the screens do need to be adjusted at some later stage, the actors will have approximately the right contact lighting and visual performance clues to aid their performances.


A virtual production approach to screen graphics brings instant access to the incorporation of CG in live playback to give more creative access to directors, DOPs, cinematographers, and set teams
as the project comes together live on-set. The team has tested the technology with a wide variety of cameras and lenses from professional Red Epics to iPhones.

Production of plausible screen graphics is an extremely challenging design task. The audience needs to understand an often futurist computer display, that may need to convey important plot points. This new form of virtual production screen hologram therefore just provides a new element to the design language the team at Territory Studio can use to better communicate the story. By itself, Marti Romances is the first to agree that this does not solve any design issues, but it does provide an apparent active third dimension, which delivers a very exciting opportunity to push screen graphics in new and original ways.

Copy link
Powered by Social Snap