Visual Disruptors Podcast 7: The Future Group

In the latest podcast in our Visual Disruptors series, Mike Seymour chats with Øystein Larsen, Chief Creative Officer at The Future Group. They discuss the challenges of producing live cross-reality performances, and Larsen’s fascination with the technology that led him and his team to push the limits and produce the impossible.

Cross reality (XR) encompasses virtual reality, mixed reality, and augmented reality. The Future Group has its fingers in every one of these pies, moving easily between them to produce such groundbreaking projects as Lost in Time, The Weather Channel’s series on environmental dangers, and performances by in-game characters live on stage at the 2018 League of Legends World Championship opening ceremony.
Listen to the Visual Disruptors Podcast with our media partners Epic Games on itunes.

The Future Group cut its XR teeth on the Lost in Time series, which aired in Norway in 2017. In the show, contestants competed to perform challenges in virtual worlds sprinkled throughout time such as the Jurassic era, 1920s New York and the Ice Age. Lost in Time was shot in a green-screen studio, with the contestants composited live against these environments as they completed the challenges.

While the contestants could navigate and interact with the environment in real time, each episode was broadcast a few months later, after The Future Group enhanced the graphics. “Before releasing the show, we wanted to take it that extra mile,” says Larsen. “All the imagery was actually output down the line in Unreal Engine at an even higher quality. We were rendering out 4K raw linear files that went to three Flame machines that were actually compositing everything. So the the work itself was fairly traditional post, but it was set up in a way that we could render 1000 frames per minute at 4K.”

Lost in Time turned heads around the world, both for its unprecedented level of immersive play and a game component that gave home viewers the chance to compete on mobile devices during the broadcast. But that was a whole year and a half ago, and since then The Future Group has been determined to push the limits of cross-reality broadcast even further.

In 2018, they tackled a series of live broadcasts for The Weather Channel, which used a combination of live projection behind the presenter coupled with live mixed-reality action in front. In the TWC series, the live 3D elements included falling telephone poles, a crashing car, lightning, fire, and rising flood waters, together with pop-up signs with information about these weather dangers.

For these types of videos, The Future Group’s challenge was to show weather phenomena without making the project too computationally intensive. “We’re always trying to find ways of explaining the same things using in-engine tricks,” says Larsen. “On my team, I have a few very experienced games artists that have been in games for 20 years, that have these real-time tricks and things up their sleeves.”

Cross-reality K-pop

One of The Future Group’s most spectacular projects was for the opening ceremony at Riot Games’ 2018 League of Legends World Championship in Incheon, South Korea. While live singers and dancers put on a dazzling performance for the thousands in attendance, the huge onstage monitors augmented the story with virtual members of the in-game K-pop group K/DA. In addition to the huge crowd, over 90 million viewers tuned in to watch online.

While all on stage performed the group’s hit song “Pop/Stars,” live cameras roved around and showed the League of Legends characters seemingly singing and dancing beside and around the live performers in perfect sync. The game characters also flew in and out, threw flames, and generally showed off their unique skins and powers.

Central to the illusion were the characters’ realistic shadows and reflections on the shiny stage beneath them, which were rendered in real time based on tracking data from the roving cameras. Larsen estimates that during the performance, Unreal Engine was rendering 30 million polygons per second. The team chose to go with planar reflections over the less computationally intensive screen-space reflections “because it just basically sells it so much more,” Larsen says.

With all the data coming through the pipeline—the live camera’s tracking data and visuals, the updated camera view and rendering from Unreal Engine, and the projection to the onstage screens—the augmented-reality performance rendered at an astonishing 59.94 frames per second, with the projection a mere six frames behind the live stage show.

It was all in a day’s work for Larsen, who calls the setup “fairly traditional”, just another way of creating an amazing show with real-time technology. With FACS rigs from Digital Domain and the dance sequences from the group’s music video in hand, the team spent five days on site in Korea integrating all the data.

Originally, the plan was to limit lighting changes during the AR moments, but Larsen couldn’t help experimenting. “It’s always better to try and push the bar a little bit,” he says. “I’m trying to put some extra juice in there.”

Upping the cross-reality game

Over the course of creating these iconic cross-reality experiences, The Future Group developed custom tools that run on top of Unreal Engine. Now, they’re making these tools available as a suite called Pixotope.

Pixotope is designed to simplify the creation of cross-reality content at a high cinematic quality. With the combination of Unreal Engine and Pixotope, designers can rapidly create virtual sets and augmented content with terrain and foliage, make use of particle systems, and simulate camera and lens properties like lens distortion and depth of field.

The Future Group itself is also expanding, with new offices in the USA and even more projects on the horizon. And while they continue to push the boundaries of technology, Larsen is quick to remind us that advances in tools for cross-reality experiences are driven by aesthetics—improved color correction, better shadows—and not the other way around. “If you do have experience with real-time engines, that’s obviously a plus,” he says, “but we do create art.”

For more on Virtual production check out visit the Virtual Production hub of our media partners Epic Games, to get more videos, podcasts, and insights into this emerging field.

By Ben Lumsden