Project Arena: Chaos LED Volumes

Project Arena is a real-time ray tracing solution for virtual production from Chaos. The system is still in the alpha stage and is being tested on various key production LED stages. The goal is to have version one of Project Arena released later this year. The Chaos team believe Project Arena’s ray tracing could transform virtual production workflows for LED walls. Building on the real-time ray-traced rendering tech pioneered with Chaos’ Vantage, Project Arena will allow for a fast and direct LED Volume workflow. Assets do not need to be converted, reducing preproduction time and making deployment easier inside volumes. For example, Project Arena would allow smaller projects to consider using an LED volume for just a day or two on a commercial, as the same VFX assets can be used, and Project Arena produces extremely high-quality final pixels. Chaos believes Project Arena will give directors, cinematographers, virtual art departments, and VFX teams fully ray-traced real-time environments for LED walls in less time and in a more familiar pipeline than ever.

Head of Innovation Vlado Koylazov

Chaos’ Head of Innovation Vladimir ‘Vlado’ Koylazov  believes that the level of interactivity and collaboration in the virtual stage that is now possible on an LED stage is a whole step above the current implementation because the quality of light being modelled is more accurate with a full ray tracing solution.

Project Arena aims to make virtual production more accessible and efficient, allowing users to quickly bring their scenes onto LED walls without the need for extensive conversions, but this is not the first time the team have explored this side of virtual production. See our 2018 fxguide story on how Director Kevin Margo and the team explored using a version of V-Ray in an onset live context in the short film Construct.

Jumping forward six years, “what enabled the quality of rendering and at the desired resolution to start happening is when NVIDIA’s DLSS 3.5 (denoiser) came along last year with ray reconstruction,” says Director of Chaos Labs: Chris Nichols. “That enabled us to get the kind of performance we wanted because once we can get that quality render at the necessary resolution, then we’ve solved the biggest challenge of ray tracing, which is high resolution.” Advanced noise reduction has made an incredible difference to what the team can now do. “Vlado and I would both say that we were kind of shocked about what a big jump that was, and it’s a big jump that enables many things to happen.”

There are many challenges to a successful LED volume; it is not just a real-time rendering problem; there are complex issues of camera tracking, latency, calibration, and distributed computing to be considered. But the technology at the core of Project Arena is also not limited to just LED volumes; the scope for real-time ray tracing extends VR/AR support such as the new Apple Vision Pro, location scouting and various live streaming at events such as concerts and theatre.

A still from cult the 2012 V-Ray I.R.L demo film directed by Daniel Thron.

For the research team, the benefits of ray tracing in terms of accurate lighting, global illumination, and the ability to make real-time adjustments to scenes make this project of significant significance both today and in the immediate future. While not yet released, the Project Arena system is being tested on various stages, and the response from customers has been positive. But to personally stress test the new system, Chris Nichols is ‘putting the V-Ray I.R.L. band’ back together. Chaos produced a landmark ‘ultra violent – everything has fresnel’ cult demo film, a 11 years ago. While there are no details yet, Chris is actively filming a new upcoming indie style demo project, which he says “will have a lot of surprises and involve some incredible people”.

“True global illumination just comes naturally out of a ray tracer,” Chris explains. “You don’t need to bake lighting, you don’t need to create different hacks to achieve your lighting. If you want to move a light, you move a light, and you don’t have to bake it again; it’s done.” In talking to various DOPs the team has found this simulation of what they are used to – done immediately on set, without having to wait is perhaps the most significant appeal.

“We’re not necessarily looking to replace the current virtual production pipeline completely,’ Chris comments. “We’re just looking to make certain scenarios much easier for a lot of people. Allow people to leverage what they already have to make that a simple transition.” In talking to customers the message the team responded to was a desire to just quickly get their high quality scene on the LED wall. To test that, they have just downloaded a vrscene file from the internet, and put it on the customer’s LED wall in five minutes. “We didn’t do anything to it, no conversion, nothing. It just worked. So it was pretty straightforward,” Chris explains.