VP stages

Around the world new stages are opening and more companies are responding to the enormous interest in virtual production LED stage volumes. Here are two (unrelated) stories of companies moving more heavily into VP.

New VP Stages opening worldwide.

MELS in Montreal is one of several companies that have recently opened new virtual production stages. MELS has 20 studios, located near downtown Montreal, which are designed to meet the needs of small, medium, and large film and television projects. The new VP stage launched at the end of last year allows VFX teams to collaborate in pre-production and on set with directors and DPs, rather than “fix it in post.” In this new era of film/TV production during COVID, the ability to create unlimited exotic landscapes and any environment or place imaginable in real-time and all on one stage saves time and money and creates more flexibility in filming schedules. The virtual production stage was built in collaboration with Solotech and ARRI and is powered by Epic Games’ Unreal Engine. The MELS stage was launched with the system running UE 4.24 and the team produced a demo piece, entirely in-camera, to show off the new stage.

 

The demo, which was completed in six weeks with support from the Unreal Engine team, MELS’ VP stage equips MELS to offer complete virtual production solutions. All services – visual effects, cameras, lighting, post-production, technical crews – are linked by fibre optics to all the sets. This turnkey model allows producers and filmmakers greater agility, efficiency, and creative latitude. MELS also offers the option of creating specific virtual stage configurations to meet the needs of individual productions.

Additionally, the MELS VP stage is a fully integrated studio offering everything from studio and stage rentals to equipment and in-house sound and VFX teams with the expertise to run the virtual stages. The company has 350 professionals who provide filmmaking services, across all areas. We spoke to Alan Wiseman, Director of production technology at MELS and Nicolas Fournier, Technical Director – Virtual Imaging, and Nikola Simeonov, CG supervisor at MELS.

The stage uses an LED pitch for the front LED panels of 2.97mm. The ceiling panels are 6mm. The ceiling panels are only used for lighting purposes. “In the case where it would be needed for either reflection or to be in shot for some reason, the ceiling would be replaced with 2.97mm pitch panels,” commented Wiseman. “The choice is not actually weight based,” he points out, as the stage is built ion a proper studio where the overhead grid systems are designed to take large weight loads, “much higher than any LED screen system could create.”

The stage has a powerful tracking system for the VP system to work, but for the test shoot the team did not capture that tracking data separately for VFX, simply because the test did not need to. But Wiseman explained that “for the next step, we are looking at several different solutions that will allow for this as well as looking at direct camera metadata with timecode in order to use in post if required.”

Simeonov explained that for their demo project they were using two UE4 blueprint techniques for grading and color correction. One was layer-based grading, per the nature of the set for Paris and NY allowing the team to color correct/grade foreground, midground, and the background independently. “The other was a light card blueprints introduced in both sets (NY – Apartment) and Paris, allowing us to interactively change the mood inside the apartment and then on the set of Paris to integrate VR assets as well as practical props and subjects on set,” he adds. All of the grading was done in Engine, and there was an iPad UI for controlling the stage live.

The MELS VP stage is built to allow panels to move. In the particular case of the MELS test shoot, the moving panels were only used as a lighting source. “That said we are capable of tracking them within the system and they could be used for reflection generation in a scene and the location data could then be captured for later use,” Wiseman explained.

The company uses a classic UE4 – Maya – UE4 connection pipeline throughout their VP process on the test piece. FBX is used where the scale was driven by UE scale system i.e. 1 unit = 1 cm. Maya was used for 3D models and the edit of UE assets. As with most of their standard VFX projects, Nuke and Photoshop were used for image generation and texturing of the UE assets.

We asked the team how much camera calibration is required, and how focus and defocus is handled between the actual camera’s DOF and additional defocusing on the LED screen separate from what happens naturally in the Arri cameras.

UE4

“For that project, we did not capture the camera metadata (focus, zoom, etc),” explained Fournier. “Unreal, however, is capable of generating very advanced camera relations. Within the engine we can profile pretty much any prime lens which allows for “faking” the focus changes in a scene.” While the team explained that there are a lot of configuration options within the engine, they found that, “with time and testing, a very realistic effect can be arranged for all prime lens, when this is linked to onset location tracking data,” he explains. “We are looking at different systems that allow for this information to be automatically adjusted via metadata and recalled easily. For zoom lens use, an additional metadata capture system is required, “although some testing was done, getting realistic effects from it will require a bridging system,” Wiseman adds. For actual Lens calibration moving forward, the team requires to calibrate any lens for distortion, flares, etc, “using a Calibrate -X to get the lens profile to import into Unreal” says Fournier.

 

NCam and disguise

In general VP news, disguise has teamed up with NCam. disguise has a long history in concert and live event multi-camera displays and the team has increasingly been moving to virtual production in recent times. NCam and disguise have announced a partnership that unites Ncam’s real-time camera tracking system with disguise xR, opening up a powerful blend of LED technology, real-time rendering, and point cloud tracking for high-end virtual productions. The same tracking tech is used by Disney, CNN, and Netflix. The aim of the new partnership is to bring more flexibility to broadcast live events by blending CGI, video, and extended reality (XR) in the most effective way possible.

The disguise platform is now being used on everything from Frozen to Coachella. disguise xR has already powered immersive real-time productions for music artists such as Katy Perry and Billie Eilish, enterprise businesses like SAP and Lenovo, broadcast TV shows like MTV Video Music Awards and America’s Got Talent. With the disguise xR technology, teams can blend virtual and physical worlds together, bringing immersive AR and MR to live production environments. This new xR solution expands their ability to deliver virtual filming systems to a wider variety of clients, who increasingly need trackers that aren’t limited to a single surface or location.

disguise xR can directly respond to growing trends built on LED walls and live real-time content, producing pixel-perfect live imagery on vast, complex surfaces. By combining Ncam Reality with disguise’s current offering, clients will be able to visualize live AR, MR & XR, real-time CGI environments, set extensions and CGI elements directly in-camera, using the most advanced tracker in the world.

Unusual for its use of computer vision and point clouds, Ncam Reality can track in any environment, on any camera, lens or rig, making it easy to build into a wide variety type of productions. The workflow flexibility allows productions to quickly swap out virtual sets during remote shoots, reducing both real estate requirements and the number of staff needed on set.

“The COVID-19 pandemic has accelerated the transition to virtual graphics, leaving many companies wondering where to start,” said Nic Hatch, CEO of Ncam. “This partnership helps brands jump on a growing trend with trained experts and advanced tech, so they can set themselves up for a future that is only going to get more visual.”

“No matter what the project, our goal is to give our customers more flexibility as they plan their next adventure,” said Tom Rockhill, CCO at disguise. “Ncam’s ability to use natural markers and IR reflectors give unique versatility, which means it can be used both indoors and outdoors, on an LED wall or for AR graphics in an open space. The possibilities are endless, and that’s exactly how we want our clients to feel.”

Copy link
Powered by Social Snap