The Soccer commercial for Caledon Football Club in Toronto was shot in virtual production using Pixomondo’s (PXO) new crowd system developed in Unreal. Pixomondo worked with collaborators: alter ego, William F. White Int’l, and Virtual Production Academy.
The TVC is a leap forward in crowd sims for virtual production (VP). So many VP stages are just providing far off vistas or environments. PXO is embracing bringing new levels of life to VP, helped by the massive advances Epic has been releasing in Unreal. In the process, they had to navigate and innovate around a score of key VP aspects:
- crowd animations at high speed, simulated to an apparent 800 fps
- black levels when shooting in smoke stages,
- in-camera seamless traditions from real grass floors to action-packed stadium turf,
- transitioning their VP from UE4 to UE5,
- reliable VP performance with MoCap crowds via efficient static mesh instancing and more.
fxguide sat down for a technical Q&A with Mahmoud Rahnama, Chief Creative Officer and Chief Innovation Officer, PXO along with Gatis Kurzemnieks, Unreal Technical Artist, who created the PXO Crowd System in London. We also spoke with David Whiteson, co-director, alter ego and Eric Whipp, Director of Photography, alter ego, along with Edward Hanrahan, Director, VP and Jack Chadwick, VP Manager, PXO in Toronto.
FXGUIDE: How did this project come about?
Mahmoud Rahnama, (Chief Creative Officer and Chief Innovation Officer, Pixomondo): Earlier this year Pixomondo completed over 700 VFX shots for the HBO series Winning Time: The Rise of The Lakers Dynasty. On that project, 90% of the work consisted of CG stadiums and crowd simulations. After we wrapped, we started discussing how there should be a faster and more efficient way of doing 3D crowds. With everything going real-time, we decided to challenge ourselves and create a flexible crowd tool in Unreal Engine that can perform in real-time and could be used on our LED volumes. When the opportunity to work on this commercial came up, we knew we had to put our new tool to the test and really push it to its limits.
Our Virtual Production (VP) team in London did an amazing job putting it all together. The fact that we can change characters, LODs, jerseys, behaviors, density, and even do a ‘crowd-wave’, in real-time is incredible.
We are currently working on making the tool more interactive so the crowd can follow the real performers within the volume; react to scores and even follow the ball. This is just the beginning.
FXG: How was the crowd created? Was it by using a crowd Sim program like Massive, or were the crowds created in UE?
Gatis Kurzemnieks, (Unreal Technical Artist): The crowd was done in Unreal Engine. Pixomondo developed a new crowd system with the goal of achieving the highest framerate possible – suitable for use in virtual production. It works by using efficient static mesh instancing in Unreal. All animation was done in vertex shader – using bone animation textures. The toolset consists of two parts – the Houdini toolset for model and animation processing and bone animation texture baking, and the Unreal plugin, written in C++, to generate and control the crowd instances.
FXG: In some of the discussions of the project, there has been talk of ‘random’ crowd movement – but a crowd would need to move as a unit, reacting and acting to reflect the moments on the ‘pitch’ – as a real crowd does? I believe your crowd was actually anything but fully random?
Gatis Kurzemnieks: The new crowd system supports triggering different crowd emotions – like cheering or being angry and many others. There are over a hundred different animation clips that can be used on any crowd person. It can be triggered on the whole crowd or just part of people. It also supports smooth blending between animation states.
FXG: Was MoCap used?
Gatis Kurzemnieks: All animations were originally MoCap and 3d models for the people were also 3d scans of real people.
FXG: How did you get the crowd to look uniform as a sports crowd but still random and human? For example, were there any actual actors filmed and dropped into the crowd as ‘cards’ in UE4?
Gatis Kurzemnieks: The PXO’s crowd system allows fine control of gender and ethnicity distribution as well as clothing color variation. No ‘cards’ were used.
FXG: Was it UE4 or UE5?
Gatis Kurzemnieks: The development actually started in UE4 but was moved to UE5. So it was UE5 that was used for the final movie.
FXG: How many days were you shooting? I assume it was one day?
David Whiteson, (co-director): It was actually a two-day shoot, for two reasons. First, we were dealing with kids and wanted to give them the appropriate amount of breaks and keep them engaged which would have taken more time out of the day. Secondly, we were shooting at high speed and also wanted to give those shots the time they deserved to execute correctly.
Also, I’m fairly confident that this would be an impossibility to shoot in one night at a real stadium location. Even two nights would require them to be 12 hours long. We did it in two very comfortable and relaxed 10-hour days with no worries or concerns for the weather. We also created nighttime during the day in a studio. It was all very civilized.
Additionally, in order to capture the feeling of a game happening all over the field, we would have had to have been constantly moving around the actual stadium. With Virtual Production, we were able to set the camera and track once and then move the Unreal CG stadium asset on the wall around digitally in a matter of seconds. No need for big equipment moves from one end of the field to the other.
FXG: Can you discuss the issues of maintaining black levels on the screen, especially when using smoke? The two side lights must have caused unwanted spill, I imagine?
Eric Whipp: (DP): For maintaining black levels we used 2 x 18Ks on each side of the wall to create the hard light on the actors. The lights needed to be flagged so there was no spill on the wall, but once we did that, there were no issues with black levels on the wall. We also had 3 Orbiter lights rigged to the top of the wall to create the sense of stadium lighting.
FXG: Regarding the high-speed 200fps photography, can you discuss how you got around the flicker refresh issue, please?
Eric Whipp: When shooting high-speed, different frame rates produce different intervals of flicker. Ideally, we would have been able to shoot at a minimum of 400fps but it’s virtually impossible to shoot at extremely high frame rates (like the Phantom camera) due to the frame rate of the wall.
We started testing with the RED camera, then the SONY Venice, and then the Alexa Mini. We chose ALEXA mini because it was able to hit 200fps as opposed to an ALEXA LF mini. We ended up using math to calculate the perfect frames per second to reduce the most amount of flicker and then did extensive testing in post to work out how to remove any remaining flicker. The footage in the final shot was actually slowed down to 800fps.
David Whiteson: One other issue that we ran into was that when we shot our actors in front of the wall with the CG crowd, we discovered that the crowd animation became steppy. The crowd was animated at 24fps and there wasn’t enough information in the crowd’s movement to handle being captured at 200fps. To solve this we decided to create multiple crowd sims running at various different frame rates. We had the wall running at 24 or 48 fps while the crowd sim was running at 96fps, 120, or 200fps and then filmed that at 200fps. It took a lot of testing to find the specific combination but we feel like we ended up with a very successful result.
FXG: What a brilliant point David, I hadn’t considered that. Turning to the ground cover in the studio: the blending of the grass on the floor into the VP screen grass, – how much did this get solved in camera, and how many shots required post work?
David Whiteson: Almost 90% of the blend was done in camera on both the blend day and with constant adjustments during the shoot. As the smoke got heavy or lighter, and as we adjusted the lights, slight CCR adjustments were inevitable. The natural grass that we laid down on the studio floor was where we need help in post with clean-up. As the players ran across the turf, the grass would get caught on their cleats and bunch up. The Post team helped to smooth that out. But the blend was flawless and looked great on camera.
FXG: Is technology catching up to a point we can expect more CG characters in Virtual Production?
Edward Hanrahan (Director, Virtual Production): Most shows are using virtual production for environments obviously, but bringing those environments to life needs animation. With more and more powerful computers we can light and animate characters in real-time – along with everything else we need to compute. …plus you’re also seeing a big push from Epic with animation tool sets and Metahumans. So it’s definitely a direction the industry is heading.
FXG: I understand that there were some lessons learned from your work recently on Star Trek, in the way you implemented VP on this project. Can you elaborate, please?
Jack Chadwick, (Virtual Production Manager, Pixomondo): We learned that we needed to keep our real-time render requirements low to retain performance and there were specific things we did to achieve that.
The biggest challenge in this project was how we retained dynamic control of our full CG crowd while keeping our real-time render requirements low and ensuring we were performant on the wall.
When we first started testing the crowd system that our London Pixomondo office had developed, we had over 40,000 crowd agents in the scene and over 50 unique character animations. While the crowd system looked great on the wall, the performance was not there. We then looked into how we could reduce the real-time render requirements and still retain the dynamic nature of our crowd.
One technique we used to reduce our real-time render requirements was splitting up the stadium asset into sections so that only the areas visible on the wall would render in real-time, and it would, in theory, alleviate the performance bottleneck we were facing. The next step we needed to do was to adjust our crowd system so that only the sections of the stadium visible would feature our crowd agents. This solution worked for all medium and closeup shots but was not a complete solution for some extreme wide shots.
For the extremely wide shots, we needed to find a way to keep the energy and movement from the crowd while reducing our agent’s real-time render requirements. We were able to adjust the level of detail settings for the crowd agents until we found a sweet spot, where we retained enough fidelity so that the characters would read well in the background but also require much less computing power to stay at a performant level when on the wall.
FXG: thanks so much guys, great talking with you.
Epic Games will be hosting Broadcast & Live Events Week next month, which will be cover insider stories, technical discussions, and a new Broadcast & Live Events field guide. For more see updates on the Epic site.