Chalk Warfare, created by Sam Wickert & Eric Leigh, directed by Sam Wickert and produced by Micah Malinics, is our second Indie project in this series. Funded with only some generous support from several key manufacturers and relying on the goodwill of friends, Chalk Warfare 4.0 is bigger, longer, and dustier than anything the team had done before.
Chalk Warfare 4.0
Created by SoKrispyMedia, the concept of Chalk Warfare is simple. Teams of friends draw chalk weapons inspired by popular games and media, which become real when pulled off the wall, and the war begins. The first instalment, 10 years ago, was limited to four guys in a park. By episode three, twelve players fought over a post-apocalyptic warehouse set, drawing items such as space portals, chalk drones, energy weapons and even an Iron Man suit. Chalk Warfare 4.0 reunites the team with team members old and new for a massive new fight, all produced for a YouTube audience as a fun platform for the team’s creativity and technical abilities.
In true collaborative style, Chalk Warfare incorporated not only crazy sets and sequences but combined a massive amount of computer animation work from both Director/VFX Supervisor Sam Wickert and in-house visual effects artist Brendan Forde, as well as a variety of artists from across the country. Work included extensive 3D tracking, particle FX, and fluid dynamics.
Chalk Warfare 4.0 truly took production to a new level, with shoots taking place in Los Angeles and South Carolina, as well as an aerial unit for a sky diving sequence. “The actual production began in California with the interior warehouse, about a year ago,” recalls Wickert. The team then shot in South Carolina for the warehouse exteriors. The final shoot was the opening skydiving. In total, there were about five to six shoot days spread out over months, but with many minor pickup shoots to capture elements or smaller cutaways elements, such as falling debris. The team only ever had half the cast at any one time, since half were in the warehouse and the others had their parts played out in the Carolina exterior. Even the skydiving was just the same 8 people, doubling up and changing helmets to appear as if 16 people jumped.
The guns are made by filming the actors with cut-out cardboard or flat, plastic weapon cut-outs, which are then tracked and replaced with chalk versions of the same shape. This means the entire project required an enormous amount of object and camera tracking. The film was primarily shot on Prime lens. About 80% of the shots, once the team lands after the parachute sequence, were filmed on a rig and about 20% on some form of like glidecam. Typically on a close action fighting sequence the footage would be shot on a wide <20mm, with the main shooting normally closer to a 28mm lens.
The team shot with the URSA Mini Pro 4.6K G2, as well as the Pocket Cinema Camera 4K. “We would utilize a two-camera setup to shoot double coverage on the flagship action scenes,” said Malinics, “and at times even splitting into two units to maximize the day, since in some of our locations we only had a day or two to capture everything. The URSA Mini Pro also allowed us to use some of the cinema-grade glass and peripherals that we already had access to, which continued to help us push our cinematic capacity on this project. We also used the Pocket Cinema Camera 4K for a lot of our behind the scenes footage. It’s easy and versatile, and cuts well against URSA Mini Pro footage.”
One of the many aspects that is interesting in this project was the engagement the team had with the audience before and during production. For example, the team asked their fans to suggest weapons and on one occasion fans even provided key dialogue suggestions. “In the film, I call out to Shama (Mrema) who’s up on top of the water tower and ask him if he is ready. Shama replies, “locked and loaded”. I then posted a picture of that to social media. Now that is the actual line, but one of our followers posts the comment: ‘Chalked and loaded‘. And I go, ‘Oh my gosh, why did we not think of that?’ So we re-dubbed it and it fits perfectly. And that is what we went with it.”
Forde focused primarily on the massive task of tracking weapons, while Wickert handled the heavier 3D sequences and chalk integration. All materials were composited in DaVinci Resolve Fusion. “Fusion’s node-based workflow was incredibly helpful,” said Forde. “On a project like this, with so many shots with similar compositing methods needed, we were able to utilize the node workflow to easily swap in and out assets, as well as tracking data to replace weapons and add them to new shots.”
This project differed from early Chalk Warfare projects in two important ways. Firstly, the scope was much bigger so the team deployed more previz, especially on the complex sky diving opening scene. The second difference was a reflection of advances in technology. Brendan Forde built a much more robust tracking and post-pipeline using tools such as Frame.io to allow the team to work in a distributed fashion but still be highly productive.
Not only did the post process need to be tightened but this film featured 16 people fighting, which is drastically more than in the previous Chalk Wars. “It was four teams of four and just getting all of the cast together was one of the big hurdles,” says Wickert. “Especially as this time around we had some people that are well-known influencers, specifically in the California section, with higher profiles and much harder to schedule.” Wickert makes the point that most of the cast and crew are personal friends and so, with Indie productions such as this, the project relies very heavily on the generosity of those friends who work on it only because they want to and they loved the previous videos.
Cool New Tech
Amongst the innovations deployed in Chalk Warfare, there were a few particularly interesting advances for the team. In particular the new Chalk particles, the use of LIDAR and the new virtual Cinematography.
The chalk effects in this film were significantly improved with a full particle simulation being deployed. The team used tyFlow, which is a new particle system created by Tyson Ibele. It is an Autodesk 3ds Max plugin that is in open Beta currently and one of its core features is that it is all multithreaded. Wickert points out that, “Actually I went out and bought a new computer with many more cores to be able to take advantage of tyFlow more completely and run even crazier particle simulations for elements like the water disintegration.” The team tried cloud computing systems, but they found cloud solutions problematic for the most part, given the large sim cache files and their own domestic internet pipelines and data transmission speeds.
TyFlow uses the latest PhysX SDK rigid body simulations and works as a fast OpenCL-accelerated solver for simulating a variety of materials. “It’s incredible what Tyson was doing, and he was very instrumental in helping us actually, I stayed in direct contact with him, throughout the production,” commented Wickert. “The product is able to compete with some of the higher-end particle systems, like Houdini and other particle systems. It works directly in 3ds Max, so it really brings out some incredible functionality that Max wasn’t really able to provide before”
TyFlow was also used for the Water simulation that neutralises the robot. The actual water meshing was done using Phoenix FD. “The water meshing was actually done with Phoenix and our render was V-Ray. A great guy and friend, Allan McKay, helped us out with that water simulation and water blast,” says Wickert. McKay provided the water cache and Wicket handled the disintegration of the robot, as well as the disintegration of the gun weapon that is being held, as it also falls apart.”
While bullet hits and closeups on the weapons all worked well with the chalk dust particles, at a certain distance away from chalk objects, the fine dust and particles are not visible. This made the job of having props appear to be made of chalk difficult and problematic for blocking and lighting. Once the fine detail goes, due to its distance from the camera, then the chalk props are just diffuse even colours, which look remarkably odd, and difficult to make sit into the shot. “That was a big thing for shading the robot. That’s honestly one of the big things we found, so we decided that we just had to make it with millions of particles. It was the easiest way to replicate chalk,” says Forde.
One of the most unexpected aspects of the South Carolina shoot was the building scanning. The team needed to obtain photogrammetry of the exterior location. At the exterior old mill location, there were some tradesmen who happened to be working on renovating one of the buildings that the team were shooting at. “And one of the gentlemen who happened to be walking by was LIDAR scanning the entire environment. He asked if we wanted the files,” recalls Wicket. “So we gladly received something like hundreds of gigabytes of LIDAR scans of this entire environment.” These scans were then combined with their own still photogrammetry images to help solve the exterior location visual effects.
Real-time virtual cinematography
For the scene where the chalk robot breaks through a wall, the team opted to use Virtual Production and virtual cinematography. Wicket had been using various forms of real-time virtual cameras since 2017 and thus the team ‘filmed’ the sequence using a live real-time virtual camera. “Things like the Unreal engine are just so cool. We are really excited to see how these powerful tools become available and able to get into the hands of people like us,” says Wicket. “I am keen to see talented people get inspired and put in the time to downloading the software and learn it – you no longer have these ‘tens of thousands of dollar barriers’ to stop you from making this stuff, anyone can download it and use it.”
The actual footage was not transcoded and the raw footage was taken directly into DaVinci Resolve studio, allowing the team a lot of flexibility with the footage’s dynamic range. The post-production was completed largely in Resolve, which Wickert found fast and efficient. “A big issue we’ve faced with software programs and editors in the past is that the software didn’t utilize the full performance of our computers. Just because you may have great hardware, it doesn’t mean your post-production software is utilizing it. This film has a combination of extensive live-action as well as many full CG sequences with no practical footage (aside from assets), so having a quick editor that is multithreaded and efficient that could handle all this material was a must. Resolve was up to the task, and truly made our workflow more efficient.” All materials were composited in DaVinci Resolve Fusion. “Fusion’s node-based workflow was incredibly helpful,” commented Forde. “On a project like this, with so many shots with similar compositing methods needed, we were able to utilize the node workflow to easily swap in and out assets, as well as tracking data to replace weapons and add them to new shots.”