Wes Ball is a highly successful film director of movies such as The Maze Runner films. He is also a visual effects artist and animator in his own right.
Last week Ball posted a video showing the real-time UE4 sizzle reel that got the film Mouse Guard greenlit at Fox Studios. Unfortunately, he did this in the shadow of the film not proceeding. Disney had decided the film did not fit in their slate and schedule of films, but they were generous in allowing Ball to have the material and Disney remain supportive of the filmmaker.
Ball posted the video, which is only part of the work he and the team were doing, but it stands as a great example of just how far real-time virtual production has come and gives one an insight into how Directors and Studios are looking to explore projects with these new real-time tools. Ball, a long time supporter of fxguide, walked us through the technical aspects of how he approached the project and how it came to be made.
History Of How It Was Made
In September 2017, it was announced that 20th Century Fox was courting the Wes Ball to helm a big screen adaptation of David Petersen’s Eisner-winning comic book series Mouse Guard. “I was finishing up the last Maze Runner movie and I saw Emma on the backlot,” he recalls, referring to Emma Watts, the president of Production at film studio 20th Century Fox. “We were talking and I was jokingly telling her that I want to do a tiny movie next. I said I wanted to do something small and she said: ‘No, next you’re going to something big…go figure out what it is”. Since the time of the second Maze Runner film, Ball had been pitched the idea of Mouse Guard. Ball had not really seriously considered it, “I was thinking, Oh yeah, cool, you know, mice with swords, that’s cute”. Meanwhile, he completed both the second and the third Maze runner films. “When I finished, Emma said again, ‘Hey, what’d you think about the Mouse Guard thing?’ And I responded that I didn’t know if I want to do a cartoon. Emma answered, ‘don’t think of it as a cartoon. Think about what James Cameron would do with it?'”. At which point Ball was hooked. He read the books and started thinking about how he could make the film. “I just started imagining Avatar with medieval Mice, it all kind of clicked in my head. I knew how to do it”, he recalls.
Ball came back to Watts and told her that he knew how to make Mouse Guard. He verbally pitched what he was planning and how he was going to attack it. Ball recalls that Watts said, “I like it when you get that look in your eye, go do it. Go put together a test, show me what you need”. With seed funding from Fox, Ball then contacted Glenn Derry. After Fox had acquired Technoprops, Derry became head of FOX VFX LAB and VP Visual Effects Fox Feature Films. Together assembled a small team and started working on the initial pitch video that Ball posted online last week. That sizzle reel led Watts to green light the picture and early pre-production started. Unfortunately, after the Disney Fox deal, Walt Disney Studios decided that Mouse Guard did not fit with their plans and the film was shelved. Unlike some reports in social media. There was no falling out between Ball and Disney. The team at Disney, while unable to commit to the film, remain supportive of Ball. The Director, while very disappointed, did not post the test to spite Disney, far from it. Ball posted the Unreal sizzle reel as he is incredibly proud of the work, and also he is a huge fan of this newly emerging area of virtual production.
Epic Games CCO Kim Libreri, via Facebook commented that the test “really is the pinnacle of virtual production.”, adding also, “incredible UE4 previz scenes from Wes Ball and his amazing teams at Fox VFXlabs and Halon”.
How He Made it
The production had two distinct stages, the original sizzle test to get the film made, and then a second stage when the team started building a robust pipeline to make the film. Wes Ball also posted a video online of the Mouse Guard offices after work had stopped. This second video shows the office they were using for the main production, but it was not in this space that the sizzle reel was made in. The reel that was posted online lead to the film being formally approved, the office, including most of the art and models that are seen in that office came after the sizzle reel.
As soon as Wes Ball decided to do a test reel he “started learning Unreal and I think it was anywhere from 12 to 13 weeks, I guess, from start until we’d built that whole sizzle reel”. The team during this period was split between Derry’s MoCap team and those directly working with Ball. In total, about 10 people helped to make the sizzle reel.
The reel was made in UE4 and runs in real time. In this case, that means any sequence can be interactively worked on, but for the final presentation, it was rendered out of UE4 into a Quicktime. The clip has greatly varying image complexity, some shots could run at 120 frames a second, in UE4 and the most complex fire shot with hundreds of characters shot ran natively at just 11 frames a second. Ball points out that their aim was not to make a demo or a game but productively use Unreal to tell a story. He adds that he is quite sure if someone was doing a game or a demo they could optimize the blueprints and no doubt get a far greater frame rate performance, but he simply did not need to do this, as the whole system ran very effectively without any optimizations. “It was good enough to work in and set any shot up. But ultimately we were just interested in being able to press render and get the shot in two minutes, – then if it actually is all running at 30 frames a second or not,” he comments. The whole project was done with the single aim of very quickly and effectively telling a story, and the team happily used whatever worked. “There was no optimizations. We had dynamic lighting. I turned on LPV, which I know some of the Unreal folk hate me doing. LPV is the light propagation volume (LPV), it is like UE4’s real-time GI (Global Illumination) that they discontinued, but it’s still in the engine…- I just found that it was a good and very quick way to get to a look that worked for me,” Ball adds.
Once they got the thumbs up to make the sizzle test from Fox, Ball started using Megascan assets to built his Mouse Guard world. “I used a ton of Megascans. I mean a ton of them… I just fell in love with using those the assets” explained Ball. Megascans is a massive online scan library of high-resolution, consistent PBR calibrated surfaces, vegetation, and 3D scans. It’s the product of five years of scanning by Quixel in collaboration with top game and film studios worldwide. Quixel produced their own incredibly impressive demo at Epic’s GDC 2019 event in March. Quixel’s stunning cinematic short “Rebirth” debuted during Epic’s “State of Unreal” opening session at GDC. It showcased both the photorealistic features in Unreal Engine and the assets in its Megascans 2D and 3D physically-based asset library.
Wes Ball has a long history with visual effects, in fact, he has taught at fxphd.com. His first short film, Ruin (done in Modo) was almost entirely made by Ball. His visual effects background was not lost on his visual effects supervisor Matt Sloan from The Death Cure, who commented at the time, “it was amazing to have a director who was as engaged and savvy as Wes.” In that last of the Maze Runner film, the director did multiple shots himself that made the final film.
Mouse Guard was going to be Ball’s first major experience of doing a MoCap movie, but as he pointed out “I’ve always told everyone at Fox, I want to do a MoCap movie – when I made Ruin that was basically my plea to give me a fully CG mMCap movie!” he joked.
On the Mouse Guard project, Ball was once again very hands-on, “I was using Megascans and making shots in Unreal,..for me it was like, ‘this is amazing!’ It was from my brain, to my Wacom tablet, to ‘I’m looking at the shot finished’, all in one night. It was incredible”. It was Glenn Derry that actually told Ball to check out the Unreal engine, “and so I went home that night and basically I just looked up tutorials on Unreal and I saw what they were doing and I said, okay, we’re using Unreal” he recalls. Ball is self-taught in UE4, he taught himself by downloading examples from the net, commenting, “honestly, I downloaded the UE4 Digital MIKE several times, opened up and just kind of ripped it apart trying to figure out how the shaders worked. I learned a lot about Unreal just by just downloading all of Epics free downloads. That’s basically how I learned Unreal. It’s great. Amazing”.
By using Megascans Ball could quickly build a world in UE4, but it was devoid of characters. This was where the team at FOX VFX LABS became involved. “When I first started talking with Glenn (Derry) about how we were gonna pull off the test, we started talking about the Unreal engine. I started looking into it and how people were using it and it kind of blew my mind a little bit,” recalls Ball. “You know, I hope this doesn’t come off as egotistical, but more than half the environments in that sizzle reel I built myself, – at home, – in my underwear !” To populate his world, Ball used MoCap almost exclusively.
Glenn Derry’s team did the motion capture on his MoCap stage. At that time, at FOX VFX LABs there happened to be a small pre-viz team from Halon Entertainment headed by Casey Pike. This team built and rigged the mouse characters in Maya for Mouse Guard. The team at FOX VFX LAB would walk out on stage, do the MoCap, the Halon team would check it, clean it up and pass it straight over to go into Ball’s Unreal computer. At this point there was no facial capture, that would come later when the team regrouped after the film was greenlit. While the sizzle reel does have facial animation, it was all hand animated later. At this time, the team established there was no easy way for them to do detailed fur on the mice. They looked at hair done as cards and there are a couple of shots using shaders with NVIDIA’s Hairworks, but Ball decided it was not worth the effort for the purpose of his sizzle reel.
All the characters were MoCap at Derry’s stage, while the team would later build their own small capture volume in the Mouse Guard offices, this was only used for virtual cinematography, all the actual MoCap was done at Fox stage. Interestingly, as this was an exploratory process, the team would work out blocking and do MoCap even before they knew what the environment was going to look like. “Casey at Halon applied all the MoCap data to the rigged characters and then spit out an FBX. That went straight into UE4. For example, in the fight scene, I just kind of imagined what might work. We had no clue what the environment going to be. We just shot a great fight scene and I said, ‘Oh, there’s probably gonna be something here or over there… imagining what the shot might be. That way we could pretty quickly establish blocking that worked, even though I didn’t know what the environment was yet”.
Unreal become Wes Ball’s ‘stage’ where he made the film. As stated, there was a small camera capture volume in the production offices, but Ball did not even use this for exploring the world or ‘finding’ shots as one might expect. The director was just more comfortable keyframing in UE4. However, he did use the system to add a more natural feel to the computer moves. Ball was very happy setting keyframe camera moves which he would then hand operate the virtual camera as it ‘moved’ along its digital track. “I am totally comfortable at a keyboard, laying down keyframes and making a camera shot. There are other directors that need a real ‘thing’. They need like a device in their hands. They need to walk and move, I don’t,” states Ball. “But what was really handy is I’d make these little simple camera splines inside UE4. And when I had the rough timing, I would take the virtual camera and I would essentially ride my own crude spline curves and add a handheld camera on top of it.” This approach turned out to be very effective later as production progressed on but for the most of the sizzle reel, the camera work was traditionally hand animated in UE4. The main exception is the virtual camera walk around the snake, where Ball walked around the snake character virtually, recording one long 360-degree shot.
Ball used UE4’s Sequencer extensively in his process. “We were using Sequencer a lot, it was definitely my friend. What we actually found really helpful was that all the characters and assets, that we were using for any given sequence, were spawned in,” Ball outlines. “So what that meant is that I had my set, which was one icon, and I had my shot, which was essentially my take, in Sequencer. And I when I opened up that Sequence, it loaded whatever the latest set was, – it all just worked”. If Ball changed anything in the set, the effects would just ripple through his shot, but not for any other shot that he had already done. This is respectful of how films are often times different from games. In games, there is one reality, one set. In film making, there is a starting point with the set, but things are often dressed to camera and objects moved to just make that one camera angle work. It is a cheat only possible when approaching the work as a film project and not as a game.
The spawning also meant different artist could work on different parts of a larger overall set, and their work would steam into the one world. This requires an additional level of both asset/shot tracking and workflow management, compared to most games.
Around this time, in early 2018, Ball commented that the Mouse Guard film was going to be very complex, saying to Comingsoon.com, “the trick with this one is we have to thread that needle with tone. I’m not interested in doing a DreamWorks or Pixar-type movie, I’m interested in doing something closer to Planet of the Apes where you’re really gonna nail characters and show the harsh reality of what they live in. It’s gonna be a little bit of both, probably, but at the same time because of the cost I need as big an audience as possible. So I want 10-year olds to see this as much as 40 and 50-year olds.”
Weta Digital had a preexisting relationship with Wes Ball and they were on track to do the visual effects and animation for the film, with Dan Lemmon as the Visual Effects Supervisor. It was Dan who suggested that the team should get some practical model sets built, as this would help make the work seem real, by actually seeing things in scale. In the end, the models were used only as a learning exercise. Ball decided in Mouse Guard to scale everything up to human dimensions. The project was not done as if the mice were tiny, but rather the decision was made that this world would be to the scale of humans. This meant that the mice would not be filmed with extreme macro style 3D lenses. “I wanted to make it feel as if you took the mice and their world and we blew them up to six feet tall” explains Ball adding, ” I wanted to make it more of an immersive thing that we are in this world that David Petersen created. In his world, there are no humans. The mice are humans basically. I wanted them to feel like we were there with them”.
Weta Digital team would have produced a much more sophisticated final animation, in line with their work on the APES films, had the production continued. The sizzle reel was never meant to show final image quality, but it is so remarkably good some people have mistaken parts as using final animation assets, this was not the case.
Lighting in UE4
Wes Ball lit every one of the shots in the Sizzle reel, “that was kind of where I spent most of my time. Honestly, lighting can make things look good or bad. It can make a really boring shot look great, so the atmosphere stuff in UE4 blew my mind”, he comments. “We all know what it’s normally like to render real volumetrics and this thing was doing it in real time” he explains. Ball knew that with his real-time sizzle reel shots the lighting was still an approximation of reality and not physically accurate in the way Weta might render a scene in their Manuka renderer. “My whole goal was to try and rapidly prototype the movie in the Unreal engine so that we could take a lot of the mundane inefficiencies out of this kind of process,- out of the whole movie making process” comments Ball. “Even though it was not a physically accurate kind of a thing that we were doing, – it was at least going to be really, really informative of what we’re trying for. This would mean Weta would start that much further along” he adds. “This meant in turn that all the artistry and all those geniuses over in New Zealand could spend a lot more of their time on the last 20%, – rather than on trying to get to the initial 80% mark.”.
Part of that plan to reduce re-work and improve communication was to build the process around USD, Pixar’s open source Universal Scene Description. “The whole process was built around that USD. All the work that we were doing, all the choices that we were making” he explains. In the past, MoCap went into Unreal, but UE4 was a data and meta-data dead end. It was purely a previs reference tool. All the real hardcore data stayed in Maya or Motion Builder. This annoyed Ball as it was often very easy for him to make changes in UE4, and he did not want someone to have to go back and mirror that change in some other Maya master file. After the sizzle reel was done, the team wrote new tools to get the data out of Unreal so it would be passed through UE4 and on to production. It would not need to be re-done and mirrored back in Maya. “The guys came up these incredible tools, they wrote some really custom stuff. Everyone on the team was really excited by this as a lot of these guys had done major MoCap movies like this before, they were very aware of all the issues with various data ‘gotchas'” Ball recalls. The team developed some new and inventive ways to solve the problem. “We basically baked in timecode to all the bones of our characters, so no matter what we did, we could always see the timecode for the take that we were using”. This was combined with a USD mechanism so that when Ball did any editing late at night the system would automatically package out all the changes, the art department reference, the USD and scene information, along with the Unreal UE4 file itself and this would be able to be sent to Weta as one package for the next morning. The process even accessed the high-density models that may have been decimated down for real-time use inside UE4, if any massive sets or assets were involved. The solution was both elegant and efficient. In many respects, it is this that moved the use of UE4 from that of a previz tool, to a virtual production tool.
With these streamlined approaches, the team planned to make a $170 million film in just 2 years, at the highest level of hyper-real image quality. Hopefully, this wonderful project will be picked up elsewhere.