BebyFace in Bebylon

Last year Los Angeles-based Kite & Lightning won 2018’s Real-Time Live!, SIGGRAPH’s showcase of live demonstrations.  By taking home the winning award, they beat an impressive field of entries. Their “Democratizing MoCap: Real-Time Full-Performance Motion Capture with an iPhone X, Xsens, IKINEMA, and Unreal Engine” presentation showed a real-time character “Beby” driven by a combination of body and facial capture, in UE4.

Their Bebylon presentation, which won Best Real-Time Graphics and Interactivity, featured a live performance driving fully expressive, real-time rendered CG characters, proving how an ingenious combination of readily available technology can level the playing field creatively and produce entertaining results.

Since SIGGRAPH 2018, the team has been working on their game Bebylon Battle Royale and they are now making a short film (or as they call it a ‘very short feature film’).

Cory Strassburger, co-founded Kite & Lightning with Ikrima Elhassan. We caught up with Strassburger recently to discuss how Bebyface and the Bebylon game are progressing ?  

Bebylon was the inspiration for the Real-Time Live! demonstration, explains Strassburger.  “I was desperate for a way to bring our game’s immortal Beby characters to life, and facial capture was the big hurdle…I knew Apple had bought a company called Faceshift who at the time was already democratizing facial capture on the desktop.  I got curious about how much of their tech made it into the iPhoneX and ARKit. My first tests showed they managed to miniaturise the whole core of their face capture tech into the iPhoneX and that was very, very exciting.”

Strassburger did not know about the Real-Time Live event, even though he had been going to SIGGRAPH for 15 years. “I learnt about it from one of the tech guys, Chris from Xsens. I was doing work with the suit and he said ‘Hey you gotta do Real-Time Live!’ I didn’t know at the time what it was, but I owed them a lot of favours for all the great work they had done for me and I was happy to help them show off how well it all worked”. When Strassburger investigated it further he was stunned at how big and influential Real-Time Live at SIGGRAPH was. “When I finally got there, I was a bit scared that I was in over my head, especially when I saw the other amazing things being done there” he added.

Initially the Kite & Lightning team were not focused on real-time, other than as motion capture for the game. But Strassburger and Elhassan set about getting everything working in Unreal and what started as an initial weekend project turned into a major focus for the company.

Bebyface actually has fxguide’s Mike Seymour’s eyes!

Even with the new focus of presenting at SIGGRAPH, the team continued working on their game. This meant that Strassburger was working on the SIGGRAPH presentation right up until the last minute.  “I had some real issues during the rehearsal, so I was in my hotel room in Vancouver for three or four days solid, just trying to work out a bunch of kinks and stuff before the SIGGRAPH event started”.

Strassburger also raided the Digital Human Meet Mike Epic UE4 sample project of our own Mike Seymour. “I pirated skin shaders, eyeballs and a very cool lighting rig, and BOOM, I had some seriously good looking Bebies rendering in real time,” he tells fxguide.

The tech and doing it today

Strassburger points out that for someone attempting this today, it is “night and day…Technically you don’t even have to figure out any of the stuff that I did. You can just download their sample projects”. Compared to this time last year, 90% of the AR Kit tech that powers the facial capture is now available via a sample download project from Unreal. This was not the case last year when Strassburger was working out how to connect his iphone into UE4. “It’s down to the artistry at this point” he says. “The key now is in the asset itself. You use the sample assets, Unity has this little sloth character, which isn’t very good. Unreal has their kite boy as the subject for their sample scene and it’s really good. You can now just get in there and see how it all works and make expressions that drive a character”. He goes on to point out that anyone can just download this today and get it working, “assuming you have an iPhone 10 and an Apple Developer account,.. you just download that project, plug in your iphone and build to the iphone”. The App build process isn’t trivial, but for any one who can handle a simple App, it is very accessible. Strassburger’s advice is “you just have to follow the instructions and plug in the data. It takes probably about a half an hour, to set up your Unreal scene in order to build to the iphone, and get it working with the sample kite character”.

The artistry he refers to comes from moving from the sample face to something like his Bebyface character.  “There’s definitely a lot of artistry involved in getting stuff to look good and working with the blend shapes in building your asset. To work with your own character, there’s definitely some steps involved to get it into the engine and working really well.

Initially the bebyface facial rig was based on a Polywink rig from November 2016. Polywink offers facial rigs on demand and automatically generates a FACS facial rig with up to 236 blendshapes. They provide a rig adapted to the specific topology and morphology of any 3D character, whether it’s a scanned head, a photorealistic 3D model or a cartoonish character. Their facial rigs are compatible with Maya, Unity and Unreal. To get BebyFace FACS rigged, the team initially used this service, uploaded their 3D model and downloaded the rigged face.

Kite & Lightning replaced this rig via their own work in Autodesk’s Maya.  When the iphone came out and the team got access to the Apple official blend shape set, “which was just more accurate and more natural” . Strassburger noted that he ‘lost a little of the baby’s organic character expression from that original set in the process.  I’m now injecting that back in, that’s the artistry”. The nice thing about the Apple set says Strassburger is that “it’s the natural base set and it does looks good, and you can push it from there”.

One of the huge aspects of the iphone capture that is often neglected is the stablising and tracking that happens in the iphone. Most HMC rigs are on tight helmets that are rigidly mounted so the camera won’t shake relative to one’s head. In the case of the iphone it works fine, even handheld. The Kite demo for example maps not only facial expressions in UE4 but also head tilt. For Bebyface this means that Strassburger has to separate the head pivot from the other iphone data, as he is wearing an Xsens suit which of course provides the head and body movement, so the iphone data needs to be stripped back to just the facial solve. This means the Kite solution works, on a loose head rig, a cheap chest mount or even hand held.

Kite & Lightning combine the Xsens suit and the iphone data stream in UE4. Ultimately a third data stream will be folded in from hand/glove sensors.  At SIGGRAPH 2018 the team were not incorporating glove or hand data, but they are looking at that option now. Strassburger has several gloves, including the Manus VR gloves, that he has played with. He is hoping to work with the new Plexus gloves. These promise 0.01 degree accuracy, 24 degrees of freedom and haptic feedback (at the time of writing the Plexus gloves are still not shipping, they were slated for Feb 2019). The Manus gloves do currently work and stream into UE4 much like the Xsens suit data. “it is pretty easy to stream into Unreal, that is where the magic happens as all of this is going into UE4 which records it in sequencer.  You literally just hit record and it all records together.” he says.  UE4 does not care where the data streams are from. All the various data is recorded in the one single animation file, even the audio that you might be streaming. “It’s then immediately usable, at that real time quality to immediately edit or do anything you want”.

For the game the team dual records the Xsens suit both in their software and in UE4. This allows post-processing of the suit data. Xsens makes an HD processing unit which Strassburger says “is remarkably good. Like I’d put it on par with an optical system, it was surprising to me that it can be that good”. The Xsens and UE4 data have matching timecode, so once the HD processing is done, which is quick but not real time, the team can just swap that data into the corresponding Blueprint in UE4. “The beauty is that I can build my edit with my captures and then pick and choose which ones I want to process in HD (in Xsens). So  when I’ve got my final selects, I go back into Xsens, I process those in HD, which is a 5 or 10 minute process.”, he adds. “It’s a super, surprisingly painless, process”. Xsens suit style technology is prone to small ground plane issues and drift. This can especially be the case with multiple floor planes (jump on and off objects).  Kite & Lightning estimate that the HD processing gets rid of around 95% to 98% of drift problems and cleans up all the data.  In the case of Bebyface the ground plane issue only really centres around sitting in seats as the characters are not action figures.

For SIGGRAPH, Strassburger was also looking at IKINEMA RunTime middleware to connect to UE4.  The Ikinema team were very helpful in the early days of getting the Xsens suit working in UE4. However the Kite & Lightning team moved for a time to Autodesk Motion Builder and in the last four months, the new Xsens LIVE LINK plugin for UE4. Strassburger believes the current pipeline with the Xsens HD pipeline is good, but it lacks some “bells and whistles, .. .it doesn’t do interbody penetration, for example”. Strassburger is keen to try integrating the Ikinema tools back into his pipeline, and also try some new things, but he points out  this is only for the real time pipeline.  As the pipeline does not currently correct for interbody penetration, Strassburger tries to be mindful of this during performance captures, but if he does drop his arms, he normally just rotates the arms up before the final render “which is not a great solution”, he comments, “but it works for the babies as it makes them look like they’re tougher, and their arms are just kind of always sticking out. But it’s something that I’m still working on” Ikinema allows for this to be solved for in real time.  If Bebyface was to make more live appearances in front of an audience, the arm intersection would become much more of an issue. The Kite & Lightning team have also considered writing their own penetration correction tool in Unreal.

Short film

As a performer, one of the big advantages to the game pipeline and the upcoming film project is the faithful real time monitoring that the real time Kite & Lightning pipeline offers Strassburger as he works.  He believes this is “incredibly important” to his role as an actor. Originally, he had no meaningful real time feedback as he was recording his performances. He now has the UE4 Bebyface displaying on a screen next to where he is capturing. Strassburger clearly has technical requirements for wanting to review takes, but he is also learning how key this real time virtual production pipeline is for him as an actor. The immediate feedback is allowing him to explore and understand “these characters and the kind of mannerisms they should have. It is really exploring the whole sort of ‘Andy Serkis School’ of really defining a character through their body language”, he explains. This is process is further enhanced with dynamics, which add “jiggle to the belly and extra secondary movement to the babies. You can start to really develop an awesome connection with the characters and with your body as a performer”.

Strassburger is already also using the Unreal cloth dynamics in UE4 and he intends to use a lot more of the original game driven dynamics in his upcoming film.

Strassburger comments that there are two of the key aspects of Unreal engine that “have turned real time game engine rendering into a viable Cinematic Tool”. The first is depth of field, “it’s such an essential component, because without it I feel like the quality level would be vastly less cinematic… the depth of field in UE4 is fricking amazing”. The second aspect is volumetric rendering, “the kind of rendering that Epic added with that, – atmosphere and fog. It’s something that most people don’t really realise is just such an essential part of cinematic lighting and the way they have integrated it is awesome. Those two things, literally turned the engine around in my mind, into this amazing cinematic tool”.

Strassburger is stepping away from the game production, to make a short film. He is doing this while the game team are addressing a series of issues that are needed for the next stage in game production. “It helps them to not be dealing with new assets, and so it won’t slow the game down if I do the film for 3 months”, he explains. What also drove the decision is that after months and months of game development the team came to see that they had developed not only a great game but a rich story space. “I ended up with this massive and really rich story world as I was doing all the character story backgrounds. In a sense the game just turned into this gigantic thing, almost like a novel, in a way. And it has always been a passion of mine to do the cinematic stuff. That’s where I come from and it’s really our background rather than gaming.” Strassburger believes by doing the short film now, he will be able to open people’s eyes to this world before the game. It may act as a marketing piece but for Strassburger, knowing the characters from the film “will make the game more fun for the audience to play”.