Kara is a freshly produced android with artificial consciousness, discovering how it is to live among humans in Detroit, USA. She struggles to find her place as an autonomous android in a world where androids are still without consciousness and considered as practical tools to improve the comfort of humans.
French game developer Quantic Dream is a pioneer of ‘interactive drama’, in which storytelling and action are cleverly intertwined to produce emotive, engaging videogame experiences.In 2013, Quantic Dream founder and CEO David Cage revealed that the company would build upon their Heavy Rain and Beyond: Two Souls, “but in a very, very different way”.
The resulting game is titled Detroit: Become Human. The game is based on a PS3 demo shown at a developer conference in 2012. The PS3 tech demo Kara was a whole generation less advanced but still it excited gamers. Despite not originally having planned to make the demo into a full game, the company did decide to move the demo into full production and to do this it turned to IKinema to help produce the 50-plus hours of linear animation while maintaining the high levels of realism they desired.
For this newly-announced game IKinema proved a key component in solving issues across the entire pipeline. Given the scale of the project it was hoped the team could secure useable results first time, helping to reduce the number of reshoots and easing the burden on animators forced to clean up poor MoCap data.
Animation Director Jan Erik Sjovall is responsible for overseeing the quality of the data and he works closely with the Set Director to ensure the performance captured on stage is replicated in game. “I make sure the visual quality is up to par with the benchmarks that we set for visual quality and [the director’s] expectations,” he explains. “I have a team of animators – facial animators, body animators, a MoCap team, a shooting team … with these people, we try to achieve those goals.” The challenge with Detroit: Become Human, he says is that there is “a ton of animation(!)”.
“I would say 80 to 90% is MoCap. When we started planning for this massive production, we looked at a strong partner for the retargeting and solving part of the pipeline. Given the data comes from the MoCap studio, we needed a strong process to get the setup onto the 3D meshes that we use in the game.” – Jan Erik Sjovall
Retargeting is a key element for the team, not least because of its use of known actors, such as Willem Dafoe and Ellen Page, who starred in BEYOND: Two Souls. As the actors aren’t always readily available, for certain scenes they use body doubles or stuntmen. “Because the actors are scanned – we scan the bodies, we scan the faces – it becomes necessary to have a really strong process in the pipeline for this step. To get the data from the marker cloud and MoCap to the final data on the 3D mesh we test the solve from the MoCap, and then retarget from that onto the final skeleton, for the final character.”
Sjovall explains that they used to shoot their own MoCap and then outsource the data to specialist motion capture data experts to be cleaned, solved and retargeted. “But we found the results sometimes inconsistent… The costs were relatively high and the results often spotty,” he confesses. ‘We were an extremely interesting client for some of these companies because we had a steady flow, for a year-and- a-half, of MoCap that we were shooting three to four days a week. So you can imagine, it’s a ton of data that comes through.”
Whenever data comes in at a low quality, it falls to the animation team to fix it and solve the shot – although that’s not the best use of an experienced animator’s time or skillset and causes its own problems he explains. “The animation is there to enhance the performance,” Sjovall asserts. “Not to change the performance. Not to create something different. But if the data comes in that dirty, that problematic, then the animators fix the data; they fix it until it works, and then it’s not the same performance.” The solution they came up with was to form an in-house team within Quantic Dream’s MoCap group. Initially they tried a different software package featuring solving and retargeting capabilities, but It still wasn’t up to the standard required. “Internally we created the same problems that we had with the outsourcers,” admits Sjovall.
Fortunately he was already familiar with IKinema. “I knew about it, and I knew it had different solving and retargeting capabilities. It was one of the first [apps] that allowed for custom rigs,” he recalls. “For example, quadrupeds – like animals – are a huge problem when you work with other software.”
Outsourcing data cleanup
Based on some research done prior to joining and Sjovall’s recommendation, the Quantic Dream team evaluated IKinema. “The results were quite amazing, actually,” he declares. “There were some issues that we had to deal with, that come with new software, but the results we had were so much truer to the actual result. I could see on the video reference in the MoCap studio that every subtle move, everything would just be transferred over perfectly. Other things, like extreme posing such as touching my toes or bending backwards – or especially stunts where people roll – were always better with IKinema Action.”
“The results were quite amazing. I could see on the video reference in the MoCap studio that every subtle move, everything would just be transferred over perfectly.”
Even though the results proved impressive, there was still resistance to cancelling the outsourcing and moving to a new workflow so late in the project. However, Sjovall knew what It would cost to get the data retargeted and when the estimates came in from the outsourcers it was clear they should handle the MoCap data in-house using IKinema. Initially the task fell to the animation team but Quantic Dream is now building a dedicated retargeting team says Sjovall. ‘We’re recruiting for a very specific profile – a junior animator with a bit more technical aptitude, who we believe we can train in the art of solving and retargeting.” The end result of moving to IKinema completely in house has been not only a significant increase in quality but a “dramatic” saving in time. “Data comes in so much cleaner, and therefore the animators can concentrate on just brushing over the data as needed,” Sjovall explains. “The initial performance is already there so they can concentrate on things that haven’t come in, like for example, hands or the head controls. I would say the overall quality and happiness of the animators working with this data has gone up quite dramatically.” Not only that, artists and directors are happier with the results too. ‘The worst thing an actor could say to me is ‘I don’t move like that!’. Sometimes I have to say, ‘yes you do’ but that’s another problem,” Sjovall says wryly. “To me, it’s really important that the animators concentrate on the work that animators do”.
We've been a free service since 1999 and now rely on the generous contributions of readers like you. If you'd like to help support our work, please join the hundreds of others and become an fxinsider member.