Coco VR

One of the most interesting VR talks at FMX this year was Magnopus Co-Founder Alex Henning discussing the COCO VR project for Disney PIXAR. With the new PIXAR film (The Incredibles2) due out soon, we look back at the engineering solutions in the Coco VR application.

In 2017, Disney Pixar released the animated film Coco to great critical acclaim and commercial success. To support the release and further celebrate the film, they also released the studio’s first consumer virtual reality experience, Coco VR. 

Magnopus collaborated with the team from Pixar to create a rich social experience far beyond the scope of the typical movie marketing piece. Magnopus Co-Founder Alex Henning shepherded the project at the Los Angeles-based company from pitch to completion. In Germany at FMX 2018, Henning presented how the company worked with the creative and technical teams of Disney to translate the world of Coco to BR.

Henning pointed out that at the beginning of the project no one was quite sure what the CocoVR experience would be, but they did decide early on that they wanted it to be ‘social’. Given that the material was from Pixar and Disney, the other key aspect for Magnopus was to make it fun, and something people would want to do more than once. It needed to be entertaining and engaging.

To be awarded the final project, Magnopus did a prototype over about three months with a team of 3- 5 people, which aimed to address of the major areas the team had identified were key to success of the final project:

  • Technical visual development and
  • Experience and interactivity design

The main project was an additional 8 months, with a team that started with 8 people but grew to 20 by end.

Technical visual development

Central to the success of the project technically was answering two key questions:

  1. When the project is completed, will it look good?
  2. Can Magnopus work with the assets from the film?

The later of these started with question: could Magnopus even open a file from the Pixar production animation team? It turned out the answer was, no. Pixar uses Presto and other proprietary in-house animation tools and so Magnopus could not access the assets easily. The conversion from heavy duty film assets to light weight real time assets was never going to be simple, but in doing the conversion neither Magnopus nor Pixar wanted to compromise on the integrity of the characters.

Characters

As there was no direct export from Presto into a game engine, Magnopus had a challenge to translate the characters faithfully to the CocoVR experience. The team used animation point captures, which were then used to puppeteer a properly skinned skeletal character that was faithfully made to match the source characters. Initially, the team were helped by the fact that they were dealing with the after life skeletons characters who did not have skin and hair. Once the full project was underway, the team then had to explore with Pixar more complex characters such as Miguel, who is fleshy, human and much more complex. “Pixar as you would expect from who they are, were extremely precious about the fidelity and animation quality of the characters”, explained Henning. To make sure the character work was at the Pixar standard, the team did a series of side by sides shots , comparing a clip of the rendered Pixar RenderMan characters and a screen recording from the game engine. “It was challenge to get there, Pixar was not just looking at Miguel’s hands, his face or his gestures, – they were looking at the folds of his clothing, and was his sweatshirt ‘on model’ in every frame for example. These were all things we had to work out”.

There were varying levels of characters in the project, at differing level of complexity. The most important being the front characters who were full body IK, which is unusual for VR CG characters. One aspect that was hard to match in any close ups of these characters was the complex Pixar RenderMan shaders.  While the team had a plan to use projected textures for the environments – this would not work for the close to camera characters. These characters had to be modeled as full 3D assets. Luckily, the skeleton characters avoid the need to address such issues as subsurface scattering and hair. Clearly, no VR system can match RenderMan final film quality, when rendering for VR use in real time, so the decision was made to not have the only real main human character, Miguel, get close to camera.

Miguel is cleverly only seen at a slight distance, and thus one avoids his character looking less than full Pixar quality. “We are working constantly on techniques that can get us closer and closer” commented Henning. Miguel is a balance between a fully dynamic character and one that is defined by hero animation from Pixar. For example, during a animated move, he is adjusted to maintain eye contact with the user, even as the user moves around inside their capture volume. “We override his hero animation to make sure he maintains an eye-line with you as you step around. The more Miguel did not need to be a dynamic reactive character the more we could cache out hair sims or cloth sims, to allow the user to get closer”. But if the team had to have dynamic interactive lighting on Miguel, then the VR team would have been at huge disadvantage to what the Pixar could get from RenderMan, (which can take hours more time to render, vs. the 11 milliseconds Magnopus had in VR).

Environments

The environments for CocoVR were particularly challenging given the vast amount of geometry and lights that Pixar was using to create the after life world in Coco. The feature film “had 10,000 dynamic lights. Anyone who has worked in real time knows that these are not numbers or statistics that translate easily to game engines, where you can handle very small amounts of geometry and maybe one dynamic light, especially when working in VR,” Henning explained. The Magnopus team were faced with taking something that took “perhaps 200 hours a frame and find a way to express it well in under 11milliseconds in stereo”, he added.

The aim was to make a user feel like they were standing inside the movie. The early test set was the Clerk’s office, which actually did not end up in the final CocoVR experience. Pixar also gave the team an exterior location of the central plaza to test. Magnopus asked Pixar to render out some spherical 360 renders in these locations, so that the team could try spherical projection inside the game engine.

Spherical projection has been used in VFX for some time. It involves projecting high quality imagery over simplified geometry from a key target point. With some work and as long as the viewer stays close to the projection source, the illusion is created that there is much more detail in the scene. This approach bakes in a lot of detail, and with some clever work, the net result is a world that appears vastly richer due to the projected baked high resolution textures.

Henning commented that this approach isn’t normally used “that much in VR but we decided to leverage this approach heavily, as the source footage was so beautiful and complex. We’d never be able to translate that lighting or shading directly into real time but we could a RenderMan render and apply it to the real time geometry so that most of what went into that beautiful look is intact”. Magnopus had used the approach previously for a special Disney project, but they had never done it on the same scale as they would for the CocoVR project.

Luke Schloemer Lead 3D artist was the primary interface between the Pixar team and the Magnopus technical team. At GDC 2018 in San Francisco  Schloemer outlined the process in detail, which we worked on, having previously spent 3 months doing a Moana VR project for Disney around the time of that film. Magnopus developed many new techniques on that project, but unlike Coco VR, that project was for GearVR and the assets came from Disney’s Hyperion renderer. CocoVR was much more ambitious as it involved the user walking around much larger environments.

This approach of spherical projection is much richer than just 360 video which would have no parallax in VR. Unfortunately it also meant that if a user walked away from the projection point very far, the illusion would break. As it happens, in VR users can’t walk that far, most are seated or in a fairly closed capture volume on the Oculus. Regardless of the spherical projection, the team had to allow the users to move and progress through the virtual world. At this point the team came on a solution that address both issues. In the VR experience users ‘teleport’ forward to established points.  This approach is called spherical multi-projection. With this approach a user gets to benefit from many of these projected images simultaneously so that depending on where you are in the VR experience, the user is being feed, per pixel, from 1 up to 9 of these spherical mappings simultaneously. The decision as to which image any one pixel draws on is a function of the users exact position, their angle of view or incidence to that geometry, what objects are occluding it etc.

“One success we figured out was that you could do a spherical projection entirely in the shader, so there is no UV required, it did make the shader more complicated but it did allow for very fast iteration” commented Schloemer.  The team then explored blending multiple shaders, (since there was no UVs).

As the final shaders did all the heavy lifting, the team had no separate layers and they could blend up to 9 x 360 projection points and blend them together.  There were no separated render layers.

Zavier Gonzalez Senior Rendering Engineer at Magnopus headed the shader team on CocoVR. Pixar rendered out separate passes for specular roughness, occlusion and basic reflections, this helped with shader integration. “Having tested visibility from a point for a pixel (from a probe depth cubemap), and then having worked out the best color, the code could then add on top specular, reflection, fog, just about anything” he explained. Having the reflections laid on top gave view dependent aspects that helped bring the world to life and make the Coco world seem more than just a simple projection.

The advantage of this approach is that it allowed the Pixar team to create all the baked lighting themselves. It was also very fast, for both eyes it took only 1.6 milliseconds, which allowed time to then have extra elements, effects and the characters added into these environments. It was memory intensive and the team ended up having to use high resolution textures to maintain quality. “But I think the final quality managed to speak for itself”, Gonzalez commented proudly.

Technical and Stereo

In terms of the stereo display, the team worked in zones. “We figured out with the current level of resolution in consumer headsets displays there are limits on where you can perceive inter or inter-object parallax” comments Henning. This means that for object close to the viewer they need to be fully stereo, but there comes a point that the stereo is between an object and its background, but the object itself does not need to exhibit stereo. “An example of this is the cobblestone underneath your feet, they are always going to be really close to the viewer, so those are completely real time with displacement shaders that our team made, so that you have parallax on individual pebbles compared to the grout in between them, but as you get further and further away you get simpler and simpler geometry”  he explains. By the time the user is looking at the houses built on top of each other, that are in the distance as sky structures, while there is still geometry it is very crude and just low frequency. “Just enough to get a bit of curvature” he adds.

Audio

The sound in the VR experience was a collaborative effort between Pixar, Magnopus  and a company called Source Sound which Magnopus have worked with in the past. The team always had a Pixar audio engineer reviewing the work. While the team wanted to use as much original sound design and audio as possible, the audio like the vision is interactive. “The audio of Hector is attached to Hector’s head” says Henning. “If I go to the left or right, his audio moves, which is a combination of most special features that are in most game engines today, coupled with Oculus’ system for ‘specializing’ in their headset”. This provides a nice surround effect in the experience, even with only a pair of headsets.

Experience Design and Interactivity

Magnopus had multiple artists working on the test application at once, both for time reasons “and to create some healthy competition” joked Henning. Magnopus is engine agnostic, in the early days, when testing, the team used Unity, Unreal and other things, regardless the final target. “At this stage a lot of the tests are going to be thrown away anyway, so you just want to let the artist use whatever tools they are most comfortable with, and work with the fastest. The idea here is to move as fast as possible,.. and hone in on what will work as soon as possible”. The team was also artistically neutral, in the sense that these tests were not designed to look polished or good, “in fact we go out of our way to make it look bad”, explains Henning. Magnopus does this, in the early stages so people will not be distracted by how something looks while they are trying to judge if it is fun or easy to use. “It is human nature that we are distracted by visuals so it is actually better at this early stage to be working with grey boxes, and plain grids” he explains.

Magnopus had already been working on interaction tools, social networking technology and object interaction social mechanics independently of Disney and Pixar, “so this was a great opportunity for us to explore using it further”, Henning explains. This core research became the wide area network social interaction base of the Coco VR experience that elevates it from a simple VR game. This includes not only shared visual experiences but voice over IP driving moving jaws on the characters, and many other innovative interactive devices such as virtual cloths and digital mirrors (and digital Selfies).

The Magnopus team knew the advanced prototype was succeeding when Disney’s Ed Catmull was invited to join an interactive session. Catmull had previously expressed concern about VR as a linear story narrative form, so getting his approval on the project was key. Magnopus knew he was not against VR per say but he believed just translating a film to a VR experience was not the best option. To test the prototype, Ed Catmull joined a shared Coco VR experience. He was in Emeryville and Magnopus was in LA. “We all played together for about 15 minutes, … and then we got this quote from Ed, ‘this is weird, but it is awesome, – I think you should continue’. Actually, Catmull did not say this directly to Magnopus, the Mic was still live after he took off his Oculus head gear, and so the LA team just heard him say this privately to the Emeryville Pixar team, “it was some brand new form or type of industrial espionage “ Henning laughingly recounts.

The final production version was produced in Unity. In the final experience, a user can choose either a single or multi-player experience, and follow the magical alebrije into the luminous world of Coco filled with lovable characters and beautiful settings from the film.

From a production point of view there was a desire to have a script for the project. But Magnopus felt that rather than building a linear narrative, they were building a space for people to build their own narrative, not unlike visiting a theme park. “We were trying to give them agency, freedom to move around their own path, explore the different attractions at their own speed, depending upon what they are interested in. From this point of view the linear narrative of a script changed into being more of a set of user journeys” explains Henning. Magnopus continued experimenting and trying new ideas right until the end of the project.

The final experience featured:

  • Meet Miguel, Ceci, and Hector in various locations featured from the movie.
  • Multi-player mode that enables users to include their friends, including voice over IP
  • Express uniqueness and style with clothes and props in Ceci’s costume shop.
  • Mirror and selfie capture in the photo booth to PC file.
  • Take a ‘gondola ride’ and see the gorgeous city from above.
  • Having the user star alongside of Hector in a musical celebration of Dia de los Muertos!

As one user in the US posted online when the experience was released on the Oculus store:

“What a VR experience should feel like. The setup is quick and intuitive, and the world is strangely immersive. After staying and enjoying art for close to half an hour, I stumbled outside where skeletons were playing music and the crowd was bumping along. After a while, I even went close to a chair in VR, grabbed my chair in my room, and sat and just enjoyed the world.”