Walt Disney Animation Studios is clearly hitting it out of the park right now. Big Hero 6 was a huge success, while Frozen remains the highest grossing animated film of all time. Creatively, Disney Animation is telling highly compelling stories, but they’re also investing heavily in incredible artistry and tech – seen most clearly via the creation of their own renderer Hyperion (which fxguide covered in depth previously, here). That same approach to taking advantage of both the art and the tech of animation is carried through in the studio’s latest adventure, Zootopia, directed by Byron Howard and Rich Moore, where the need for major solutions to hair and fur and also vegetation were crucial to pulling off the all-mammal world of the film.
Down to the follicle level
Of course, it’s not like Disney Animation hadn’t tackled the complexities of hair or fur before. There was Tangled’s lead character of course, and then a myriad of others in say Wreck-it-Ralph and Frozen. Disney is also behind the instancing tool XGen (now licensed to Autodesk) used widely for hair and fur.
But the studio hadn’t yet taken on a project that featured so many animals, with so many different types of hair – often with several species all in the same frame – like Zootopia. The film features 64 different species, equating to about 800,000 different character models. The lead characters Hopps (a rabbit) and Wilde (a fox) would require 2.5 million hairs each, a giraffe character had 9 million hairs and even a gerbil needed 480,000.
Still, the number of hairs is relatively meaningless unless you also have an efficient means to groom and control them. That’s what Disney Animation sought to do this time around. But first, they had to get down and dirty with all the kinds of hair and fur that would be required in Zootopia. And for that the production embarked on a series of animal research visits to find out exactly what fur really looked like.
“We visited lots of animal parks and a safari park,” visual effects supervisor Kersavage told fxguide. “We went to a place where you could get a lot closer and be in contact with the animals. We went to Natural History museums where you could examine fur right up close and even get it under a microscope to understand what makes fur and how is it different than what we’ve thought about in the past.”
The result of that research was an analysis of what fur looked like and how it behaved at the follicle level. “You suddenly realize that some hairs can be super opaque,” notes Kersavage. “We have a honey badger in the film and it has some white fur on it and it’s very opaque. But then we also have polar bears and their fur is actually more translucent, almost transparent, and you can’t really see any color in it. It’s more about the way the light actually goes through the polar bear’s fur and scatters through the rest of the fur that creates the white effect.”
On past films, Disney had ‘got away with’ several cheats for generating hair and fur. But with this new desire to replicate hairs at the follicle level and not make their animals look like “stuffed toys,” a new approach to hair was needed. “It would have been challenging for us to re-build the actual hair follicles for every species,” admits Kersavage. “So we decided to take a shader approach to this and try to see if we could replicate through some shader principles the ways we can create fur that has the right opaqueness or the light passing through.”
Movement of hair and fur had also previously been achieved with somewhat limited controls. “Typically,” explains Kersavage, “if we wanted pass wind through the fur of the characters, we had some effects modules we used to create noise patterns and be able to send ripples through. What we wanted to explore instead was how can we get more control, going back to to the follicle level, to be able to affect individual hairs at that moment.”
The answer for both hair creation and movement came essentially via continued adaption of Disney’s XGen work. “XGen has been a way to do geometry instancing, basically,” states Kersavage. “It was also where we dialed in the materials and the shading, so we added the ability to be able to change the properties shader-wise. That work done for the shader was leveraged off of the BRDF that we did even back from Wreck-it-Ralph.”
Disney also brought back an earlier tool it had created for grooming the hair. “We had this tool in our past that was part of XGen that was called iGroom and we brought that back to life and were able to use it on more of a granular level,” says Kersavage. “It essentially lets you comb the hair and put in different kinds of patterns, say for the animals that have more cowlicks, and then different kinds of roughness and clumps.”
And then there’s the rendering handled in Hyperion. Already Disney had found that the path tracer was allowing them to include almost an unlimited amount of geometry in their scenes – both for environments and characters. However, the fur and hair were a slightly different story. “With Hyperion being a path tracer, it wanted to bounce around inside the hair and scatter throughout,” says Kersavage. “That can get very complicated, so the big thing that the team did was try to figure out how to optimize the hair itself.”
In Hyperion, during the rendering process, the scatter would occur and “then we’d get back to a point of diminishing returns,” explains Kersavage. “We had to work out how to average the large number of bounces into something that’s a little bit more digestible and so more efficient. Hyperion could do all that averaging for us and then we’d know what our cut-off was, based on species and therefore hair density.”
Rendering the complex hair threw up another challenge; artists would not see the final results of their animation and hair sims until the final renders were delivered. What Disney needed was a way to get an idea of how the hair would behave and how it might impact the performance – and get it fast. “In the past,” says Kersavage, “what we’ve done is have some proxies with basic shapes. That can get half of the way there but when you really get into the the really dynamic grooms, that breaks down a lot.”
So Disney capatilized again on earlier tools it had developed in hardware shading and GPU rendering with its Nitro GPU solve. Nitro allowed animators, in particular, to generate near real-time playblasts of their hair and fur for immediate feedback. “You could say, ‘Give me 10 per cent of the groom or give me 100 per cent of the groom’ based off how much you wanted to receive playback in the shot,” outlines Kersavage. “That might be based on how many characters you had in the scene – if you had just one character you might crank it all the way up or if you have 20 characters you’re trying to manage then you dial it down accordingly.”
“That really gave animation a big control into what they were going to see in the final render,” adds Kersavage. “Another part of that too is that the hair would mask the performance. If the character had a smile or a frown, the corners of the mouth specifically sometimes get lost underneath the fur. If you’re just looking at the solid surface view of it, it all looks fine but as soon as you put the fur on it, the subtle performance is lost. So with the GPU tools we were able to provide animators with, none of it actually got lost and they realized perhaps they’d need to push a facial expression a little bit further to be able to see the expression through the hair itself.”
It’s a jungle out there
At one point, Zootopia’s action moves into the rainforest, where Hopps and Wilde continue their detective work. The forest is a vast landscape of trees and vegetation, intertwined with living quarters and a constant supply of rain (some from giant sprinklers) and mist.
The environment was always envisaged as dense, but just how dense could it actually be? In many ways that was left to the artists at Disney Animation and the continual R&D and updates to tools they’d been been creating. One in particular was Bonsai, a vegetation-generation tool. The team also drew on procedural tree growth used on Tangled. But what really changed Disney’s approach to how dense the rainforest could be was being able to take advantage of Hyperion.
“Early on,” notes Kersavage, “I wanted to push Hyperion to see how much we could do with vegetation. How much geometry can we actually stuff through it? We did this test and kept making it larger and larger, and the next thing was we had 7 million trees that we were able to fly through! That made me feel like the complexity was not necessarily going to be an issue with the rendering. The complexity really came from an artistic point of view in being able to place all of that in such a fashion that was consistent with the production design. We started small by building up individual pieces with branches and designs and as we were going the thing got larger and larger. They’d ask, ‘Can we add 100 more trees?’, and we did!”
Kersavage also pushed for the trees, leaves, branches, vines, grass and all the vegetation to, like the fur, always be moving. “We had some procedural tools that allowed artists to set up a slow, medium, fast, extra fast version of trees, and when we got to the point where we could finally take it from the base version and make it live in the shot, you could dial in the parameters and say, ‘Oh I want this tree to move faster.’ That really helped a lot. And then we leveraged that not just in the rainforest but everywhere else in the film too. You’ll see a lot of vegetation, whether it’s ivy on the side of buildings, or grass that’s moving – we used the techniques.”
For the rain of the rainforest, Disney relied on the idea of volumes. “We realized the thing that makes any environment look very specific to that area is the amount of water vapor or moisture in the air,” says Kersavage. “It’s about the way the light passes through that water vapor. So we generated lots of volumes. There’s one where it’s just an overall volume that we can place into a shot, all the way down to individual pieces, right down to the steam that’s coming out of the tree.”
Not only that, the rainforest scenes feature shots of individual rain drops falling – at different levels in the foreground, mid-ground and background – which also helped sell the stereo presentation of the film. “The water droplets actually hit the characters,” adds Kersavage, “and for that we had procedural pools of water you’re going to actually see when they’re running out onto the gondola platform – that applied to the whole rainforest. And then there’s the condensation and dripping and droplets hanging on the leaves when the rain actually stops.”
How Disney Animation dealt with fur/hair and vegetation are really just two examples only of the tech innovations in Zootopia. It could be said the studio has really hit its stride in pioneering creative, artistic and technical methods to help in telling great stories, something Kersavage also acknowledges. “With each film we try to get better and better at efficiency and what we can achieve artistically. With each one of these things we gain that confidence of saying ‘nothing is impossible’. We’re able to take whatever vision a director has going forward and know that we can make something cool out of that and come up with something that’s fantastic.”