Vicon House of Moves recently helped bring to life 30 animated superheroes for an NHL and Stan Lee venture known as The Guardian Project. We find out how they used mo-cap and the Epic Games Unreal Engine to make it possible.

The Guardians together - previs

With names like Oiler, Thrasher and Red Wing, the 30 Guardian characters each have special powers derived from the NHL teams and cities they represent. Guardian Media Entertainment, a joint venture between the National Hockey League and SLG Entertainment led by Stan Lee of POW! Entertainment, looked to Vicon House of Moves to produce Guardian animation for broadcast, to display at stadiums, for a virtual reality experience and to screen online.

The Guardians together - final shot

A centerpiece of the project was an animated short designed to showcase all of the characters that was played on screens inside the RBC Center stadium in Raleigh, North Carolina during the NHL All-Star game and aired on the Versus and CBC Networks.

Here we talk to House of Moves vice president of production Brian Rausch and House of Moves director Peter Krygowski about the mo-cap shoot for the animated short, how the Epic Games Unreal Engine used for rendering and the challenges of multiple deliverables.

fxg: Can you tell me how this project came to Vicon House of Moves?

Brian Rausch: Well, interestingly, it didn't come to us like a normal project, where we might have say a two hour movie and there'll be 35 minutes of animation that we need to shoot - that's a very defined deliverable. Guardian Media came to us and said, 'we have an idea, we have some character designs, and we need to create a brand, we want it to be animated, online, on TV and also be live in hockey stadiums.' It just meant there were so many different formats, resolutions and deliverables.


- Watch a behind the scenes reel for The Guardian Project.

fxg: How did you start to turn the concept into CG characters and animation for the All-Star game?

Peter Krygowski: Guardian Media had established a pretty strong backstory for the whole project. They created 30 characters from 30 different cities. They developed whole stories and even things like comic books around them. The All-Star game came about because Adam Baratta (Guardian's Chief Creative Officer) was trying to figure out a way to really showcase the characters and deliver on them and bring them to the world in a big fashion. He needed to bring 30 characters all at once to a huge broadcast audience.

fxg: Can you talk about the process of creating character assets for the animated short?

The Islander - previs

The Islander - final shot
Krygowski: It involved us taking their 2D drawings and artwork and bringing them to modelers at House of Moves and having them flesh out the characters. A lot of them were not pure humanoid characters so we needed to do specific things like fly with wings and have backwards facing legs and mechanical attachments. House of Moves. Generally we were using Maya and ZBrush for modeling and then MotionBuilder for the animation out of mo-cap.

Rausch: The front end of the pipeline was basically your standard CG pipeline, right up until we introduced the Unreal Engine into the overall pipe. We haven't used Unreal to produce images as essentially a renderer before. As far as I know there's not a great deal many places doing that for non-real time animation.

fxg: So how did you implement Unreal into your pipeline?

Rausch: First up, we wanted to increase the quality of the characters, so we jammed more polys and rigs into them, stuff that you wouldn't try to do in real time if you were using Unreal. Maya supports advanced deformers so that when you rig your characters and they start moving around they start looking pretty nice and clean. Unfortunately, most of those advanced deformers don't import into Unreal because they don't run in real time. So we had to rig in such a way to get high quality deformations but without necessarily the highest quality rigging tools.

We were using Unreal like a standard renderer, which would normally take say an hour per frame for a character pass. We ultimately ended up with 32 seconds per frame no matter how many characters were in the scene. We didn't want to render say a 16x9 image and butcher the crap out of it just to get it to the right resolution, so we used the Unreal engine to give us that flexibility. We wanted to ability to actually re-render these format-specific when the time comes, and it turned out to be a pretty big lifesaver.

fxg: Were there other technical hurdles you had to overcome using Unreal?

Rausch: Technically, there are IO issues, importing cameras, meshes and rigs - stuff that everybody does when they're doing a cinematic, but we were actually doing our cameras outside and bringing them and writing conversions to make those happen. But the render times were amazing. Ultimately, we rendered the entire three-and-a-half minute piece on basically one machine.

Mo-cap performer

fxg: Let's talk about the motion capture - what was the technical set-up you had for the Guardians project?

Rausch: We were using 80 Vicon T160s on our stage. We had Active Media Circle come in to do all the stunt co-ordination for us.

Krygowski: The thing to remember is that there were so many deliverables for the project - 30 characters times five superpowers for each times 30 individual environments in which they live and inhabit, plus a whole bunch of assets for broadcast. The way we laid out the original mo-cap was to try and get as much as we could out of single performances to use as collateral across each of the deliverables.

The second part of that was trying to mimic to mo-cap for games. It has a unique set of standards - so we would try and create say a run cycle into an idle that can be looped back and forth. Or a power move that can be some sort of acrobatic move that goes into an idle move. We wanted to have as many different elements that we could break up into separate pieces.

Mo-cap series

fxg: What kind of performances were you trying to get out of the mo-cap?

Krygowski: Before any shooting, we spent a fair amount of time working with Guardian to determine what the superpowers were in terms of using flames or electricity or water or ice or flight. We really defined how we would manifest those powers in a character. One of them was telepathy - and I can tell you that doing telepathy on a motion capture set is a pretty extraordinary thing in itself! We'd take those powers and incorporate them into 30 separate moves into the characters where they inhabit an environment and would show off some of their superpowers.

Because of the time-line for the All-Star piece, we weren't able to do a whole separate shoot for it, so we pieced together bits we had shot - characters doing power moves, jumping off buildings - and then had the animators hand-tailor them to the particular scenarios. The Unreal Engine gave us that ability to experiment and do things quickly and pull a lot of disparate pieces that weren't necessarily intended to go together.

The gravity machine - previs
The gravity machine - final shot

fxg: How many performers were involved?

Krygowski: We used five or six guys, mostly chosen because of their bodies. The Maple Leaf and the Wild and the Ranger, for example, are huge hulking characters. One stunt guy we had, Thomas Johnson, is a fighter and can actually get in the ring. He could do a lot of things with his strength. He was picking up massive structures - like a tape ball on the set which was 80 pounds - and hoisting them across the scene. For the Islander, he had a grappling hook which was about 20 or 30 pounds of sandbags on the rope and whipping and flinging it in the air. You can really see that in the performance in the character. We had a rig that launched people across the stage, and we could mix and match the mo-cap for different characters.

fxg: Is there one particular scene or sequence in the short you were really happy with when it came together?

Krygowski: At one point, Guardian Media wanted to bring all the characters together at once and have them come to the RBC Center. The Center comes to life and turns into this gravity machine that sucks all the characters in. I wasn't contemplating building that whole environment in CG, so we decided to do a live action shoot. When it first opens up, there were 16 different parts that come to life. We wanted to make the building swirl around continuously. There's one scene in that sequence where you see about 12 or 13 characters getting sucked into the building. It's all rendered out of Unreal, with layers of effects, and I was really proud of that shot.


Thanks so much for reading our article.

We've been a free service since 1999 and now rely on the generous contributions of readers like you. If you'd like to help support our work, please join the hundreds of others and become an fxinsider member.