The opening to Bungie’s Destiny takes place where astronauts discover The Traveler, an object that gamers will encounter later as they play. Prologue worked with Bungie to deliver this cinematic and we found out from some of the key team – directors Simon Clowes and Ilya V. Abulkhanov, CG supervisor Lee John Nelson, producer Armando Plata and FX supervisor Alan McKay how they did it – including a surprising practical shot.
fxg: Can you talk about designing the cinematic and what concepts, storyboards and early roughing out was done – what were the important things Bungie wanted you to communicate?
Prologue: During the initial discussions with Bungie, they expressed they were looking for a :90-1:20 opening sequence. The sequence had to communicate a present day landing on Mars, which happens to be 500+ years before the setting of the game itself. We were then provided a script from which we were asked to explore the story of the landing and the journey of the astronauts as they seek out a suspicious object hovering above the terrain – The Traveler.
We presented initial storyboards and concept art, along with a written treatment describing how we intended to not only tell the story, but also our stylistic approach to the execution. This was followed by a thorough storyboarding process with our frequent artist, Doug Stambaugh. We drew every key moment in the sequence, which was then divided into scenes and shots. Quickly sketching our ideas provided us a lot of freedom in experimentation with camera angles and composition. Since some elements of the script were revised a few times, we always had the foundation of the resulting animatic created from the drawings to go back to.
fxg: How were the main CG assets crafted, such as the spaceship and astronauts – what were the tools you used for modeling, rigging, animation here
Prologue: We usually start out with concept drawings, but for this piece, we started with simple photo reference. Bungie provided the Lander and weapon reference. They wanted to make sure they were immediately recognizable, and identifiable as present day. One of our lead artists, Bora Jurisic, put a lot of love into the details of the Lander, and we think it came out pretty awesome. Those assets were both modeled and rigged in Maya.
The astronauts were a hybrid of the traditional space suit, other pressure-suits, and a bit of our own design. They needed to be cool looking, but still believably functional. We knew it was going to be a tough task, so we called upon senior Modeler, Dan Katcher. Dan is also the lead dragon modeler for Game of Thrones, so we knew he had the chops. Paul Fedor’s texture work on top of Dan’s models really gave the astronauts the realism and believability we were hoping for.
The astronauts, as well as some of the terrains, were sculpted in ZBrush. The Lander and weapons were modeled in Maya. Rigging and animation were also out of Maya.
fxg: What was your approach to lighting and rendering – what did you use for lighting and texture and shading references?
Prologue: Bungie came to us at Prologue because they wanted something different visually. They wanted to be scientific, but not to where it would make things look boring. So we spent a lot of time researching and conceiving the look and lighting style we wanted. We found out that the hues on Mars are different than what we know them to be. The brownish red you see in NASA photos aren’t necessarily the accurate colors. It’s color corrected by NASA. But it’s the color we recognize as Mars. We used a lot of matte paintings for our environments, which gave us solid background plates for Daniel Hayes (Lighting Lead) and the lighting team to light too. Deciding what render engine to use to accomplish the unique look we wanted was a vital. We needed something robust yet small house friendly, and because of the ambitiousness of our vision, on the given schedule, render speeds were crucial. In the end, V-Ray was the obvious choice. Other than the FX, all rendering was done in V-Ray.
fxg: The Mars environment has a textured and immersive quality to it – how were the view from space and on-surface shots realized?
Prologue: We looked at a ton of Mars landscapes. We realized that it is just a vast, huge landscape, and that we would have to modify and stylize a lot of the terrain to create a more immersive environment. Matt Gilson, our Lead Matte Painter, helped us get the look and feel we wanted. We first did low-res previz that had basic geo to help carve out compositions. Matt was able to use this as the initial base for his paintings. We then used projection methods in Nuke to get proper life into the environments. It was a long road to get there, but we are extremely proud of the end result.
fxg: Can you talk about some of the heavy effects work, including the re-entry fx and the dust and atmospheric work on the surface?
Prologue: For the lander entry, we relied on traditional reference of old NASA footage. There were also some back and forth tests with Bungie to make sure we got the events and science correct, especially involving the heat shield. The first couple entry shots were actually a combination of fx, and a bit of comp trickery by Ilya. The environment fx were very tricky. The story has a mood to it that gets heavier as the journey of the astronauts progresses. The idea was to slowly have a presence of a storm coming. Making conditions harsher as they travel further. Prologue had never really done vast, large scale fx before, so it was very heavy on our pipeline. Landing Allan McKay as our FX Supervisor was a huge boon for us. He was able to make preset tools for the fx team, for a lot of the elements, to drag and drop into shots as needed, which saved us a lot of time. However, a lot of them still had to be tweaked to help keep tone and mood relevant to story.
We also asked Allan McKay about the effects work involved in the cinematic.
McKay: The challenge coming onto Destiny was establishing a fresh team of talent and knowing we had a large shot count that relied on environments and animation that was generally being developed the same time the FX would need to be developed. I managed to score a very solid FX team on this, guys like Patrick Schuller, Benjamin Liu and Hnedel Maximore. Initially I rolled on to focus on look dev alongside Hnedel. The initial space sequence was primarily handled by Hnedel while I focused on the look development for the entire project, I knew we would need to put together a lot of heavy duty atmospheric effects, and simulating dirt, rocks, dust all blowing over large scale environments interacting with everything. I approached things a little bit differently on this project, by writing a lot of tools to help roll out the effects.
They were broken down into two areas, one was FX asset libraries where we could insert pre-cached grids or particle caches into the environment to re-position and render instantly, which was great for many of the secondary shots, and for many of the hero shots or up-close effects, I had many dynamic tools, which would literally let you point and click and it would generate and ‘rig’ the effects to the shot with a few basic controls, to get us 99% of the way there, so situations like having hundreds of millions of grains of sand flowing over surfaces or very specific types of atmospheric effects that need to simulate inside the environment, we can set up and get close to final typically in a few hours rather than a few days.
Although the final cut is much shorter, one of the coolest areas of the project was that there was a massive amount of rain at the end of the cinematic, we had originally outsourced this however the effects didn’t quite fit what we were after, I developed some pretty awesome tools to generate rain both interacting with the environment as well as with the characters to rebuild all of the rain shots from scratch in just a matter of days, the challenge was creating misty rain elements, dripping and splashes and an effective way to mesh out tens of millions of droplets of rain and render fast. I did all of this within 3DS Max, the entire rain system was built in MaxScript and relied primarily on many of Max’s built-in software and worked very effectively.
This definitely was a passion project, we had such an enormous amount of talent tied to it, and the project itself was always evolving and story concepts were changing along the way, so we needed to adapt to this quickly and by building a solid workflow for this allowed us to roll with the punches and in the end come out on top.
fxg: The color shifting in the horizon and sky is really interesting – how was this achieved from a technical point of view?
Prologue: Figuring out the color shifting / time-lapse sequence was one of our biggest, and most gratifying, creative challenges. We knew what we wanted, we knew it was ambitious, and we knew it would be a tough sell. When we first pitched the idea to Bungie in a previz state, they weren’t quit sure it would work. But we were determined to figure it out. We tried a few techniques. In the end we did the time-lapse in actual 3D. Working closely with our Animation Supervisor, Mitch Gonzalez, we meticulously stepped the animation, and even combined different mocap takes to create the effect for the astronauts. We also shifted and stepped the lighting so it would pop and change over the course of the shot. We didn’t really know what we were going to get until we were in comp. In the end, we were able to achieve an interesting effect that conveys distance and time for their journey.