Battleship: tactical water and fluid sims (updated)

ILM was the lead facility on the technically challenging Battleship. While ILM had huge rigid body and extensive water simulation pipelines from both the Transformers and Pirates franchises, combining massive destruction and water sims on the scale required for Battleship demanded a complete overhaul of the already impressive ILM simulation pipe. With over 1,000 VFX shots to create ILM had both Pablo Helman and Grady Cofer, serving as the overall visual effects supervisors on the film. We take a look at ILM and Scanline’s work for the film, and the previs by The Third Floor.

Layers of extremely complex water sims.

For a film to challenge the technical simulation pipeline of ILM, you know it is going to be big. But one of the factors that drove ILM to review and revamp its pipeline was the need to not only do ‘big’ large scale effects but also the smallest closeup water sim and almost everything in-between. ILM’s Grady Cofer, the overall VFX supervisor on the film, told fxguide: “During the early pre-production process of break down scripts, we knew going in that a film Battleship with naval warfare and alien ships battling on the sea that there was going to be a massive amount of water simulation, but it really went beyond that. A lot takes place during the day, and there were a lot of varying scales.”

“The way Pete (director Peter Berg) conceived of this story,” adds Cofer, “he wanted the alien ships to start beneath the surface of the water and then breach up through the surface, with these very dramatic breaches. So we referenced  a lot of submarine breaches, to inform those shots.” But once the ships were on the surface the director wanted an additional water touch for all the alien ships and that was a complex system of “recycling the water, and by that I mean the ships suck water out of the ocean and then we’d populated these (alien) ships with water portholes and the water would facade over the surface of the ships.” Which in turn required complex and very detailed digital collisions and splashes.

For an in-depth discussion of ILM’s water sim work for Battleship, listen to our fxpodcast with visual effects supervisor Grady Cofer.

Internal 2009 Battleship water project

At the outset of planning for the show some three years ago (2009), ILM thought they had a very robust and well developed water sim pipeline, after the award winning work the company had done on Poseidon and the Pirates films. “But we calculated there was really not enough time to accomplish the shots in the years to come (with the tools we had). With all of our bids we were about 500 man weeks over, and we knew at that point we were going to have to re-invent and step up the way we did water sims,” details Cofer. Thus ILM started on what would be called internally the Battleship Water Project.

EXCLUSIVE: this ILM clip shows the primary water layers and how they were comped.

ILM got all their R&D team together, “and Willi Geiger our CG supervisor here, who has a great history of CG water under his belt, and together we fleshed out this new water pipeline and we started building tools to advance our workflow.”

• The large scale water simulations are still based on a level set particle based process where everything is broken down into grids, but everything they could do was done to optimize those grids. For example, on a section of ocean ILM would run a ‘base’ simulation. There could be five million cells, with a piece of complex geometry such as a ship, perhaps one of the film’s ‘Stinger’ ships, which interacts and collides with the water geometry as it moves and floats. But, as Grady points out, “even at a grid size that might resolve to real world scale of two foot square, you are losing a lot of detail, there is so much fine detail in these complex water structures that you just don’t get from the base simulation.”

• ILM’s solution is to then add on top a type of FLIP PIC solver for particle based simulation. This allows for the traditional GRID approach for the wider scales but in the same shot it transitions to the fine detail solution, allowing for say 20 million particles to be tossed up on the side of the ship. Each of these particle groups would then have a grid placed around them. Developed in-house, these secondary grids that are added were adaptive in size, based on how close the camera was to the particle simulation. Now with the particle secondary solution, the imagery could be resolved to a pixel resolution.

• The grid systems above were then meshed. Key in the ability to convert a particle or grid approach into something one can render is the process of meshing the simulation. This polyhedronization encapsulated both the grid and the particle approaches, and outputs something that can then be shaded and rendered. It is at this level that the ILM would move to considering refraction, reflection, scatter and specular highlights. The meshing stage is piviotal in making sure that the water does not flicker, “you have to decide when the particles are clustered together, their proximity to each other – and at what point do you want the particles meshed together and at what point do they pull apart enough to  separate , become separate globules – that are traveling beside each other”. This is a critical stage for evaluating surface tension. The program has to decide when the water is meshed, when it is water, and / or mist.

Water when disturbed produces white water, the transparent liquid gets aerated and becomes virtually opaque. The water also gets atomized into mist which “would generate hundred of millions of particles – we had shots that were ranged from half a billion to a billion particles,” explains Cofer.

“Once we found the key to the air fields around our splashes, it really injected a lot of realistic detail.”

  Grady Cofer
  ILM

Most companies or projects model water, model rigid bodies, run simulations and the level of interaction between RBS and water sims is not fully closed. Yes, objects change water but they are not all normally fully interacting, affecting and driving each other completely. ILM went even further than this – they modeled the way the air was affected by the water splashes which in turn affected how the water was coming off the waves and splashes as mist and how that mist moved, and how the actions aerated the water and created white water. This is an outstanding level of realism. “The real selling point or tipping point was when we started building air simulations around all these events,” comments Cofer. “We used our Plume tool, which is our tool for doing smoke and fire simulation. It gives you very detailed air simulations. If you have a ship that breaches up through the ocean, it churns up the water, and there are all these vertices of air rotating under the wings – as it spreads its wings apart, as you go through the evolution of a water droplet, you start with a surface of water, which is meshed, and then splashes, and that atomizes to mist. And as it becomes less dense it is more influenced by the air fields. And it starts swirling, and the water structure influences those air flows. Once we found the key to the air fields around our splashes, it really injected a lot of realistic detail.”

Watch an ILM featurette.

Another approach that the team developed was ‘ballistic particles’, which allowed the animators to inject special particles into a simulation, that would literally over some range of time set explode mathematically and thus inject extra turbulence and roughness into the simulation. These ‘ballistic particles’ helped add realism by adding a somewhat random and unpredictable energy to the water splashes otherwise being simulated by the movement of the ships or alien craft.

The final rendering was done in RenderMan, influenced by a huge amount of HDRs captured on location and at sea. “We amassed this great library of skies and we’d use that as our base lighting on the ships – that would get you 80% or 90% there,” says Cofer.

One problem many water sims have is thin sheeting or water cascades that break during a fall as water surface tension breaks down. To accurately model this, ILM accessed extensive water, waterfall and other live action libraries. While the team looked at hours and hours of footage, and many hours of naval footage, the water around the ships was primarily simulated. Cofer and the team did get to film water from helicopters, as well as other things more naval in nature!

Interestingly, moving away from the water sims, Cofer and a team got to actually go on board a range of ships with the help of the US Navy. The filmmakers had huge support from the real US Navy. Digital versions were made of almost every Navy ship in the film. At the start of the film there is a sequence with three naval destroyers, (2 American, 1 Japanese) and the editing cuts back and forth between digital and real, but also in a single shot often one or two of the destroyers is digital and the other real. There was also a digital version of the aircraft carrier Ronald Reagan, and the battleship the ‘Mighty Mo’ itself. In real life, as in the film, the USS Missouri is decommissioned, and there are no battleships in the US Navy. Which may have meant that all the shots of the USS Missouri were going to be digital but it was not as simple as that.

“I remember being at the pre-production meeting when they asked me how I felt about taking the real USS Missouri out to sea and ‘would that be useful?’,” recalls Cofer, “and I was YES YES we have to do it, what an amazing opportunity – and I promised Pete not only would we get the best reference in the world but we’d also get backplates that we could put right in the movie.” The ship was pulled out by tugs as the engines are decommissioned but still Cofer and the team photographed it extensively. The ship was being dry docked as part of a maintenance program and so the ILM team also got to extensively LIDAR the ship to aid with digital modeling. “We got this massive point cloud right down to little tends from Kamikaze hits from the second world war,” adds Cofer. In fact, ILM got naval co-operation and under strict naval supervision was able to scan and photograph a lot of ship assets. Cofer spent ten days at sea filming and photographing from inside a 5″ gun to a naval helicopter photography of the naval exercises and live fires of Tomahawk missiles.

EXCLUSIVE: this ILM clip shows how the primary destruction sims on land were comped.

 

Scanline adds to Battleship’s effects

Working in tandem with ILM, Scanline contributed to several large sequences on Battleship, a project that made heavy use of its proprietary Flowline fluid simulation software. Scanline’s work included the alien crafts approaching Earth and impacting the ocean, and one of them – ‘Elvis’ – spinning out of control and crashing into Hong Kong. The studio also handled the alien barrier, as well the Ringo tower shots seen as Hopper investigates the strange object. Finally Scanline contributed to the final battle sequences as jet fighters attack the mothership.

A mysterious entity approaches Earth from space, soon revealed to actually be five alien ships. “As the ships roll through space, they have a fluid simulated plasma that protects them through their travels,” explains Scanline co-visual effects supervisor Danielle Plantec. “That same alien technology is what later is used to generate the alien barrier. The catastrophe in space that sends Elvis hurtling off course toward Hong Kong instead of its intended Hawaii landing is a small satellite collision that tears a hole in the protective plasma which causes it to go slightly off course and collide with Ringo’s shield and go even farther off course.”

For that part of the shot, Scanline defined a breakage that tore a large piece of the tail off as well as the communication parts of the ship (that are later pulled out of the debris in Hong Kong). “In the same way that we tore the buildings and satellite dishes apart using rigid body simulations,” says Plantec, “we triggered the tearing away of the tail and communications portion of the ship, with the explosion from the impact with Ringo’s shield. We then generated more detailed smaller particulate simulation and varying smoke trails to add complexity.”

Watch the Hong Kong sequence.

“We were careful to keep the trails very thin,” continues Plantec, “as Peter Berg didn’t want it to feel flaming and pyroclastic and really wanted to see everything rather than having too much obscured by debris and trails. We made sure to define the four healthy ships landing in Hawaii with their plasma shields clearly in tact, while Elvis spins out of control trailing debris. We also added controlled shockwaves to show entry into the atmosphere.”

As the damaged Elvis then approaches Hong Kong, it emits a trail of fire and debris. “We started out with some very cool thick pyroclastic trails,” recalls Scanline co-visual effects supervisor Stephan Trojansky, “but it obscured too much, so we started varying it quite a bit, streaming thinner white smoke from smaller pieces of debris and only allowing the thicker smoke to emit intermittently.” We wanted to be sure you could see the smaller fires around the ship and the charred skin with small detailing similar to burning steel wool. When the ship hits the mountain side we procedurally broke the mountain itself as well as having more pieces of the ship break off and trees sent into the air. The impact sent a shockwave which we simulated some of the trees reacting to.”

The Hong Kong destruction scenes were fleshed out with rough primary animation as previs to calculate ship, mountain and building impacts. “We then evolved it with each iteration adding low resolution sims for breakage until we replaced each part with high resolution, models, simulation and rendering,” says Scanline co-visual effects supervisor Stephan Trojansky. One shot in the sequence is a view of the crashing ship from inside a Hong Kong office tower, which presented several compositing challenges to Scanline. “We had to bring the plate into the same world as the destruction so it felt like it was being illuminated and shadowed by Elvis as he tore through frame,” says Trojansky. “We also did some additional simulations on top of where the plate was to help integrate the foreground and background.”

Ultimately, Elvis collides with a major Hong Kong bank building, causing its upper floors to topple over – an effect achieved without any practical elements. “We built out the bank building in detail from beams to hanging fluorescent lights,” explains Trojansky. “We pre-cut the path of destruction and then did rigid body simulations in which the pieces would break at stress points as they are hit by either Elvis or other pieces of the building. We then did additional small scale rigid simulations of smaller debris as well as internal props and office furniture, and secondary dust simulations trailing of debris.”

“We even added simulated trees from the atrium,” adds Trojansky, “and a window washer that breaks through the glass as the top of the building comes crashing down. We also ignited lots of small fires throughout the interior of the building which gave us a lot of opportunity for variation and interesting light effects as well as adding a blue electrical sparking throughout. The smoke had several types of simulation. There was a thinner white smoke that broke up the darker pyroclastic turbulent smoke. Wherever we had smoke there was always a very broad range of color to add complexity.”

Scanline’s Flowline software was used to produce the trails coming off of Elvis, as well as its final water impact (and the impacts of the other ships safely making splash-downs near Hawaii). “We started by generating a base digital water that matched perfectly to the plate,” says Trojansky. “We then simulated the impact of the shops on the water and did secondary and tertiary simulations for the spray and mist. While the primary water defined the shape of the white water, the spray gave it structure and detail while the mist gives it a strong sense of scale. We additionally had smoke and fire sims intermixed and rendered with the water so that you could see the fire from Elvis illuminating the water from the inside.”

Watch how the aliens construct their barrier.

Scanline’s explosions, including for the satellite and satellite dishs, were created procedurally. “We broke the rigid surfaces and applied forces to trigger their destruction while holding the pieces together with a strong friction almost like a glue,” explains Plantec, “which helped give it maintain a strong complexity rather than simply looking like shattered pieces. It feels like the fragments flex and hold together while other parts crack and tear away. We used several layers of Flowline simulations from primary explosions to trailing smoke and dust streaming from the fragmented pieces.”

The aliens connect their communications to the satellite dishes through a series of batteries in hopes of using an orbiting satellite to communicate with their home planet. “As the connection is made it sends a destructive energy through its path,” says Plantec. “We modeled trees to match identically to the plate and attached leaves as particles and procedurally broke the branches as the plasma paths hit and then used fluid simulation to generate the beams that come from the satellite dishes. Later when the dishes are destroyed we simulated small paths of fire at varying styles and intensity and intermixed them in comp along with the satellite dish rigid bodies, explosions and lots of lens flares.”

Another Scanline contribution was the alien barrier that traps the three destroyers inside. For that, Scanline generated the beams and shockwaves with a fluid sim. “The dome itself was generated in Nuke with procedural texturing and refraction,” notes Plantec, “while the clouds that build as the dome grows are Flowline fluid simulations and the water interaction at the base of the dome was also generated by a Flowline simulation of foam and spray in combination with some practical comp elements.”

Finally, Scanline’s visual effects were intermixed with ILM work for the final destruction of the alien mothership. “ILM provided us with concept art for the end destruction sequence,” says Plantec, “and ILM also provided us with a partially destroyed ship which they used for their shots. As we had the close up jet fly overs and the last bombs dropping on the mothership, we did a large amount of destruction detailing to hold up from our shot angles and additional build to enable us to blow it up.”

“We simulated small waterfalls flowing over the mothership’s surface and interacting with the ocean as it hit,” adds Plantec. “We generated large amounts of foam and bubbles and aeration under the surface where it connected to the destroyed ship. We also added fires and electrical sparking throughout the ship and Flowline explosions as the bombs impact the ship and deliver the final blows. The explosions also triggered rigid body simulations that tore apart the guts of the ship and bent and flexed rather than just shattering.”

Previs takes Battleship to extremes
Previs by The Third Floor.

Among the companies providing previs services to Battleship was The Third Floor, which contributed to the shredder highway destructions sequence, the captured alien rescue and buoy grid sequences, as well as the Thug fight in the engine room and the climatic battle at the end of the film between the USS Missouri and the alien Mothership.

“A good portion of our effort was focused on exploring ideas with the director and helping work out specific shots and action sequencing beyond the general beats in the script,” says Barry Howell, The Third Floor’s previsualization supervisor. “Our team interfaced closely with Production Designer Neil Spisak and Art Directors Aaron Haye and Bill Skinner to make sure that the Art Department designs were reflected in the previs scenes. We also worked very closely with Director of Photography Tobias Schliessler to ensure we were visualizing the types of shots that he wanted. VFX Producer Gayle Busby and ILM VFX Supervisor Grady Cofer would meet with us regularly to make sure what we were creating was within budget and inline with the expected visual effects.”

Previs by The Third Floor.

In the engine room alien encounter (inside Hopper’s ship), previs helped production establish the environment and then plan for the actual shoot. “We created multiple previs versions that tried out different scenarios and suggested different ways the story could be told,” explains Howell. “This sequence evolved over many iterations, as we changed things up based on requests or revisions from the filmmakers. What ended up on screen was a combination of two of the sequences Pete liked best.”

“Another sequence that followed pretty close to what was previsualized was the Land Commander Rescue,” adds Howell. Here, Hopper’s men find an alien floating in the water and bring him aboard. “This was a crucial scene in the movie as it was the first time the audience would get a close-up look at the aliens themselves,” says Howell. “The director wanted to extend the suspense and creepiness of this moment by slowly revealing the alien. With storyboard artist Richard Bennett’s help, we created a series of shots in previs that showed only parts of the alien at first, focusing more on the crew members’ reactions. The result met the director’s approval with minimal revisions.”

All images and clips copyright © 2012 Universal Studios.

4 thoughts on “Battleship: tactical water and fluid sims (updated)”

  1. Brendan Taylor

    Love the ILM featurette. It’s nice to know that even ILM faces these incredibly daunting deadlines…I’d be interested to hear how they work with their Singapore offices and how they maintain a persistence of vision and quality of work.

  2. Pingback: ERIC ALBA | Battleship: tactical water and fluid sims (Updated)

Comments are closed.