Superman Returns thanks to director Bryan Singer, Visual Effects Supervisor Mark Stetson, and a large number of visual effects artists around the globe who worked to make the man of steel fly, see through walls, repel bullets and save the world. We cover this summer blockbuster in our one-year anniversary fxpodcast as well as in detail with our online story.

This week we talk to Rising Sun Pictures(RSP) about their previz and impressive digital corn field work in our weekly podcast. It is a special podcast for us here at the fxguide tech bunker as it celebrates exactly one year since we started our fxpodcast. We’d like to thank all the artists and companies that have so generously given their time to make fxguide’s fxpodcast successful. We have learned so much. A special thanks to everyone who has emailed us. If we did get such great feedback from you we may surely have stopped by now rather than building the site to the heights it has now achieved. We have over three times the number of daily visitors to fxguide than we had just one year ago when we launched the podcast. So from all of us here at fxg – thank you.

A number of visual effects companies worked on Superman Returns. Key amongst them are Sony Pictures Imageworks (SPI), Photon , Framestore CFC , Rhythm & Hues (R&H) and Rising Sun Pictures (RSP).

returns/set1While each company did a range of shots, it is evident that companies are starting to be known for various specialties. SPI, for example, created digital Supermen by building on their experience in digital doubles from Spider-Man. While the Superman double was shared with other companies, he was cape-less and bald, as SPI’s cloth and hair simulations are proprietary. R&H produced spectacular water for the yacht sequences and also did the digital resurrection of Marlon Brando, building on their history of digital mouth matching and tracking that started back with Babe.

The various effects companies clearly worked well together. Photon of Queensland is an extremely experienced and talented model shop with their own digital vfx unit. They passed on both model shots and scans of the 35 foot long miniature super yacht “The Gertrude”. LIDAR scans and texture mapped imagery were passed to R&H to integrate with their digital water. Photon worked closely with Frantic Films who did the complex Crystal ray tracing sequences, extending the physical sets and Photon models.

The final shot count, according to Cinefex, shows that SPI contributed 286 shots, Framestore CFC 301, Rhythm & Hues 114, Rising Sun Pictures 106 shots plus previsualisation, The Orphanage 129, Photon VFX 183 plus miniatures & elements, Frantic Films 136, Lola Visual Effects 72, Pixel Liberation Front, 49 shots including previz, and Eden FX.

One aspect faced by all the visual effects companies was the camera used to capture the original photography. Most of the film was shot on the new Panavision Genesis camera. This new generation digital camera has a CCD pickup the size of 35mm film so it uses standard 35mm lens. Recording without the compression of previous cameras such as the HDCAM cameras used to film Star Wars: Episode II, the Genesis is a significant technical feature of Superman Returns. For many individuals this represents the future of filmmaking, precipitating changes in the post production process. The Genesis proundly impacts several of the effects companies, while hardly impacting others we spoke to. For example, R&H converted the footage from the Genesis Panalog format to a more Cineon-like format and then just treated it as normal. While the Genesis has no film grain, it does have digital noise or ‘digital grain’. This radically effected companies such a Photon who shot massive miniatures for the film.

Photon VFX

The majority of the miniatures for Superman Returns were shot by Photon. Photon actually started filming the huge miniatures for Superman Returns on film, but within days concerns were raised about matching the low grain 35mm with the nearly grainless digital Genesis footage. It was therefore decided very early in minatures to switch to filming with the Genesis on the Milo Motion control rig. This was a worldwide first and Photon had 3 days to solve — actually invent — long frame rate digital miniatures photography with Genesis. To sell miniatures one needs huge amount of depth of field. To achieve this you need loads of light and/or very long exposures. As Photon’s visual effects supervisor Dale Duguid explains, “the longer you expose film the less grain – the longer you expose digitally – the more noise you get.” Photon built and shot extensive miniature landscapes – some the size of entire sound stages. The vast world of Krypton, which would ultimately be all but cut from the opening of the film, was lenseed using Genesis on a Milo motion control rig.

This posed some new challenges for Photon and for miniature unit DP Steve Newman, since the Genesis had not been used for slow frame-rate sequences or synchronized to motion control moves at these speeds. On a very tight deadline, Photon’s Duguid and Breslin tested compositing with the Genesis slow frame rate HD output while engineer Graeme Palmer built an electronic synchronizing system to take frame pulses from the Genesis into Milo’s Flair software. Newman adjusted the lighting design to account for new exposure calculations. An effective procedure for capturing multiple plates between 1 fps and 6 fps on HD resulted from this hard work and was used by operator Jerry Andrews throughout the miniature shoot.

returns/super7470All the sequences were pre-visualised as animations and approved by director Bryan Singer and VFX Supervisor Mark Stetson. Photon’s lead animator Sean Steinmuller created a separate animation in Maya for each shot in the sequence and included ‘head and tail’ motion to allow a real-world camera system to achieve the move. In collaboration with motion control operator Jerry Andrews, Steinmuller used ‘IKtrix Impulse’ to determine the ideal position and motion of the Milo crane to achieve the move around the studio set. This allowed planning ahead of each scene, laying track precisely where needed and ‘detailing’ the miniature sets only where required for each shot.

On set there was a feed from the Genesis to a G5 with a Blackmagic Design Card and an Apple 30″ monitor. Duguid comments that he, like others, felt there was a certain stigma with digital over film. “But it only took minutes to the future, to see the full fidelity of the images, it was as if I was looking through a hole in reality to a different world, right on set,” he recounts. “I went to say lets go again for safety and then i realised – we had it – I could see for myself – so we moved on”.

ILM had used the HDCAM F900 on miniature work on Star Wars: Episode II, but found they had to push so much light into the sets they nearly melted. Photon likewise had to add a second 100K lamp, move to 360 ‘shutters, and then film 8 frames a second for some sequences. They would then average these to one frame – thereby reducing noise by 1/8. Finally the team used innovative electronic noise reduction.

The issue of noise or grain reduction also came up in the digital work Photon did. In addition to the model work, Photon did around 230 shots, of which 183 ended up in the cinema version. Some older film from the 1978 Richard Donner version was needed for the new film. It was “some very un-Genesis footage, which required an immense amount of work to sit beside the grainless Genesis,” explains Duguid, “(as) it was very grainy in the blacks”.

Photon had around 35 people on the project on average at any one time, but the number peaked at over 150 at the height of the minatures work.

One remarkable aspect of the minatures shoot was that as the shots were prevized, during the extensive setup time for each major shot, Photon had their digital VFX team on set immediately compositing shots. For example, Photon did a large number of Crystal models that were extended and worked with digitally. Rather than wait around on set, Mark Stetson would pre-comp 5 or 6 motion control passes in Shake and even develop look style frames with Photon’s senior compositors. These style frames would then be sent with the footage and the actual Shake setups to say Frantic Films who might be comping the final shots.

Frantic Films

Frantic films used Spore particle ray tracing system for the complex crystal forms. Spore was written by Doc Baily, with assistance from programmers Josh Aller and J. Walt Adamczyk. Spore is a stand-alone particle system that runs on Irix and Linux machines. Spore excels at ultra high density particles, and is different from other particle system generators because it has the capacity to handle vast numbers of particles: For instance, the software was used on the film Solaris. Typically a Spore shot could contain a particle effect with in excess of a billion particles.

Rising Sun Pictures

RSP has been growing in stature in the international effects community. The firm has consistently been producing breathtaking visual effects, building from what is jokingly called ‘911 emergency work’ in the industry, where you are brought it to fix overages or work on films that are behind schedule. Today RSP is selected as a primary effects facility from the start of the film being green lighted.

In the case of Superman Returns, not only did they contribute 106 shots including two major key sequences, but also were responsible for producing the pre-visualisation for the entire film in partnership with Pixel Liberation Front. Two major sequences completed by RSP stand out — the life of young Clark on the Kent family farm and the development of the iconic Superman X-ray vision.

In a sequence of flashbacks to Superman’s youth, the film shows young Clark first discovering and then joyfully playing with his new powers. The sequence was shot in Tamworth and RSP’s visual effects supervisor Tim Crosbie was on set for the sequence, immediately after finishing on Harry Potter and the Goblet of Fire.

returns/rsp-corn1
Cornfield composite from Rising Sun Pictures

In the film there are vast amounts of seemingly normal live action that is in reality achieved entirely by incredibly impressive photo real 3D. It has long been common for wide establishing shots to have digital matte painting and sky replacements. But in this film many corn fields, actors and live action sweeping camera moves are all 100% digitally reproduced and almost – if not actually – impossible to pick out.

The digital fields were created not because of a lack of effort from principle photography. Initially the plan was to have a combination of some CGI fields for flying shots what would be impossible to film, combined with elaborate real 100% live action. Real corn was grown on set for original photography and a computerized 500 foot high and 1000 foot long complex flying/running rig was built. While the results of the live action were impressive, running wirework is very hard to look real. This was shown in the original 1978 Richard Donner Superman film with the train/running scene. In the original a pure wire rig solution just fails to give the character weight and the restricted movement of the actor or stunt man is an issue in terms of their performance. Once Bryan Singer and visual effects supervisor Mark Stetson saw that the CGI Clark and corn was working so well in wide shots, they decided to replace some of the beautifully shot closer young Clark work as well.

To RSP this was both a gift and a challenge. It was a gift to have such accurate camera and lighting references; it was a challenge as now the digital double had to work cut for cut in close up and in immense detail.

To achieve the level of realism needed, Crosbie and the team planned to shot real corn against blue screen and populate the field with effectively ‘cards’ standing up in the virtual field. These 2D cards were originally planned to be used for the mid to far corn only, so nearly all the digital corn is actually made from only 6 corn stalks. The final corn stalks would be both located in both shade and full sun so Crosbie had the corn shot in low contrast lighting, allowing for the corn to be re-lit for high contrast sun or darker shadow. The cards were normalised to the direction of the camera so that the cards always faced the camera. As Corn is so irregular it is impossible to tell the lack of perspective change as the camera loves marginally higher or lower above the corn.

The live action plates that were intended for the final film were shot over a real corn field, with a space left in the rows for the actor Stephan Bender to be pulled through by the suspended computerised wire rig. The only difference between these plates and the final shots were supposed to be a couple of rows of digital corn that director explicitly wanted to see a ‘wake’ effect as the young superman sped through the field. In post it was found that the shots needed a major background replacement, and this would have involved complex water irrigation sprinkler keying. Singer wanted the young Clark to look slightly different and more joyful in his performance. In the end it was just easier to do the whole thing digitally, so while the final shot in the film is an identical sweeping camera move, everything from the actor to the distant clouds are digital animation.

returns/rsp-corn2As the production required very close up digital shots of the young Clark running, Stephan Bender was digitally scanned and a full digital body double created. Crosbie points to this as a great example of the depth of talent in each of the departments at RSP, close up digital actors is extremely challenging in CG. The digital double was then animated to run at high-speed trough the digital corn in close up, but often times with a live action head shot performance of Bender tracked on. The young Clark sequence made extensive use of HDRI and image based techniques. The results so real, it is likely no one will know the shots are CGI.

For the X-ray effect, RSP’s Art Director David Scott produced some pivotal concept art that director Singer immediately took to. The concept was to see through objects much like a clipping plane in 3D, but this was tempered by the density profile of the material that is being ‘X-rayed’. For example when Superman X-ray’s Lois’ house, the wooden beams are visually removed in such a way as to make it seem that the grain of the wood removes a fraction after the softer wood pulp that naturally exists between the wood grain structure.

This effect is shown most elegantly in the Lois lift shot. Superman’s X-ray vision follows Lois as she enters the Daily Planet lift and ascends. Filmed live action as a camera dropping vertically past a static height Lois, to mimic the inverse of her lifting up, the shot is both technically perfect and visually beautiful. The art deco lobby is bathed in golden light; the rich golden tones mix well with the brass older style elevator. As there is a nice amount of camera movement we see via the X-ray effects gently through chandeliers and through the workings of the clock like ‘floor indicator’ above the lift. This is a shot for the show reel, effective, beautiful and technically cool.

It surley would have been much easier to have just done a more conventional ‘X-ray’ style or ‘Fresnel pass’ sytle look, rather than the softer and infinitely more complex virtual clipping plane solution that RSP developed. But a traditional X-ray effect would have been very harsh and intrusive to the story telling.

There is one actual X-ray style operation performed by Clark/Superman on a fallen Lois, where he does scan her for injuries. An earlier version of the script called for an X-ray of Lois and a male barman and the RSP team bought some real world medical scans to assist in the shot. Although the barman was later dropped from the shot, the male skeleton purchased was still valuable after being digitally altered to “remove some obvious bits and adding others” jokes Crosbie.

The transition from 911 work to primary effects house did not just happen with Superman Returns, it is a process that has evolved over years. This is reflected in the depth of talent now at RSP, who have offices in both Adelaide and Sydney. Due to their dual offices, Rising Sun Pictures also had to develop some key technology behind the scenes to make their business work. To allow artists in both of their offices to see identical images on their monitors in terms of colour and to then discuss those images, sister company Rising Sun Research developed cineSync and cineSpace. The two applications are fast becoming global preferred options for monitor calibration and professional video conferencing collaboration, further enhancing Rising Sun’s reputation. Both were used on Superman Returns, but many other companies are also using them. John Bruno, Visual effects Supervisor for X-men 3 recently went so far as to say that X-men 3 was only possible due to cineSync, given he had 10 companies contributing shots. cineSync was used by all of the effects companies on Superman Returns many of whom praised its ease of use when we spoke to them about it.

Sony Pictures Imageworks (SPI)

Sony Pictures Imageworks was relied upon for much of the digital double work for the Superman character. On Spider-Man 2 there was a lot of IBR, or Image-Based Rendering, which was used again by SPI for Superman Returns. “The work on Doc Oc was definitely the foundation for the work we did on Superman ” explains John Monos, SPI’s supervisor. “We used the same Light Stage number 2 that Paul Debevec and his team developed.” On Spider-Man 2 the recording was done in single 10 sec shots, capturing with 4 cameras at the front. While that gave really good coverage of the Doc Oc character’s face, the team then recorded a second pass for the back of his head (mainly the neck as the hair is done fully digitally in all cases).

This double capture of each setup created problems “on the back end” of Spider-Man 2′s production, so on Superman Returns the cameras were increased to 6 cameras to capture the actors back of the head and neck, in one pass. The intent of the IBR is to record the reflectance data of the actor in all possible lighting. It is vital that the actor remain exactly still during the scanning. So another improvement for Superman was to reduced the time of each scan down to six to seven seconds.

returns/spi1Ironically, given the Genesis camera in principle photography, the SPI second unit shot the Superman IBR passes on film, using 5218 film stock. Eyes are extremely hard in IBR processes and the eyes in the digital double ended up being done with traditional CGI approaches.

“Ultimately you’re cloning an actor, so how better to do it than with exact recordings of the actual actor ,” says Monos. “With IBR you’re getting really good reflectance data on the actor in a particular pose, be it his neutral face or whatever,” he says, “(but) it gets more complex when we have to push that data around (as they ‘act’) so that’s an area we had to do work in for Superman – to improve that.”

SPI developed an extremely effective hair renderer that could cast shadows on to IBR face area. This meant that while the various aspects of the final was made from different approaches or shaders, the entire character worked and was lit as one.

A key aspect of the digital Superman was always going to be his cape. The team looked at real reference such as flags in hurricanes, etc, in the end the team used Syflex for the cape, but with ‘filmic considerations’ to make the cape look good, given that the perfectly simulated effect of a cape at high speeds may not be visually appealing. This was also the case in zero gravity, where the cape still moves – and in reality perhaps a real cape would not. For this the SPI team reasoned that perhaps there was secondary motion left from him moving – just prior to the shot.

In terms of using Genesis footage, SPI did research to get the optimum results and Monos found a few aspects of interest in using the camera. First there was no film scanning, so data handling was different and easier. “Visually it looks a lot sharper than film and we were not sure what we were getting ourselves in for in terms of pulling green screen etc,” comments Monos. “It turns out that it pulls greenscreen quite cleanly. There are some things to deal with some ways it looks in motion blur, and also some differences in exposure, in the ways that film will bloom out, as opposed to video and in the end Genesis is a video camera, it is a different animal”. SPI did add a type of film grain on all their shots, to mimic traditional film grain.

One other major complex sequence SPI completed was the crash landing of the Shuttle plane. Dodger stadium was actually shot prior to the shoot in Sydney, and so in many cases the background plates were camera mapped on to lidar scans of the stadium. This allowed maximum flexibility for camera moves that matched the greenscreen and cgi elements which were filmed later. The projection mapping was done all in 3D and rendered in Renderman. Generally speaking, SPI tried to get the lighting right in 3D and they did not do multi-pass rendering.

Bruno Vilela, SPI CG supervisor also supervised a lot of the Metropolis footage, which was based on New York Ctiy. The team used a combination of solutions from matte paintings, camera mapping, CGI building and removing of NY landmarks. This allowed the team to produce a large number of skyline shots without having to produce entire CG cities.

Rhythm & Hues (R&H)

At the time of the 1978 Superman, much was made of the very high salary paid to Marlon Brando for a relatively short amount of screen time. It was therefore a good investment to reuse sections of that footage in the new Superman Returns, especially as the great actor passed away in 2004.

As mentioned earlier, R&H did the digital repurposing of Marlon Brando. All the lines that Brando delivered were actually recorded by the actor when he was alive, but some were only recorded as audio and almost none were recorded from the right camera angle. To solve this Brando’s mouth was modeled and then tracked on to his actual face from another take. This requires the dual skills of carefully lip syncing the new lines and very accurately modeling and texturing his face. To really sell the shot one of the secrets for polishing the shot is to adjust the textures of the mapped mouth, thus avoiding the skin texture stretching. To do this the R&H team placed patches of moving texture from original footage to blend the digital mouths with his real face. Once this was achieved, the image could be projected over a 3D model of Brando’s head allowing for the camera to now view the lines from the side of his head, when it was only every filmed in 1978 directly face on.

returns/rh1As complex as the 9 shots were of Brando, they amounted to much less work for R&H than the extremely complex water sequence that VFX Supervisor Derek Spears and his team achieved. The ocean rescue renders took, depending on the camera angle, somewhere between 45 minutes to three hours per frame to render. According to Spears, “the really big splashes could go up to about 8 hours a frame. The crystal renders were really more intensive – some of those were really 100 hours per frame, if you add everything — tiling frames, with multiple segments running parallel — those were the refracting of the CG elements and Brando inside of the crystal refracting with the crystals behind them. Very complex and very time consuming”.

The yacht rescue sequences were tracked in R&H’s Voodoo proprietary software application. Not all CG shots needed match moving, so those were set up in Houdini. The actual water rendering took a different approach from the water simulation route of say Poseidon, but it produced just as stunning a final image. R&H approach’s is more filmic in that it has a series of tools to produce art directed waves, as opposed to rendering simulations and hoping the waves go where you want.

The water pipeline used Wavetools to start the modeling — this is just a wave surface simulation tool. Then iWave is used as a water surface generator for objects in the water. To this Ahab – a wave displacement tool – is added. This is like a fluid sim tool, and on Superman Returns “this was not used very much, but a little,” explains Spears. Next, FELT, a vector field flow / particle manipulator, is used to create the splashes, add 3D mist, and 3D smoke elements. Finally R&H has CloudTank which was used for rendering volumetrics. FELT is used to generate the volumetric splashes and CloudTank was used to render them.

Unlike some applications there are no obvious screen grabs for this story. When we asked for some screen shots, R&H’s digital effects supervisor Mike O’Neal, explained “these tools really aren’t like that. They are command line tools. They look like text files. We don’t really want to show the maps they generate, because that would give away some of the proprietary technique. The ocean is a flat grid until it is rendered…. the best thing (is to look at) the rendered images.”

R&H had 200 staff members working on the film, including their Indian office and not including stateside support like I/O and scan record.

Framestore CFC

Framestore CFC’s work on the film centers on Krypton island, the environment around it, and Superman’s showdown there with Lex Luthor. Superman flies to the island as it begins to rise from the ocean. A Kryptonian crystal has formed the island, and so its underlying structure and look is crystalline. The sequence included a nail-biting rescue by Lois Lane, arriving (and escaping) in a seaplane.

A team of up to 70 Framestore CFC artists and technicians worked eight months on 313 shots for Superman Returns. They were led by VFX Supervisor, Jon Thum. “Our work encompasses huge CG environments of oceans, crystal rocks, water interaction, a seaplane, a helicopter and Superman himself,” according to Thum. “This is all mixed with 2D elements of mist, waterfalls, layered skies and various green-screen elements.”

The CG team was led by CG Supervisor Justin Martin. One of the most demanding areas of CG research and development was the ocean. The team developed their own techniques for creating the roiling waters, pipelining their work through Houdini and Maya, feeding into Renderman, and then compositing in Shake. In addition, the CG team had to break up the (CG) set, smash crystal columns, and break up rocks from the rising island. For this, they pushed Houdini’s dynamics to the limit, expanding its choreography abilities, and building on previous techniques developed for earlier shows such as Blade II, Harry Potter and the Chamber of Secrets, and Thunderbirds.

returns/cfc1Of the sequence of the seaplane struggling through a ravine to escape the still-growing island, Martin says, “This was a tricky one – the CG elements included the crystal rocks, the plane, and some of the water elements. But if there’s one thing we’ve learned from previous projects, it’s to use live elements whenever possible – so the waterfalls were mostly real ones. The ocean material, on the other hand, was CG – because of the shots needed and the fact that shooting plates of water as stormy as required is almost impossible.” Martin adds that – somewhat to his surprise – the crystals posed much more of a challenge than the ocean work. “Managing the complexity of the crystalline geometries, and how much would be in each shot was a real headache,” he says, “Whereas with the oceans we had a look established quite early on, and it was just a case of refining it and going back and forth with the compositors.”

The seaplane itself was shot in a tank, and the surrounding water from the original shoot needed to be replaced with CG water and blended where necessary with the real tank water to maintain the realistic interactions with the plane. “As the seaplane tries to take off through the CG canyon,” says Thum, “we used some real seaplane elements mixed with CG seaplane where necessary. Although some elements were shot with a real seaplane, it did not match precisely the set seaplane, so for any close up shots we would replace the top half of the seaplane with CG, whilst keeping the real pontoons with their white-water interaction.”

As is often the case these days, the team was frequently required to deliver shots in which there were no (or very few) real elements – a blank canvas, as it were. The matte painters, led by Matte Painting Supervisor Martin Macrae, played a crucial part in such situations, with their eye helping to create compositions and lighting schemes that worked. Photoshop and Maya were the primary tools. “It was pretty much like Lego,” says Macrae, half jokingly, “We had to basically build a 3km wide island, on a scale that none of us had previously tackled, so we needed blocks that we could easily assemble and tweak.” For Macrae, the hardest part was to make the full CG shots look convincing. “Essentially you’re dealing with a pure fantasy environment,” he says, “One that doesn’t occur in real life – and our job was to make it look ‘natural’.”

The overhead shots of the island, seen from Superman’s point of view as he nears Luthor’s base, were created with a combination of techniques that were repeated throughout Framestore CFC’s shots. The first involved procedural textures for the crystal rock, 3D waterfalls and projected 2D elements where possible. The second, led by Martin Macrae, involved projecting matte painting onto the rendered geometry to create extra detail – a technique that produces better results, but only works for small camera moves.

The crystal island is created from a mixture of procedural textured geometry, with additional matte painting in some areas to create more detail. For CFC one hurdle was getting the CG ocean to interact with the island, and they turned to Houdini and artist talent. “The final interaction was painstakingly created using 2D mist and splash elements, as well as 3D particle effects and projected elements that were targeted to specific areas of the composition,” relates Thum. “For the shots where rocks are breaking away from the island, there were many elements involved in addition to the CG break up effect, such as 3D water effects, smaller particulate debris and 2D water elements.”

Gavin Toomey led the compositing team, having also attended the shoot in Sydney following on from Thum’s initial stint there. “As with previous very heavy-render projects such as Troy, we didn’t use any beauty passes – everything was re-lit in 2D,” he says. “So we had a couple of guys building the template scripts to begin with, concentrating on the ocean and the rocks.”

returns/cfc2Working in HD did not faze the team. “When you’re working in film, you’re always aware that when it’s shot out it will, to an extent, be a ‘forgiving’ medium. HD is much crisper and more demanding in that respect. At the same time, you have to make the material look like what the audiences expect their films to look like,” says Toomey. “The art lies in knowing when to stop piling on the detail, keeping it implicit rather than explicit – nailing the point at which it looks photographic.” Overall, though, Toomey feels that the challenges presented by the project were well worth the extra effort. “In the end I think that some of the 2D elements really help sell the shots,” he reflects, “It was great to have a chance to get our hands dirty with the TDs.”

With much labour behind them and the finish in sight, the production handed Framestore CFC one final surprise six weeks before delivery was due. “Our end sequence of the island rising out of the ocean was re-cut,” recalls Thum, “and the island put back to concept. Meeting this challenge required some…radical reorganisation, let’s say. The fact that we turned it all around in time is something we are justly proud of.” Thum is not alone in his admiration for the team’s work during this last burst of activity. Says Superman Returns VFX Supervisor, Mark Stetson, “The late-schedule design changes we threw at Framestore CFC put us all right out at the edge of risk. Jon Thum, (Framestore CFC Producer) Robin Saxen and the team responded with a sequence that was hugely improved, and still finished on the schedule we agreed. It made the movie better, and I am very grateful.”

We have several high resolution JPEG images from the movie. You can download a .zip archive of the images here.

Copy link
Powered by Social Snap