The Foundry held its second Master Class in London on Friday, playing to packed house at the Vue theatre in Leicester Square. The event was standing-room only, having given away all the seats in only 24 hours when it was announced earlier this year.
The event focused on stereoscopic production and post...a timely program after the overwhelming success of Avatar at the box office. The Foundry actually went out and shot some stereoscopic scenes so that the entire process could be covered for their event. Throughout the day, a variety of speakers examined various issues regarding stereoscopic projects. fxguidetv will have coverage from the event in the next several weeks but in the meantime, click through for more about the event and a few details about the the upcoming Nuke 6.1 release.
The day started off with a discussion of stereoscopic terms by The Foundry's Chief Scientist Simon Robinson. Next up was Andy Millns of Inition with an on-set centric discussion of the current options for stereoscopic cameras. The Foundry shoot utilized Silicon Imaging SI-3D 2K cameras integrated with the P+S Technik Mirror Rig. Millns feels the integrated 3D system, which outputs synchronized dual video streams via a single controller, offers benefits over having two separate cameras.
Theodor Groeneboom, Framestore lead compositor on Avatar, then examined several plates from the Foundry shoot and spoke about fixing various issues which can be inherent in stereoscopic filmmaking. When principal photography is done using mirror rigs, there will be color differences between the two cameras due to the fact that one is shooting through mirror glass and the other is filming a reflection in the mirror. This can also cause various polarizing effects to be different between the cameras, such as reflections in puddles of water. Lens flares can also be problematic as the slight angle between cameras when filming convergent stereo can lead to asymmetrical flares. All of these issues must be corrected before compositing.
Florian Gellinger of rise|fx led compositing efforts along with Russell Dodgson, senior Nuke compositor in the Framestore commercials department. One of the (unforseen) issues with the shot ended up being difficulties in tracking the 1000+ frame shot shown at the master class. The Foundry introduced their camera tracker in version 6 and the guys tried to use this as well as other standalong tracking packages, but the slow moving dolly shot ended up being quite problematic in every application. The overall track was decent, but due to the speed there was a bit of small floating back and forth...which was overcome with some manual tweaking.
However, the shot provided a great test case for The Foundry developers and actually improved the 3D tracking code so that the shot became trackable in Nuke. This upgrade will be rolled out as part of the upcoming 6.1 upgrade. Gellinger also showed UV texture and projection mapping techniques to demonstrate using Nuke's 3D capabilities to for the shot's matte painting.
As an aside, they were demonstrating on a loaded HP Z800 workstation and the response and interactivity was pretty amazing. Every time I sit in front of one of those workstations, it's clear that HP is really focused on making performance computers.
After lunch, Ben Minall showed some techniques for getting the most out of Ocula in problem situations. Artists seem to be very happy with the vertical alignment tools in the plugins, something that needs to be done to almost every stereoscopic shot. Some of the other tools, such as automatic color correction are much less automatic and can be difficult to obtain consistent results.
However, Minall showed a very clever Nuke Ocula tree addressing this very issue, extending the range of shots in which the Ocula plugins can be used. His technique actually utilizes several nodes that sample reference areas of various sizes and then averages the results. This averaged result is then used to create the final color correction, ending up with much more reliable results. The Foundry is considering creating a tool based upon this procedure, which should make things easier for artists. This same averaging principal could also be applied to several other Ocula tools.
Tom Williamson of CafeFX gave an interesting talk on their workflow for converting 2D to 3D imagery for Alice in Wonderland. The bottom line is that it is an incredibly tedious and human-intensive process to obtain results. And is considering the fact that the company was generally dealing only with greenscreen shots as opposed to breaking down entire scenes. I think it's safe to say that after this experience, Williamson is a fan of shooting stereoscopic as opposed to creating it from a single camera. After the event, the Clash of the Titans stereo conversion project was a popular point of discussion, with arists wondering what the final result will bring. The fx-laden film was finished in traditional 2D and the additional camera view required for 3D is currently being created by Prime Focus from those completed plates.
The Foundry's Jon Waddleton showed a sneak peek of the upcoming Nuke 6.1. Feature highlights include:
- Ultimatte Keyer now included as standard
- Mac OSX 64-bit support
- FBX Export
- Paint/Roto workflow & performance improvements
- 3D selection and snapping
- Python accessible selection and geometry
- Interactive camera and light positioning
- 'Look at' options on nodes with transforms
Based upon the chatter over drinks after the event, it was another great masterclass -- it's exactly the type of stuff artists are interested in seeing. For those of you who couldn't be in London, we'll be bringing it to you on fxguidetv shortly.
We've been a free service since 1999 and now rely on the generous contributions of readers like you. If you'd like to help support our work, please join the hundreds of others and become an fxinsider member.