Katana in production: changing the structure of the workflow

Katana is on the move.

When The Foundry first launched the product, those unfamiliar with the Sony experience took time to find a place in the production schedule to evaluate it and consider adopting it – so great is its influence on a pipeline – it could hardly be fully adopted mid-show on any major film. Meanwhile, people outside the ‘know’ looked on slightly confused as to why The Foundry seemed to be investing so much in what seemed like a tool so very few actual artists might surely want? Ok it looked cool, but only for a handful of CG TDs really?

As time has rolled on, the product has become more fully understood and its adoption is currently exploding at the high end. Originally developed at Sony Pictures Imageworks, Katana has been used now in dozens of large productions since its launch in 2004.

Not many people know that Katana started life as a 2D compositing package. “We had an in-house compositor, Bonsai, that was showing its age and the team that was responsible for maintaining it was, in their own time, prototyping what a new system would look like,” comments SPI CTO Rob Bredow. “Eventually the project started to get momentum and the development team looked at Katana (2D project) and said ‘this project has a lot of promise’.”

An SPI Katana screenshot from The Amazing Spider-Man. Click to enlarge.

But at that time, the team saw that Bonsai or compositing in general was not really the problem. SPI’s lighting tool at the time was called BIRPS (Sony Pictures RIBs backwards) and it was in worse shape than its sister compositing program Bonsai. The BIRPS program was a very powerful lighting tool and RIB editor (SPI was still a RenderMan house at this stage), but as Bredow likes to point out, “It was a lighting tool where you couldn’t move any lights!” BIRPS had no GUI and literally could not position lights. So Sony decided to take Katana, put its 2D functions on hold, “as something we thought we’d get back to, and focus all efforts on Katana – 3D, it was a ground up initiative.”

Earlier in his career at Sony, Bredow was the VFX supervisor on Surf’s Up, one of the first films to use the new Katana program during its alpha testing at SPI. “When we jumped into those first shows with Katana, we did so with both feet, there was no plan B,” says Bredow. This is not the normal way films work, or at least how the textbooks say they should work. In theory, versions of software are frozen and builds are considered locked off unless there is some serious problem that would require a mid-show software update.

It was “not as scary as it sounds,” explains Bredow. “In part because at that time in the industry the tools were very dynamic anyway – there were things like Delayed Read Archive – which we started using really extensively for the first time on these shows.”

“Those types of systems and those types of tools that are totally fundamental to the way we work were just things that we were introducing on every new show,” he adds. “The pipeline was very dynamic and changing a lot, so the chance to use Katana on (Surf’s Up) was just a really great opportunity.”

At this time the R&D team started using the same release system and process that SPI still uses today, which allows for weekly software releases and weekly meetings with the key stakeholders to make sure all the needs the team and show runners had to make their films work. “It was hand in hand collaboration between the software team and the shows trying to do these fairly compacted films in these brand new packages,” says Bredow.


Enter The Foundry

In November 2009, The Foundry and Sony Pictures Imageworks (SPI) entered into a technology sharing agreement to collaborate on the development of Katana.

Today it is now spreading amongst the high end facility community to such experienced teams as Digital Domain, MPC, ILM, Reliance MediaWorks and newer (or upgrading) pipelines such Laika and SPIN VFX. Katana was forged on productions like Spider-Man 3 at SPI (where it was meant to be used on just a few shots but was adopted wholeheartedly by the team). While its own developers tried to point out it was just an alpha build, so beneficial was the approach, the Spider-Man 3 artists just did not care and used it extensively.

Pointcloud, wireframe, final – Katana handles dense assets extremely well.

Once one starts to use Katana, its Nuke-like UI workflow philosophy starts to make real sense being from a company like The Foundry. Given the evolution of the industry, the product is at the heart of a perfect storm where 3D and 2D lines are blurring and CGI complexity is rocketing upwards, while renderers simultaneously embrace physically plausible shading and all its computational complexity. It is true Katana is a long term bet for The Foundry, but if it succeeds, the product has the potential to be a complete ‘category killer’. Katana is perhaps the most interesting long term piece of new technology the industry has seen in 3D for some time – it is certainly influencing many things around it.

Alembic is an example of a sister, or perhaps daughter, technology from the same SPI technology incubator. The data animation format is having its own rapid wide scale adoption success, and it is just one of many products or standards which Katana naturally promotes the use of as part of an overall new workflow philosophy. While much of the publicity around Katana focuses on full animated shots, in fact the product was was forged inside Sony on live action and animation – the first three films being Surf’s Up, Beowulf and Spider-Man 3.

In concert with Nuke, Katana is very much at home on live action films. “The two things you really want with doing visual effects versus fully animated films,” says Katana product manager Andy Lomas – a veteran himself of Avatar, Over the Hedge and The Matrix films – “is to see your lighting in context with the plate, and to have the color right.” Another of the related technologies that migrated from SPI to the community is OpenColor. The Foundry now ships both Katana and Nuke with OpenColor, so lighting matched in Katana is able to be accurate and consistent throughout the pipeline. It is true Katana’s role in a VFX film is different, says Lomas. “In most animated films it is about the recipe or formula for doing the shot – that can then be applied to a bunch of other shots – while VFX tends to have more individual shot adjustments – matching to this or that special background plate.” But today there are more VFX films with complete CGI sequences than ever before, blurring the line between animated and VFX like never before.

Lomas goes on to point out two-thirds of Katana’s customers today are interested in the product for VFX work, not full animated features. One such customer is Laika, known for its Academy Award-nominated Coraline (2009) stop motion work and the upcoming Paranorman (stereo stop motion). As Jeff Stringer (director, production technology in Laika’s Entertainment Division) points out, “by the time a shot gets to Katana it doesn’t matter if it was shot live action or stop motion it is just a clip. Every frame of our films is creatively lit and shot with real cameras, so lighting any CG element is all about plate integration, which Katana really helps us to do.”

fxguide covered Katana in fxguidetv #111 – click here to watch presentations from Sony Pictures Imageworks’ Jeremy Selan and The Foundry’s Andy Lomas.

Laika decided to purchase a Katana site license as the VFX team looked at better ways to manage the consistency of lighting and increasingly complexity of their shots. “In each successive film at Laika, we are more ambitious than the last,” says Stringer. Their decision was aided by Andrew Nawrot (CG Supervisor in Laika’s Entertainment Division) being ex-Sony himself, and he is heading up the move to switch to Katana. “Its procedural workflow makes perfect sense for feature film look development and lighting,” says Nawrot. “The node-based approach allows us to create and manage looks for large amounts of assets with minimal setup between tasks. It also streamlines shot lighting workflows and allows us to easily maintain standards and rigs across multiple assets, shots and sequences.”

Surf’s Up was one of the early films that SPI used Katana for.

Katana works by having a script, or as The Foundry calls it, a ‘recipe’ approach. For massively complex scenes like say a city in the new Amazing Spider-Manfilm, partial geometry can be loaded very quickly, lighting worked on and render previewed to develop the look, but for most lighting situations not all of the whole scene is needed to light it. But if a single pane of glass or leaf is catching the light annoyingly, one can drill down to that, if you want. This central collection and management point for all the 3D assets, and Katana’s ability to spread easily to other shots or reuse elements in other scenes, places it at the heart of the workflow. Couple this with its shader relationship to the renderers and Katana ‘owns’ large scale CGI production management.

Where Katana is used in the pipeline:

Katana is a tool used at two key stages of production:

1. Look development

2. Shot production

(One could argue the there is even a third area: overall workflow design for senior pipeline architects).

1. LookDev: spending less time on data management and more on lighting

The idea of developing LookDev – the recipes that go into producing a character, moving on to production, were central to Katana and even BIRPS before it. But LookDev and then managing that process used to be extremely time consuming. SPI even experimented with hiring lighting assistants just to help with data management. Pre-Katana, on a film like The Polar Express Sony employed ‘Associate TD’s’ to take that load off the TDs so they could pre-load material. Today there is no position at SPI called an Associate TD. “We just don’t need them,” says Rob Bredow.

It is perhaps a sad indictment of our modern workflows that over half the time of a lighting artist can be lost on simply data management and moving things around, checking versions are correct and the right files are in the right place and being used. Katana at its core aims to provide computer assistance to allow more time for lighting. In LookDev, Katana replaces the conventional CG pipeline with a flexible ‘recipe’ based asset workflow. The Foundry use the term ‘recipe’ as the UI and approach of Katana is not unlike Nuke – with a type of file in and then a nodal flow diagram for setting up the lighting. This is key as this ‘template’ can be then used on the next shot or scene. It is also key in how Katana works with the data. A scene may require massive amounts of data but Katana does not need to load all that data to start working with the files.

A key to understanding Katana is its relationship to the assets. Katana allows teams to start lighting before assets are fully completed – it manages this form of versioning easily – but it also allows for other types of versioning such as shot and scene specific variations. Imagine there is a character, lighting can begin working while the creature is not locked off, and new versions can be easily accommodated since the Katana UI surrounds the asset. This happens in a way not dissimilar to Nuke around a greenscreen clip – if the clip is then replaced by a new version 2.0 say cleaned up or somehow altered, the script is still on the whole valid.

Detailed actual Katana scene graph from SPI’s Amazing Spider-Man.

But more than that, imagine the character now needs to be dirty and muddy in some scenes but not in others, Katana handles these scene based variation versions easily as well. Finally, if other similar characters then need to be in the same scenes, the original LookDev could be a great starting point for them. And it goes deeper and more specific than this implies. An example here would be LookDev on a set of destroyed buildings, if say a shader was developed for dirty glass. All the glass in all the models could be accessed by Katana’s script asset aware capabilities to allow that dirty glass shader to affect all windows in this shot, this scene or this city whenever it appears, excluding the flash back scenes where a ‘clean glass’ shader is used.

Katana then services highly scalable asset based workflows as a nodal flexible workflow, that is very fast. Katana’s node based approach allows rapid turn around of high complexity shots while keeping artists in control and reducing in house development overheads. MPC told fxguide that they are replacing sections of their own internally developed workflow code since at best they might be able to devote say, “Seven or eight people, where between Sony and The Foundry there might be 30 engineers. Multiply that by each year that goes by and you just could not keep up with the rate of improvement.”

Extensive APIs mean Katana can integrate with a variety of renderers and facility specific pre-existing shader libraries and workflow tools. There are some costs associated with committing to Katana. For example, to fully realize the workflow benefits, Katana requires a comprehensive naming convention for data sets / metadata. For newer facilities this may mean ‘lifting their game’ – something all the companies we spoke to welcomed. For older pipelines it means examining and dovetailing existing data and asset management with the requirements of Katana. It is not that Katana is limited – “rather, she requires a level of commitment,” as one technical lead pointed out.

2. Shot production: a world where nothing is really finaled

Katana today makes a production much freer to address changes that come along during production. In the past, if a UV layout change was required the result to the schedule and pipeline could have been “catastrophic,” according to SPI’s Rob Bredow. But after the move to Katana, if the change “does not affect the geometry topology of the model, we can just swap out those UVs just-in-time,” says Bredow, “meaning that a team of animators could be working on dozens of shots and we get a UV change, which we just pass along from modeling straight to lighting and everything just lines right up at the end of the pipeline.”

Nodegraph in an Arnold pipeline.

Katana speeds up production, as Bredow saw first hand inside Sony on productions. By using Katana it was possible to do complex shot lighting in the morning, perhaps at lower resolution, render it out, see it and get notes, and then incorporate those notes into a full render that might be done overnight. That double iteration of a shot in a single day for huge production work is a major speed advantage. It is possible to output in SPI’s or other pipelines various normal passes and other passes, but this is not the same as being able to fundamentally being able to move the lights, and just use such tools for more subtle final adjustments.

The real sweet spot for Katana is a ‘same as shot’ – in these cases Katana allows the ‘recipe’ to be loaded, and while it may need some tweaking, the artist is already most of the way there without data management and without a lot of time. Literally some lighters would take one to two days in a shot set-up, before doing any new work. “That’s all changed,” says Bredow, “and that’s what you are looking for when you move to Katana, especially on a show with 600-800 shots, when most of the changes may be coming towards the end of the schedule, which is great to afford the director that kind of flexibility, but it is also a challenge on the whole pipeline.”

Renderers: Arnold and RenderMan rendering

A big issue for any company is their render pipeline and the various render issues related to that. Even SPI, which is now a full Arnold house, may have reason from time to time to render something in a different package, especially if it is sharing assets. With Katana, for example, companies can write a hair generator as a Katana ‘scene graph generator’ and it should work with both say RenderMan and Arnold without having to write new custom procedurals for each of those renderers. Because of the way Katana works, there should be almost no efficiency lost or memory overhead compared with writing them natively for each renderer.

Scene graphs can also be written procedurally so someone could use existing assets, such as geometry or particle caches, and generate a new scene graph procedurally. This is much like writing procedurals for renderers in RenderMan, and, as Katana was from the earliest days at SPI written to be render agnostic, someone can write a procedural scene graph once and then use it in many different renderers.

SPIN VFX says the biggest advantage is “how exposed RenderMan is, so you can get into the LookDev easily. It is not so much you are going through an imperfect translation process that we have struggled with in the past.” But it should be emphasized that Katana has no shaders as part of the program. There is a great SDK and it, along with other hooks, does allow companies to produce their own ‘secret sauce’. Katana is not a closed environment – it can and is extended. SPI has advanced rendering tools for physically plausible shaders (using OSL) and their own implementation of Arnold. That means SPI can produce superior results and custom solutions.

It would be wrong to think Katana and the other initiatives (many of which are open source) at SPI indicate any downsizing of R&D or outsourcing of R&D. Far from it, Rob Bredow points out that partnerships with companies like The Foundry frees up SPI’s R&D team to focus on areas that can make a difference. “We did the deal with Katana as it had become a stable product internally,” says Bredow. “All the innovation was happening on top of Katana and once we realized that that core of Katana was pretty much stable, we started considering our options moving forward. Our vision is to build on this core that really we all share so we can focus on those things that end up on the screen as value added components.” These components are things like the major advances in sub-surface scattering seen in the new Amazing Spider-Man, suggests Bredow.

See our special fxguide story on the Lizard rendering in The Amazing Spider-Man.


Looking forward

SPI’s latest feature film VFX was for The Amazing Spider-Man.

Which artists are using Katana?

Every 3D artist at Digital Domain has access to Nuke and Katana. Like many facilities, DD pipeline was set up with 3D artists who were regularly doing slap comps and thus the UI of a node scene flow diagram very is familiar to their 3D team.

The move to The Foundry selling Katana has one other huge benefit, the support offered by The Foundry team. This is a company used to dealing with high end productions and using cutting edge technology.  SPIN VFX joked that “pioneers are the ones with arrows in their backs” but they could not have been more happy with the ‘outstanding’ support they have been getting from The Foundry.

Implementing a Katana pipeline does come with certain expectations and even requirements. While Katana is very flexible on what naming conversions are used, it needs one. It can be flexible to work with any well designed rigidly enforced system, but to use Katana fully, a facility needs a rigidly enforced naming system. In other words, Katana does not dictate the system a company must use, but to get the most from Katana one needs solid naming and asset management system.

Full res screen shot of Katana in action relighting the hallway scene.

The Foundry have been developing Katana aggressively, not because Sony withheld any code, but The Foundry’s Katana is render agnostic, whereas SPI is an Arnold facility. And not just any Arnold facility, the company supported Arnold’s development in-house for many years. SPI has its own version of the Arnold software. In the case of SPI, this version uses Open Source Shaders (OSL). While these shaders are great and work extremely well for Sony, the switch to Arnold from RenderMan happened before version 13.5, so the Katana that The Foundry licensed did not include support for say, co-shaders – that are key to a company like SPIN VFX and something The Foundry has worked very hard to make sure are included. (Co-shaders are RenderMan’s somewhat unique way to build ‘network’ shaders. Introduced in version RPS 13.5, they are powerful, popular and avoid the need to just generate monolithic shaders.)

There are no shaders provided as part of Katana, but there are some key hooks still in the code that The Foundry should be able to exploit moving forward. If you read fxguide’s account of the production of the new Spider-Man film, you will see that Sony took HDRI files from the Spheron camera, extracted and promoted key lights and as they did with the Smurfs film, triangulate the real world new area light positions to place these new lights back in the correct place, often as a Quadlight (a polygon, or ‘quad’, with a projected light HDR on it – so it is realistic and no longer a point source).

To do this SPI used something called ‘IBL_create’. There is nothing stopping any facility writing this ‘super tool’ themselves. Away from Katana, companies like ILM do exactly this, but The Foundry also has the capacity and the core code from Sony to do it, but so far it has not been able to roll this out. This is not due to anything other than engineering priorities and sensible engineering scheduling.

SPI has even tested implementing this in The Foundry version – purely as a test exercise. While this is not quite finished, it appears that it is completely possible, given enough time and commitment.

As stated earlier, the third group of people or strength of Katana is to workflow and pipeline architects. It allows them to design, build and implement extremely flexible systems that are also fast. Ultimately, the number one question potential customers seem to ask is ‘Just how like the full Katana is The Foundry version?’ The answer seems to be that The Foundry version is not only a full implementation, but a wider more agnostic version, albeit with some extra bells and whistles still being rolled out.

5 thoughts on “Katana in production: changing the structure of the workflow”

  1. “It would be wrong to think Katana and the other open source initiatives at SPI..”
    Katana is not open source…

    1. Sorry I did not mean to imply Katana was Open Source, but then I can see how you could read it that way…
      I will edit the article to make it clearer.
      Mike

  2. Great article. Thanks for the insights. One minor thing: “SPI has its own version of the Arnold software. In the case of SPI, this version uses Open Source Shaders (OSL).” … I guess it should say Open Shading Language rather than Open Source Shaders.

Comments are closed.