Forum Replies Created

Viewing 15 posts - 16 through 30 (of 40 total)
  • Author
    Posts
  • in reply to: finishing in Nuke (longish timelines) #215723
    Michael Dalton
    Participant

    @memo 23423 wrote:

    Yea I know flame/smoke would be ideal… but unfortunately they’re a bit above my budget!

    Fusion and Nuke seem sooo close (running on an 8core/raid) to what I need that its frustrating not to be able to go that one step further and finish the whole spot in the same app!

    All that I really need extra (or what I havent figured out how to do yet) is to be able select a bunch of nodes and group them (analogous to a nested Combustion composition), then give that group a start time – is something like that possible? Or is there any other way a 30 second video like http://www.msavisuals.com/ti_3gsm07 could be done in one Nuke project?

    What you need is there for the most part… Grouping a series of nodes is there via the group command. Offseting them in time is there via the offset node. Splicing a group after another group (think FFI deisktop style splice) and dissolve in or out or cross is done via the append node. So with a little tcl scripting you could easily parse an edl and generate a series of node groups followed by append nodes to populate the timeline in a similar way that you might do in Combustion, but without all the bullshit of working in Combustion. What you also can do is add retime nodes (optical flow) for your timewarp segments which is pretty cool.

    I wrote something similar for Shake a while back using dissolve nodes and switches for cuts, and the built in retime for timewarps. Works fine. The only issue your’re going to have – which I quickly ran up against – was that you’re going to need to predefine your working format. I’ve been meaning to add it to Digres as well which I’ll shortly publish somewhere for everyone…

    In fact I’ve got this Find for Confom script on FXguide that parses edls and will search for DPX files for corresponding events and add them to your Nuke job. I guess I just need marry that code with a little bit of simple Nuke code for conforming events.

    Well… until that happens you should be able to do it manually if you wanted to. The thing is, node based editing can be a really strage place to live. It works but it takes some time to get used to it.

    Best,
    Chris

    in reply to: …Flame set-up to Nuke #214945
    Michael Dalton
    Participant

    @aneks 23532 wrote:

    the node still seems to be there but I am not sure what t would actually support or how well it works !

    t.

    Well, in 4.7v2 there’s no action node per se. The thing is, you do have cards with a trans geo node which are working in screen space with 1 equal to the full width of the render size.. Axis… I believe works in screen values as well. Cards support bilinear as well as bicubic transformations so that will roll over with a little massaging. You then have trilinears (deforms) support for projectors, cameras, obj/geo, lights and simple shaders and textures. You don’t really have particles (well you do, but I wonder if they could possibly translate over).

    Obvioudly the keyer and colour corrector nodes aren’t going to be the same but the rest could in theoory be parsed over to a Nuke scene.

    But who has the time…

    Best,
    Chris

    in reply to: Dbase integration questions…. #213454
    Michael Dalton
    Participant
    fbrandst wrote:
    You can write me a mail or pm, I have enough webspace to upload the file, if that helps.

    Besides I’m very interested in the demo so I’d be gald if you could send it to me…

    thanks

    franz

    Hi,

    i’ve managed to get a slot on our FTP. Anyone who’s interested in seeing it please contact me at [email protected]

    Take care,
    Chris

    in reply to: Dbase integration questions…. #213453
    Michael Dalton
    Participant
    fbrandst wrote:
    the link seems to be broken, or not working because of bandwith problems. maybe you can post it on another server, so everybody can watch you demo.

    It is indeed broken… Apple made me take it down after 10Gigs of downloads in half a day. I’m trying to see if I can find another place to post it… something which is turning out to be more problematic than I thought.

    If anyone has any ideas? Otherwise I’ll post it to anyone who’s interested.

    Best,
    Chris

    in reply to: flint conforming #215629
    Michael Dalton
    Participant
    kuban wrote:
    merv wrote:
    hi…can i just confirm that if i wana conform tapeless i should select import instead of capture in the edl page after i have imported the edl?…hope its clear enough question…lol…cheers…merv

    No, you import the image sequences from “import image” menu. But you just give the correct tape names, during importing. (The default name is “IMPORT”).
    And then save all the imported clips into a library reel. Then go to edl menu, and point the edl manager the reel, it should look for the source clip. And then you just assemble your edit. So you visit the edl menu, after you import all your material.

    In 2007 SP3 ext1 this behavior has been upgraded slightly. What you can do now is park at the root level of your scan directory, set your filetype to dpx, your tape name to either – dpx tape, dir tape, or manual tape, your timecode and keycode in a similar fashion, and then hit the recursive scan button. This will then find every single DPX sequence possible and automatically enter in all of the info required for doing a file based conform. When it’s finished recursively searching for the files, you can either softimport everything or hard import everything to single reel, even directly into the library if that’s where you entered import image from, then go to the edl tool and away you go.

    It really makes the file based conforming workflow easy to use.

    Good luck,
    Chris

    in reply to: Dbase integration questions…. #213452
    Michael Dalton
    Participant

    Hi All,

    So here’s the first recordings that are available:

    http://homepage.mac.com/amzunino/digres/FileSharing8.html

    It’s a bit heavy, and I curse a little too often but it’s a decent 25 minutes of an intro to what it’s all about. There’s a *significantly* larger amount of depth that what is shown, it just takes time to get through everything.

    All comments are welcome.

    Best,
    Chris

    in reply to: Dbase integration questions…. #213451
    Michael Dalton
    Participant
    loops wrote:
    I’d love to see how you have that working. Shouting across the room when you’ve finished a roto doesn’t always work so well 🙂

    I have a Toxik license via fxphd so I could compare and contrast if you like.

    Thanks for that. Most of the gui stuff in Nuke is still either work in progress or proof of concept. Regadrless, everything is controllable via tcl which means that if someone was inclined they could throw together an interface in tk, which would have access to all of the same functions.

    Regardless, I’ll be putting my dog and pony show together sometime during the week, so I’ll post more details when I’ve got ’em. Really cool to hear that there’s some interest.

    Best,
    Chris

    in reply to: HOW to import motion control data in flame #215239
    Michael Dalton
    Participant
    jayfxjay wrote:
    he export us data for flame but this is a raw format wich my flame do not recognize
    any suggestions

    Chances are you’re trying to load the data as an action setup? Instead select the camera, the go to load and change from action to raw and select the raw file and import.

    Not in front of the machine but I think that’s how it goes.

    Best,
    Chris

    in reply to: Dbase integration questions…. #213450
    Michael Dalton
    Participant
    cnoellert wrote:
    Hello everyone,

    I had a couple of questions that I wanted to throw out into the Nuke abyss.

    How many of you guys have used Toxic and it’s little database features?
    What was nice about it and what blew. If you had similar dbase capability
    in Nuke what would be the top features you’d like to see.

    I’m investigating moving sections of our DI dbase over to Nuke – seems like
    pgtcl is working under Linux and Mac OSX – and in the process adding several
    more sfx related features. I’ve got my own ideas but I’m just curious about
    what other people really miss/would like to have. I’m thinking to
    eventually make it public.

    On or off the list is cool. [email protected]

    Best,
    Chris

    Hello all,

    Almost 9 months later of working mildly on and off on this, I’ve put together a base line functionality. Today, I can say that I’ve got Nuke running in a colaborative enviroment against a postgres database.

    There is user/group management with full role support – Admins, Supers, Ops and Juniors. Project containers built of Scenes, Shots and Elements. Supervisor based assignment of objects to groups and/or individuals, Op/Jun abaility to claim objects within their rights/role to claim, full revisioning system, cascading notifications, basic media management and revisioning support. Fully scriptable project creation templating via tcl…

    …and some other stuff. It’s not exactly a plug and play thing, but if anyone is interested in my little hobby project -digres- as I’ve been calling it, please let me know. I’m going to shortly make some screen recordings to illustrate some of the base line features, but will need somewhere to post them, so if anyone has any ideas please let me know.

    Best,
    Chris

    in reply to: importing film cineon files #215007
    Michael Dalton
    Participant
    loops wrote:
    With so many things being DI’d at 2k now, that problem must be dwindling away – although losing detail in an entire feature instead of a few shots is clearly not good 🙂

    How does 4k cut with OCN to your eyes? I’m a lowly compositor and don’t get to check these things out for myself 🙂

    Well that’s the catch22 isn’t it? I read some speculative figure that now 25% of features being released in the states are DIs, so in that regard the problem is still very much there – although admittedly it’s mainly the US vfx heavy blockbusters that go full DI (as well as thier polar opposites – the indie films) so the problem decreases. But there’s a middle ground of shows that are still going photochemical/Analog with a real master and negcut because there isn’t really a need to go DI. Hence the opticals and visions.

    As far as 4k is concerned, it cuts significantly better against the “original” material compared to 2k. Colour (which is what actually definies the illusive term “sharpness”) is purer due to fewer scaling issues introdced while trying to describe such a significantly higher resolution with such few lines. There are many that argue that Super2k (4k scans nyquivst subsampled to 2k) are a really nice alternative to a full on 4k workflow, but I think that’s just a bandaid solution for ill-designed workflows and poor purchase decisions – when it comes to full DI’s and opticals that is. VFX is a slightly different animal which can, due to the vast complexity of some shots, be forced to remain on the small canvas.

    Bare in mind the vast majority of providers for DI workflows have little to no experience in traditional film workflows and film calibration. That’s now changing of course but there’s still more fluff than substance being thrown around with many players grabbing for middle ground and only one (Filmlight) really landing up top.

    My opinions of course (although they are right 😉 )

    Best,
    Chris

    in reply to: importing film cineon files #215006
    Michael Dalton
    Participant
    paul_round wrote:
    We work linearly, using in-house LUT’s for importing/exporting, viewing on the monitor again with an in-house written LUT, simulating the look of the final, ungraded, print. I can’t stress enough the importance of building a transparent pipeline for film work, your goal should be to ensure your output cuts directly with origianl camera neg seamlessly.

    Out of mild curiousity Paul, is the vast majority of your work 2k? Almost all 2k material I’ve seen (OCN 35mm 3/4 perf) really don’t match the “sharpness” of the material it’s being cut against. We do a ton of opticals and effects cut against OCN and the only way forward has been to go for big canvas.

    For 50 asa 16 mil 2k doesn’t always feel like it’s “enough.”

    Best,
    Chris

    in reply to: split screen of off-line/on-line #215416
    Michael Dalton
    Participant
    newone wrote:
    What is the quickest way to set up a split screen to compare an off-line and an on-line with audio? Is a simple 2 layer action set-up what everyone does?
    Sorry for the lame question just wondering if there are other ways!

    Easiest way is absolutly going to be Batch with a two up view with one side set to result and the other side set to view context1. Then load up to input nodes, one with the online conform and one with the offline.

    Doing it this way you can also use the batch timeline to make edit changes as well as use the segment effects stuff.

    Best,
    Chris

    in reply to: truelight in shake 4/4.1 #215309
    Michael Dalton
    Participant
    adi wrote:
    does anybody use shake with truelight trying to display dpx log files as final film projection result?

    we use truelight and we calibrated our monitors with a blueeye2 color probe from Lacie.

    The only issue i have is that truelight in shake has just some general presets and i would like to have presets for each film type (negative and positive)

    how do you guys make sure that what you see is what you get in the theatre projection?

    thank you

    adrian cruceru
    digital FX

    The Truelight version which ships with Shake is not the “full” version and only represents a very small fraction of what Truelight is actually capable of. A full on Truelight calibration suite includes the cube maker which enables you to create your own cubes. Contact Filmlight for more details.

    This includes more stock presets for neg, print, monitor and projection as well as a series of tools, both visual and cli for tweaking your results into perfection. But the true power of Truelights lies in no using the presets but creating your own. There’s a strip available from Filmlight containing all of the patches Truelight requires read for a full calibration. Rolling those strips out on your Arri on say 44 gives you a neg to profile. Striking a print of that neg on say 93 gives you a print to profile. Displaying the print on your reference film projector gives you the projector profile. And lastly using the Truelight probe (monitor/projector) to profile your display/dlp gives you your source monitor profile. Truelight then can track the transformation of colour from one medium through another to your final projection with fairly high accuracy.

    A few caveats. Cubes are designed for one chain of colour transformations. One display, one type of neg developed at one lab, one print struck and developed on one kind of printer at one lab and projected on one projector. That’s when they are at their most accurate. Change your neg type, bets are off. Change to 83, all bets are off. Change from a bell4 to a bell6 – nix. So cubes are not perfect. But they are a giude. You keep profiling, keep updating.

    Second, if you’re at a large installation – generating a “Good enough” lut which you use for the bulk of machines coupled together with one workstation and reference monitor/projector which is calibrated as best as humanly possible is arguably a smarter deployment.

    Regardless good luck, and in answer to your question – we use Truelight, both in hardware via the Truelight SDI box and in software via Shake and Baselight.

    Chris


    Chris Noellert
    Senior Flame / Digital Post Technical Director / Prima donna

    Nordiskfilm Post Production Stockholm
    (Formerly Filmteknik/Frithiof Film to Video. AB)
    Tulvaktsvagen 2
    115 40 Stockholm

    Tel: +46 8 450 450 00
    Fax: +46 8 450 450 01
    Dir: +46 8 450 450 17
    Mob: +46 7 024 616 31
    AIM: cmnoellert

    Reel: http://se.nordiskfilm-postproduction.com/movieviewer.aspx?movie=Video_250018.mov
    Web: http://www.nordiskfilm-postproduction.com

    in reply to: cine-tel monitor in film cfg #215011
    Michael Dalton
    Participant
    newone wrote:
    Yes,
    We have discovered that! We are going to give RISING SUN RESEARCH a chance.
    The desire for us is to perefctly match the lab that will do the film out…

    I think I give this lecture once a year 😉

    There’s a few thing to bare in mind when working with Luts and Filmout.

    So first things first grab a digital lad from Kodak:

    http://www.kodak.com/US/en/motion/support/dlad/dlad_2048X1556.cin

    On a print from a correctly recorded neg the big grey patch will measure somewhere around 1.09, 1.06, 1.03 status A. It’s like the universal checkpoint for the laboratory. ALL chemistry in modern labs are set up to ensure that they can hit this aim as closely as possible. That means we have a good starting point for success.

    A good 24 frames of the digital lad should go on the front and back ends of every cineon sequence you send to the film out house with a note on it saying that the lad patch on the print needs to hit a status A of as close as possible to 1.09, 1.06, 1.03. For me the norm tends to hover around 30, 30, 18 because proper print dens cineone files are sampled from the base and up.

    The thing is most lab’s printers only work in full points which are .07 density, which means a lad patch which hits 1.04, 1.06, 1.01 is still acceptable but may look different that to what you’re used to but not by much. When you check the prints ask for the lab to check the status A of the print first thing and tell you where it is. Don’t accept a print .07 above or below the aim. Make them redo it.

    This will become your mantra.

    It’s also worth mentioning that no two positives will ever be the “same” simply due to the nature of how the photo-chemical process works – actually how film works period. In some labs the chem can and will drift during the same print – hence printing the lad on the front and back of your sequences. The smartest procedure would not be to establish your LUT based on a single neg, pos iteration, but on the mean of a few neg/print combos to specific lab aims, which are as close to repeatable as possible. Expect a deviation of at least .05 density up or down per channel but demand nothing more.

    I would also establish a reference theatre/screen that is the definitive visual aim for your material. It’s important to bare in mind that a lut is meant to be a visual representation of perceived color from one specific viewing medium, through a series of different mediums to a resulting specific display medium. That’s to say from one monitor/projector, through one intermediate neg, through one print, to one film projector throwing to one screen. Changin monitors or projectors, neg or pos stock means all bets are off and you calibrate again.

    Luts are not perfect – not even the 3d ones. But if you know what to ask for you can increase your chances of getting approved on your first iteration.

    Chris

    in reply to: DI Industry #214255
    Michael Dalton
    Participant

    Start here:

    http://tig.colorist.org/wiki3/index.php/FAQ

    Good luck,
    Chris

Viewing 15 posts - 16 through 30 (of 40 total)