Forum Replies Created
-
AuthorPosts
-
claudio antonelliParticipant
definitely contact autodesk before you buy one. I’m not super informed, but I believe their policies on license transfer have changed in the past year.
April 17, 2011 at 1:58 am in reply to: "That doesn’t look the same as when I saw it in blah blah blah" #218383claudio antonelliParticipantLook at it from their perspective: They’ve spent weeks getting their corporate client to sign off on the color of the spot only to have it now looking too cyan because of the monitor. If they’re even remotely new, don’t have a really good relationship with their client or aren’t a great salesman then lord knows what the client will do if they see all their golden yellows looking a bit …green.
While teaching people about optical perception (not to mention the wild differences in various TV models) could solve the problem, the effectiveness of any lessons depends on the personal relationships. Like every other piece of information, how people respond has a lot more to do with how they view you than with any technical reality. If your guy trusts you and can communicate easily with their client then a simple sentence or two will alleviate the issue across all fronts. If they’re science-y types, they’ll probably really enjoy the TED talk. However, if the relationships are new, or not great, the only thing you can do is plonk them down in front of one of those Sony CRT’s and say “this is color truth, we calibrated it ten minutes ago and nothing will be a truer picture anywhere in the universe.”
The only real trick to any of it in the latter scenario is to make whoever you’re dealing with feel heard and understood. If you dismiss them because the science doesn’t agree, they’re going to think you’re a condescending prick. If you hear them out and present solutions that they can easily up-sell to someone with even less understanding of the post process, they’re going to be grateful.
claudio antonelliParticipantIf you are resizing on import and have the resize algorithm set to something like “impulse” it will be very fast and sloppy about the resize. Setting it to one of the nicer settings as you noted below is a better idea.
claudio antonelliParticipantAt their default values, I go between Lanczos and Gaussian. Gaussian is softer, which is sometimes good and sometimes bad. Lanczos has a nice amount of sharpening, but can exacerbate razoring or noise issues in certain types of footage.
claudio antonelliParticipantsmoke/flame media is uncompressed and takes up roughly the same space as an equivalent dpx file.
I forget how much a single HD dpx frame is, but I think it’s around 9-10 meg at 10bit color depth. Multiply that by your frames per second and go from there.
I don’t know why, but I’m always amazed when the math works out. We had to bring in some massive amount of media and my engineer figured out exactly the space needed. It was like magic to me. Haha.
claudio antonelliParticipantNot currently in the way that plugin does it, no.
You can do a lot of hacks with projectors and bicubics, so while I see the usefullness in UV mapping, I’ve never found myself yelling “IF I ONLY HAD A UV MAP!!”
claudio antonelliParticipantIf I recall the only version of the software that had paintable arrows was Inferno in the IRIX days. I certainly haven’t seen them in a while on either Inferno or Flame.
So to answer the question, I don’t think there’s a way to manually edit the vectors any more.
claudio antonelliParticipantTo get particles to emit out of an image like rain is pretty easy–make an image plane/bicubic whatever your emitter.
To have many images rain down doesn’t really work in flame since all the particles share the same texture from the media layer.
There’s also some rain presets if you’re going for basic good old rain.
claudio antonelliParticipantTinder’s got a great one.
If Tinder’s not available, I generally end up using a combination of blurs and displacement, possibly some sapphire distort with a particle system as the generator for all of it–have some variance in the particle speeds and a ‘transparency=lifetimeI’ function to govern the falloff.
claudio antonelliParticipantIf you load up multiple EDL’s in flame’s EDL module (I haven’t done this in a while) and set them all to have the same destination reel, there’s a button that defines whether or not the capture clips are cut up based on the edls or left as large clips.
claudio antonelliParticipant+1
I kept coming back to John Montgomery’s noise functions for particles.
Also, with all the new features in flame (hello action multi out!) there’s a good amount of room for some new tips and tutorials.
claudio antonelliParticipantTurn on “Paint F+M” and feed a black clip into the matte. The matte output of the paint node will have all your strokes on it.
The one thing that could get tricky is it even records things like “reveal” as strokes. Since I use “reveal” as an eraser as often as I use it to reveal various layers, I don’t tend to pipe out a paint-stroke matte very often, and for things like regraining I may just go the lazy way with an overcranked difference matte.
claudio antonelliParticipantFlame\’ll let you apply optical flow data to alternate clips, but I have concerns that any optical flow data will distort an image in a usable way. Worth trying… If I get a free moment I\’ll take a swing at it.
also, your youtube link isn\’t working (looks like the URL got truncated). I\’m curious to see it.
edit: looks like the new server may have something to do with screwing up the link since it’s also adding backslashes before apostrophes.
claudio antonelliParticipantI’d love a translation, even something brief just outlining the effects and transfer modes.
claudio antonelliParticipantAlan Latteri has the tip on his website here:
http://instinctual.tv/instinctual/flame_tips/flame_tips.html
I can’t find it on the autodesk area anymore.
The basic trick is to camera track something, add your backplate as an image to the scene, make it a child of the tracked camera and push it back into Z until, when looking through the tracked camera, it matches up with the actual backplate. Then dupe the camera, smooth out it’s motion and use it to look at your image that’s attached to the tracked camera.
very handy.
-
AuthorPosts
