Home Page › forums › Autodesk/Discreet › Flame and Smoke › Tracking Architecture on live footage
- This topic has 6 replies, 5 voices, and was last updated 14 years, 11 months ago by Anonymous.
-
AuthorPosts
-
November 11, 2005 at 2:58 pm #200419TomDParticipant
There’s a very good chance that my team will be embarking on a very large animation project. I’d like to get some impressions from those who are experienced in architectural visualization and the workflow associated with these types of projects.
I’ll present a simple overview and ask a couple questions. Any responses are greatly appreciated.
Overview:
We’ll need to match move to helicopter shots – HD footage (digital). The architectural models will have to be photoreal and blend seamlessly into the environment that’s shot.Software:
Maya 7 Unlimited with the software and mentalray rendering engines will be used to get as realistic a look as possible. discreet flame (version 9.5) will also be available for all compositing work.Questions:
1. I have had great luck with limited 3d tracking in flame, but there is probably better stand alone solutions. What are your reccomendations for tracking software to recreate the actual environment around the architecture that will be created? Is Maya Live up to the task, or is this something that Boujou or pfTrack will be needed for? flame has a great integrated tracker that can export to fbx format, which I think we might be able to use in Maya.
In short, what is the best choice for the tracking and recreating the environment to situate the models on? The match move has to be perfect.
2. Compositing? What types of passes would be best to render for compositing in flame? I’m assuming that AO, specular and diffuse passes for the most part. I’m also assuming that we’ll use HDRI lighting for everything.
Any resources (visually) to look at would be well received. I’m trying to overload myself with information so I can make the best decisions going forward.
thanks.
November 11, 2005 at 4:22 pm #211095AnonymousInactiveI’ve had great luck tracking things in SynthEyes… plus it’s waaaaay less expensive than the other solutions.
November 11, 2005 at 5:56 pm #211091kalthansParticipantfrom our experience with Boujou (now at 3.1) i can say that it makes VERY short work out of 3d camera matching, even with questionable material. a lot of people balk at the price tag but i think it is worth every penny (keep in mind that a lot of people balk at FFI’s price, too).
the new version has provision for non-fixed (zoom) lens shots but it is not a panacea for all types of zoomed imagery. there was a aerial shot (grand sweeping aerial looking down w/ zoom and gimbal lock) in our studio recently that boujou could not track and we ended up sending the shot to 2d3 and they spent several days on it making it work.
if you have any chance at all to influence the aerial photography parameters before they happen i’d recommend limiting the amount of zoom since that is probably the single largest thing to throw 3d tracking software off.
since you’re talking about tracking aerials of citys or building i think you might be in for some luck (unless there are a lot of reflective surfaces) since that type of imagery will no doubt present tons of available features to autotrack and extrapolate data from.
as for HDRI i am curious to hear what type of image seqeunces you end up using to generate your HDR images. my first thought concerning HDRI when you mentioned aerials is the inherent difficulty in gathering HDRI sequences (from a classic mirror-ball technique) when you are shooting from a helo. do you plan on having a team on the ground at the same time as the aerials are being shot? i wonder if the mirror-ball method will need to be modified to suit a grander scale of architechtural proportions (maybe shooting an HDRI sequence from ground or low-rooftop level in the proximity of where the 3d building will be located in the scene with a fisheye lens pointed up?…i dunno….i’m just theorizing here). please let us know how you end up implementing HDRI in this scheme…i’d love to hear!
good luck
k
November 12, 2005 at 2:23 am #211093AnonymousInactiveI’d also recommend SynthEyes. It’s very efficient, extremely fast, and not expensive at all.
I’m now using it full time, and the results are very impressive (it’s so fast, I still can’t belive it).
It has a lot of nice features like geometry reconstruction that can be acheived in a few clicks and give great results for shadow catchers or accurate ref objects for a 3D scene.Dont’t forget to note the camera and lens specs for each shot. They are REALLY important. (If you can also get any survey data, that could help a lot).
Also, note that if you shoot from an helicopter, the camera will be subject to a lot of vibrations, so you’ll certainy need to stabilize (or smooth) the camera movment. The new smoothcam node in shake is very fast and gives excellent results and saves a lot of time in comparaison to old stabilisation/smoothing methods.
It worked great even on proxies of HD footage. It’s really the tool to use, if you want to quickly smooth a lot of erratic shots in a short amount of time.
It could be a good companion to your flame workstation.For the 3D part, I suggest you to render only what you exactly need to avoid heavy compositions. Render only what’s necessary.
I guess that in your case, the Zchannel will be the most usefull as you’ll have to simulate a lot of environment effects.
For the other passes, you’ll have to evaluate your needs.November 14, 2005 at 12:55 pm #211096AnonymousInactiveFLGB wrote:I’d also recommend SynthEyes. It’s very efficient, extremely fast, and not expensive at all.
I’m now using it full time, and the results are very impressive (it’s so fast, I still can’t belive it).
It has a lot of nice features like geometry reconstruction that can be acheived in a few clicks and give great results for shadow catchers or accurate ref objects for a 3D scene.Interesting. As an addition to this thread, has anyone used RealVIS MatchMover Pro software? I understand that the flame/inferno tracking software is actually this (or is it only in inferno?). I’ve taken a look at it and although it’s more expensive than SynthEyes, it seems highly capable.
The one thing I do not like about SynthEyes is that I can’t see exactly how it would fit into a workflow for use in Maya. I saw that MM pro has a plug-in for Maya and Max … anyone have experience with it? Knowing I can track footage right inside Maya and have a scene set up already is very appealing.
FLGB wrote:Also, note that if you shoot from an helicopter, the camera will be subject to a lot of vibrations, so you’ll certainy need to stabilize (or smooth) the camera movment. The new smoothcam node in shake is very fast and gives excellent results and saves a lot of time in comparaison to old stabilisation/smoothing methods.
It worked great even on proxies of HD footage. It’s really the tool to use, if you want to quickly smooth a lot of erratic shots in a short amount of time.
It could be a good companion to your flame workstation.That’s something I haven’t heard yet. Anothing thing I noticed is that PixelFarm has a tracking plugin for Shake, PFMatch, which can export a Flame action and data for Maya. This is actually pretty interesting since it could bridge the workflow between the two stations pretty seamlessly. Anyone use Shake – or a plugin inside Shake – for tracking data instead of any of these third party tools? Keeping the number of necessary software packages down to perform such a job would be nice.
Also, what kind of benefits would one get from using Shake instead of flame for the stabilizing? Is it just a faster workflow or is the stabilizing actually more accurate?
FLGB wrote:For the 3D part, I suggest you to render only what you exactly need to avoid heavy compositions. Render only what’s necessary.
I guess that in your case, the Zchannel will be the most usefull as you’ll have to simulate a lot of environment effects.
For the other passes, you’ll have to evaluate your needs.Completely agree. Now, if only Maya would export an RLA/RPF format file that flame supports, I’d be a happy camper. For now, I’m looking at mental ray being the renderer of choice, so we’d probably export out a separate z-pass for this type of work (and for DOF, if needed).
good thread and great information so far.
November 14, 2005 at 3:08 pm #211094AnonymousInactiveWell, as far as incorporating SynthEyes into your pipeline, it’s pretty much been this for me:
1. Import the video and track it. If the auto-tracking fails manual tracking is pretty painless and easy in SynthEyes.
2. Set one tracked point up as the origin, and set the distance between two tracked points to their real-world value. Update the tracking data (it just scales the points/camera and reorients them around the origin).
3. Export the camera file to MAX (in my case).
I would suggest at least giving SynthEyes a shot before getting one of the other products – even if it ends up to be not what you need it’s so inexpensive it shouldn’t really hurt in the end. I’m not sure how well the other packages handle manual tracking, and as far as I can tell some of them you have to set up a matte in another program.
November 15, 2005 at 1:29 am #211092AnonymousInactiveIn general (maybe 95% of cases), the autotracking of syntheyes gives excellent results. You just need to set correctly your blips. You can also use the semi-automatic mode which also gives excellent results.
Even on green-screen footage with few trackers on screen (even blurred cause out of DOF) the auto-tracking succeeded.As far as I know, the pipeline and capabilities of syntheyes are exactly the same as the other tracking softwares.
It’s really a great piece of software.Now back to the stabilisation issue :
Except if you’ll be shooting from the helicopters shots with a specalised gyroscopic stabilisation system, you can be sure that your shoots will be subjects to erratic movments.
A 3D tracker can’t help for this. You have to do this in 2D.To suppress them you’ll need to smooth your camera motion.
The old way of doing this was :
– track your plate with two trackers (really hard to do because the trackers will inevitably and rapidly go off-screen, and you’ll have to find new targets to continue the stabilisation path).
– Hand-correct keyframes that went wrong
– Smooth the XY coordinates (animation curvres) of the trackers
– Pray that your tracks are as accurate as possible
– and apply these new smoothed coordinates as a matchmove.
This way you keep the original camera motion intact while removing the jitter.I did that on a project, and it took me about 30mins per shot (belive me, it’s really hard to find good targets on rapidly moving helicopter shots of 2000 frames).
The new way of doing this (in shake) is to use the new smooth cam feature which analizes the optical flow. Just hit a button and it will work in a few minutes (or seconds depending of the duration and size of your images).
The result will be far more accurate and totally flawless compared the ordinary way.So if your project consists in a lot of helicopter shots, this feature will save you a lot of time, and more if you only use it in the terminal.
I don’t know the capabilities of the the FFI toolbox but a shake seat with an intern can be a really good ally.
-
AuthorPosts
- You must be logged in to reply to this topic.
