Home Page › forums › fx Art and Technique › 3D / CGI › realtime 3d character
- This topic has 1 reply, 2 voices, and was last updated 9 years, 8 months ago by Makayla Crespo.
-
AuthorPosts
-
March 14, 2011 at 4:19 am #204074AnonymousGuest
Hi there,
I am making a research for creating a 3d character for a live broadcast. Pretty much like this example, David Tench show, created by Animal Logic in Sydney, they did an amazing work,
https://www.youtube.com/watch?v=jApTh6VwgHI&feature=relatedApparently they used vicon for mocap.
I am mostly curious about the realtime rendering of it? Anybody out there has any idea? I wonder if it is just a keying off of motionbuilder’s realtime screen capture? or maybe they’ve used a game engine?
Whatever they did it looks really good to my eye considering how tough things get when it is realtime.
Thanks,
Serkan
March 14, 2011 at 1:05 pm #219531Makayla CrespoParticipantHi There, We have a Vicon Blade 10 camera setup for body mo-cap at school. You can download a plugin from Vicon to stream data straight into motion builder. The character in the video shouldn\’t be hard to render out realtime in motionbuilders viewport, I would think either it was keyed from MOBU or the animation data was streamed to maya with their own plugin. Using tracker points on the face to drive blendshapes in maya.
I have not seen any publicly available plugin to stream optical data straight into Maya. But its definitely been done before.
Theres also a lot of hype at the moment using kinetic to drive characters in MOBU, but you wouldn\’t be able to drive facial capture yet with it.
You might also like to check out CryTeks CryEngine 3 – Its the next step in real-time rendering and now they are releasing a client for Maya to stream live animation data.
https://www.youtube.com/watch?v=OgrrrzJS89s
Lovely stuff that.
-
AuthorPosts
- You must be logged in to reply to this topic.
