The technical papers preview of this year’s SIGGRAPH once again does not fail to impress. This year has a heavy weighting of complex computational photography papers and scene reconstruction research.

There is also a strong showing in Cloth sims papers with several such papers featured in the annual preview video.

One of the most impressive papers is the Mode-Adaptive Neural Networks for Quadruped Motion Control. Which once again shows how various forms of AI and deep learning will increasingly be a part of the vfx world and more effects closer to real time.  The approach for real-time quadruped motion synthesis called Mode-Adaptive Neural Networks and is trained on unstructured motion capture data, without requiring labels for the phase or locomotion gaits. The system can be used for creating natural animations in games and films, and is the first of such systematic approaches whose quality could be of practical use. It is implemented in the Unity 3D engine and TensorFlow.

 

The summary paper is available here. This work is from the University of Edinburgh and Adobe Research. Once again promoting the question: Is Adobe about to take on Autodesk in 3D?