Technical Papers

Motion Capture and Synthesis

Saturday, 01 December 09:00 - 10:45 |  Peridot 206

Lightweight Binocular Facial Performance Capture under Uncontrolled Lighting - Picture

Lightweight Binocular Facial Performance Capture under Uncontrolled Lighting

We propose a passive facial performance capture approach that reconstructs detailed dynamic facial geometry from a single stereo pair of cameras and succeeds under uncontrolled and time-varying lighting. Our method brings facial performance capture out of the studio, into the wild, and within the reach of everybody.


Levi Valgaerts, Max Planck Institute for informatics
Chenglei Wu, Max Planck Institute for informatics
Andrés Bruhn, University of Stuttgart
Hans-Peter Seidel, Max Planck Institute for informatics
Christian Theobalt, Max Planck Institute for informatics


Accurate Realtime Full-body Performance Capture Using A Single Depth Camera - Picture

Accurate Realtime Full-body Performance Capture Using A Single Depth Camera

We present a new method for accurately capturing full-body performance using a single depth camera. Our system is robust, automatic, runs in real time, and allows for accurate reconstruction of human-body poses even under significant occlusions. We achieve state of the art accuracy in our comparison with alternatives.


Xiaolin Wei, Texas A&M University
Peizhao Zhang, Texas A&M University
Jinxiang Chai, Texas A&M University


Data-driven Finger Motion Synthesis for Gesturing Characters - Picture

Data-driven Finger Motion Synthesis for Gesturing Characters

Creating compelling finger motions is a challenging and time-consuming process. Our method automatically adds detailed finger movements to the body motions of gesturing and conversing characters. We locate suitable finger motion segments from a database based on similarity of the arm motions and smoothness of the reconstructed finger motions.


Sophie Jörg, Carnegie Mellon University, Clemson University
Jessica Hodgins, Carnegie Mellon University, Disney Research, Pittsburgh
Alla Safonova, Disney Research, Pittsburgh


A Statistical Similarity Measure for Aggregate Crowd Dynamics - Picture

A Statistical Similarity Measure for Aggregate Crowd Dynamics

We present an information-theoretic method to measure similarity between observed, real-world data and visual simulation of aggregate motions for a complex system consisting of many individual agents. The resulting metric is robust to data noise, motion uncertainty, and correlates strongly with user perceptions of motion similarity.


Stephen Guy, University of North Carolina - Chapel Hill
Jur van den Berg, University of Utah
Wenxi Liu, City University of Hong Kong
Rynson Lau, City University of Hong Kong
Ming Lin, University of North Carolina - Chapel Hill
Dinesh Manocha, University of North Carolina - Chapel Hill