Matching Trajectories of Anatomical Landmarks Under Viewpoint,Anthropometric and Temporal Transforms |
| |
Authors: | Alexei Gritai Yaser Sheikh Cen Rao Mubarak Shah |
| |
Affiliation: | (1) Cernium Corporation, Reston, USA;(2) Carnegie Melon University, Pittsburgh, USA;(3) PVI Virtual Media Services, New York, USA;(4) School of Electrical Engineering and Computer Science, University of Central Florida, Orlando, USA |
| |
Abstract: | An approach is presented to match imaged trajectories of anatomical landmarks (e.g. hands, shoulders and feet) using semantic
correspondences between human bodies. These correspondences are used to provide geometric constraints for matching actions
observed from different viewpoints and performed at different rates by actors of differing anthropometric proportions. The
fact that the human body has approximate anthropometric proportion allows innovative use of the machinery of epipolar geometry
to provide constraints for analyzing actions performed by people of different sizes, while ensuring that changes in viewpoint
do not affect matching. In addition, for linear time warps, a novel measure, constructed only from image measurements of the
locations of anatomical landmarks across time, is proposed to ensure that similar actions performed at different rates are
accurately matched as well. An additional feature of this new measure is that two actions from cameras moving at constant
(and possibly different) velocities can also be matched. Finally, we describe how dynamic time warping can be used in conjunction
with the proposed measure to match actions in the presence of nonlinear time warps. We demonstrate the versatility of our
algorithm in a number of challenging sequences and applications, and report quantitative evaluation of the matching approach
presented. |
| |
Keywords: | Applications Trajectory matching Human Information Processing Motion |
本文献已被 SpringerLink 等数据库收录! |
|