首页 | 官方网站   微博 | 高级检索  
     


Visual recognition of pointing gestures for human–robot interaction
Authors:Kai Nickel  Rainer Stiefelhagen  
Affiliation:

aInteractive Systems Labs, Universitaet Karlsruhe, 76131 Karlsruhe, Germany

Abstract:In this paper, we present an approach for recognizing pointing gestures in the context of human–robot interaction. In order to obtain input features for gesture recognition, we perform visual tracking of head, hands and head orientation. Given the images provided by a calibrated stereo camera, color and disparity information are integrated into a multi-hypothesis tracking framework in order to find the 3D-positions of the respective body parts. Based on the hands’ motion, an HMM-based classifier is trained to detect pointing gestures. We show experimentally that the gesture recognition performance can be improved significantly by using information about head orientation as an additional feature. Our system aims at applications in the field of human–robot interaction, where it is important to do run-on recognition in real-time, to allow for robot egomotion and not to rely on manual initialization.
Keywords:Person tracking  Gesture recognition  Head orientation  Human–robot interaction
本文献已被 ScienceDirect 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号