首页 | 官方网站   微博 | 高级检索  
     


Robotic grasping and manipulation through human visuomotor learning
Authors:Brian Moore  Erhan Oztop
Affiliation:1. The C. Eugene Bennett Department of Chemistry, 217 Clark Hall, West Virginia University, Morgantown, WV 26506, USA;2. Center for Neuroscience, Robert C. Byrd Health Sciences Center, PO Box 9304, West Virginia University, Morgantown, WV 26506, USA;3. NanoSAFE, PO Box 6223, West Virginia University, Morgantown, WV 26506, USA
Abstract:A major goal of robotics research is to develop techniques that allow non-experts to teach robots dexterous skills. In this paper, we report our progress on the development of a framework which exploits human sensorimotor learning capability to address this aim. The idea is to place the human operator in the robot control loop where he/she can intuitively control the robot, and by practice, learn to perform the target task with the robot. Subsequently, by analyzing the robot control obtained by the human, it is possible to design a controller that allows the robot to autonomously perform the task. First, we introduce this framework with the ball-swapping task where a robot hand has to swap the position of the balls without dropping them, and present new analyses investigating the intrinsic dimension of the ball-swapping skill obtained through this framework. Then, we present new experiments toward obtaining an autonomous grasp controller on an anthropomorphic robot. In the experiments, the operator directly controls the (simulated) robot using visual feedback to achieve robust grasping with the robot. The data collected is then analyzed for inferring the grasping strategy discovered by the human operator. Finally, a method to generalize grasping actions using the collected data is presented, which allows the robot to autonomously generate grasping actions for different orientations of the target object.
Keywords:
本文献已被 ScienceDirect 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号