Mobile robot interception using human navigational principles: Comparison of active versus passive tracking algorithms |
| |
Authors: | Thomas G Sugar Michael K McBeath Anthony Suluh Keshav Mundhra |
| |
Affiliation: | (1) Arizona State University, Tempe, AZ, 85287-6106 |
| |
Abstract: | We examined human navigational principles for intercepting a projected object and tested their application in the design of
navigational algorithms for mobile robots. These perceptual principles utilize a viewer-based geometry that allows the robot
to approach the target without need of time-consuming calculations to determine the world coordinates of either itself or
the target. Human research supports the use of an Optical Acceleration Cancellation (OAC) strategy to achieve interception.
Here, the fielder selects a running path that nulls out the acceleration of the retinal image of an approaching ball, and
maintains an image that rises at a constant rate throughout the task. We compare two robotic control algorithms for implementing
the OAC strategy in cases in which the target remains in the sagittal plane headed directly toward the robot (which only moves
forward or backward). In the “passive” algorithm, the robot keeps the orientation of the camera constant, and the image of
the ball rises at a constant rate. In the “active” algorithm, the robot maintains a camera fixation that is centered on the
image of the ball and keeps the tangent of the camera angle rising at a constant rate. Performance was superior with the active
algorithm in both computer simulations and trials with actual mobile robots. The performance advantage is principally due
to the higher gain and effectively wider viewing angle when the camera remains centered on the ball image. The findings confirm
the viability and robustness of human perceptual principles in the design of mobile robot algorithms for tasks like interception.
Thomas Sugar works in the areas of mobile robot navigation and wearable robotics assisting gait of stroke survivors. In mobile robot navigation,
he is interested in combining human perceptual principles with mobile robotics. He majored in business and mechanical engineering
for his Bachelors degrees and mechanical engineering for his Doctoral degree all from the University of Pennsylvania. In industry,
he worked as a project engineer for W. L. Gore and Associates. He has been a faculty member in the Department of Mechanical
and Aerospace Engineering and the Department of Engineering at Arizona State University. His research is currently funded
by three grants from the National Sciences Foundation and the National Institutes of Health, and focuses on perception and
action, and wearable robots using tunable springs.
Michael McBeath works in the area combining Psychology and Engineering. He majored in both fields for his Bachelors degree from Brown University
and again for his Doctoral degree from Stanford University. Parallel to his academic career, he worked as a research scientist
at NASA—Ames Research Center, and at the Interval Corporation, a technology think tank funded by Microsoft co-founder, Paul
Allen. He has been a faculty member in the Department of Psychology at Kent State University and at Arizona State University,
where he is Program Director for the Cognition and Behavior area, and is on the Executive Committee for the interdisciplinary
Arts, Media, and Engineering program. His research is currently funded by three grants from the National Sciences Foundation,
and focuses on perception and action, particularly in sports. He is best known for his research on navigational strategies
used by baseball players, animals, and robots. |
| |
Keywords: | Mobile robot navigation Visual servoing Perceptual principles |
本文献已被 SpringerLink 等数据库收录! |
|