首页 | 官方网站   微博 | 高级检索  
     

基于拉普拉斯分布的双目视觉里程计EI北大核心CSCD
引用本文:范涵奇,吴锦河.基于拉普拉斯分布的双目视觉里程计EI北大核心CSCD[J].自动化学报,2022,48(3):865-876.
作者姓名:范涵奇  吴锦河
作者单位:1.北方工业大学信息学院 北京 100144
基金项目:北京市教育委员会科研计划一般项目(KM201710009007)资助~~;
摘    要:针对相机在未知环境中定位及其周围环境地图重建的问题,本文基于拉普拉斯分布提出了一种快速精确的双目视觉里程计算法.在使用光流构建数据关联时结合使用三个策略:平滑的运动约束、环形匹配以及视差一致性检测来剔除错误的关联以提高数据关联的精确性,并在此基础上筛选稳定的特征点.本文单独估计相机的旋转与平移.假设相机旋转、三维空间点以及相机平移的误差都服从拉普拉斯分布,在此假设下优化得到最优的相机位姿估计与三维空间点位置.在KITTI和New Tsukuba数据集上的实验结果表明,本文算法能快速精确地估计相机位姿与三维空间点的位置.

关 键 词:视觉里程计  运动估计  光流  拉普拉斯分布
收稿时间:2019-12-18

Stereo Visual Odometry Based on Laplace Distribution
Affiliation:1.School of Information, North China University of Technology, Beijing 100144
Abstract:In this paper, we present a stereo visual odometry algorithm to estimate the locations of the camera and the surrounding map of unknown environments. The proposed algorithm works fast and yields an accurate trajectory of the camera and environment map. We associate the features of frames by optical flow, and then we select stable features by applying three strategies, i.e. smooth motion constraints, circular matching, and disparity consistency, from the associations. Our algorithm estimates translations and orientations of the camera separately only from the selected stable features. We optimize camera poses and 3D points of the environmental map by assuming the uncertainties of these quantities obey the Laplace distributions which resist outliers and large errors. The experimental results on the KITTI and New Tsukuba datasets show that the proposed algorithm can quickly and accurately estimate the camera pose and 3D environment points.
Keywords:
本文献已被 维普 等数据库收录!
点击此处可从《自动化学报》浏览原始摘要信息
点击此处可从《自动化学报》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号