CASIA OpenIR  > 毕业生  > 硕士学位论文
视觉SLAM中的多传感器标定及稠密地图创建的研究
其他题名A Study on Multi-Sensor Calibration and Dense Mapping in Visual SLAM
李立杭
2015-05-27
学位类型工学硕士
中文摘要同步定位与地图创建(Simultaneous Localization and Mapping,SLAM)是指一个移动物体在进行自身定位的同时构建未知环境的三维模型的过程,是计算机视觉领域的重要研究主题。近年来,视觉SLAM技术大量应用在机器人导航、增强现实、三维重建等领域。然而,视觉SLAM技术还存在一些关键问题尚待解决,比如多传感器系统的标定、稠密地图的创建以及闭环检测等。本文针对多传感器标定和稠密地图创建这两个问题,提出了一种用于视觉/惯性系统的由粗到精的标定方法,并且开发了一个进行稠密地图创建的视觉SLAM系统。本文的主要贡献有: (1) 提出了一种由粗到精的多传感器标定方法,可标定一个由两个工业相机、一个惯性测量单元(Inertia Measurement Unit,IMU)和一个倾角仪组成的视觉/惯性系统。该方法首先标定这两个工业相机。之后使用求得的相机参数,初步计算IMU和一个相机以及倾角仪和一个相机的相对姿态。利用以上结果,可以直接计算IMU和倾角仪的相对姿态。由于运动平台的硬件约束,IMU和倾角仪的初始估计姿态往往精度不高。针对此问题,提出了一个新的优化算法,该算法可以有效地融合更高质量的IMU和倾 角仪数据,提升IMU和倾角仪相对姿态的标定精度。实验结果表明了此标定方法的有效性。 (2) 开发了一套针对以上视觉/惯性系统的标定软件包,该软件包整合了由粗到精的标定方法和几个功能模块,并提供了一个友好的图形用户界面。该软件包采用自底向上的分层架构并将功能划分为几个独立的模块。软件包提供了多传感器数据的同步采集,标定平台机械结构的运动控制,相机的内外参数标定,视觉/惯性传感器空间关系的标定以及标定结果的优化等功能。使用者可以便捷地通过图形用户界面完成所有的操作。 (3) 开发了一个针对稠密地图创建问题的单目摄像机SLAM系统,该系统使用多视图立体视觉技术进行稠密深度估计,通过深度图融合得到一个稠密地图。为了实时处理,系统在高性能的GPU上进行开发。此外,初步开发了一个基于RGB-D传感器进行稠密地图创建的系统。实验结果表明,基于单目摄像机的系统较适合于小场景的稠密地图创建(比如桌面环境),而基于RGB-D传感器的系统可以处理中等规模的室内场景。
英文摘要SLAM(Simultaneous Localization and Mapping), referring to the process that a mobile object localizes itself and simultaneously builds the 3D models of the unknown scene, is an important research topic in the field of computer vision. Recently, SLAM has been extensively applied into many applications including robot navigation, augmented reality, 3D reconstruction, etc. However, there still exist a few key problems for the SLAM technique, such as multi-sensor systemic calibration, dense mapping, loop closure detection, etc. In this thesis, addressing both the multi-sensor calibration and dense mapping problems, I explore a Visual-Inertial system, where a coarse-to-fine calibration method is proposed. Moreover, a visual SLAM system which can do dense mapping is developed. The main contributions are: (1) A coarse-to-fine calibration method is proposed, which is used to calibrate a Visual-Inertial system consisting of two cameras, one IMU and one inclinometer. The proposed method first calibrates the two cameras. Then with the obtained camera parameters, the relative pose between the IMU and a camera, as well as the relative pose between the inclinometer and a camera is computed initially. And the pose between the IMU and the inclinometer is calculated directly. The initially estimated relative pose between the IMU and the inclinometer is prone to be poor due to the limitation of the used motion platform, hence a new optimization algorithm, which is able to integrate high-quality measurements captured from the IMU and the inclinometer, is proposed to refine this relative pose. Extensive experimental results show the effectiveness of the proposed method. (2) A software package for calibrating the above Visual-Inertial system is developed, which integrates the coarse-to-fine calibration method with several modules and a friendly graphical user interface. The software package uses bottom-to-up design principle and divides the functionalities into several individual modules. The core modules can collect measurements from the two cameras, the IMU and the inclinometer simultaneously, control the 6 DoF motion platform, perform the camera calibration, calibrate the vision/inertial sensors, and optimize the calibration results. All the operations can be done through the graphical user interface, which makes the software package easy to use. (3) A monocular SLAM system is developed aiming the problem on dense mapping, which estimates dense depth by utilizing M...
关键词Slam 计算机视觉 惯性传感器 手眼标定 稠密地图创建 Slam Computer Vision Inertial Sensors Hand-eye Calibration Dense Mapping
语种中文
文献类型学位论文
条目标识符http://ir.ia.ac.cn/handle/173211/7760
专题毕业生_硕士学位论文
推荐引用方式
GB/T 7714
李立杭. 视觉SLAM中的多传感器标定及稠密地图创建的研究[D]. 中国科学院自动化研究所. 中国科学院大学,2015.
条目包含的文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可
CASIA_20122801462804(4529KB) 暂不开放CC BY-NC-SA
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[李立杭]的文章
百度学术
百度学术中相似的文章
[李立杭]的文章
必应学术
必应学术中相似的文章
[李立杭]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。