Institute of Computing Technology, Chinese Academy IR
Environmental-structure-perception-based adaptive pose fusion method for LiDAR-visual-inertial odometry | |
Zhao, Zixu1,2,3,4; Liu, Chang1,3,4; Yu, Wenyao1,2,3,4; Shi, Jinglin1; Zhang, Dalin1 | |
2024-05-01 | |
发表期刊 | INTERNATIONAL JOURNAL OF ADVANCED ROBOTIC SYSTEMS |
ISSN | 1729-8814 |
卷号 | 21期号:3页码:20 |
摘要 | Light Detection and Ranging (LiDAR)-visual-inertial odometry can provide accurate poses for the localization of unmanned vehicles working in unknown environments in the absence of Global Positioning System (GPS). Since the quality of poses estimated by different sensors in environments with different structures fluctuates greatly, existing pose fusion models cannot guarantee stable performance of pose estimations in these environments, which brings great challenges to the pose fusion of LiDAR-visual-inertial odometry. This article proposes a novel environmental structure perception-based adaptive pose fusion method, which achieves the online optimization of the parameters in the pose fusion model of LiDAR-visual-inertial odometry by analyzing the complexity of environmental structure. Firstly, a novel quantitative perception method of environmental structure is proposed, and the visual bag-of-words vector and point cloud feature histogram are constructed to calculate the quantitative indicators describing the structural complexity of visual image and LiDAR point cloud of the surroundings, which can be used to predict and evaluate the pose quality from LiDAR/visual measurement models of poses. Then, based on the complexity of the environmental structure, two pose fusion strategies for two mainstream pose fusion models (Kalman filter and factor graph optimization) are proposed, which can adaptively fuse the poses estimated by LiDAR and vision online. Two state-of-the-art LiDAR-visual-inertial odometry systems are selected to deploy the proposed environmental structure perception-based adaptive pose fusion method, and extensive experiments are carried out on both open-source data sets and self-gathered data sets. The experimental results show that environmental structure perception-based adaptive pose fusion method can effectively perceive the changes in environmental structure and execute adaptive pose fusion, improving the accuracy of pose estimation of LiDAR-visual-inertial odometry in environments with changing structures. |
关键词 | Adaptive pose fusion LiDAR-visual-inertial odometry environmental structure perception pose estimation of unmanned vehicle sensor fusion |
DOI | 10.1177/17298806241248955 |
收录类别 | SCI |
语种 | 英语 |
资助项目 | National Key R&D Program of China[2022YFC3320800] ; Zhejiang Provincial Key RD Plan of China[2021C01040] |
WOS研究方向 | Robotics |
WOS类目 | Robotics |
WOS记录号 | WOS:001216075600001 |
出版者 | SAGE PUBLICATIONS INC |
引用统计 | |
文献类型 | 期刊论文 |
条目标识符 | http://119.78.100.204/handle/2XEOYT63/38985 |
专题 | 中国科学院计算技术研究所期刊论文_英文 |
通讯作者 | Zhao, Zixu |
作者单位 | 1.Chinese Acad Sci, Inst Comp Technol, Wireless Commun Technol Res Ctr, Beijing 100190, Peoples R China 2.Univ Chinese Acad Sci, Beijing, Peoples R China 3.Chinese Acad Sci, Inst Comp Technol, State Key Lab Processors, Beijing, Peoples R China 4.Beijing Key Lab Mobile Comp & Pervas Device, Beijing, Peoples R China |
推荐引用方式 GB/T 7714 | Zhao, Zixu,Liu, Chang,Yu, Wenyao,et al. Environmental-structure-perception-based adaptive pose fusion method for LiDAR-visual-inertial odometry[J]. INTERNATIONAL JOURNAL OF ADVANCED ROBOTIC SYSTEMS,2024,21(3):20. |
APA | Zhao, Zixu,Liu, Chang,Yu, Wenyao,Shi, Jinglin,&Zhang, Dalin.(2024).Environmental-structure-perception-based adaptive pose fusion method for LiDAR-visual-inertial odometry.INTERNATIONAL JOURNAL OF ADVANCED ROBOTIC SYSTEMS,21(3),20. |
MLA | Zhao, Zixu,et al."Environmental-structure-perception-based adaptive pose fusion method for LiDAR-visual-inertial odometry".INTERNATIONAL JOURNAL OF ADVANCED ROBOTIC SYSTEMS 21.3(2024):20. |
条目包含的文件 | 条目无相关文件。 |
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论