Datasets
Standard Dataset
Visual Inertial Odometry Sensor Fusion Approach for Autonomous Localization
- Citation Author(s):
- Submitted by:
- Simone Godio
- Last updated:
- Fri, 04/30/2021 - 12:20
- DOI:
- 10.21227/10av-kj18
- License:
- Categories:
Abstract
This paper describes a sensor fusion technique to localize autonomously unmanned vehicles. In particular, we performed a sensor fusion based on the extended Kalman filter between two commercial sensors. The adopted sensors are ZED2 and Intel T265, respectively; these platforms already perform visual-inertial odometry in their integrated system-on-chip. Since these 2 devices represent the top of the range on the market to make an autonomous localization, this study aims to analyze and inform about results that can be obtained by performing a sensor fusion between the two cameras. Several tests on a specific trajectory and environment demonstrated that a more robust autonomous localization than one of the single cameras can be obtained by properly tuning parameters and inputs of the Extended Kalman filter.
rl -> 'EKF results txt file'
t265 -> 'Intel t265 results txt file'
ZED2 -> 'ZED2 results txt file'
paper.fig -> 3D result graph shown in the paper (Fig. 10)
fig1.fig -> Z-error shown in the paper (Fig. 11)
fig2.fig -> XY results shown in the paper (Fig. 12)
test_1.m -> Matlab file to manage data