This paper describes a sensor fusion technique to localize autonomously unmanned vehicles. In particular, we performed a sensor fusion based on the extended Kalman filter between two commercial sensors. The adopted sensors are ZED2 and Intel T265, respectively; these platforms already perform visual-inertial odometry in their integrated system-on-chip. Since these 2 devices represent the top of the range on the market to make an autonomous localization, this study aims to analyze and inform about results that can be obtained by performing a sensor fusion between the two cameras.