Collaborative robot true trajectories and pose estimations

0
0 ratings - Please login to submit your rating.

Abstract 

This is a PART of the dataset used in our paper titled "Detecting Anomalous Robot Motion in Collaborative Robotic Manufacturing Systems".

Abstract of the paper: Anomalous robot motions caused by cyber attacks and inherent defects can lead to task failures as well as harmful accidents in collaborative human-robot workplaces. External sensors are essential for reliably monitoring robot tool paths and detecting any deviation as early as possible, especially in anomalous situations where the robot system and its internal sensors may be out-of-control. However, affordable external sensors usually suffer from poor-quality measurements, necessitating enhancements. In addition to external sensors providing independent monitoring, to effectively detect trajectory deviations, we need anomaly detection methods that can capture complex dynamic patterns in the nonlinear trajectory time-series data. In this work, we propose a framework to accurately estimate robot trajectories during normal robot operations as well as to detect various types of anomalous trajectory changes using an external camera. The proposed framework efficiently incorporates marker-based pose estimation, Long Short-Term Memory (LSTM), and residual control charts. The framework was evaluated in a shared human-robot assembly task. The results show that we can accurately estimate robot trajectories by enhancing camera-based measurements. Moreover, it effectively detects anomalous trajectory changes in their early stages. The motion deviation upon detection assists in determining a safe working distance. The framework also exhibits generalizability to previously unseen trajectory deviations and possesses adaptability to other types of external sensors. 

Instructions: 

Each file contains various continuous measurements reagrding the robot trajectory during a collaboratie human-robot assembly task.

These measurements include time stamps, ground truth robot pose, camera-based robot pose estimations, etc.