Computer vision can be used by robotic leg prostheses and exoskeletons to improve high-level transitions between different locomotion modes (e.g., level-ground walking to stair ascent) through the prediction of future environmental states. Here we developed the StairNet dataset to support research and development in vision-based automated stair recognition.

Categories:
395 Views

Dataset asscociated with a paper in 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems

"Talk the talk and walk the walk: Dialogue-driven navigation in unknown indoor environments"

If you use this code or data, please cite the above paper.

 

Instructions: 

See the docs directory.

Categories:
117 Views

This is the original data and the fitted data from the paper Scalability in Computing and Robotics by Heiko Hamann and Andreagiovanni Reina. For more information please refer to the paper.

Categories:
78 Views

Given the difficulty to handle planetary data we provide downloadable files in PNG format from the missions Chang'E-3 and Chang'E-4. In addition to a set of scripts to do the conversion given a different PDS4 Dataset.

Instructions: 

Please see Readme inside ZIP files for more information about the provided data and scripts. 

Categories:
254 Views

Data validation of the article "A framework for assessing, comparing and predicting the performance of autonomous RFID-based inventory robots for retail"

Categories:
181 Views

Dataset of rosbags collected during autonomous drone flight inside a warehouse of stockpiles. PCD files created using reconstruction method proposed by article.

Data still being move to IEEE-dataport. 

Instructions: 

Bag files contais multiple topics. Proposed method uses mainly Velodyne lidar pointcloud information and DJI imu

Categories:
783 Views

Synergistic prostheses enable the coordinated movement of the human-prosthetic arm, as required by activities of daily living. This is achieved by coupling the motion of the prosthesis to the human command, such as residual limb movement in motion-based interfaces. Previous studies demonstrated that developing human-prosthetic synergies in joint-space must consider individual motor behaviour and the intended task to be performed, requiring personalisation and task calibration.

Instructions: 

Task-space synergy comparison data-set for the experiments performed in 2019-2020.

Directory:

  • Processed: Processed data from MATLAB in ".mat" format. Organised by session and subject.
  • Raw: Raw time-series data gathered from sensors in ".csv" format. Each file represents a trial where a subject performed a reaching task. Organised by subject, modality and session. Anonymised subject information is included in a ".json" file.
    • Columns of the time-series files represent the different data gathered.
    • Rows of the time-series files represent the values at the given time "t".
  • Scripts: MATLAB scripts used to process and plot data. See ProcessAndUpdateSubjectData for data processing steps.
Categories:
237 Views

We introduce a new robotic RGBD dataset with difficult luminosity conditions: ONERA.ROOM. It comprises RGB-D data (as pairs of images) and corresponding annotations in PASCAL VOC format (xml files)

It aims at People detection, in (mostly) indoor and outdoor environments. People in the field of view can be standing, but also lying on the ground as after a fall.

Instructions: 

To facilitate use of some deep learning softwares, a folder tree with relative symbolic link (thus avoiding extra space) will gather all the sequences in three folders : | |— image |        | — sequenceName0_imageNumber_timestamp0.jpg |        | — sequenceName0_imageNumber_timestamp1.jpg |        | — sequenceName0_imageNumber_timestamp2.jpg |        | — sequenceName0_imageNumber_timestamp3.jpg |        | — … | |— depth_8bits |        | — sequenceName0_imageNumber_timestamp0.png |        | — sequenceName0_imageNumber_timestamp1.png |        | — sequenceName0_imageNumber_timestamp2.png |        | — sequenceName0_imageNumber_timestamp3.png |        | — … | |— annotations |        | — sequenceName0_imageNumber_timestamp0.xml |        | — sequenceName0_imageNumber_timestamp1.xml |        | — sequenceName0_imageNumber_timestamp2.xml |        | — sequenceName0_imageNumber_timestamp3.xml |        | — … |

Categories:
244 Views

These datasets are of the hydraulically actuated robot HyQ’s proprioceptive sensors. They include absolute and relative encoders, force and torque sensors, and MEMS-based and fibre optic-based inertial measurement units (IMUs). Additionally, a motion capture system recorded the ground truth data with millimetre accuracy. In the datasets HyQ was manually controlled to trot in place or move around the laboratory. The sequence includes: forward and backwards motion, side-to-side motion, zig-zags, yaw motion, and a mix of linear and yaw motion.

Instructions: 

Please see instructions.pdf.

Categories:
978 Views

In recent years, researchers have explored human body posture and motion to control robots in more natural ways. These interfaces require the ability to track the body movements of the user in three dimensions. Deploying motion capture systems for tracking tends to be costly and intrusive and requires a clear line of sight, making them ill adapted for applications that need fast deployment. In this article, we use consumer-grade armbands, capturing orientation information and muscle activity, to interact with a robotic system through a state machine controlled by a body motion classifier.

Categories:
201 Views

Pages