Robotics

This dataset contains both the artificial and real flower images of bramble flowers. The real images were taken with a realsense D435 camera inside the West Virginia University greenhouse. All the flowers are annotated in YOLO format with bounding box and class name. The trained weights after training also have been provided. They can be used with the python script provided to detect the bramble flowers. Also the classifier can classify whether the flowers center is visible or hidden which will be helpful in precision pollination projects.

Categories:
192 Views

This data port serves as a valuable extension to the article titled "Algorithmic Framework for Analyzing and Simulating Multi-axial Robotic Transformations in Spatial Coordinates." It provides Python script implementations of the simulation algorithm detailed in the paper. These scripts are designed to allow seamless adoption and experimentation with the proposed algorithm, enhancing its usability for researchers and practitioners alike.

Categories:
81 Views

Lettuce Farm SLAM Dataset (LFSD) is a VSLAM dataset based on RGB and depth images captured by VegeBot robot in a lettuce farm. The dataset consists of RGB and depth images, IMU, and RTK-GPS sensor data. Detection and tracking of lettuce plants on images are annotated with the standard Multiple Object Tracking (MOT) format. It aims to accelerate the development of algorithms for localization and mapping in the agricultural field, and crop detection and tracking.

Categories:
351 Views

In this investigation, the researchers have used a commercially available millimeter-wave (MMW) radar to collect data and assess the performance of deep learning algorithms in distinguishing different objects. The research looks at how varied ambiance factors, such as height, distance, and lighting, affect object recognition ability in both static and dynamic stages of the radar.

Categories:
477 Views

Visual perception can improve transitions between different locomotion mode controllers (e.g., level-ground walking to stair ascent) by sensing the walking environment prior to physical interactions. Here we developed the "StairNet" dataset to support the development of vision-based stair recognition systems. The dataset builds on ExoNet – the largest open-source dataset of egocentric images of real-world walking environments.

Categories:
2201 Views

Dataset asscociated with a paper in 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems

"Talk the talk and walk the walk: Dialogue-driven navigation in unknown indoor environments"

If you use this code or data, please cite the above paper.

 

Categories:
230 Views

This is the original data and the fitted data from the paper Scalability in Computing and Robotics by Heiko Hamann and Andreagiovanni Reina. For more information please refer to the paper.

Categories:
133 Views

Given the difficulty to handle planetary data we provide downloadable files in PNG format from the missions Chang'E-3 and Chang'E-4. In addition to a set of scripts to do the conversion given a different PDS4 Dataset.

Categories:
370 Views

Data validation of the article "A framework for assessing, comparing and predicting the performance of autonomous RFID-based inventory robots for retail"

Categories:
253 Views

Dataset of rosbags collected during autonomous drone flight inside a warehouse of stockpiles. PCD files created using reconstruction method proposed by article.

Data still being move to IEEE-dataport. 

Categories:
1799 Views

Pages