StairNet: A Computer Vision Dataset for Stair Recognition

Citation Author(s):
Andrew Garrett
Kurbis
University of Toronto
Alex
Mihailidis
University of Toronto
Brokoslaw
Laschowski
University of Toronto
Submitted by:
Brokoslaw Laschowski
Last updated:
Wed, 03/13/2024 - 06:11
DOI:
10.21227/12jm-e336
Links:
License:
0
0 ratings - Please login to submit your rating.

Abstract 

Visual perception can improve transitions between different locomotion mode controllers (e.g., level-ground walking to stair ascent) by sensing the walking environment prior to physical interactions. Here we developed the "StairNet" dataset to support the development of vision-based stair recognition systems. The dataset builds on ExoNet – the largest open-source dataset of egocentric images of real-world walking environments. StairNet contains ~515,000 labelled images from six of the twelve original ExoNet classes. These images were reclassified using new definitions with the goal of increasing the accuracy of the cutoff points between classes. The dataset was manually parsed several times during annotation to reduce misclassification errors and remove images with large obstructions. The StairNet dataset opens new opportunities for environment-adaptive control of robotic leg prostheses and exoskeletons. 

Reference:

1. Kurbis AG, Laschowski B, and Mihailidis A. (2022). Stair recognition for robotic exoskeleton control using computer vision and deep learning. IEEE International Conference on Rehabilitation Robotics (ICORR). DOI: 10.1109/ICORR55369.2022.9896501.

2. Kuzmenko D, Tsepa O, Kurbis AG, Mihailidis A, and Laschowski B. (2023). Efficient visual perception of human-robot walking environments using semi-supervised learning. IEEE International Conference on Intelligent Robots and Systems (IROS). DOI: 10.1109/IROS55552.2023.10341654. 

3. Kurbis AG, Mihailidis A, and Laschowski B. (2024). Development and mobile deployment of a stair recognition system for human-robot locomotion. IEEE Transactions on Medical Robotics and Bionics. DOI: 10.1109/TMRB.2024.3349602. 

4. Kurbis AG, Kuzmenko D, Ivanyuk-Skulskiy B, Mihailidis A, and Laschowski B. (2024). StairNet: Visual recognition of stairs for human-robot locomotion. BioMedical Engineering OnLine. DOI: 10.1186/s12938-024-01216-0. 

5. Ivanyuk-Skulskiy B, Kurbis AG, Mihailidis A, and Laschowski B. (2024). Sequential image classification of human-robot walking environments using temporal neural networks. IEEE International Conference on Biomedical Robotics and Biomechatronics (BioRob). Under review.  

Instructions: 

*Details regarding the StairNet dataset are provided in the ReadMe file. TFRecords dataset format is available upon request. Please email Dr. Brokoslaw Laschowski (brokoslaw.laschowski@utoronto.ca) for any additional questions and/or technical assistance.