Skip to main content

Datasets

Open Access

ExoNet: Egocentric Images of Walking Environments

Citation Author(s):
Brokoslaw Laschowski (University of Waterloo)
William McNally (University of Waterloo)
Alexander Wong (University of Waterloo)
John McPhee (University of Waterloo)
Submitted by:
Brokoslaw Laschowski
Last updated:
DOI:
10.21227/rz46-2n31
Links:
Average: 5 (1 vote)

Abstract

Egocentric vision is important for environment-adaptive control of humans and robots. Here we developed ExoNet, the largest open-source dataset of wearable camera images of real-world walking environments. The dataset contains over 5.6 million RGB images of indoor and outdoor environments, which were collected during summer, fall, and winter seasons. Over 923,000 images were human-annotated using a 12-class hierarchical labelling architecture. ExoNet serves as a communal platform to develop, optimize, and compare new deep learning models for egocentric visual perception, with applications in robotics and neuroscience. 

Reference:

Laschowski B, McNally W, Wong A, and McPhee J. (2022). Environment classification for robotic leg prostheses and exoskeletons using deep convolutional neural networks. Frontiers in Neurorobotics. DOI: 10.3389/fnbot.2021.730965.

Instructions:

*Details on the ExoNet database are provided in the reference above. Please email Dr. Laschowski (blaschow@uwaterloo.ca) for any additional questions and/or technical assistance. 

Can anyone really train this dataset? How do I feel that a lot of labels are wrong?
ZB Qiao Sat, 03/18/2023 - 06:03 Permalink

Dataset Files

LOGIN TO ACCESS DATASET FILES
Open Access dataset files are accessible to all logged in users. Don't have a login? Create a free IEEE account. IEEE Membership is not required.