ExoNet Database: Wearable Camera Images of Human Locomotion Environments
Abstract: Recent advances in computer vision and deep learning are allowing researchers to develop automated environment recognition systems for robotic leg prostheses and exoskeletons. However, small-scale and private training datasets have impeded the widespread development and dissemination of image classification algorithms (e.g., convolutional neural networks) for recognizing the human walking environment. To address these limitations, we developed "ExoNet" - the first open-source, large-scale hierarchical database of high-resolution wearable camera images (i.e., egocentric perception) of legged locomotion environments. Unparalleled in both scale and diversity, ExoNet contains over 5.6 million RGB images of indoor and outdoor real-world walking environments, which were collected using a lightweight wearable camera throughout the summer, fall, and winter seasons. Approximately 923,000 images in ExoNet were human-annotated using a novel, 12-class hierarchical labelling architecture. Available publicly through IEEE DataPort, ExoNet offers an unprecedented communal platform to train, develop, and compare next-generation image classification algorithms for human locomotion environment recognition. In addition to robotic leg prostheses and exoskeletons, applications of ExoNet could extend to humanoids, autonomous legged robots, powered wheelchairs, and other mobility assistive technologies.
1) Laschowski B, McNally W, Wong A, and McPhee J. (2020). ExoNet Database: Wearable Camera Images of Human Locomotion Environments. Frontiers in Robotics and AI, 7, 562061. DOI: 10.3389/frobt.2020.562061.
2) Laschowski B, McNally W, Wong A, and McPhee J. (2021). Computer Vision and Deep Learning for Environment-Adaptive Control of Robotic Lower-Limb Exoskeletons. Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC). DOI: 10.1109/EMBC46164.2021.9630064.
3) Laschowski B (2021). Energy Regeneration and Environment Sensing for Robotic Leg Prostheses and Exoskeletons. PhD Dissertation. University of Waterloo. http://hdl.handle.net/10012/17816.
4) Laschowski B, McNally W, Wong A, and McPhee J. (2022). Environment Classification for Robotic Leg Prostheses and Exoskeletons using Deep Convolutional Neural Networks. Frontiers in Neurorobotics, 15, 730965. DOI: 10.3389/fnbot.2021.730965.
*Details on the ExoNet database are provided in the references above. Please email Brokoslaw Laschowski (firstname.lastname@example.org) for any additional questions and/or technical assistance.