Context-Aware Human Activity Recognition (CAHAR) Dataset

Citation Author(s):
John
Mitchell
University of Leeds
Abbas
Dehghani-Sanij
University of Leeds
Sheng
Xie
University of Leeds
Rory
O'Connor
University of Leeds
Submitted by:
John Mitchell
Last updated:
Fri, 06/21/2024 - 11:24
DOI:
10.21227/bwee-bv18
Data Format:
License:
0
0 ratings - Please login to submit your rating.

Abstract 

This dataset consists of inertial, force, color, and LiDAR data collected from a novel sensor system. The system comprises three Inertial Measurement Units (IMUs) positioned on the waist and atop each foot, a color sensor on each outer foot, a LiDAR on the back of each shank, and a custom Force-Sensing Resistor (FSR) insole featuring 13 FSRs in each shoe. 20 participants wore this sensor system whilst performing 38 combinations of 11 activities on 9 different terrains, totaling over 7.8 hours of data. In addition, participants 1 and 3 performed additional activity-terrain combinations which can be used as a test set, or for additional training data. The intention with this dataset is to enable researchers to develop classification techniques capable of identifying both the performed activity, and the terrain on which it was performed. These classification techniques will be instrumental in the adoption of remote, real-environment gait analysis with increased accuracy, reproducibility, and scope for diagnosis.

Funding Agency: 
Engineering and Physical Sciences Research Council
Grant Number: 
EP/T517860/1