Datasets
Open Access
MEx - Multi-modal Exercise Dataset
- Citation Author(s):
- Submitted by:
- Anjana Wijekoon
- Last updated:
- Tue, 10/01/2019 - 10:57
- DOI:
- 10.21227/h7g2-a333
- Data Format:
- License:
- Categories:
- Keywords:
Abstract
Multi-modal Exercises Dataset is a multi- sensor, multi-modal dataset, implemented to benchmark Human Activity Recognition(HAR) and Multi-modal Fusion algorithms. Collection of this dataset was inspired by the need for recognising and evaluating quality of exercise performance to support patients with Musculoskeletal Disorders(MSD).The MEx Dataset contains data from 25 people recorded with four sensors, 2 accelerometers, a pressure mat and a depth camera. Seven different exercises that are highly recommended by physiotherapists for patients with low-back pain were selected for this data collection. Two accelerometers were placed on the wrist and the thigh of the person and they performed exercises on the pressure mat while being recorded by a depth camera from top. One person performed one exercise for maximum 60 seconds. The dataset contains three data modalities; numerical time-series data, video data and pressure sensor data posing interesting research challenges when reasoning for HAR and Exercise Quality Assessment. With the recent advancement in multi-modal fusion, we also believe MEx is instrumental in benchmarking not only HAR algorithms, but also fusion algorithms of heterogeneous data types in multiple application domains.
Dataset Files
- MEx.zip (79.39 MB)
- Applying kNN on each sensor, including read files, sliding window, DCT transform(only for acc data) and LOPO eval knn.zip (9.24 kB)
Open Access dataset files are accessible to all logged in users. Don't have a login? Create a free IEEE account. IEEE Membership is not required.
Documentation
Attachment | Size |
---|---|
Exercises.pdf | 47.07 KB |