MEx - Multi-modal Exercise Dataset

Citation Author(s):
Anjana
Wijekoon
School of Computing and Digital Media, Robert Gordon University, Aberdeen, UK
Nirmalie
Wiratunga
School of Computing and Digital Media, Robert Gordon University, Aberdeen, UK
Kay
Cooper
School of Health Sciences, Robert Gordon University, Aberdeen, UK
Submitted by:
Anjana Wijekoon
Last updated:
Tue, 10/01/2019 - 10:57
DOI:
10.21227/h7g2-a333
Data Format:
License:
0
0 ratings - Please login to submit your rating.

Abstract 

Multi-modal Exercises Dataset is a multi- sensor, multi-modal dataset, implemented to benchmark Human Activity Recognition(HAR) and Multi-modal Fusion algorithms. Collection of this dataset was inspired by the need for recognising and evaluating quality of exercise performance to support patients with Musculoskeletal Disorders(MSD).The MEx Dataset contains data from 25 people recorded with four sensors, 2 accelerometers, a pressure mat and a depth camera. Seven different exercises that are highly recommended by physiotherapists for patients with low-back pain were selected for this data collection. Two accelerometers were placed on the wrist and the thigh of the person and they performed exercises on the pressure mat while being recorded by a depth camera from top. One person performed one exercise for maximum 60 seconds. The dataset contains three data modalities; numerical time-series data, video data and pressure sensor data posing interesting research challenges when reasoning for HAR and Exercise Quality Assessment. With the recent advancement in multi-modal fusion, we also believe MEx is instrumental in benchmarking not only HAR algorithms, but also fusion algorithms of heterogeneous data types in multiple application domains.

 

Dataset Files

LOGIN TO ACCESS DATASET FILES
Open Access dataset files are accessible to all logged in  users. Don't have a login?  Create a free IEEE account.  IEEE Membership is not required.

Documentation

AttachmentSize
File Exercises.pdf47.07 KB