EMG-EEG dataset for Upper-Limb Gesture Classification

Citation Author(s):
BOREOM
LEE
Gwangju Institute of Science and Technology
Submitted by:
Boreom Lee
Last updated:
Thu, 06/22/2023 - 02:48
DOI:
10.21227/5ztn-4k41
Data Format:
License:
0
0 ratings - Please login to submit your rating.

Abstract 

Electromyography (EMG) has limitations in human machine interface due to disturbances like electrode-shift, fatigue, and subject variability. A potential solution to prevent model degradation is to combine multi-modal data such as EMG and electroencephalography (EEG). This study presents an EMG-EEG dataset for enhancing the development of upper-limb assistive rehabilitation devices. The dataset, acquired from thirty-three volunteers without neuromuscular dysfunction or disease using commercial biosensors is easily replicable and deployable. The dataset consists of seven distinct gestures to maximize performance on the Toronto Rehabilitation Institute hand function test and the Jebsen-Taylor hand function test. The authors aim for this dataset to benefit the research community in creating intelligent and neuro-inspired upper limb assistive rehabilitation devices.

Instructions: 

The acquisition of the electromyography data (EMG) was performed using the defunct Myo armband, consisting of 8-channels with a 200 Hz sampling frequency. The EMG data, acquired from the upper-limb at maximum voluntary contraction, remains raw and unfiltered. In addition, electroencephalography (EEG) data was collected through the use of openBCI Ultracortex IV, composed of 8-channels with a 250 Hz sampling frequency. The dataset is accessible both in .CSV and .MAT formats, with individual subject data in a singular directory. A supervised machine learning approach can be undertaken while utilizing the naming nomenclature of the files. The file name designates the subject as S{}, repetition as R{}, and gesture as G{} respectively. The dataset consists of six repetitions per gesture and seven gestures in total. The gesture numbering scheme is as follows: G1 represents a large diameter grasp, G2 a medium diameter grasp, G3 a three-finger sphere grasp, G4 a prismatic pinch grasp, G5 a power grasp, G6 a cut grasp, and G7 an open hand. Detailed description of the dataset including starter code can be found here: https://github.com/HumanMachineInterface/Gest-Infer

Funding Agency: 
National Research Foundation of Korea
Grant Number: 
2020R1A2B5B01002297

Comments

Dear Sir,

 

I would like to access the above dataset for an undergraduate project at the University of Peradeniya.

 

Thank you very much in advance.

 

Best regards

Ruwan

Submitted by Ruwan Ranaweera on Sun, 03/31/2024 - 09:53