Synergistic prostheses enable the coordinated movement of the human-prosthetic arm, as required by activities of daily living. This is achieved by coupling the motion of the prosthesis to the human command, such as residual limb movement in motion-based interfaces. Previous studies demonstrated that developing human-prosthetic synergies in joint-space must consider individual motor behaviour and the intended task to be performed, requiring personalisation and task calibration.

Instructions: 

Task-space synergy comparison data-set for the experiments performed in 2019-2020.

Directory:

  • Processed: Processed data from MATLAB in ".mat" format. Organised by session and subject.
  • Raw: Raw time-series data gathered from sensors in ".csv" format. Each file represents a trial where a subject performed a reaching task. Organised by subject, modality and session. Anonymised subject information is included in a ".json" file.
    • Columns of the time-series files represent the different data gathered.
    • Rows of the time-series files represent the values at the given time "t".
  • Scripts: MATLAB scripts used to process and plot data. See ProcessAndUpdateSubjectData for data processing steps.
Categories:
143 Views

Ear-EEG recording collects brain signals from electrodes placed in the ear canal. Compared with existing scalp-EEG,  ear-EEG is more wearable and user-comfortable compared with existing scalp-EEG.

Instructions: 

** Please note that this is under construction, and instruction is still being updated **

 

 

Participants

6 adults ( 2 males/ 4 females, age:22-28) participated in this experiment. The subjects were first given information about the study and then signed an informed consent form. The study was approved by the ethics committee at the  City  University of  Hong  Kong(Reference number:  2-25-201602_01).

 

Hardware and Software

We recorded the scalp-EEG using the a Neuroscan Quick Cap (Model C190) . Ear-EEG were recorded simultaneously with scalp-EEG. The 8 ear electrodes placed at the front and back ear canal (labeled as xF,  xB), and two upper and bottom positions in the concha (labeled as xOU and xOD). All ear and scalp electrodes were referenced to a  scalp  REF electrode.  The scalp  GRD  electrode  was  used as a  ground reference. The signals were sampled at 1000 Hz then filtered with a  bandpass filter between  0.5  Hz and  100  Hz together with a  notch filter to suppress the line noises.  The recording amplifier was SynAmps2,  and  Curry  7  was used for real-time data monitoring and collecting.

 

Experimental design

Subjects were seated  in front of a computer monitor. A fixation cross presented in  the center of  the monitor for 3s, followed by an arrow pseudo-randomly pointing to  the  right  or  left for 4s. During  the  4  s  arrow presentation, subjects needed to imagine and grasp the left or right hand according  to  the arrow direction. A short  warning beep was played  2  s  after the cross onset to call the subjects. 

 

Data Records

The data and the metadata from 6 subjects are stored in the IEEE Dataport. Note that Subject 1-4 completed 10 blocks of trials, subject 6 finished  only  5  blocks.  Each  block contained 16  trials.  In our dataset, each folder contain individual dataset from one subject.  For each individual dataset, there were four type of files (.dat, .rs3, .ceo, .dap). All four files were needed for EEGLAB and MNE package processing.  Each individual dataset contains the raw EEG data from 122 channels (from scale EEG recording), 8 channels (from ear EEG recording), and 1 channels (REF electrode). 

Individual dataset of subject 1,5,6 has different sub-datasets. The index indicates the time order of that sub-dataset (motor1, then followed by motor2, motor3, motor 4 etc).  While Individual dataset of subject 2,3,4 has one main dataset.

Each dataset has timestamps for epoch extraction. Two event labels marked the start of the arrow, which indicated the start of subject hand grasping (event number 1: left hand; event number 2: right hand).

Categories:
376 Views

One subject, five different movements, four levels of motor imagery data.The sampling rate is 25Hz, a total of 33,000 lines.

Categories:
129 Views

The CLAS (Cognitive Load, Affect and Stress) dataset was conceived as a freelyaccessible repository which is purposely developed to support research on the automated assessment of certain states of mind and the emotional condition of a person.

Instructions: 

1.      Database structure (CLAS.zip)

The database is organized in 4 folders:

·         Answers – answers of the questions in the interactive tasks (Math problems, Logic problems and the Stroop test) for each person.

·         Block_details – metadata for each block (1 block per task) for every participant.

·         Data – raw signal recordings for the individual participants.

·         Documentation – accompanying documents.

 

When using the CLAS dataset, please cite:

     Markova, V., Ganchev, T., Kalinkov, K. (2019). CLAS: A Database for Cognitive Load, Affect and Stress Recognition, in Proceedings of the International Conference on Biomedical Innovations and Applications, (BIA-2019), art. no. 8967457,  DOI: 10.1109/BIA48344.2019.8967457. Available on-line: https://ieeexplore.ieee.org/document/8967457

Categories:
1415 Views

This dataset provides the ECG signals recorded in ambulatory (moving) conditions of subjects. The ambulatory ECG (A-ECG) data acquired with two different recorders viz. Biopac MP36 Acquisition system and a self-developed wearable ECG recorder are made available. Total 10 subjects' (with avg. age of 27 years, 1 female and 9 males) ECG signals with four body movements- Left & Right arm up/down, Sitting down & standing up and Waist twist are uploaded.

An EEG signals dataset is also provided here.

Instructions: 

Please contact me at: rahul2777@gmail.com for how to use the dataset and further discussion.

Categories:
532 Views

Accurate proportional myo-electric control of the hand is important in replicating dexterous manipulation in robot prostheses. Many studies in this field have focused on recording discrete hand gestures, while few have focused on the proportional and multiple-DOF control of the human hand using EMG signals. To aid researchers on advanced myoelectric hand control and estimation, we present this data from our work "Extraction of nonlinear muscle synergies for proportional and simultaneous estimation of finger kinematics".  

Instructions: 

Data: 10 subjects <downsampled_filtered_emg(dsfilt_emg),  23-joint marker position data(finger_kinematics)>

Data File Extension: .mat 

Cell Variables:

   * dsfilt_emg,              <5x7 cell>    <40000x8>

   * finger_kinematics   <5x7 cell>    <4000x69>  

Cell format:

row - correspond to the number of trials (total 5 trials)

column - correspond to the number of tasks (total 7 tasks)

 7 sets of movement tasks:

(1-5) individual flexion and extension of each finger: thumb, index, middle, ring, little, in this order

(6)   simulteneous flexion and extension of all fingers 

(7)   random free finger movement, in no particular order, and only in the flexion and extension plane

 

Comments:

a.  dsfilt_emg data and muscle activation data consist of 8 column vectors represents the 8 forearm muscles in the following order 

<APL,FCR,FDS,FDP,ED,EI,ECU,ECR>

 b. The finger_kinematics data consists of 69 column vectors. Each column vector is either the <x,y,z> joint positions

   of each marker. There are a total of 23 markers used.The marker assignment has been shown in <Marker_Position.png>.

   Kinematics data can be visualized through the script provided with the dataset - "Visulize_Kinematics_data.m"

Please refer to our paper for the additional detail regarding the data. Please cite [1] if you wish to use this dataset.

[1] - S.K.Dwivedi, J. Ngeo, T.Shibata, Transaction of Biomedical Engineering, In Press, 

"Extraction of Nonlinear Synergies for Proportional and Simultaneous Estimation of Finger Kinematics."

 

Affiliation / Email  --- Graduate School of Life Sciences and System Engineering, 

                                  Kyushu Institute of Technology, Kitakyushu, Japan

                                  (correspondence: tom@brain.kyutech.ac.jp

 

Categories:
1291 Views

The databases include arrays of alphabets and numbers, ["Ka Gyi" "Ka Kway" "Ga Nge" "Ga Gyi" "Nga" "Sa Lone" "Sa Lane" "0" '0' "Nya" "Ta Talin Jade" "Hta Won Bell" "Dain Yin Gouk" "Dain Yin Hmote" "Na Gyi" "Ta Won Bu" "Hta Sin Htoo" "Da Dway" "Da Out Chike" "Na Nge" "Pa Sout" "Pha Oo Htote" "Ba Htet Chike" "Ba Gone" "Ma" "Ya Pa Lat" "Ya Gout" "La" "Wa" "Tha" "Ha" "La Gyi" "Ah" '0' '1' '2' '3' '4' '5' '6' '7' '8' '9' "10"]. The symbols are recorded as gestures of palm by the MIIT research team and recorded audio file also for each number and alphabet.

Categories:
332 Views

We provide a large benchmark dataset consisting of about: 3.5 million keystroke events; 57.1 million data-points for accelerometer and gyroscope each; and 1.7 million data-points for swipes. Data was collected between April 2017 and June 2017 after the required IRB approval. Data from 117 participants, in a session lasting between 2 to 2.5 hours each, performing multiple activities such as: typing (free and fixed text), gait (walking, upstairs and downstairs) and swiping activities while using desktop, phone and tablet is shared.

Instructions: 

Detailed description of all data files is provided in the *BBMAS_README.pdf* file along with the dataset. 

 

 

Please cite:

[1] Amith K. Belman and Vir V. Phoha. 2020. Discriminative Power of Typing Features on Desktops, Tablets, and Phones for User Identification. ACM Trans. Priv. Secur. Volume 23,Issue 1, Article 4 (February 2020), 36 pages. DOI:https://doi.org/10.1145/3377404

[2]Amith K. Belman, Li Wang, S. S. Iyengar, Pawel Sniatala, Robert Wright, Robert Dora, Jacob Baldwin, Zhanpeng Jin and Vir V. Phoha, "Insights from BB-MAS -- A Large Dataset for Typing, Gait and Swipes of the Same Person on Desktop, Tablet and Phone", arXiv:1912.02736 , 2019. 

[3] Amith K. Belman, Li Wang, Sundaraja S. Iyengar, Pawel Sniatala, Robert Wright, Robert Dora, Jacob Baldwin, Zhanpeng Jin, Vir V. Phoha, "SU-AIS BB-MAS (Syracuse University and Assured Information Security - Behavioral Biometrics Multi-device and multi-Activity data from Same users) Dataset ", IEEE Dataport, 2019. [Online]. Available: http://dx.doi.org/10.21227/rpaz-0h66

 

 

 

 

Categories:
6410 Views

The MyoUP (Myo University of Patras) database contains recordings from 8 intact subjects (3 females, 5 males; 1 left handed, 7 right handed; age 22.38 ± 1.06 years). The acquisition process was divided into three parts: 5 basic finger movements (E1), 12 isotonic and isometric hand configurations (E2), and 5 grasping hand-gestures (E3). The recording device used was the Myo Armband by Thalmic labs (8 dry sEMG channels and sampling frequency of 200Hz). The dataset was created for use in gesture recognition tasks.

Instructions: 

1. Unzip file

2. Run 'python3 create_dataset_myoup.py'

Categories:
338 Views

This database contains the results of an experiment were healthy subjects played 5 trials of a rehabilitation-based VR game, to experience either difficulty variations or presence variations.

Colected results are demogrpahic information, emotional emotions after each trial and electrophysiological signals during all 5 trials.

Categories:
230 Views

Pages