Accurate proportional myo-electric control of the hand is important in replicating dexterous manipulation in robot prostheses. Many studies in this field have focused on recording discrete hand gestures, while few have focused on the proportional and multiple-DOF control of the human hand using EMG signals. To aid researchers on advanced myoelectric hand control and estimation, we present this data from our work "Extraction of nonlinear muscle synergies for proportional and simultaneous estimation of finger kinematics".  

Instructions: 

Data: 10 subjects <downsampled_filtered_emg(dsfilt_emg),  23-joint marker position data(finger_kinematics)>

Data File Extension: .mat 

Cell Variables:

   * dsfilt_emg,              <5x7 cell>    <40000x8>

   * finger_kinematics   <5x7 cell>    <4000x69>  

Cell format:

row - correspond to the number of trials (total 5 trials)

column - correspond to the number of tasks (total 7 tasks)

 7 sets of movement tasks:

(1-5) individual flexion and extension of each finger: thumb, index, middle, ring, little, in this order

(6)   simulteneous flexion and extension of all fingers 

(7)   random free finger movement, in no particular order, and only in the flexion and extension plane

 

Comments:

a.  dsfilt_emg data and muscle activation data consist of 8 column vectors represents the 8 forearm muscles in the following order 

<APL,FCR,FDS,FDP,ED,EI,ECU,ECR>

 b. The finger_kinematics data consists of 69 column vectors. Each column vector is either the <x,y,z> joint positions

   of each marker. There are a total of 23 markers used.The marker assignment has been shown in <Marker_Position.png>.

   Kinematics data can be visualized through the script provided with the dataset - "Visulize_Kinematics_data.m"

Please refer to our paper for the additional detail regarding the data. Please cite [1] if you wish to use this dataset.

[1] - S.K.Dwivedi, J. Ngeo, T.Shibata, Transaction of Biomedical Engineering, In Press, 

"Extraction of Nonlinear Synergies for Proportional and Simultaneous Estimation of Finger Kinematics."

 

Affiliation / Email  --- Graduate School of Life Sciences and System Engineering, 

                                  Kyushu Institute of Technology, Kitakyushu, Japan

                                  (correspondence: tom@brain.kyutech.ac.jp

 

Categories:
1508 Views

The databases include arrays of alphabets and numbers, ["Ka Gyi" "Ka Kway" "Ga Nge" "Ga Gyi" "Nga" "Sa Lone" "Sa Lane" "0" '0' "Nya" "Ta Talin Jade" "Hta Won Bell" "Dain Yin Gouk" "Dain Yin Hmote" "Na Gyi" "Ta Won Bu" "Hta Sin Htoo" "Da Dway" "Da Out Chike" "Na Nge" "Pa Sout" "Pha Oo Htote" "Ba Htet Chike" "Ba Gone" "Ma" "Ya Pa Lat" "Ya Gout" "La" "Wa" "Tha" "Ha" "La Gyi" "Ah" '0' '1' '2' '3' '4' '5' '6' '7' '8' '9' "10"]. The symbols are recorded as gestures of palm by the MIIT research team and recorded audio file also for each number and alphabet.

Categories:
350 Views

We provide a large benchmark dataset consisting of about: 3.5 million keystroke events; 57.1 million data-points for accelerometer and gyroscope each; and 1.7 million data-points for swipes. Data was collected between April 2017 and June 2017 after the required IRB approval. Data from 117 participants, in a session lasting between 2 to 2.5 hours each, performing multiple activities such as: typing (free and fixed text), gait (walking, upstairs and downstairs) and swiping activities while using desktop, phone and tablet is shared.

Instructions: 

Detailed description of all data files is provided in the *BBMAS_README.pdf* file along with the dataset. 

 

 

Please cite:

[1] Amith K. Belman and Vir V. Phoha. 2020. Discriminative Power of Typing Features on Desktops, Tablets, and Phones for User Identification. ACM Trans. Priv. Secur. Volume 23,Issue 1, Article 4 (February 2020), 36 pages. DOI:https://doi.org/10.1145/3377404

[2]Amith K. Belman, Li Wang, S. S. Iyengar, Pawel Sniatala, Robert Wright, Robert Dora, Jacob Baldwin, Zhanpeng Jin and Vir V. Phoha, "Insights from BB-MAS -- A Large Dataset for Typing, Gait and Swipes of the Same Person on Desktop, Tablet and Phone", arXiv:1912.02736 , 2019. 

[3] Amith K. Belman, Li Wang, Sundaraja S. Iyengar, Pawel Sniatala, Robert Wright, Robert Dora, Jacob Baldwin, Zhanpeng Jin, Vir V. Phoha, "SU-AIS BB-MAS (Syracuse University and Assured Information Security - Behavioral Biometrics Multi-device and multi-Activity data from Same users) Dataset ", IEEE Dataport, 2019. [Online]. Available: http://dx.doi.org/10.21227/rpaz-0h66

 

 

 

 

Categories:
6534 Views

The MyoUP (Myo University of Patras) database contains recordings from 8 intact subjects (3 females, 5 males; 1 left handed, 7 right handed; age 22.38 ± 1.06 years). The acquisition process was divided into three parts: 5 basic finger movements (E1), 12 isotonic and isometric hand configurations (E2), and 5 grasping hand-gestures (E3). The recording device used was the Myo Armband by Thalmic labs (8 dry sEMG channels and sampling frequency of 200Hz). The dataset was created for use in gesture recognition tasks.

Instructions: 

1. Unzip file

2. Run 'python3 create_dataset_myoup.py'

Categories:
405 Views

This database contains the results of an experiment were healthy subjects played 5 trials of a rehabilitation-based VR game, to experience either difficulty variations or presence variations.

Colected results are demogrpahic information, emotional emotions after each trial and electrophysiological signals during all 5 trials.

Categories:
263 Views

This database contains the 166 Galvanic Skin Response (GSR) signal registers collected from the subjects participating in the first experiment (EXP 1) presented in:

R. Martinez, A. Salazar-Ramirez, A. Arruti, E. Irigoyen, J. I. Martin and J. Muguerza, "A Self-Paced Relaxation Response Detection System Based on Galvanic Skin Response Analysis," in IEEE Access, vol. 7, pp. 43730-43741, 2019. doi: 10.1109/ACCESS.2019.2908445

Instructions: 

* GSR signals of each participant:The files whose names begin with letter A correspond to the GSR registers extracted from the participants. These files have a single column which correspond to the values of the GSR signal sampled at Fs=1Hz.* Labels of each signal:The files whose names begin with LABEL correspond to the labels of the RResp of each subject.These files have two columns. The first column corresponds to the label of the register and the second column corresponds to the timestamp for that given label. The registers have been labeled using 20s windows (sliding every 5s) and being the labels positioned in the center of the window. For example:-1 12.5  --> In the time window going from 2.5s to 22.5s, the RResp label corresponds to RResp=-1, being the  center of the window at 12.5s.There are four RResp intensity levels: 0 stands for the absence of any RResp, -1 for a Low intensity RResp, -2 for a Medium intensity RResp and -3 for a High intensity RResp.

Categories:
854 Views

Motor point identification is pivotal to elicit comfortable and sustained muscle contraction through functional electrical stimulation. To this purpose, anatomical charts and manual search techniques are used to extract subject-specific stimulation profile. Such information being heterogenous they lack standardization and reproducibility. To address these limitations; we aim to identify, localize, and characterize the motor points of forearm muscles across nine healthy subjects.

Categories:
589 Views

FRAP curve modeling using transient-sensitive analog computer unit with oscilloscopic CRT (Practicum, 2014)

Categories:
19 Views

The data is obtained from electrocardiography, using flexible electrode, Ag/AgCl electrode and Metal Clamp electrode of a femal subject, age 22 years old.

Categories:
366 Views

This Free CAD files are for the manuscript "A temperature-controlled patch-clamp platform demonstrated on Jurkat T lymphocytes and human stem cell derived neurons". The files allow for easily 3D-printing a housing box for the electronics. 

Instructions: 

The uploaded .zip-file contains FreeCad Files which can easily be converted to any other file format needed for 3D-printing, such as STL, OBJ, and 3MF, to name a few of them. 

Categories:
52 Views

Pages