Datasets
Standard Dataset
Neurophysiology Data (EEG, ECG, ET) of Cognitive Training in Immersive Environment
- Citation Author(s):
- Submitted by:
- Vasilii Borisov
- Last updated:
- Wed, 09/04/2024 - 05:49
- DOI:
- 10.21227/2y1m-kr17
- Data Format:
- Research Article Link:
- Links:
- License:
- Categories:
- Keywords:
Abstract
This data set consists of 16 half-hour EEG recordings, obtained from 10 volunteers, as described below.
Experimental Protocol
The final study group consisted of 16 participants, including 13 (80%) men and 3 (20 %) women. Average age 24 ± 2 years. All participants gave written informed consent. They had normal or corrected to normal vision. Cognitive status assessment based on the study of electrophysiology and eye-tracking (ET) signals during cognitive training in an immersive environment (virtual reality). The study protocol included synchronized biosignal recording:
- 19-channel electroencephalogram (EEG) together with electrocardiogram (ECG) on three standard leads on a Neuron-Spectrum-3 device (LLC Neurosoft, Russia);
- eye tracking (ET) during cognitive exercise in an immersion environment. HTC Vive Pro Eye virtual reality system (HTC Corporation, Taiwan) with built-in ET was used as an immersive environment.
Tasks from the Upgrade-VR application (LLC FZCO, UAE) was used as cognitive load. This is an application for the STEAM platform (Valve Corporation, USA) for training cognitive abilities using simple game exercises created by neuropsychologists [https://vr-upgrade.com].
Subjects performed different motor/cognitive tasks while neurophysiology data (EEG, ECG, ET) were recorded using the LSL system. Each subject performed 1 experimental runs: three-minute runs of each of the five following tasks. Cognitive skills training exercises were used for the study tasks:
‘Reaction’ - reaction speed;
‘Coordination’ - spatial orientation and coordination;
‘Concentration’ - concentration of attention;
‘Synchronization’ - coordination of both hands;
‘Memory’ - working memory.
Please cite our paper:
A. A. Kuznetsov, T. S. Petrenko, A. V. Gorbunov and V. I. Borisov, "Neurophysiology Data of Cognitive Training in Immersive Environment," 2024 IEEE 25th International Conference of Young Professionals in Electron Devices and Materials (EDM), Altai, Russian Federation, 2024, pp. 2220-2225, doi: 10.1109/EDM61683.2024.10615177.
The data are provided here in XDF format (containing 26 EyeTracker signals each sampled at 175 sample rate, 19 EEG signals ang 1 ECG signal with 500 sample rate of both).
ET_channels = [['validL'],
['validR'],
['gazeoriginL_X'],
['gazeoriginL_Y'],
['gazeoriginL_Z'],
['gazeoriginR_X'],
['gazeoriginR_Y'],
['gazeoriginR_Z'],
['gazeL_X'],
['gazeL_Y'],
['gazeL_Z'],
['gazeR_X'],
['gazeR_Y'],
['gazeR_Z'],
['pupilL'],
['pupilR'],
['eye_opennessL'],
['eye_opennessR'],
['pupilLSensorPosL_X'],
['pupilLSensorPosL_Y'],
['pupilLSensorPosL_Z'],
['pupilLSensorPosR_X'],
['pupilLSensorPosR_Y'],
['pupilLSensorPosR_Z'],
['convergence_distance_mm'],
['convergence_distance_validity']]
The .edf files and the EEG/ECG channels in the corresponding .xdf files contain identical data.
EEG = [["Fp1"], ["Fp2"],
["F3"], ["F4"], ["C3"], ["C4"], ["P3"],
["P4"], ["O1"], ["O2"], ["F7"], ["F8"],
["T3"], ["T4"], ["T5"], ["T6"], ["FZ"],
["CZ"], ["PZ"], ["ECG"]]
The .csv files contain one of six codes (from 0 to 5) for each sample from EEG which corresponding to one of periods:
0 corresponds to Coordination task
1 corresponds to Concentration task
2 corresponds to Reaction task
3 corresponds to Memory task
4 corresponds to Synchronisation task
5 corresponds to Rest period
Comments
request for data