Neurophysiology Data (EEG, ECG, ET) of Cognitive Training in Immersive Environment

Citation Author(s):
Ural Federal University
Ural State Medical University
Ural State Medical University
Ural Federal University
Submitted by:
Vasilii Borisov
Last updated:
Mon, 03/18/2024 - 15:14
Data Format:
0 ratings - Please login to submit your rating.


This data set consists of 10 half-hour EEG recordings, obtained from 10 volunteers, as described below.

Experimental Protocol
The final study group consisted of 10 participants, including 8 (80%) men and 2 (20 %) women. Average age 24 ± 2 years. All participants gave written informed consent. They had normal or corrected to normal vision. Cognitive status assessment based on the study of electrophysiology and eye-tracking (ET) signals during cognitive training in an immersive environment (virtual reality). The study protocol included synchronized biosignal recording:
- 19-channel electroencephalogram (EEG) together with electrocardiogram (ECG) on three standard leads on a Neuron-Spectrum-3 device (LLC Neurosoft, Russia);
- eye tracking (ET) during cognitive exercise in an immersion environment. HTC Vive Pro Eye virtual reality system (HTC Corporation, Taiwan) with built-in ET was used as an immersive environment.
Tasks from the Upgrade-VR application (LLC FZCO, UAE) was used as cognitive load. This is an application for the STEAM platform (Valve Corporation, USA) for training cognitive abilities using simple game exercises created by neuropsychologists [].

Subjects performed different motor/cognitive tasks while neurophysiology data (EEG, ECG, ET) were recorded using the LSL system. Each subject performed 1 experimental runs: three-minute runs of each of the five following tasks. Cognitive skills training exercises were used for the study tasks:
‘Reaction’ - reaction speed;
‘Coordination’ - spatial orientation and coordination;
‘Concentration’ - concentration of attention;
‘Synchronization’ - coordination of both hands;
‘Memory’ - working memory.


The data are provided here in XDF format (containing 26 EyeTracker signals each sampled at 175 sample rate, 19 EEG signals ang 1 ECG signal with 500 sample rate of both).

ET_channels = [['validL'],

The .edf files and the EEG/ECG channels in the corresponding .xdf files contain identical data.
EEG = [["Fp1"], ["Fp2"],
    ["F3"], ["F4"], ["C3"], ["C4"], ["P3"],
    ["P4"], ["O1"], ["O2"], ["F7"], ["F8"],
    ["T3"], ["T4"], ["T5"], ["T6"], ["FZ"],
    ["CZ"], ["PZ"], ["ECG"]]

The .csv files contain one of six codes (from 0 to 5) for each sample from EEG which corresponding to one of periods:
0 corresponds to Coordination  task
1 corresponds to Concentration task
2 corresponds to Reaction task
3 corresponds to Memory task
4 corresponds to Synchronisation task
5 corresponds to Rest period