Recent advances in scalp electroencephalography (EEG) as a neuroimaging tool have now allowed researchers to overcome technical challenges and movement restrictions typical in traditional neuroimaging studies.  Fortunately, recent mobile EEG devices have enabled studies involving cognition and motor control in natural environments that require mobility, such as during art perception and production in a museum setting, and during locomotion tasks.

Categories:
1688 Views

Ten volunteers were trained through a series of twelve daily lessons to type in a computer using the Colemak keyboard layout. During the fourth-, eight-, and eleventh-session, electroencephalography (EEG) measurements were acquired for the five trials each subject performed in the corresponding lesson. Electrocardiography (ECG) data at each of those trials were acquired as well. The purpose of this experiment is to aim in the development of different methods to assess the process of learning a new task.

Instructions: 

*Experimental setup

Ten volunteers were trained through a series of twelve daily lessons to type in a computer using the Colemak keyboard layout, which is an alternative to the QWERTY and Dvorak layouts, and it is designed for efficient and ergonomic touch typing in English. Six of our volunteers were female, four male, all of them were right-handed, and their mean age was 29.3 years old with an standard deviation of 5.7 years. The lessons used during our experiment are available on-line at colemak.com/Typing_lessons. In our case, we asked the volunteers to repeat each of them five times (with resting intervals of 2 min in between). We
chose Colemak touch typing as the ability to learn because most people are unaware of its existence, then it is a good candidate for a truly new ability to learn. The training process always took place in a sound-proof cubicle in which the volunteers were isolated from distractions. Hence, the volunteers were sitting in front of the computer and were engaged entirely in the typing lesson. All the experiments were carried at the same hour of the day, and all volunteers were asked to refrain of doing any additional training anywhere else. For more details, see [1].

*Data arrangement

A Matlab-compatible file is provided for each subject. Each .mat file contains a cell array (named Cn) of size 15x10, which corresponds to the 15 trials and 10 channels, respectively. Trials are organized as follows: rows 1-5 correspond to the measurements during the fourth Colemak lesson, rows 6-10 during the eighth, and rows 11-15 during the eleventh. Channels are organized by columns in the following order: (1) ECG, (2) F3, (3) Fz, (4) F4, (5) C3, (6) Cz, (7) C4, (8) P3, (9) POz, and (10) P4. Each of the elements of Cn correspond to a vector containing the output (time samples acquired at 256 Hz sampling frequency) of each of those channels. The lenght of each of those vectors differ between subjects, as well as for each trial depending on the time it took the corresponging subject to complete the Colemak lesson. The units of all output signals are microVolts.

*Preprocessing

All data has been preprocessed with the automatic decontamination algorithms provided by the B-Alert Live Software (BLS): raw signals are processed to eliminate known artifacts. Particularly, the following actions are taken for different type of artifacts:

• Excursions and amplifier saturation – contaminated periods are replaced with zero values, starting and ending at zero crossing before and after each event.
• Spikes caused by artifact are identified and signal value is interpolated.
• Eye Blinks (EOG) – wavelet transforms deconstruct the signal and a regression equation is used to identify the EEG regions contaminated with eye blinks. Representative EEG preceding the eye blink is inserted in the contaminated region.

Aditionally, all data were detrended using Matlab's command detrend.

*How to acknowledge

We encourage researchers to use the published dataset freely and we ask that they cite the respective data sources as well as this paper:

[1] D. Gutiérrez y M. A. Ramírez-Moreno, “Assessing a Learning Process with Functional ANOVA Estimators of EEG Power Spectral Densities,” Cognitive Neurodynamics, vol. 10, no. 2, pp. 175-183, 2016. DOI: 10.1007/s11571-015-9368-7

*Credits

All data were acquired in the Laboratory of Biomedical Signal Processing, Cinvestav Monterrey, in the context of M. A. Ramírez-Moreno's MSc thesis work under the advice of D. Gutiérrez.

Categories:
636 Views

This dataset contains EEG signals from 73 subjects (42 healthy; 31 disabled) using an ERP-based speller to control different brain-computer interface (BCI) applications. The demographics of the dataset can be found in info.txt. Additionally, you will find the results of the original study broken down by subject, the code to build the deep-learning models used in [1] (i.e., EEG-Inception, EEGNet, DeepConvNet, CNN-BLSTM) and a script to load the dataset.

Categories:
485 Views

Ear-EEG recording collects brain signals from electrodes placed in the ear canal. Compared with existing scalp-EEG,  ear-EEG is more wearable and user-comfortable compared with existing scalp-EEG.

Instructions: 

** Please note that this is under construction, and instruction is still being updated **

 

 

Participants

6 adults ( 2 males/ 4 females, age:22-28) participated in this experiment. The subjects were first given information about the study and then signed an informed consent form. The study was approved by the ethics committee at the  City  University of  Hong  Kong(Reference number:  2-25-201602_01).

 

Hardware and Software

We recorded the scalp-EEG using the a Neuroscan Quick Cap (Model C190) . Ear-EEG were recorded simultaneously with scalp-EEG. The 8 ear electrodes placed at the front and back ear canal (labeled as xF,  xB), and two upper and bottom positions in the concha (labeled as xOU and xOD). All ear and scalp electrodes were referenced to a  scalp  REF electrode.  The scalp  GRD  electrode  was  used as a  ground reference. The signals were sampled at 1000 Hz then filtered with a  bandpass filter between  0.5  Hz and  100  Hz together with a  notch filter to suppress the line noises.  The recording amplifier was SynAmps2,  and  Curry  7  was used for real-time data monitoring and collecting.

 

Experimental design

Subjects were seated  in front of a computer monitor. A fixation cross presented in  the center of  the monitor for 3s, followed by an arrow pseudo-randomly pointing to  the  right  or  left for 4s. During  the  4  s  arrow presentation, subjects needed to imagine and grasp the left or right hand according  to  the arrow direction. A short  warning beep was played  2  s  after the cross onset to call the subjects. 

 

Data Records

The data and the metadata from 6 subjects are stored in the IEEE Dataport. Note that Subject 1-4 completed 10 blocks of trials, subject 6 finished  only  5  blocks.  Each  block contained 16  trials.  In our dataset, each folder contain individual dataset from one subject.  For each individual dataset, there were four type of files (.dat, .rs3, .ceo, .dap). All four files were needed for EEGLAB and MNE package processing.  Each individual dataset contains the raw EEG data from 122 channels (from scale EEG recording), 8 channels (from ear EEG recording), and 1 channels (REF electrode). 

Individual dataset of subject 1,5,6 has different sub-datasets. The index indicates the time order of that sub-dataset (motor1, then followed by motor2, motor3, motor 4 etc).  While Individual dataset of subject 2,3,4 has one main dataset.

Each dataset has timestamps for epoch extraction. Two event labels marked the start of the arrow, which indicated the start of subject hand grasping (event number 1: left hand; event number 2: right hand).

Categories:
422 Views

Our state of arousal can significantly affect our ability to make optimal decisions, judgments, and actions in real-world dynamic environments. The Yerkes-Dodson law, which posits an inverse-U relationship between arousal and task performance, suggests that there is a state of arousal that is optimal for behavioral performance in a given task. Here we show that we can use on-line neurofeedback to shift an individual's arousal from the right side of the Yerkes-Dodson curve to the left toward a state of improved performance.

Instructions: 

We hope you find this dataset useful. For help please see the provided readme file, the article by Faller et al. (2019) in PNAS, the preprint by Faller et al. (2018) on BioRxiv, the conference paper by Faller et al. (2016) at IEEE SMC and/or the tutorials for the Matlab toolboxes EEGLAB and BCILAB. Thank you very much.

Categories:
1544 Views

There are two types of data: raw EEG data recorded from the Brain-Vision system and Mat file converted by BBCI-Tool Box.

Categories:
144 Views

Previous neuroimaging research has been traditionally confined to strict laboratory environments due to the limits of technology. Only recently have more studies emerged exploring the use of mobile brain imaging outside the laboratory. This study uses electroencephalography (EEG) and signal processing techniques to provide new opportunities for studying mobile subjects moving outside of the laboratory and in real world settings. The purpose of this study was to document the current viability of using high density EEG for mobile brain imaging both indoors and outdoors.

Instructions: 

Study Summary:

The purpose of this study is to test the reliability of brain and body dynamics recordings in a real world environment, compare electrocortical dynamics and behaviors in outdoor vs. indoor settings, determine behavioral, biomechanical, and EEG correlates to visual search task performance, and search for EEG and behavioral parameters related to increased mental stress. Each subject walked outdoors on a heavily wooded trail and indoors on a treadmill, performing a visual search object recognition task. After a baseline condition without targets, the subject was tasked to identify light green, target flags vs. dark green, nontarget flags. During two conditions, non-stress and stress, the subject received $0.25 for each correct flag identification. During the stress condition, the subject also received a punishment (loss of $1.00) for each incorrect flag identification, plus an automatic punishment (loss of $1.00) approximately every 2 minutes. Each of the 3 conditions lasted approximately 20 minutes. Saliva samples were collected at the start and end of each condition. The order of non-stress and stress conditions was randomized for each subject. Please note there are some events, where the subject was assumed to have perceived a stimulus, that are lacking the Participant/Effect HED tag. This tag allows for automated processing of events. These particular events (e.g. occasional experimenter instructions to walk down a certain part of the outdoor trail) are of low importance for the purposes of data analysis.

 

Data Summary

In accordance with the Terms of Service, this dataset is made available under the terms of the "Creative Commons" Attribution (CC-BY) license. (https://ieee-dataport.org/faq/who-owns-datasets-ieee-dataport).

 

Number of Sessions: 98

Number of Subjects: 49

Subject Groups: normal

Primary source of event information: Tags

Number of EEG Channels: 264 (105 recordings)

Recorded Modalities: EEG (105 recordings), Eye_tracker (88 recordings), Force_plate (47 recordings), IMU (99 recordings), Pulse_from_EEG (52 recordings), Pulse_sensor (86 recordings)

EEG Channel Location Type(s): Custom (105 recordings),

 

Data organization

This study is an EEG Study Schema (ESS) Standard Data Level 1 container. This means that it contains raw, unprocessed EEG data arranged in a standard manner. Data is in a container folder and ready to be used with MATLAB to automate access and processing. All other data measures other than EEG are in .mat (MATLAB) format. For more information please visit eegstudy.org.

 

There is one folder for every subject that includes the following files when available:

(1) Indoor EEG session (<ID number_Indoor.set>)

EEG files have been imported into EEGLAB and are stored as unprocessed raw .set format in standard EEGLAB Data Structures.

(https://sccn.ucsd.edu/wiki/A05:_Data_Structures)

 

(2) Outdoor EEG session (<ID number_Outdoor.set>)

Same as Indoor EEG session (above)

 

(3) Indoor IMU session (<ID number_Indoor_imu.mat>)

The IMU .mat file contains a structure with 6 fields (variable name: IMU)

 

IMU.dataLabel: string including ID number, environment, and sensor type

IMU.dataArray: 10xNx6 matrix. Third dimension refers to each of 6 IMU sensors (left foot, right foot, left ankle, right ankle, chest, and waist). Columns are frame numbers. Rows are: 

• x, y, and z direction of accelerations, in m/s^2 

• x, y, and z direction of gyroscopes, in rad/s 

• x, y, and z direction of magnetometers, in microteslas

• Temperature, in degrees Celsius

IMU.axisLabel: String headings for ‘dataType’ and ‘frame’ and ‘sensorNumber’

IMU.axisValue: 1x10 cell array of string headings for each row of data type, and 1x6 cell array of string headings for each IMU sensor

IMU.samplingRate: Sampling rate

IMU.dateTime: String of date and time information of recording

 

(4) Outdoor IMU session (<ID number_Outdoor_imu.mat>)

Same as Indoor IMU session (above).

 

(5) Indoor eye tracking session (<ID number_Indoor_eye_tracker.mat>)

The eye tracker .mat file contains a structure with 6 fields (variable name: Eye_tracker)

 

Eye_tracker.dataLabel: string including ID number, environment, and sensor type

Eye_tracker.dataArray: 7xN matrix. Columns are frame numbers. Rows are: 

• x and y coordinates of the master spot, in eye image pixels

• x and y coordinates of the pupil center, in eye image pixels

• Pupil radius, in eye image pixels

• Eye direction with respect to the scene image, in scene image pixels

The eye and scene images are displayed and recorded with resolution of 640 x 480 pixels. The origin is the top left of the image with the X-axis positive to the right and the Y-axis positive downwards. Unavailable data is shown by the number –2000.

 

Eye_tracker.axisLabel: String headings for ‘dataType’ and ‘frame’

Eye_tracker.axisValue: 1x7 cell array of string headings for each row of data type

Eye_tracker.samplingRate: Sampling rate

Eye_tracker.dateTime: String of date and time information of recording

 

(6) Outdoor eye tracking session (<ID number_Outdoor_eye_tracker.mat>)

Same as Indoor eye tracking session (above).

 

(7) Indoor heart rate from pulse sensor session (<ID number_Indoor_pulse_sensor.mat>)

The pulse sensor .mat file contains a structure with 6 fields (variable name: Pulse_sensor)

 

Pulse_sensor.dataLabel: string including ID number, environment, and sensor type

Pulse_sensor.dataArray: 3xN matrix. Columns are frame numbers. Rows are: 

• pulse (normalized wave), in volts  

• Inter-beat Interval (IBI), in milliseconds

• heart rate, in beats per minute (BPM)

Pulse_sensor.axisLabel: String headings for ‘dataType’ and ‘frame’

Pulse_sensor.axisValue: 1x3 cell array of string headings for each row of data type

Pulse_sensor.samplingRate: Sampling rate

Pulse_sensor.dateTime: String of date and time information of recording

 

(8) Outdoor heart rate from pulse sensor session 

(<ID number_Outdoor_pulse_sensor.mat>)

Same as Indoor pulse sensor session (above).

 

(9) Indoor heart rate from EEG session (<ID number_Indoor_pulse_from_eeg.mat>)

If pulse rate was recovered from EEG ECG a corresponding file is available. The pulse from EEG .mat file contains a structure with 6 fields (variable name: Pulse_from_EEG)

 

Pulse_from_EEG.dataLabel: string including ID number, environment, and sensor type

Pulse_from_EEG.dataArray: 3xN matrix. Columns are frame numbers. Rows are: 

• pulse (normalized wave), in volts  

• Inter-beat Interval (IBI), in milliseconds

• heart rate, in beats per minute (BPM)

Pulse_from_EEG.axisLabel: String headings for ‘dataType’ and ‘frame’

Pulse_from_EEG.axisValue: 1x3 cell array of string headings for each row of data type

Pulse_from_EEG.samplingRate: Sampling rate

Pulse_from_EEG.dateTime: String of date and time information of recording

 

(10) Outdoor heart rate from EEG session (<ID number_Outdoor_pulse_from_eeg.mat>)

Same as Indoor pulse from EEG session (above).

 

(11) Indoor treadmill force plate session (<ID number_Indoor_force_plate.mat>)

The force plate .mat file contains a structure with 6 fields (variable name: Force_plate)

 

Force_plate.dataLabel: string including ID number, environment, and sensor type

Force_plate.dataArray: 3xNx2 matrix. Third dimension is for left and right force plates, respectively. Columns are frame numbers. Rows are: 

• x, y, and z direction of force, in newtons

Force_plate.axisLabel: String headings for ‘dataType’ and ‘frame’ and ‘sensorNumber’

Force_plate.axisValue: 1x3 cell array of string headings for each row of data type

Force_plate.samplingRate: Sampling rate

Force_plate.dateTime: String of date and time information of recording

 

(12) EEG digitized head map (<ID number.sfp>)

Besa coordinates of all electrode positions.

 

 (13) Indoor eye tracking video (<ID number_Indoor_eye_tracker.avi>)

The eye tracker .avi file is a video from the subject’s perspective (640x480 resolution, 30 frames/sec)

 

(14) Outoor eye tracking video (<ID number_Outdoor_eye_tracker.avi>)

The eye tracker .avi file is a video from the subject’s perspective (640x480 resolution, 30 frames/sec)

 

(15) Indoor video camera (<ID number_Indoor_video_camera(#).avi>)

The camcorder .avi file is a video from the experimenter’s perspective (704x384 resolution, 30 frames/sec). If there are multiple parts the (#) appended indicates the order.

 

(16) Outdoor video camera (<ID number_Outdoor_video_camera(#).avi>)

The camcorder .avi file is a video from the experimenter’s perspective (704x384 resolution, 30 frames/sec). If there are multiple parts the (#) appended indicates the order.

 

Cortisol (Cortisol_all_subjects.xlsx)

Salivary cortisol data is provided as a single spreadsheet ‘Cortisol_all_subjects.xlsx’. It contains the following variables:

 

  • subid: ID number

  • sex: 1 = male, 2 = female

  • age: in years

  • height: in inches

  • weight: in pounds

  • environment: 1 = outdoors, 2 = indoors

  • ordererenvironment: 1 = outdoor first, 2 = indoor first

  • orderstress: 1 = stress first, 2 = non-stress first

  • condition: 1 = Initial sample taken before walking started, 2 = Baseline sample after baseline walking, 3 = Non-stress sample taken after non-stress condition, 4 = Stress sample taken after stress condition

  • concentration: cortisol levels in µg/L

  • cond_ordered = order of conditions by environment

Categories:
597 Views

Electroencephalography (EEG) signal data was collected from twelve healthy subjects with no known musculoskeletal or neurological deficits (mean age 25.5 ± 3.7, 11 male, 1 female, 1 left handed, 11 right handed) using an EGI Geodesics© Hydrocel EEG 64-Channel spongeless sensor net. All subjects gave their informed consent for inclusion before they participated in the study. The study was conducted in accordance with the Declaration of Helsinki, and the protocol was approved by the Ethics Committee of the University of Wisconsin-Milwaukee (17.352).

Categories:
791 Views