Multi-modal mobile brain-body imaging (MoBI) dataset for assaying neural and head movement responses associated with creative video game play in children

Citation Author(s):
Akshay Sujatha Ravindran, Jesus G. Cruz-Garza, Anastasiya Kopteva, Andrew Paek, Aryan Mobiny, Zachary Hernandez, Jose Luis Contreras-Vidal
Submitted by:
Akshay Ravindran
Last updated:
Thu, 11/08/2018 - 10:34
DOI:
10.21227/H23W88
Data Format:
Links:
License:
2475 Views
Categories:
Keywords:
0
0 ratings - Please login to submit your rating.

Abstract 

One of the grand challenges in neuroscience is to understand the developing brain ‘in action and in context’ in complex natural settings. To address this challenge, it is imperative to acquire brain data from freely-behaving children to assay the variability and individuality of neural patterns across gender and age. Here, we share a de-identified multi-modal mobile brain-body imaging (MoBI) dataset (4 channel EEG, head motion; demographics; skill level) acquired at the Children’s Museum of Houston from 232 (166 males/ 66 females) children while engaged in video game play task (Minecraft) and in a resting state control condition. This dataset provides an opportunity to further examine the effects of age, gender, and videogame skill level on the temporal and spectral patterns of brain responses that emerge during video gaming and resting state in children in the complex natural setting of a museum. 

Instructions: 

Participants

233 children (167 males/ 66 females) participated in this experiment at the Children’s Museum of Houston during a special one-day event advertised to the public. The parents and children were first briefed, and were given information about the study. Parents or guardians signed voluntarily an informed consent form, whereas children were asked to assent to the experiment. The experiment was approved by the Institutional Review Board of the University of Houston. The experiment was conducted in a single day. Information about age, gender, race and level of expertise in playing Minecraft was collected using questionnaires, with the results summarized in Table 1 in the Instructions.docx file. The age of participants ranged from 6 to 16 years, with an average age of 8.72 (SD:  2.23) years of age.

Environment

A designated area at the Children’s Museum of Houston was set up with chairs facing a blank white wall as a setup to acquire data for the (baseline) resting state control condition. Twenty desktop computers were arranged in an adjacent larger area for playing Minecraft after the resting state data recording was completed.

Hardware and Software

We recorded brain activity and head movement data using the Muse EEG headsets (Interaxon, Toronto, Ontario, Canada). Two electrodes of the headset were positioned in the anterior-frontal region (AF7 and AF8) and two at the temporal-parietal region (TP9 and TP10). The headset produces bipolar readings using Fpz (center of forehead) as the reference (REF). Two other electrodes were placed centrally, to suppress the noise using a Driven Right Leg (DRL-REF) feedback configuration. The head acceleration was captured using an on-board accelerometer. EEG was recorded at a sampling rate of 220Hz while the acceleration data was recorded at 50 Hz. An online notch filter was applied on the EEG data to eliminate 60Hz line noise. The headset also provided Meta data (quality bits for each electrode) sampled at 10 Hz.

Experimental design

The baseline condition consisted of the children looking at a blank wall for 1 minute with their eyes open. Children were asked to avoid body movements. Afterwards, they were asked to sit in front of a desktop wherein they could play Minecraft for up to 20 minutes while the EEG and acceleration data would be recorded.

Data Records

The data and the metadata from 232 subjects are stored in the IEEE Dataport. Note that EEG and motion data are only available for 171 subjects, as data from 61 subjects could not be saved due to connectivity issues or low battery charge. In the dataset, the subject number is referenced to a subject ID and that information is made available in the Information file (Information.xlsx file) stored in the data folder. The subject ID is assigned in such a way that the subjects with missing data is arranged at the end. The excel file also contains the demographic information of the subject such as age, gender, race, ethnicity, grade, and class, as well as the frequency of playing Minecraft along with the percentage of time for which different channels had good contact (based on meta data from Muse headset).  The data associated with each individual subject (named with respect to the subject number) has been saved under a single variable (.mat file).

Each individual dataset is further organized in the following way:

1)  EEG: Matlab structure containing the raw and meta data associated with the 4 channel EEG

  • EEG.data: a Matlab variable containing the raw EEG data from the 4 channels (in the order        TP9, AF7, AF8, TP10).
  • EEG.timestamp: a Matlab variable containing the timestamp associated with the EEG data.          EEG was sampled at 220 Hz.

Intervals of baseline and task were identified for the first 88 subjects for which it was possible to extract the segments, based on the accelerometer data, and thus this indoor positioning data are also included in the dataset. The segmentation process is explained in the section below. These 88 subjects have 4 additional entries corresponding to baseline and task, start and end sample points.

  • EEG.baseline_start: a Matlab variable containing the sample number associated with the start of baseline condition.
  • EEG.baseline_end: a Matlab variable containing the sample number associated with the end of baseline condition.
  • EEG.task_start: a Matlab variable containing the sample number associated with the start of task condition.
  • EEG.task_end: a Matlab variable containing the sample number associated with the end of task condition.

2)       ACC: Matlab structure containing the raw and meta data associated with the 3-axis head acceleration.

  • ACC.data: a Matlab variable containing 3 axis accelerometer data.
  • ACC.timestamp: a Matlab variable containing the timestamp associated with the accelerometer data. Accelerometer data was sampled at 50 Hz.

3)      Metadata: Matlab structure containing the signal quality and demographics information associated with each subject.

  • Metadata.is_good: a Matlab variable containing the boolean meta data provided by Muse which is a strict data quality indicator for each individual channel; 0 = bad, 1 = good. The first           column corresponds to the timestamp while the next four corresponds to each channel (TP9, AF7, AF8, and TP10).  It was sampled at 10 Hz.
  • Metadata.horseshoe: a Matlab variable containing the meta data provided by Muse which is a data quality indicator for each individual channel; 1 = good, 2 = ok,>= 3 bad. The first                column corresponds to the timestamp while the next four corresponds to each channel (TP9, AF7, AF8, and TP10).  It was sampled at 10   Hz.
  • Metadata.Info: a Matlab cell array containing the demographic and electrode contact information associated with that subject. The column corresponds to subject ID, age, gender, race,          ethnicity, grade, class, frequency of playing Minecraft and percentage of time of good contact respectively.

 

Identification of Baseline and Task Conditions segments

The segment corresponding to the baseline and task conditions were identified for the first 88 subjects after visually inspecting the accelerometer data alone. There is a period involving walking from the area used to collect baseline data to that of playing Minecraft. There is a repetitive pattern associated with walking caused by the headset shaking with respect to the footstep. A typical pattern consisting of cyclic sharp spikes associated with walking has been shown in Fig 1A in Instructions.docx file. This can be used to identify instances of walking in comparison to other head accelerations produced during game play (Fig. 1A in Instructions.docx). These intervals were manually identified during the offline analysis and were used to classify the intervals into baseline and task conditions. Therefore, computed the intervals for 88 participants that have superior quality EEG and head motion data. The intervals are saved under the EEG structure corresponding to each individual subject.

Usage Notes

  • The readers should use denoising algorithms to clean the EEG data, for example Artifact Subspace Reconstruction method [1] and wavelet thresholding [2]. The reader is referred to our proposed denoising flowchart in (Mobiny et.al, 2017, submitted).
  • Perform thresholding on EEG [3] and accelerometer data to remove noisy windows.

Funding

This work was partially supported by a cross-cutting seed grant from the Cullen College of Engineering at the University of Houston and the National Science Foundation Award NCS-FO 1533691.

Acknowledgements

We acknowledge the logistical support of all members of the Noninvasive Brain-Machine Interface Systems Laboratory who assisted with the data collection at the Children’s Museum of Houston. This work could not have been done without the support of Julia Banda, Gretchen Schmaltz, and Neelam Damani, Director of Gallery Education at the Children’s Museum of Houston. We also acknowledge the support of the University of Houston students Ranga Prasad Maddi, Zach Hernandez, Sho Nakagome, Ju- liana Arenas, Teresa Tse, Kevin Nathan, Justin Brantley, Nivas Kumar, Murad Megjhani, Jeff Gorges, Yongtian He, David Eguren, Nikunj Bhagat, Fangshi Zhu, and Phat Luu.

References

[1] Mullen, T. , Kothe, C., Chi, YM., Ojeda, A., Kerth, T. & Makeig, S.et al. Real-time modeling and 3D visualization of source dynamics and connectivity using wearable EEG. Engineering in Medicine and Biology Society (EMBC), 2013 35th Annual International Conference of the IEEE, 2184– 2187 (2013).

[2] Krishnaveni, V., Jayaraman, S., Aravind, S., Hariharasudhan, V. & Ramadoss, K. Automatic identification and Removal of ocular artifacts from EEG using Wavelet transform. Measurement science review 6, 45–57 (2006).

[3] Aryan Mobiny, Akshay Sujatha Ravindran, Jesus G Cruz-Garza, Andrew Paek, Anastasiya Kopteva, Sara Eshaghi,  José L Contreras Vidal. Assaying neural activity using scalp electroencephalography (EEG) of children during videogame playing (in review)

 

 

Comments

frfvefv

Submitted by gobishankar selvaraj on Sun, 09/17/2023 - 01:14

please con you give access to dataa

Submitted by Tan Huynh on Wed, 09/11/2024 - 21:47

Dataset Files

LOGIN TO ACCESS DATASET FILES

Documentation

AttachmentSize
File Instructions.docx210.01 KB