One of the grand challenges in neuroscience is to understand the developing brain ‘in action and in context’ in complex natural settings. To address this challenge, it is imperative to acquire brain data from freely-behaving children to assay the variability and individuality of neural patterns across gender and age.


Recent advances in scalp electroencephalography (EEG) as a neuroimaging tool have now allowed researchers to overcome technical challenges and movement restrictions typical in traditional neuroimaging studies.  Fortunately, recent mobile EEG devices have enabled studies involving cognition and motor control in natural environments that require mobility, such as during art perception and production in a museum setting, and during locomotion tasks.


This dataset is associated with the paper, Jackson & Hall 2016, which is open source, and can be found here:

The DataPort Repository contains the data used primarily for generating Figure 1.


** Please note that this is under construction, and all data and code is still being uploaded whilst this notice is present. Thank-you. Tom **

All code is hosted as a GIT repository (below), as well as instructions, which can be found by clicking on the link/file called in that repository.

You are free to clone/pull this repository and use it under MIT license, on the understanding that any use of this code will be acknowledged by citing the original paper, DOI: 10.1109/TNSRE.2016.2612001, which is Open Access and can be found here:


The use of modern Mobile Brain-Body imaging techniques, combined with hyperscanning (simultaneous and synchronous recording of brain activity of multiple participants) has allowed researchers to explore a broad range of different types of social interactions from the neuroengineering perspective. In specific, this approach allows to study such type of interactions under an ecologically valid approach.


The electrooculography signal is widely used to analyze eye movements, with an emphasis on its use in human-computer interaction. Various techniques based on artificial intelligence have been used for signal processing. These methods require a specific dataset to train algorithms capable of detecting eye movements. We designed an experiment in which horizontal and vertical eye movements were recorded in conjunction with different movement angles.


It mainly includes the original BONN epilepsy EEG data set, as well as the EEG signals after the revised tunable Q-factor wavelet transform decomposition and reconstruction of the data set.


EEG consists of collecting information from brain activity in the form of electrical voltage. Epileptic Seizure prediction and detection is a major sought after research nowadays. This dataset contains data from 11 patients of whom seizures are observed in EEG for 2 patients.


The total duration of seizures is 170 seconds. The number of channels is 16 and data is collected at 256Hz sampling rate.


The final dataset files in .csv format contain 87040 rows x 17 columns,



The design and implementation of an anthropomorphic robotic hand control system for the Bioengineering and Neuroimaging Laboratory LNB of the ESPOL were elaborated. The myoelectric signals were obtained using a bioelectric data acquisition board (CYTON BOARD) using six channels out of 8 available, which had an amplitude of 200 [uV] at a sampling frequency of 250 [Hz]. 



Recently, surface electromyogram (EMG) has been proposed as a novel biometric trait for addressing some key limitations of current biometrics, such as spoofing and liveness. The EMG signals possess a unique characteristic: they are inherently different for individuals (biometrics), and they can be customized to realize multi-length codes or passwords (for example, by performing different gestures).