One of the grand challenges in neuroscience is to understand the developing brain ‘in action and in context’ in complex natural settings. To address this challenge, it is imperative to acquire brain data from freely-behaving children to assay the variability and individuality of neural patterns across gender and age.


Recent advances in scalp electroencephalography (EEG) as a neuroimaging tool have now allowed researchers to overcome technical challenges and movement restrictions typical in traditional neuroimaging studies.  Fortunately, recent mobile EEG devices have enabled studies involving cognition and motor control in natural environments that require mobility, such as during art perception and production in a museum setting, and during locomotion tasks.


This dataset is associated with the paper, Jackson & Hall 2016, which is open source, and can be found here:

The DataPort Repository contains the data used primarily for generating Figure 1.


** Please note that this is under construction, and all data and code is still being uploaded whilst this notice is present. Thank-you. Tom **

All code is hosted as a GIT repository (below), as well as instructions, which can be found by clicking on the link/file called in that repository.

You are free to clone/pull this repository and use it under MIT license, on the understanding that any use of this code will be acknowledged by citing the original paper, DOI: 10.1109/TNSRE.2016.2612001, which is Open Access and can be found here:


Recently, surface electromyogram (EMG) has been proposed as a novel biometric trait for addressing some key limitations of current biometrics, such as spoofing and liveness. The EMG signals possess a unique characteristic: they are inherently different for individuals (biometrics), and they can be customized to realize multi-length codes or passwords (for example, by performing different gestures).




The data contains 13 Healthy controls, 14 PD without FOG, and 14 with FOG


Here we present recordings from a new high-throughput instrument to optogenetically manipulate neural activity in moving


Raw Data for Liu, et al., 2021

This is the raw data corresponding to: Liu, Kumar, Sharma and Leifer, "A high-throughput method to deliver targeted optogenetic stimulation to moving C. elegans population" available at and forthcoming in PLOS Biology.

The code used to analyze this data is availabe on GitHub at


This dataset is publicly hosted on IEEE DataParts. It is >300 GB of data containing many many individual image frames. We have bundled the data into one large .tar bundle. Download the .tar bundle and extract before use. Consider using an AWS client to download the bundle instead of your web browser as we have heard of reports that download such large files over the browser can be problematic.


This dataset as-is includes only raw camera and other output of the real-time instrument used to optogenetically activate the animal and record its motion. To extract final tracks, final centerlines, final velocity etc, these raw outputs must be processed.

Post-processing can be done by running the /ProcessDateDirectory.m MATLAB script from Note post processing was optimized to run in parallel on a high performance computing cluster. It is computationally intensive and also requires an egregious amount of RAM.

Repository Directory Structure

Recordings from the instrument are organized into directories by date, which we call "Date directories."

Each experiment is it's own timestamped folder within a date directory, and it contains the following files:

  • camera_distortion.png contains camera spatial calibration information in the image metadata
  • CameraFrames.mkv is the raw camera images compressed with H.265
  • labview_parameters.csv is the settings used by the instrument in the real-time experiment
  • labview_tracks.mat contains the real-time tracking data in a MATLAB readable HDF5 format
  • projector_to_camera_distortion.png contains the spatial calibration information that maps projector pixel space into camera pixel space
  • tags.txt contains tagged information for the experiment and is used to organize and select experiments for analysis
  • timestamps.mat contains timing information saved during the real-time experiments, including closed-loop lag.
  • ConvertedProjectorFrames folder contains png compressed stimulus images converted to the camera's frame of reference.

Naming convention for individual recordings

A typical folder is 210624_RunRailsTriggeredByTurning_Sandeep_AML67_10ulRet_red

  • 20210624 - Date the dataset was collected in format YYYYMMDD.
  • RunRailsTriggeredByTurning - Experiment type describes the type of experiment. For example this experiment was performed in closed loop triggered on turning. Open loop experiments are called "RunFullWormRails" experiments for historical reasons.
  • Sandeep - Name of the experimenter
  • AML67 - C. elegans strain name. Note strain AML470 corresponds to internal strain name "AKS_483.7.e".
  • 10ulRet - Concentration of all-trans-retinal used
  • red - LED color used to stimulate. Always red for this manuscript.

Regenerating figures

Once post processing has been run, figures from the mansucript can then be generated using scripts in

Please refer to instructions_to_generate_figures.csv for instructions on which Matlab script to run to generate each specific figure.


The given Dataset is record of different group people either healthy subjects or subclinical cardiovascular disease(CVD) with history coronary heart disease or hypertension for superficial body features, original photoplethysmography imaging(iPPG) signal and characteristics.

The main purpose of the dataset is to understand the relationship between CVD and high-dimensional ippg characteristics.