One of the grand challenges in neuroscience is to understand the developing brain ‘in action and in context’ in complex natural settings. To address this challenge, it is imperative to acquire brain data from freely-behaving children to assay the variability and individuality of neural patterns across gender and age.
Recent advances in scalp electroencephalography (EEG) as a neuroimaging tool have now allowed researchers to overcome technical challenges and movement restrictions typical in traditional neuroimaging studies. Fortunately, recent mobile EEG devices have enabled studies involving cognition and motor control in natural environments that require mobility, such as during art perception and production in a museum setting, and during locomotion tasks.
This dataset is associated with the paper, Jackson & Hall 2016, which is open source, and can be found here: http://ieeexplore.ieee.org/document/7742994/
The DataPort Repository contains the data used primarily for generating Figure 1.
** Please note that this is under construction, and all data and code is still being uploaded whilst this notice is present. Thank-you. Tom **
All code is hosted as a GIT repository (below), as well as instructions, which can be found by clicking on the link/file called README.md in that repository.
You are free to clone/pull this repository and use it under MIT license, on the understanding that any use of this code will be acknowledged by citing the original paper, DOI: 10.1109/TNSRE.2016.2612001, which is Open Access and can be found here: http://ieeexplore.ieee.org/document/7742994/
This dataset has been employed in the following articles:
Recently, surface electromyogram (EMG) has been proposed as a novel biometric trait for addressing some key limitations of current biometrics, such as spoofing and liveness. The EMG signals possess a unique characteristic: they are inherently different for individuals (biometrics), and they can be customized to realize multi-length codes or passwords (for example, by performing different gestures).
The data contains 13 Healthy controls, 14 PD without FOG, and 14 with FOG
Here we present recordings from a new high-throughput instrument to optogenetically manipulate neural activity in moving
Raw Data for Liu, et al., 2021
This is the raw data corresponding to: Liu, Kumar, Sharma and Leifer, "A high-throughput method to deliver targeted optogenetic stimulation to moving C. elegans population" available at https://arxiv.org/abs/2109.05303 and forthcoming in PLOS Biology.
The code used to analyze this data is availabe on GitHub at https://github.com/leiferlab/liu-closed-loop-code.git
This dataset is publicly hosted on IEEE DataParts. It is >300 GB of data containing many many individual image frames. We have bundled the data into one large
.tar bundle. Download the
.tar bundle and extract before use. Consider using an AWS client to download the bundle instead of your web browser as we have heard of reports that download such large files over the browser can be problematic.
This dataset as-is includes only raw camera and other output of the real-time instrument used to optogenetically activate the animal and record its motion. To extract final tracks, final centerlines, final velocity etc, these raw outputs must be processed.
Post-processing can be done by running the
/ProcessDateDirectory.m MATLAB script from https://github.com/leiferlab/liu-closed-loop-code.git. Note post processing was optimized to run in parallel on a high performance computing cluster. It is computationally intensive and also requires an egregious amount of RAM.
Repository Directory Structure
Recordings from the instrument are organized into directories by date, which we call "Date directories."
Each experiment is it's own timestamped folder within a date directory, and it contains the following files:
camera_distortion.pngcontains camera spatial calibration information in the image metadata
CameraFrames.mkvis the raw camera images compressed with H.265
labview_parameters.csvis the settings used by the instrument in the real-time experiment
labview_tracks.matcontains the real-time tracking data in a MATLAB readable HDF5 format
projector_to_camera_distortion.pngcontains the spatial calibration information that maps projector pixel space into camera pixel space
tags.txtcontains tagged information for the experiment and is used to organize and select experiments for analysis
timestamps.matcontains timing information saved during the real-time experiments, including closed-loop lag.
ConvertedProjectorFramesfolder contains png compressed stimulus images converted to the camera's frame of reference.
Naming convention for individual recordings
A typical folder is
20210624- Date the dataset was collected in format
RunRailsTriggeredByTurning- Experiment type describes the type of experiment. For example this experiment was performed in closed loop triggered on turning. Open loop experiments are called "RunFullWormRails" experiments for historical reasons.
Sandeep- Name of the experimenter
AML67- C. elegans strain name. Note strain AML470 corresponds to internal strain name "AKS_483.7.e".
10ulRet- Concentration of all-trans-retinal used
red- LED color used to stimulate. Always red for this manuscript.
Once post processing has been run, figures from the mansucript can then be generated using scripts in https://github.com/leiferlab/liu-closed-loop-code.git
Please refer to
instructions_to_generate_figures.csv for instructions on which Matlab script to run to generate each specific figure.
The given Dataset is record of different group people either healthy subjects or subclinical cardiovascular disease(CVD) with history coronary heart disease or hypertension for superficial body features, original photoplethysmography imaging(iPPG) signal and characteristics.
The main purpose of the dataset is to understand the relationship between CVD and high-dimensional ippg characteristics.