The dataset comprises up to two weeks of activity data taken from the ankle and foot of 14 people without amputation and 17 people with lower limb amputation.  Walking speed, cadence, and lengths of strides taken at and away from the home were considered in this study.  Data collection came from two wearable sensors, one inertial measurement unit (IMU) placed on the top of the prosthetic or non-dominant foot, and one accelerometer placed on the same ankle.  Location information was derived from GPS and labeled as ‘home’, ‘away’, or ‘unknown’.  The dataset contains raw acce


This dataset is comprised of 31 Matlab .mat files. Each .mat file contains all sensor data for one individual participant. Files for participants with lower limb amputation (n = 17) are named as ‘S##.mat’ and files for control participants (n = 14) are named as ‘C##.mat’.


The dataset contains the psycho and physiological measures acquired on a sample of 220 adult subjects (76=female, 144=male) having dental treatment.


EEG signals of various subjects in text files are uploaded. It can be useful for various EEG signal processing algorithms- filtering, linear prediction, abnormality detection, PCA, ICA etc.


A custom made multispectral camera was used to collect a novel dataset of images of untreated lettuce leaves or leaves treated with vinegar, oil, or a combination of these. The camera captured image data at 10 wavelengths ∈[380nm,980nm] across the electromagnetic spectrum in the visible and NIR (near-infrared) regions. Imaging was done in a lab environment with the presence of ambient light.




The dataset consists of two populations of fetuses: 160 healthy and 102 Late Intra Uterine Growth Restricted (IUGR). Late IUGR is an adverse pathological condition encompassing chronic hypoxia as a consequence of placental insufficiency, resulting in an abnormal rate of fetal growth. In standard clinical practice, Late IUGR diagnosis can only be suspected in the third trimester and ultimately confirmed at birth. This data collection comprises of a set of 31 Fetal Heart Rate (FHR) indices computed at different time scales and domains accompanied by the clinical diagnosis.


The data for healthy and Late IUGR populations are included in a single .xlsx file.

Participants are listed by rows and features by columns. In the following we report an exhaustive list of features contained in the dataset accompanied by their units, time interval employed for the computation, and scientific literature references: 

 Fetal and Maternal Domains

  • Clinical Diagnosis [HEALTHY/LATE IUGR]: binary variable to report the clinical diagnosis of the participant
  • Gestational Age [days]: gestational age at the time of CTG examination
  • Maternal Age [years]: maternal age at the time of CTG examination
  • Sex [Male (1)/Female (2)]: fetal sex


Morphological and Time Domains

  • Mean FHR [bpm] – 1-min epoch: the mean of FHR excluding accelerations and decelerations 
  • Std FHR [bpm] – 1-min epoch: the standard deviation of FHR excluding accelerations and decelerations 
  • DELTA [ms] – 1-min epoch: defined in accordance with [1], [2] excluding accelerations and decelerations 
  • II [] – 1-min epoch: defined in accordance with [1], [2] excluding accelerations and decelerations 
  • STV [ms] – 1-min epoch: defined in accordance with [1], [2] excluding accelerations and decelerations 
  • LTI [ms] – 3-min epoch: defined in accordance with [1], [2] excluding accelerations and decelerations 
  • ACC_L [#] – entire recording: the count of large accelerations defined in accordance with [3], [4] 
  • ACC_S [#] – entire recording: the count of small accelerations defined in accordance with [3], [4] 
  • CONTR [#]– entire recording: the count of contractions defined in accordance with [3], [4] 


Frequency Domain 

  • LF [ms²/Hz] – 3-min epoch: defined in accordance with [2], LF band is defined in the range [0.03 - 0.15] Hz 
  • MF [ms²/Hz] – 3-min epoch: defined in accordance with [2], MF band is defined in the range [0.15 - 0.5] Hz 
  • HF [ms²/Hz] – 3-min epoch: defined in accordance with [2], HF band is defined in the range HF [0.5 - 1 Hz] 


Complexity Domain 

  • ApEn [bits] – 3-min epoch: defined in accordance with [5], m = 1, r = 0.1*standard deviation of the considered epoch 
  • SampEn [bits] – 3-min epoch: defined in accordance with [6], m = 1, r = 0.1*standard deviation of the considered epoch 
  • LCZ_BIN_0 [bits] – 3-min epoch: defined in accordance with [7], binary coding and p = 0 
  • LCZ_TER_0 [bits] – 3-min epoch: defined in accordance with [7], tertiary coding and p = 0 
  • AC/DC/DR [bpm] – entire recording: defined in accordance with [8]–[10], considering different combinations of parameters T and s, L is constant and equal 100 samples; e.g, AC_T1_s2 is defined as the acceleration capacity computed setting the parameters T = 1 and s = 2



[1]       D. Arduini, G. Rizzo, A. Piana, P. Bonalumi, P. Brambilla, and C. Romanini, “Computerized analysis of fetal heart rate—Part I: description of the sys- tem (2CTG),” J Matern Fetal Invest, vol. 3, pp. 159–164, 1993.

[2]       M. G. Signorini, G. Magenes, S. Cerutti, and D. Arduini, “Linear and nonlinear parameters for the analysis of fetal heart rate signal from cardiotocographic recordings,” IEEE Trans. Biomed. Eng., vol. 50, no. 3, pp. 365–374, 2003.

[3]       FIGO, “Guidelines for the Use of Fetal Monitoring,” Int. J. Gynecol. Obstet., vol. 25, pp. 159–167, 1986.

[4]       R. Rabinowitz, E. Persitz, and E. Sadovsky, “The relation between fetal heart rate accelerations and fetal movements.,” Obstet. Gynecol., vol. 61, no. 1, pp. 16–18, 1983.

[5]       S. M. Pincus and R. R. Viscarello, “Approximate entropy: a regularity measure for fetal heart rate analysis.,” Obstet. Gynecol., vol. 79, no. 2, pp. 249–55, 1992.

[6]       D. E. Lake, J. S. Richman, M. P. Griffin, and J. R. Moorman, “Sample entropy analysis of neonatal heart rate variability,” Am. J. Physiol. - Regul. Integr. Comp. Physiol., vol. 283, no. 3, pp. R789–R797, 2002.

[7]       A. Lempel and J. Ziv, “On the complexity of finite sequences,” IEEE Trans. Inf. Theory, vol. 22, no. 1, pp. 75–81, 1976.

[8]       A. Bauer et al., “Phase-rectified signal averaging detects quasi-periodicities in non-stationary data,” Phys. A Stat. Mech. its Appl., vol. 364, pp. 423–434, 2006.

[9]       A. Fanelli, G. Magenes, M. Campanile, and M. G. Signorini, “Quantitative assessment of fetal well-being through ctg recordings: A new parameter based on phase-rectified signal average,” IEEE J. Biomed. Heal. Informatics, vol. 17, no. 5, pp. 959–966, 2013.

[10]     M. W. Rivolta, T. Stampalija, M. G. Frasch, and R. Sassi, “Theoretical Value of Deceleration Capacity Points to Deceleration Reserve of Fetal Heart Rate,” IEEE Trans. Biomed. Eng., pp. 1–10, 2019.


Pressing demand of workload along with social media interaction leads to diminished alertness during work hours. Researchers attempted to measure alertness level from various cues like EEG, EOG, Video-based eye movement analysis, etc. Among these, video-based eyelid and iris motion tracking gained much attention in recent years. However, most of these implementations are tested on video data of subjects without spectacles. These videos do not pose a challenge for eye detection and tracking.

Disease-Specific Faces (DSF) database is used to research the phenotype and genotype of the diseases.
Disease-Specific Face images collected from:
♦ Professional medical publications
♦ Professional medical websites
♦ Medical Forums
♦ Hospitals
with definite diagnostic results.
The database is updated every three months.
If you would like to use DSF database, please send email to

BraTS has always been focusing on the evaluation of state-of-the-art methods for the segmentation of brain tumors in multimodal magnetic resonance imaging (MRI) scans. BraTS 2019 utilizes multi-institutional pre-operative MRI scans and focuses on the segmentation of intrinsically heterogeneous (in appearance, shape, and histology) brain tumors, namely gliomas. Furthemore, to pinpoint the clinical relevance of this segmentation task, BraTS’19 also focuses on the prediction of patient overall survival, via integrative analyses of radiomic features and machine learning algorithms.

Last Updated On: 
Fri, 02/28/2020 - 06:31

Synergistic prostheses enable the coordinated movement of the human-prosthetic arm, as required by activities of daily living. This is achieved by coupling the motion of the prosthesis to the human command, such as residual limb movement in motion-based interfaces. Previous studies demonstrated that developing human-prosthetic synergies in joint-space must consider individual motor behaviour and the intended task to be performed, requiring personalisation and task calibration.


Task-space synergy comparison data-set for the experiments performed in 2019-2020.


  • Processed: Processed data from MATLAB in ".mat" format. Organised by session and subject.
  • Raw: Raw time-series data gathered from sensors in ".csv" format. Each file represents a trial where a subject performed a reaching task. Organised by subject, modality and session. Anonymised subject information is included in a ".json" file.
    • Columns of the time-series files represent the different data gathered.
    • Rows of the time-series files represent the values at the given time "t".
  • Scripts: MATLAB scripts used to process and plot data. See ProcessAndUpdateSubjectData for data processing steps.

Ear-EEG recording collects brain signals from electrodes placed in the ear canal. Compared with existing scalp-EEG,  ear-EEG is more wearable and user-comfortable compared with existing scalp-EEG.


** Please note that this is under construction, and instruction is still being updated **




6 adults ( 2 males/ 4 females, age:22-28) participated in this experiment. The subjects were first given information about the study and then signed an informed consent form. The study was approved by the ethics committee at the  City  University of  Hong  Kong(Reference number:  2-25-201602_01).


Hardware and Software

We recorded the scalp-EEG using the a Neuroscan Quick Cap (Model C190) . Ear-EEG were recorded simultaneously with scalp-EEG. The 8 ear electrodes placed at the front and back ear canal (labeled as xF,  xB), and two upper and bottom positions in the concha (labeled as xOU and xOD). All ear and scalp electrodes were referenced to a  scalp  REF electrode.  The scalp  GRD  electrode  was  used as a  ground reference. The signals were sampled at 1000 Hz then filtered with a  bandpass filter between  0.5  Hz and  100  Hz together with a  notch filter to suppress the line noises.  The recording amplifier was SynAmps2,  and  Curry  7  was used for real-time data monitoring and collecting.


Experimental design

Subjects were seated  in front of a computer monitor. A fixation cross presented in  the center of  the monitor for 3s, followed by an arrow pseudo-randomly pointing to  the  right  or  left for 4s. During  the  4  s  arrow presentation, subjects needed to imagine and grasp the left or right hand according  to  the arrow direction. A short  warning beep was played  2  s  after the cross onset to call the subjects. 


Data Records

The data and the metadata from 6 subjects are stored in the IEEE Dataport. Note that Subject 1-4 completed 10 blocks of trials, subject 6 finished  only  5  blocks.  Each  block contained 16  trials.  In our dataset, each folder contain individual dataset from one subject.  For each individual dataset, there were four type of files (.dat, .rs3, .ceo, .dap). All four files were needed for EEGLAB and MNE package processing.  Each individual dataset contains the raw EEG data from 122 channels (from scale EEG recording), 8 channels (from ear EEG recording), and 1 channels (REF electrode). 

Individual dataset of subject 1,5,6 has different sub-datasets. The index indicates the time order of that sub-dataset (motor1, then followed by motor2, motor3, motor 4 etc).  While Individual dataset of subject 2,3,4 has one main dataset.

Each dataset has timestamps for epoch extraction. Two event labels marked the start of the arrow, which indicated the start of subject hand grasping (event number 1: left hand; event number 2: right hand).