104 participants  (54 female and 50 male) walked over a treadmill. Gait data based on 25 joint trajectories was recorded using a single Kinect V2 depth sensor placed in frontal view. We gradually increased the speed of the motorized treadmill from 0m/s to 1.2m/s. All recordings start once 1.2m/s speed is reached. After approximately 30 seconds of continuous walking, the recording stops and then the slowdown starts.

Instructions: 

The .zip file contains:

1) A .xlsx file containing important information about each participant (age,sex,height and weight)

2) A .jpg file containing an overview of the workspace.

3) 104 walking recordings (54 female and 50 male recordings) .

 Each recording has the identification code "K3"+"the participant reference number". Each recording is a time series vector organized as follows:

0.Time 

1.Shoulder Right (x,y,z)

2.Elbow Right (x,y,z)

3.Wrist Right (x,y,z)

4.Hand Right (x,y,z)

5.Hand tip Right (x,y,z)

6.Thumb Right (x,y,z)

7.Hip Right (x,y,z)

8.Knee Right (x,y,z)

9.Ankle Right (x,y,z)

10.Foot Right (x,y,z)

11.Shoulder Left (x,y,z)

12.Elbow Left (x,y,z)

13.Wrist Left (x,y,z)

14.Hand Left (x,y,z)

15.Hand tip Left (x,y,z)

16.Thumb Left (x,y,z)

17.Hip Left (x,y,z)

18.Knee Left (x,y,z)

19.Ankle Left (x,y,z)

20.Foot Left (x,y,z)

21.Head (x,y,z)

22.Neck (x,y,z)

23.Spine Shoulder (x,y,z)

24.Spine mid (x,y,z)

25.Spine Base (x,y,z)

Categories:
338 Views

This dataset contains the trained model that accompanies the publication of the same name:

 Anup Tuladhar*, Serena Schimert*, Deepthi Rajashekar, Helge C. Kniep, Jens Fiehler, Nils D. Forkert, "Automatic Segmentation of Stroke Lesions in Non-Contrast Computed Tomography Datasets With Convolutional Neural Networks," in IEEE Access, vol. 8, pp. 94871-94879, 2020, doi:10.1109/ACCESS.2020.2995632. *: Co-first authors

 

Instructions: 

The dataset contains 3 parts:

  • Pre-processing: Script to extract brain volume from surrounding skull in non-contrast computed tomography (NCCT) scans and instructions for further pre-processing.
  • Trained convolutional neural network (CNN) to perform automated segmentations
  • Post-processing script to improve CNN-based segmentations

 

Independent Instructions for each part are also contained within each folder.

Categories:
694 Views

The nucleus and micronucleus images in this dataset are collected manually from Google images. Many of these images are in RGB color while a few of them are in grayscale. The dataset includes 148 nucleus images and 158 micronucleus images. The images are manually curated, cropped, and labeled into these two classes by a domain of experts in biology. The images have different sizes and different resolutions. The sizes and shapes for nucleuses and micronucleuses images differ from one image to another. Each image may contain one or more nucleus or micronucleus.

Categories:
103 Views

This dataset has been collected in the Patient Recovery Center (a  24-hour,  7-day  nurse  staffed  facility)  with  medical  consultant   from  the  Mobile  Healthcare  Service of Hamad Medical Corporation.

Categories:
630 Views

The dataset comprises up to two weeks of activity data taken from the ankle and foot of 14 people without amputation and 17 people with lower limb amputation.  Walking speed, cadence, and lengths of strides taken at and away from the home were considered in this study.  Data collection came from two wearable sensors, one inertial measurement unit (IMU) placed on the top of the prosthetic or non-dominant foot, and one accelerometer placed on the same ankle.  Location information was derived from GPS and labeled as ‘home’, ‘away’, or ‘unknown’.  The dataset contains raw acce

Instructions: 

This dataset is comprised of 31 Matlab .mat files. Each .mat file contains all sensor data for one individual participant. Files for participants with lower limb amputation (n = 17) are named as ‘S##.mat’ and files for control participants (n = 14) are named as ‘C##.mat’.

Categories:
410 Views

The dataset contains the psycho and physiological measures acquired on a sample of 220 adult subjects (76=female, 144=male) having dental treatment.

Categories:
180 Views

EEG signals of various subjects in text files are uploaded. It can be useful for various EEG signal processing algorithms- filtering, linear prediction, abnormality detection, PCA, ICA etc.

Categories:
265 Views

A custom made multispectral camera was used to collect a novel dataset of images of untreated lettuce leaves or leaves treated with vinegar, oil, or a combination of these. The camera captured image data at 10 wavelengths ∈[380nm,980nm] across the electromagnetic spectrum in the visible and NIR (near-infrared) regions. Imaging was done in a lab environment with the presence of ambient light.

Instructions: 

 

Categories:
158 Views

The dataset consists of two populations of fetuses: 160 healthy and 102 Late Intra Uterine Growth Restricted (IUGR). Late IUGR is an adverse pathological condition encompassing chronic hypoxia as a consequence of placental insufficiency, resulting in an abnormal rate of fetal growth. In standard clinical practice, Late IUGR diagnosis can only be suspected in the third trimester and ultimately confirmed at birth. This data collection comprises of a set of 31 Fetal Heart Rate (FHR) indices computed at different time scales and domains accompanied by the clinical diagnosis.

Instructions: 

The data for healthy and Late IUGR populations are included in a single .xlsx file.

Participants are listed by rows and features by columns. In the following we report an exhaustive list of features contained in the dataset accompanied by their units, time interval employed for the computation, and scientific literature references: 

 Fetal and Maternal Domains

  • Clinical Diagnosis [HEALTHY/LATE IUGR]: binary variable to report the clinical diagnosis of the participant
  • Gestational Age [days]: gestational age at the time of CTG examination
  • Maternal Age [years]: maternal age at the time of CTG examination
  • Sex [Male (1)/Female (2)]: fetal sex

 

Morphological and Time Domains

  • Mean FHR [bpm] – 1-min epoch: the mean of FHR excluding accelerations and decelerations 
  • Std FHR [bpm] – 1-min epoch: the standard deviation of FHR excluding accelerations and decelerations 
  • DELTA [ms] – 1-min epoch: defined in accordance with [1], [2] excluding accelerations and decelerations 
  • II [] – 1-min epoch: defined in accordance with [1], [2] excluding accelerations and decelerations 
  • STV [ms] – 1-min epoch: defined in accordance with [1], [2] excluding accelerations and decelerations 
  • LTI [ms] – 3-min epoch: defined in accordance with [1], [2] excluding accelerations and decelerations 
  • ACC_L [#] – entire recording: the count of large accelerations defined in accordance with [3], [4] 
  • ACC_S [#] – entire recording: the count of small accelerations defined in accordance with [3], [4] 
  • CONTR [#]– entire recording: the count of contractions defined in accordance with [3], [4] 

 

Frequency Domain 

  • LF [ms²/Hz] – 3-min epoch: defined in accordance with [2], LF band is defined in the range [0.03 - 0.15] Hz 
  • MF [ms²/Hz] – 3-min epoch: defined in accordance with [2], MF band is defined in the range [0.15 - 0.5] Hz 
  • HF [ms²/Hz] – 3-min epoch: defined in accordance with [2], HF band is defined in the range HF [0.5 - 1 Hz] 

 

Complexity Domain 

  • ApEn [bits] – 3-min epoch: defined in accordance with [5], m = 1, r = 0.1*standard deviation of the considered epoch 
  • SampEn [bits] – 3-min epoch: defined in accordance with [6], m = 1, r = 0.1*standard deviation of the considered epoch 
  • LCZ_BIN_0 [bits] – 3-min epoch: defined in accordance with [7], binary coding and p = 0 
  • LCZ_TER_0 [bits] – 3-min epoch: defined in accordance with [7], tertiary coding and p = 0 
  • AC/DC/DR [bpm] – entire recording: defined in accordance with [8]–[10], considering different combinations of parameters T and s, L is constant and equal 100 samples; e.g, AC_T1_s2 is defined as the acceleration capacity computed setting the parameters T = 1 and s = 2

 

References

[1]       D. Arduini, G. Rizzo, A. Piana, P. Bonalumi, P. Brambilla, and C. Romanini, “Computerized analysis of fetal heart rate—Part I: description of the sys- tem (2CTG),” J Matern Fetal Invest, vol. 3, pp. 159–164, 1993.

[2]       M. G. Signorini, G. Magenes, S. Cerutti, and D. Arduini, “Linear and nonlinear parameters for the analysis of fetal heart rate signal from cardiotocographic recordings,” IEEE Trans. Biomed. Eng., vol. 50, no. 3, pp. 365–374, 2003.

[3]       FIGO, “Guidelines for the Use of Fetal Monitoring,” Int. J. Gynecol. Obstet., vol. 25, pp. 159–167, 1986.

[4]       R. Rabinowitz, E. Persitz, and E. Sadovsky, “The relation between fetal heart rate accelerations and fetal movements.,” Obstet. Gynecol., vol. 61, no. 1, pp. 16–18, 1983.

[5]       S. M. Pincus and R. R. Viscarello, “Approximate entropy: a regularity measure for fetal heart rate analysis.,” Obstet. Gynecol., vol. 79, no. 2, pp. 249–55, 1992.

[6]       D. E. Lake, J. S. Richman, M. P. Griffin, and J. R. Moorman, “Sample entropy analysis of neonatal heart rate variability,” Am. J. Physiol. - Regul. Integr. Comp. Physiol., vol. 283, no. 3, pp. R789–R797, 2002.

[7]       A. Lempel and J. Ziv, “On the complexity of finite sequences,” IEEE Trans. Inf. Theory, vol. 22, no. 1, pp. 75–81, 1976.

[8]       A. Bauer et al., “Phase-rectified signal averaging detects quasi-periodicities in non-stationary data,” Phys. A Stat. Mech. its Appl., vol. 364, pp. 423–434, 2006.

[9]       A. Fanelli, G. Magenes, M. Campanile, and M. G. Signorini, “Quantitative assessment of fetal well-being through ctg recordings: A new parameter based on phase-rectified signal average,” IEEE J. Biomed. Heal. Informatics, vol. 17, no. 5, pp. 959–966, 2013.

[10]     M. W. Rivolta, T. Stampalija, M. G. Frasch, and R. Sassi, “Theoretical Value of Deceleration Capacity Points to Deceleration Reserve of Fetal Heart Rate,” IEEE Trans. Biomed. Eng., pp. 1–10, 2019.

Categories:
682 Views

Pressing demand of workload along with social media interaction leads to diminished alertness during work hours. Researchers attempted to measure alertness level from various cues like EEG, EOG, Video-based eye movement analysis, etc. Among these, video-based eyelid and iris motion tracking gained much attention in recent years. However, most of these implementations are tested on video data of subjects without spectacles. These videos do not pose a challenge for eye detection and tracking.

Categories:
330 Views

Pages