Wearable Sensing
PPE Usage Dataset
This repository provides the Personal Protective Equipment (PPE) Usage Dataset, designed for training deep neural networks (DNNs). The dataset was collected using the EFR32MG24 microcontroller and the ICM-20689 inertial measurement unit, which features a 3-axis gyroscope and a 3-axis accelerometer.
The dataset includes data for four types of PPE: helmet, shirt, pants, and boots, categorized into three activity classes: carrying, still, and wearing.
- Categories:
Two publicly available datasets, the PASS and EmpaticaE4Stress databases, were utilised in this study. They were chosen because they both used the same Empatica E4 device, which allowed the acquisition of a variety of signals, including PPG and EDA. The dataset consists of in 1587 30-second PPG segments. Each segment has been filtered and normalized using a 0.9–5 Hz band-pass and min-max normalization scheme.
- Categories:
Prolonged sitting in a single position adversely affects the spine and leads to chronic issues that require extended therapy for recovery. The principal motivation of this study is to ensure good posture, i.e. when a person's body is positioned correctly and supported by the appropriate level of muscle tension. It draws people's attention to the need for good sitting posture and health. Commercially available wearable sensors provide several advantages as the sensors can be embedded with the clothing.
- Categories:
This dataset contains inertial sensor and optical motion capture data from a trial of 20 healthy adult participants performing various upper limb movements. Each subject had an IMU and cluster of relfective markers attached to their sternum, right upper arm, and right forearm (as in the image attached), and IMU and marker data was recorded simultaneously. This trial was carried out with the intention of investigating alternative sensor-to-segment calibration methods, but may be useful for other areas of inertial sensor research. CAD files for the 3D printed mounts are also included.
- Categories:
This dataset contains electrocardiography, electromyography, accelerometer, gyroscope and magnetometer signals that were measured in different scenarios using wearable equipment on 13 subjects:
- Weight movement in a horizontal position at an angle of approximately 45°.
- Vertical movement of the weights from the table to the floor and back.
- Moving the weights vertically from the table to the head and back.
- Rotational movement of the wrist while holding the weights with the arm extended, see Figure ~\ref{fig2}.
- Categories:
The EmoReIQ (Emotion Recognition for Iraqi Autism Individuals) dataset is a specialized EEG dataset designed to capture emotional responses in individuals with Autism Spectrum Disorder (ASD) and Typically Developed (TD). It focuses on five core emotions: calm, happy, anger, fear, and sad. The dataset is gathered through an experimental setup using video stimuli to elicit these emotions and records corresponding EEG signals from participants.
- Categories:
The Human Activity Recognition (HAR) dataset comprises comprehensive data collected from various human activities including walking, running, sitting, standing, and jumping. The dataset is designed to facilitate research in the field of activity recognition using machine learning and deep learning techniques. Each activity is captured through multiple sensors providing detailed temporal and spatial data points, enabling robust analysis and model training.
- Categories:
Wild-SHARD presents a novel Human Activity Recognition (HAR) dataset collected in an uncontrolled, real-world (wild) environment to address the limitations of existing datasets, which often need more non-simulated data. Our dataset comprises a time series of Activities of Daily Living (ADLs) captured using multiple smartphone models such as Samsung Galaxy F62, Samsung Galaxy A30s, Poco X2, One Plus 9 Pro and many more. These devices enhance data variability and robustness with their varied sensor manufacturers.
- Categories:
The dataset consists of 4-channeled EOG data recorded in two environments. First category of data were recorded from 21 poeple using driving simulator (1976 samples). The second category of data were recorded from 30 people in real-road conditions (390 samples).
All the signals were acquired with JINS MEME ES_R smart glasses equipped with 3-point EOG sensor. Sampling frequency is 200 Hz.
- Categories: