Wearable Sensing

This dataset provides measurements of cerebral blood flow using Radio Frequency (RF) sensors operating in the Ultra-Wideband (UWB) frequency range, enabling non-invasive monitoring of cerebral hemodynamics. It includes blood flow feature data from two arterial networks, Arterial Network A and Arterial Network B. Statistical features were manually extracted from the RF sensor data, while autonomous feature extraction was performed using a Stacked Autoencoder (SAE) with architectures such as 32-16-32, 64-32-16-32-64, and 128-64-32-16-32-64-128.

Categories:
132 Views

TOWalk: A Multi-Modal Dataset for Real-World Movement Analysis

The TOWalk Dataset has been developed to support research on gait analysis, with a focus on leveraging data from head-worn sensors combined with other wearable devices. This dataset provides an extensive collection of movement data captured in both controlled laboratory settings and natural, unsupervised real-world conditions in Turin (Italy).

Categories:
260 Views

Objective: This study evaluates the feasibility of a noninvasive system for monitoring diaphragmatic efficiency in people with cervical spinal cord injury (CSCI). Methods: Two versions of a portable hardware system were developed using impedance pneumography (IP) to measure tidal volume (TV) and surface electromyography (sEMG) to assess diaphragm electrical activity (EAdi). Version 1 was used to determine optimal electrode positions, while Version 2 integrated these sensor systems into a compact, portable design.

Categories:
133 Views

This dataset contains the raw data acquired by a wearable, graphene-PbS quantum dot photodetector platform for smart contact lenses under indoor light illumination by a Philips Hue GU10 bulb at colour temperature settings of 2200K, 4000K and 6500K and illuminance of 100 lux. In addition, it also contains S-11 magnitude data acquired on a PocketVNA from near field communication coils designed for wireless power and data transmission for this contact lens platform.

Categories:
16 Views

This dataset accompanies the paper “Evaluating Cross-Device and Cross-Subject Consistency in Visual Fixation Prediction”. We collected eye gaze data using a 30Hz eye tracker embedded in the Aria Glasses (Meta Platforms, Inc., Menlo Park, CA, USA) on 300 images from the MIT1003 dataset, with each image viewed for 3 seconds by 9 subjects (age range 23-39 years), resulting in a total of 243,000 eye fixations. Besides, we also release the average saliency maps from the subjects' visual fixations.

Categories:
179 Views

The Clarkson University Affective Data Set (CUADS) is a multi-modal affective dataset designed to assist in machine learning model development for automated emotion recognition. CUADS provides electrocardiogram, photoplethysmogram, and galvanic skin response data from 38 participants, captured under controlled conditions using Shimmer3 ECG and GSR sensors. ECG, GSR and PPG signals were recorded while each participant viewed and rated 20 affective movie clips. CUADS also provides big five personality traits for each participant.

Categories:
105 Views

This dataset contains human motion data collected using inertial measurement units (IMUs), including accelerometer and gyroscope readings, from participants performing specific activities. The data was gathered under controlled conditions with verbal informed consent and includes diverse motion patterns that can be used for research in human activity recognition, wearable sensor applications, and machine learning algorithm development. Each sample is labeled and processed to ensure consistency, with raw and augmented data available for use. 

Categories:
51 Views

With the continuous advancement of technology, small and portable physiological sensors that can be worn on the body are quietly integrating into our daily lives, and are expected to greatly enhance our quality of life. In order to further enrich and expand the emotional physiological signals captured by portable wearable devices, we utilized the 14-channel portable EEG acquisition device Emotiv EPOC X, and with emotional video clips as the stimulus source, we collected two sets of emotional EEG signals from two groups of 10 participants each, named EmoX1 and EmoX2.

Categories:
226 Views

Currently, existing public datasets based on peripheral physiological signals are limited, and there is a lack of emotion recognition (ER) datasets specifically customized for smart classroom scenarios. Therefore, we have collected and constructed the I+ Lab Emotion (ILEmo) dataset, which is specifically designed for the emotion monitoring of students in classroom. The raw data of the ILEmo dataset is collected by the I+ Lab at Shandong University, using custom multi-modal wristbands and computing suites.

Categories:
218 Views

This dataset corresponds to the measurements of two microstrip patch antennas, collected using the facility described in [1]. The available measurements contained within the dataset allow a complete characterization of the field radiated by these antennas. These fields can be introduced in enhanced microwave imaging algorithms that consider the field radiated by the transmitting and receiving antennas of the microwave imaging system [2] (modified Delay and Sum algorithm), [3] (modified Phase Shift Migration imaging algorithm).

Categories:
138 Views

Pages