Wearable Sensing

This dataset consists of sensory data of digits, i.e., from 0 to 9. The dataset is collected from 20 volunteers by using a 9−axis Inertial Measurement Unit (IMU) equipped marker pen. The objective of this dataset is to design classification algorithms for recognizing a handwritten digit in real-time.

489 views
  • Machine Learning
  • Last Updated On: 
    Mon, 04/27/2020 - 14:38

    The dataset comprises up to two weeks of activity data taken from the ankle and foot of 14 people without amputation and 17 people with lower limb amputation.  Walking speed, cadence, and lengths of strides taken at and away from the home were considered in this study.  Data collection came from two wearable sensors, one inertial measurement unit (IMU) placed on the top of the prosthetic or non-dominant foot, and one accelerometer placed on the same ankle.  Location information was derived from GPS and labeled as ‘home’, ‘away’, or ‘unknown’.  The dataset contains raw acce

    156 views
  • Health
  • Last Updated On: 
    Tue, 04/28/2020 - 16:50

    FallAllD is a large open dataset of human falls and activities of daily living simulated by 15 participants. FallAllD consists of 26420 files collected using three data-loggers worn on the waist, wrist and neck of the subjects. Motion signals are captured using an accelerometer, gyroscope, magnetometer and barometer with efficient configurations that suit the potential applications e.g. fall detection, fall prevention and human activity recognition.

    537 views
  • Health
  • Last Updated On: 
    Tue, 04/21/2020 - 16:38

    Synergistic prostheses enable the coordinated movement of the human-prosthetic arm, as required by activities of daily living. This is achieved by coupling the motion of the prosthesis to the human command, such as residual limb movement in motion-based interfaces. Previous studies demonstrated that developing human-prosthetic synergies in joint-space must consider individual motor behaviour and the intended task to be performed, requiring personalisation and task calibration.

    97 views
  • Biomedical and Health Sciences
  • Last Updated On: 
    Wed, 05/06/2020 - 11:13

    Ear-EEG recording collects brain signals from electrodes placed in the ear canal. Compared with existing scalp-EEG,  ear-EEG is more wearable and user-comfortable compared with existing scalp-EEG.

    114 views
  • Biomedical and Health Sciences
  • Last Updated On: 
    Wed, 02/19/2020 - 03:24

    This dataset features cooking activities with recipes and gestures labeled. The data has been collected using two smartphones (right arm and left hip), two smartwatches (both wrists) and one motion capture system with 29 markers. There were 4 subjects who prepared 3 recipes (sandwich, fruit salad, cereal) 5 times each. The subjects followed a script for each recipe but acted as naturally as possible

    711 views
  • IoT
  • Last Updated On: 
    Wed, 05/06/2020 - 11:13

    The CLAS (Cognitive Load, Affect and Stress) dataset was conceived as a freelyaccessible repository which is purposely developed to support research on the automated assessment of certain states of mind and the emotional condition of a person.

    633 views
  • Artificial Intelligence
  • Last Updated On: 
    Sat, 05/30/2020 - 10:31

    These CSV files contain the wearable sensor data (RIP and IMU )collected from forty subjects during multiple cigarette smoking sessions.

    213 views
  • Wearable Sensing
  • Last Updated On: 
    Tue, 01/21/2020 - 14:54

    BS-HMS-Dataset is a dataset of the users' brainwave signals and the corresponding hand movement signals from a large number of volunteer participants. The dataset has two parts; (1) Neurosky based Dataset (collected over several months in 2016 from 32 volunteer participants), and (2) Emotiv based Dataset (collected from 27 volunteer participants over several months in 2019). 

    613 views
  • Machine Learning
  • Last Updated On: 
    Thu, 12/12/2019 - 13:17

    We provide a large benchmark dataset consisting of about: 3.5 million keystroke events; 57.1 million data-points for accelerometer and gyroscope each; and 1.7 million data-points for swipes. Data was collected between April 2017 and June 2017 after the required IRB approval. Data from 117 participants, in a session lasting between 2 to 2.5 hours each, performing multiple activities such as: typing (free and fixed text), gait (walking, upstairs and downstairs) and swiping activities while using desktop, phone and tablet is shared. 

     

    5836 views
  • Artificial Intelligence
  • Last Updated On: 
    Thu, 03/05/2020 - 00:59

    Pages