inertial sensors

We present a comprehensive dataset developed as part of a study to compute real-time kinematics using a full-body wearable approach incorporating up to 12 IMUs. This dataset includes optical and inertial measurements from 22 subjects engaged in a diverse set of 9 activities: walking, running, squatting, boxing, yoga, dance, badminton, stair climbing, and seated extremity exercises. The dataset features ground truth kinematics, offline predicted kinematics, online predicted kinematics, and IMU-simulated offline predicted kinematics.
- Categories:

This dataset is collected at KAIST, Daejeon, and KAIST by ISILAB to research seamless indoor-outdoor detection. The collecting device is a Raspberry Pi 4B+ with touchscreen UI connected with a Pmod Nav module and a PmodGPS. This collection has a rough three-month time span, which mitigates the specific time-specific bias. Further, in the collection, we also swap the wiring to simulate the device bias. The dynamic calibration is not applied to the dataset; searchers may choose to apply the dataset or not.
- Categories:

This dataset is related to dog activity and is sensor data.
- Categories: