ENHANCING IOT APPLICATIONS WITH MACHINE LEARNING: HUMAN UPPER BODY MOVEMENT RECOGNITION USING IMU SENSORS IN IOT-ENABLED SCHOOL BACKPACKS

Citation Author(s):
SAI PRANEETH
JASTI
Deakin University
Submitted by:
Sai Praneeth Jasti
Last updated:
Mon, 07/08/2024 - 15:58
DOI:
10.21227/qenj-fw73
Data Format:
License:
0
0 ratings - Please login to submit your rating.

Abstract 

Human biomechanics is still an active topic of research that requires more technological advancements and data collection of various human body movements. There is a need for methodologies to identify daily activities in various scenarios, such as one while carrying a school bag. Deakin university has developed an Internet of Things (IoT) enabled smart school bag consisting of motion analysis sensors that would recognize the activities performed while carrying the school bag. This paper describes the procedures for gathering human body movements data using an array of 6-Degree of Freedom (DoF) Inertial Measurement Unit (IMU) sensors fabricated to a student's backpack. However, we have two issues before successfully classifying the sensor data into activities. To overcome the initial issue of bias correction we have implemented a combination of linear regression analysis and Kalman filtering on individual sensor. To address the later issue of improving system accuracy, we have added a layer of sensor fusion to improve the redundancy and reliability in detecting the activities. Then the system is used to collect data for a certain set of body movements when carrying a backpack, which are used to train Machine learning model using Fine Gaussian Support Vector Machines (SVM) and the model was able to achieve a training accuracy of 99.7%. The ability to recognize human movement from a backpack will allow for future study in a wider range of research fields such as informing kids and parents about the discomfort level of carrying heavy bags and potential long-term musculoskeletal problems.

Instructions: 

The data file can be opened using the Matlab load command.

The data is categorized in the following formats:

  • Fs correlates with the frequency at which the sensor data was collected
  • The RAW data collected from each sensor signal are saved in theĀ  respective sensor folder as a subfolder named RAW
  • The RAW data that was processed using the bias correction stage is saved in the respective sensor numbered folder, with accelerometer and gyroscope named subfolders.
  • The sensor fusion folder consists of the data of fusion accelerometer and gyroscope data which was collected after the sensor fusion stage.
  • The data consisting of features extracted from the feature extraction stage using CWT were saved accordingly with the names of the features as subfolder names in the Features main folder.