Expressive motion with dancers

Citation Author(s):
David
St-Onge
École de technologie supérieure
Ulysse
Côté-Allard
Université Laval
Submitted by:
David St-Onge
Last updated:
Tue, 05/17/2022 - 22:17
DOI:
10.21227/H29M1Q
Data Format:
Research Article Link:
Links:
License:
0
0 ratings - Please login to submit your rating.

Abstract 

In recent years, researchers have explored human body posture and motion to control robots in more natural ways. These interfaces require the ability to track the body movements of the user in three dimensions. Deploying motion capture systems for tracking tends to be costly and intrusive and requires a clear line of sight, making them ill adapted for applications that need fast deployment. In this article, we use consumer-grade armbands, capturing orientation information and muscle activity, to interact with a robotic system through a state machine controlled by a body motion classifier. To compensate for the low quality of the information of these sensors, and to allow a wider range of dynamic control, our approach relies on machine learning. We train our classifier directly on the user to recognize (within minutes) which physiological state his or her body motion expresses. We demonstrate that on top of guaranteeing faster field deployment, our algorithm performs better than all comparable algorithms, and we detail its configuration and the most significant features extracted. As the use of large groups of robots is growing, we postulate that their interaction with humans can be eased by our approach. We identified the key factors to stimulate engagement using our system on 27 participants, each creating his or her own set of expressive motions to control a swarm of desk robots. The resulting unique dataset is available online together with the classifier and the robot control scripts.

Instructions: 

The list of true labels for each participant is contained within the "true_labels_informations" folder.

The "formated_dataset" folder contain the dataset builded by the python file classifyRF.py

Datasets from participant 1 to 27 are each contained in their own folder named accordignly (d1..d27).

 

A public github repository provides with the python script require to parse and use the dataset with scikit:

https://github.com/MISTLab/MoodsRFC.git

 

This dataset and the related Machine Learning algorithms developed and tested with it are detailed in:

St-Onge, David, Côté-Allard, Ulysse, Glette, Kyrre, Gosselin, Benoit et Beltrame, Giovanni. 2019. « Engaging with robotic swarms: commands from expressive motion ». ACM Transactions on Human-Robot Interaction (THRI), vol. 8, nº 2.