Expressive motion with dancers

Expressive motion with dancers

Citation Author(s):
École de technologie supérieure
Université Laval
Submitted by:
David St-Onge
Last updated:
Tue, 03/24/2020 - 18:27
Data Format:
Dataset Views:
0 ratings - Please login to submit your rating.
Share / Embed Cite

In recent years, researchers have explored human body posture and motion to control robots in more natural ways. These interfaces require the ability to track the body movements of the user in three dimensions. Deploying motion capture systems for tracking tends to be costly and intrusive and requires a clear line of sight, making them ill adapted for applications that need fast deployment. In this article, we use consumer-grade armbands, capturing orientation information and muscle activity, to interact with a robotic system through a state machine controlled by a body motion classifier. To compensate for the low quality of the information of these sensors, and to allow a wider range of dynamic control, our approach relies on machine learning. We train our classifier directly on the user to recognize (within minutes) which physiological state his or her body motion expresses. We demonstrate that on top of guaranteeing faster field deployment, our algorithm performs better than all comparable algorithms, and we detail its configuration and the most significant features extracted. As the use of large groups of robots is growing, we postulate that their interaction with humans can be eased by our approach. We identified the key factors to stimulate engagement using our system on 27 participants, each creating his or her own set of expressive motions to control a swarm of desk robots. The resulting unique dataset is available online together with the classifier and the robot control scripts.


The list of true labels for each participant is contained within the "true_labels_informations" folder.

The "formated_dataset" folder contain the dataset builded by the python file

Datasets from participant 1 to 27 are each contained in their own folder named accordignly (d1..d27).


A public github repository provides with the python script require to parse and use the dataset with scikit:


This dataset and the related Machine Learning algorithms developed and tested with it are detailed in:

St-Onge, David, Côté-Allard, Ulysse, Glette, Kyrre, Gosselin, Benoit et Beltrame, Giovanni. 2019. « Engaging with robotic swarms: commands from expressive motion ». ACM Transactions on Human-Robot Interaction (THRI), vol. 8, nº 2.

Dataset Files

You must be an IEEE Dataport Subscriber to access these files. Login or subscribe now. Sign up to be a Beta Tester and receive a coupon code for a free subscription to IEEE DataPort!

Thank you for rating this dataset!

Please share additional details of your rating with the IEEE DataPort community by adding a comment.

Embed this dataset on another website

Copy and paste the HTML code below to embed your dataset:

Share via email or social media

Click the buttons below:

[1] David St-Onge, Ulysse Côté-Allard, "Expressive motion with dancers", IEEE Dataport, 2018. [Online]. Available: Accessed: Apr. 03, 2020.
doi = {10.21227/H29M1Q},
url = {},
author = {David St-Onge; Ulysse Côté-Allard },
publisher = {IEEE Dataport},
title = {Expressive motion with dancers},
year = {2018} }
T1 - Expressive motion with dancers
AU - David St-Onge; Ulysse Côté-Allard
PY - 2018
PB - IEEE Dataport
UR - 10.21227/H29M1Q
ER -
David St-Onge, Ulysse Côté-Allard. (2018). Expressive motion with dancers. IEEE Dataport.
David St-Onge, Ulysse Côté-Allard, 2018. Expressive motion with dancers. Available at:
David St-Onge, Ulysse Côté-Allard. (2018). "Expressive motion with dancers." Web.
1. David St-Onge, Ulysse Côté-Allard. Expressive motion with dancers [Internet]. IEEE Dataport; 2018. Available from :
David St-Onge, Ulysse Côté-Allard. "Expressive motion with dancers." doi: 10.21227/H29M1Q