Expressive motion with dancers

Expressive motion with dancers

Citation Author(s):
David St-Onge and Ulysse Côté-Allard and Giovanni Beltrame
Submitted by:
David St-Onge
Last updated:
Thu, 11/08/2018 - 10:34
DOI:
10.21227/H29M1Q
Data Format:
License:
Dataset Views:
64
Share / Embed Cite
Abstract: 

  In recent years, researchers have explored gesture-based interfaces to control robots in non-traditional ways. These interfaces require the ability to track the body movements of the user in 3D. Deploying mo-cap systems for tracking tends to be costly, intrusive, and requires a clear line of sight, making them ill-adapted for applications that need fast deployment, such as artistic performance and emergency response. In this paper, we use consumer-grade armbands, capturing orientation information and muscle activity, to interact with a robotic system through a state machine controlled by a body motion classifier. To compensate for the low quality of the information of these sensors, and to allow a wider range of dynamic control, our approach relies on machine learning. We train our classifier directly on the user to reconignize (within minutes) his or her physiological states, or moods. We demonstrate that on top of guaranteeing faster field deployment, our algorithm performs better than a memoryless neural network. We then demonstrate the ease of use and robustness of our system with a study on 27 participants, each creating a hybrid dance performance with a swarm of desk robots from their own body dynamics.

Instructions: 

The list of true labels for each participant is contained within the "true_labels_informations" folder.

The "formated_dataset" folder contain the dataset builded by the python file classifyRF.py

Datasets from participant 1 to 27 are each contained in their own folder named accordignly (d1..d27).

 

A public github repository provides with the python script require to parse and use the dataset with scikit:

https://github.com/MISTLab/MoodsRFC.git

Dataset Files

No Data files have been uploaded.

Embed this dataset on another website

Copy and paste the HTML code below to embed your dataset:

Share via email or social media

Click the buttons below:

facebooktwittermailshare
[1] David St-Onge and Ulysse Côté-Allard and Giovanni Beltrame, "Expressive motion with dancers", IEEE Dataport, 2018. [Online]. Available: http://dx.doi.org/10.21227/H29M1Q. Accessed: Sep. 22, 2019.
@data{h29m1q-18,
doi = {10.21227/H29M1Q},
url = {http://dx.doi.org/10.21227/H29M1Q},
author = {David St-Onge and Ulysse Côté-Allard and Giovanni Beltrame },
publisher = {IEEE Dataport},
title = {Expressive motion with dancers},
year = {2018} }
TY - DATA
T1 - Expressive motion with dancers
AU - David St-Onge and Ulysse Côté-Allard and Giovanni Beltrame
PY - 2018
PB - IEEE Dataport
UR - 10.21227/H29M1Q
ER -
David St-Onge and Ulysse Côté-Allard and Giovanni Beltrame. (2018). Expressive motion with dancers. IEEE Dataport. http://dx.doi.org/10.21227/H29M1Q
David St-Onge and Ulysse Côté-Allard and Giovanni Beltrame, 2018. Expressive motion with dancers. Available at: http://dx.doi.org/10.21227/H29M1Q.
David St-Onge and Ulysse Côté-Allard and Giovanni Beltrame. (2018). "Expressive motion with dancers." Web.
1. David St-Onge and Ulysse Côté-Allard and Giovanni Beltrame. Expressive motion with dancers [Internet]. IEEE Dataport; 2018. Available from : http://dx.doi.org/10.21227/H29M1Q
David St-Onge and Ulysse Côté-Allard and Giovanni Beltrame. "Expressive motion with dancers." doi: 10.21227/H29M1Q