Skip to main content

Datasets

Standard Dataset

MMEAD-VRAG: A Multi-Modal Continuous Emotion Annotation Dataset for VR Action Games

Citation Author(s):
Enyao Chang
Submitted by:
Enyao Chang
Last updated:
DOI:
10.21227/cfkx-jh75
No Ratings Yet

Abstract

Furthermore, we introduce A Multi-Modal Continuous Emotion Annotation Dataset for VR Action Games (MMEAD-VRAG), the first multi-modal time-series dataset incorporating both physiological and behavioral signals in VR action gaming scenarios. A comparative analysis with existing state-of-the-art datasets reveals that MMEAD-VRAG exhibits fewer limitations in terms of data collection methodology, dataset scale, and participant diversity. The implementation of PhyBehavNet and the MMEAD-VRAG dataset is publicly available at https://github.com/EnyaoC/MMEAD-VRAG.

Instructions:

The implementation of PhyBehavNet and the MMEAD-VRAG dataset is publicly available at https://github.com/EnyaoC/MMEAD-VRAG.

Dataset Files

Files have not been uploaded for this dataset