Datasets
Standard Dataset
MMEAD-VRAG: A Multi-Modal Continuous Emotion Annotation Dataset for VR Action Games

- Citation Author(s):
- Submitted by:
- Enyao Chang
- Last updated:
- Fri, 04/11/2025 - 10:45
- DOI:
- 10.21227/cfkx-jh75
- License:
- Categories:
- Keywords:
0 ratings - Please login to submit your rating.
Abstract
Furthermore, we introduce A Multi-Modal Continuous Emotion Annotation Dataset for VR Action Games (MMEAD-VRAG), the first multi-modal time-series dataset incorporating both physiological and behavioral signals in VR action gaming scenarios. A comparative analysis with existing state-of-the-art datasets reveals that MMEAD-VRAG exhibits fewer limitations in terms of data collection methodology, dataset scale, and participant diversity. The implementation of PhyBehavNet and the MMEAD-VRAG dataset is publicly available at https://github.com/EnyaoC/MMEAD-VRAG.
Instructions:
The implementation of PhyBehavNet and the MMEAD-VRAG dataset is publicly available at https://github.com/EnyaoC/MMEAD-VRAG.