
Furthermore, we introduce A Multi-Modal Continuous Emotion Annotation Dataset for VR Action Games (MMEAD-VRAG), the first multi-modal time-series dataset incorporating both physiological and behavioral signals in VR action gaming scenarios. A comparative analysis with existing state-of-the-art datasets reveals that MMEAD-VRAG exhibits fewer limitations in terms of data collection methodology, dataset scale, and participant diversity. The implementation of PhyBehavNet and the MMEAD-VRAG dataset is publicly available at https://github.com/EnyaoC/MMEAD-VRAG.
- Categories: