Dataset of Multi-Modal Physiological Signals (fNIRS, EEG, ECG, EMG) Recorded Across Different Emotional States

Citation Author(s):
Charles
Mcclary
Kennesaw State University
Mohammad Y. M.
Naser
Kennesaw State University
Sumayyah
Repole
Kennesaw State University
Ben
McKinney
Kennesaw State University
Sylvia
Bhattacharya
Kennesaw State University
Submitted by:
Mohammad Naser
Last updated:
Fri, 04/25/2025 - 13:54
DOI:
10.21227/tm30-9744
Data Format:
License:
0
0 ratings - Please login to submit your rating.

Abstract 

This dataset comprises synchronized multi-modal physiological recordings—functional Near-Infrared Spectroscopy (fNIRS), Electroencephalography (EEG), Electrocardiography (ECG), and Electromyography (EMG)—collected from 16 participants exposed to emotion-eliciting video stimuli. It includes raw signals, event markers, and Python scripts for data import and preprocessing. Special emphasis is placed on fNIRS, which, though less common in affective computing, provides valuable hemodynamic insights that complement electrical signals from EEG, ECG, and EMG. The dataset is structured to facilitate reproducibility and ease of integration across platforms. It aims to support research in emotion recognition, multimodal data fusion, and machine learning applications in emotion-aware and human-centered systems.

Instructions: 

Refer to data description

Dataset Files

LOGIN TO ACCESS DATASET FILES