In:
Proceedings of the AAAI Conference on Artificial Intelligence, Association for the Advancement of Artificial Intelligence (AAAI), Vol. 31, No. 1 ( 2017-02-12)
Abstract:
Multimodality has been recently exploited to overcome the challenges of emotion recognition. In this paper, we present a study of fusion of electroencephalogram (EEG) features and musical features extracted from musical stimuli at decision level in recognizing the time-varying binary classes of arousal and valence. Our empirical results demonstrate that EEG modality was suffered from the non-stability of EEG signals, yet fusing with music modality could alleviate the issue and enhance the performance of emotion recognition.
Type of Medium:
Online Resource
ISSN:
2374-3468
,
2159-5399
DOI:
10.1609/aaai.v31i1.11112
Language:
Unknown
Publisher:
Association for the Advancement of Artificial Intelligence (AAAI)
Publication Date:
2017
Permalink