Multimodal Affective Analysis Using Hierarchical Attention Strategy with Word-Level Alignment

ACL 2018

Multimodal Affective Analysis Using Hierarchical Attention Strategy with Word-Level Alignment

Jan 28, 2021
|
47 views
|
Details
Abstract: Multimodal affective computing, learning to recognize and interpret human affect and subjective information from multiple data sources, is still a challenge because: (i) it is hard to extract informative features to represent human affects from heterogeneous inputs; (ii) current fusion strategies only fuse different modalities at abstract levels, ignor-ing time-dependent interactions between modalities. Addressing such issues, we introduce a hierarchical multimodal architecture with attention and word-level fusion to classify utterance-level sentiment and emotion from text and au-dio data. Our introduced model outperforms state-of-the-art approaches on published datasets, and we demonstrate that our model is able to visualize and interpret synchronized attention over modalities. Authors: Yue Gu, Kangning Yang, Shiyu Fu, Shuhong Chen, Xinyu Li, Ivan Marsic (Rutgers University)

Comments
loading...