Abstract
<Summary> Human walking patterns contain a wide range of non-verbal information including their identity and emotions. Recent work utilizing Spatial-Temporal Graph Convolutional Network (ST-GCN), which considers the inherent spatial connections between skeletal joints, has shown promising performance for skeleton-based emotion perception from gaits. However, the significance of the nodes may change depending on the emotions, which is not included in the current studies. Efficiently considering the significance of nodes for different emotions is a major issue in this task. To address this problem, a novel dual-attention module is proposed to assist ST-GCN in perceiving the correlation between nodes in this paper. Experimental results on Emotion-gait dataset demonstrate that our method outperforms the current state-of-the-art methods. We also visualize the attention-based weights of the nodes to better understand the importance of a node in emotion perception. We observe that the entire gait is light when people are happy. When being angry, the whole body moves violently with short strides. While sadness makes it difficult to move forward.
Translated title of the contribution | A Dual Attention Spatial-Temporal Graph Convolutional Network for Emotion Recognition from Gait |
---|---|
Original language | Japanese |
Pages (from-to) | 309-317 |
Number of pages | 9 |
Journal | Journal of the Institute of Image Electronics Engineers of Japan |
Volume | 51 |
Issue number | 4 |
DOIs | |
Publication status | Published - 2022 |
All Science Journal Classification (ASJC) codes
- Computer Science (miscellaneous)
- Electrical and Electronic Engineering