Dual Attention 機構を取り入れた時空間グラフ畳み込みネットワークによる歩行からの感情認識

Translated title of the contribution: A Dual Attention Spatial-Temporal Graph Convolutional Network for Emotion Recognition from Gait

Jiaqing Liu, Shoji Kisita, Shurong Chai, Tomoko Tateyama, Yutaro Iwamoto, Yen Wei Chen

Research output: Contribution to journalArticlepeer-review

Abstract

<Summary> Human walking patterns contain a wide range of non-verbal information including their identity and emotions. Recent work utilizing Spatial-Temporal Graph Convolutional Network (ST-GCN), which considers the inherent spatial connections between skeletal joints, has shown promising performance for skeleton-based emotion perception from gaits. However, the significance of the nodes may change depending on the emotions, which is not included in the current studies. Efficiently considering the significance of nodes for different emotions is a major issue in this task. To address this problem, a novel dual-attention module is proposed to assist ST-GCN in perceiving the correlation between nodes in this paper. Experimental results on Emotion-gait dataset demonstrate that our method outperforms the current state-of-the-art methods. We also visualize the attention-based weights of the nodes to better understand the importance of a node in emotion perception. We observe that the entire gait is light when people are happy. When being angry, the whole body moves violently with short strides. While sadness makes it difficult to move forward.

Translated title of the contributionA Dual Attention Spatial-Temporal Graph Convolutional Network for Emotion Recognition from Gait
Original languageJapanese
Pages (from-to)309-317
Number of pages9
JournalJournal of the Institute of Image Electronics Engineers of Japan
Volume51
Issue number4
DOIs
Publication statusPublished - 2022

All Science Journal Classification (ASJC) codes

  • Computer Science (miscellaneous)
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'A Dual Attention Spatial-Temporal Graph Convolutional Network for Emotion Recognition from Gait'. Together they form a unique fingerprint.

Cite this