Research Article
Two-Person Mutual Action Recognition Using Joint Dynamics and Coordinate Transformation
@INPROCEEDINGS{10.4108/eai.20-11-2021.2314154, author={Shian-Yu Chiu and Kun-Ru Wu and Yu-Chee Tseng}, title={Two-Person Mutual Action Recognition Using Joint Dynamics and Coordinate Transformation}, proceedings={Proceedings of the 1st International Conference on AI for People: Towards Sustainable AI, CAIP 2021, 20-24 November 2021, Bologna, Italy}, publisher={EAI}, proceedings_a={CAIP}, year={2021}, month={12}, keywords={skeleton-based action recognition mutual interaction recognition bidirectional lstm deep learning human behavior analysis}, doi={10.4108/eai.20-11-2021.2314154} }
- Shian-Yu Chiu
Kun-Ru Wu
Yu-Chee Tseng
Year: 2021
Two-Person Mutual Action Recognition Using Joint Dynamics and Coordinate Transformation
CAIP
EAI
DOI: 10.4108/eai.20-11-2021.2314154
Abstract
Skeleton-based action recognition has attracted lots of attention in computer vision. Human mutual interaction recognition relies on extracting discriminative features for better understanding details. In this work, we propose two vectors to encode joint dynamics and spatial interaction information. The proposed model shows remarkable performance at handling sequential data. Experimental results demonstrate that our model outperforms state-of-the-art approaches with much less overheads.
Copyright © 2021–2024 EAI