About | Contact Us | Register | Login
ProceedingsSeriesJournalsSearchEAI
Edge Computing and IoT: Systems, Management and Security. Second EAI International Conference, ICECI 2021, Virtual Event, December 22–23, 2021, Proceedings

Research Article

Wearable-Based Human Emotion Inference System

Download(Requires a free EAI acccount)
2 downloads
Cite
BibTeX Plain Text
  • @INPROCEEDINGS{10.1007/978-3-031-04231-7_12,
        author={Zirui Zhao and Canlin Zheng},
        title={Wearable-Based Human Emotion Inference System},
        proceedings={Edge Computing and IoT: Systems, Management and Security. Second EAI International Conference, ICECI 2021, Virtual Event, December 22--23, 2021, Proceedings},
        proceedings_a={ICECI},
        year={2022},
        month={5},
        keywords={Wearable devices Emotion recognition Multimodal data Multiple perception},
        doi={10.1007/978-3-031-04231-7_12}
    }
    
  • Zirui Zhao
    Canlin Zheng
    Year: 2022
    Wearable-Based Human Emotion Inference System
    ICECI
    Springer
    DOI: 10.1007/978-3-031-04231-7_12
Zirui Zhao1,*, Canlin Zheng1
  • 1: College of Computer Science and Software Engineering
*Contact email: 2362534742@qq.com

Abstract

Many emotion recognition methods which have been proposed today have different shortcomings. For example, some methods use expensive and cumbersome special-purpose hardware, such as EEG/ECG helmets, while others based on cameras and speech caused risk of privacy leakage. With the prosperous development and popularization of wearable devices, people tend to be equipped with multiple smart devices, which provides potential opportunities for lightweight emotional perception. Based on this, we take actions on developing universal portable system and multi-source wearable sensing technology devices.

This paper designs an emotion recognition framework called MW-Emotion (Multi-source Wearable emotion recognition) featuring on low cost, universality, and portable commercial wearable devices to perceive multi-source sensing data and implement a system. It takes four basic emotions as the research object and implement an emotion recognition to explore deep context innovatively. The experimental results show that MW-Emotion has a recognition accuracy of 85.1% for person-dependent mode. The framework uses the method that different types of data are effectively fused through signal processing. We call it multimodal data fusion technology, which reduces the energy waste caused by data redundancy and effectively resists interference.

Keywords
Wearable devices Emotion recognition Multimodal data Multiple perception
Published
2022-05-03
Appears in
SpringerLink
http://dx.doi.org/10.1007/978-3-031-04231-7_12
Copyright © 2021–2025 ICST
EBSCOProQuestDBLPDOAJPortico
EAI Logo

About EAI

  • Who We Are
  • Leadership
  • Research Areas
  • Partners
  • Media Center

Community

  • Membership
  • Conference
  • Recognition
  • Sponsor Us

Publish with EAI

  • Publishing
  • Journals
  • Proceedings
  • Books
  • EUDL