About | Contact Us | Register | Login
ProceedingsSeriesJournalsSearchEAI
Science and Technologies for Smart Cities. 7th EAI International Conference, SmartCity360°, Virtual Event, December 2-4, 2021, Proceedings

Research Article

Temporal Colour-Coded Facial-Expression Recognition Using Convolutional Neural Network

Cite
BibTeX Plain Text
  • @INPROCEEDINGS{10.1007/978-3-031-06371-8_4,
        author={Minh Nguyen and Wei Qi Yan},
        title={Temporal Colour-Coded Facial-Expression Recognition Using Convolutional Neural Network},
        proceedings={Science and Technologies for Smart Cities. 7th EAI International Conference, SmartCity360°, Virtual Event, December 2-4, 2021, Proceedings},
        proceedings_a={SMARTCITY},
        year={2022},
        month={6},
        keywords={Computer vision Object detection Deep learning Facial expression recognition},
        doi={10.1007/978-3-031-06371-8_4}
    }
    
  • Minh Nguyen
    Wei Qi Yan
    Year: 2022
    Temporal Colour-Coded Facial-Expression Recognition Using Convolutional Neural Network
    SMARTCITY
    Springer
    DOI: 10.1007/978-3-031-06371-8_4
Minh Nguyen1,*, Wei Qi Yan1
  • 1: School of Engineering, Computer and Mathematical Sciences
*Contact email: minh.nguyen@aut.ac.nz

Abstract

This research primarily aims to solve the problem of the high suicide rate in NZ; in this project, we plan to implement an AI-based recognition system for the long-term mental health issue for discovering potential suicidal population in NZ society. Visual data (CCTV video footages) possesses affluent and bountiful information; however, the amount of data grows explosively; thus we often fail to capture the patterns and extract meaningful featured data for any reliable analysis. Moreover, AI-detected human facial microexpressions are usually ambiguous and return with various uncertain patterns. It is extremely tough to identify and verify the emotions of somebody in the last few minutes, hours, days, weeks, or months. In a nutshell, it is very challenging to assess the depression so as to predict their suicidal probability. Pertaining to solve this problem, we will design a novel temporal expression recognition system based on the accumulation of seven colour-coded human emotional expressions, namely, anger, disgust, fear, happiness, sadness, surprise, and neutral. We propose to use various colour dots (rain-drops) to replace the feelings of people. We assume that, just like the colour, people have three primitive emotions: joy (green), sadness (blue), and anger (red). The mixture of these will lead to other feelings: anger + joy = surprise (yellow), anger + sad = scare (purple), joy + sad = disgust (cyan), and when these three primitive feelings are additive, we get a neutral state (white). Long-term feelings are emotions accumulated overtime, digitally presented by using any drops of colours on to a white canvas. Each canvas can be the feeling of someone in a predefined period (the last five minute, for instance). By implementing this, the emotions of target persons over the previous one month could be effectively packed down into a movie of approx. 2 h (60 Hz). At any time, such a video could be assessed by using AI algorithms for stress level assessment (in the last one month) so as to decide the requirements of mental treatment.

Keywords
Computer vision Object detection Deep learning Facial expression recognition
Published
2022-06-17
Appears in
SpringerLink
http://dx.doi.org/10.1007/978-3-031-06371-8_4
Copyright © 2021–2025 ICST
EBSCOProQuestDBLPDOAJPortico
EAI Logo

About EAI

  • Who We Are
  • Leadership
  • Research Areas
  • Partners
  • Media Center

Community

  • Membership
  • Conference
  • Recognition
  • Sponsor Us

Publish with EAI

  • Publishing
  • Journals
  • Proceedings
  • Books
  • EUDL