Research Article
Prediction of Two-dimensional Impressions of Images of Facial Emotions using features of EEGs
@ARTICLE{10.4108/eai.20-8-2019.162800, author={Satoru Waseda and Minoru Nakayama}, title={Prediction of Two-dimensional Impressions of Images of Facial Emotions using features of EEGs}, journal={EAI Endorsed Transactions on Context-aware Systems and Applications}, volume={6}, number={18}, publisher={EAI}, journal_a={CASA}, year={2020}, month={1}, keywords={Facial Expressions, ERPs, Emotion, Prediction, Item Response Theory}, doi={10.4108/eai.20-8-2019.162800} }
- Satoru Waseda
Minoru Nakayama
Year: 2020
Prediction of Two-dimensional Impressions of Images of Facial Emotions using features of EEGs
CASA
EAI
DOI: 10.4108/eai.20-8-2019.162800
Abstract
The viewing of categories of facial emotions is predicted using features of viewer’s scalp potentials, such as event-related potentials (ERPs) measured during the viewing of pictures of facial emotions. All visual stimuli were rated using two-dimensional emotional scales, and the responses for each viewer were converted into sensitivities using item response theory (IRT). This sensitivity to facial emotions can be predicted using discrimination analysis and the extracted features of ERPs recorded during the viewing of the images. The categories of facial emotions viewed were estimated to a certain level of significance using regression analysis, and the sensitivities predicted for the emotional scales were calculated accurately, as performance depended on the reactions to the images of the emotions. The results showed that categories of facial emotions viewed can be predicted to a level of significance using features of scalp potentials.
Copyright © 2019 Satoru Waseda et al., licensed to EAI. This is an open access article distributed under the terms of the Creative Commons Attribution license (http://creativecommons.org/licenses/by/3.0/), which permits unlimited use, distribution and reproduction in any medium so long as the original work is properly cited.