Mobile Computing, Applications, and Services. 10th EAI International Conference, MobiCASE 2019, Hangzhou, China, June 14–15, 2019, Proceedings

Research Article

Transformer Based Memory Network for Sentiment Analysis of Chinese Weibo Texts

  • @INPROCEEDINGS{10.1007/978-3-030-28468-8_4,
        author={Junlei Wu and Jiang Ming and Min Zhang},
        title={Transformer Based Memory Network for Sentiment Analysis of Chinese Weibo Texts},
        proceedings={Mobile Computing, Applications, and Services. 10th EAI International Conference, MobiCASE 2019, Hangzhou, China, June 14--15, 2019, Proceedings},
        proceedings_a={MOBICASE},
        year={2019},
        month={9},
        keywords={ABSA Transformer Memory network Weibo texts},
        doi={10.1007/978-3-030-28468-8_4}
    }
    
  • Junlei Wu
    Jiang Ming
    Min Zhang
    Year: 2019
    Transformer Based Memory Network for Sentiment Analysis of Chinese Weibo Texts
    MOBICASE
    Springer
    DOI: 10.1007/978-3-030-28468-8_4
Junlei Wu1,*, Jiang Ming1,*, Min Zhang1,*
  • 1: Hangzhou Dianzi University
*Contact email: 171050047@hdu.edu.cn, jmzju@163.com, hz_andy@163.com

Abstract

Weibo has already become the main platform of mobile social and information exchange. Therefore, the sentiment feature extraction of Weibo texts is of great significance, and aspect-based sentiment analysis (ABSA) is useful to retrieval the sentiment feature from Weibo texts. Now, context-dependent sentiment feature is obtained by widely using long short-term memory (LSTM) or Gated Recurrent Unit (GRU) network, and target vector is usually replaced by average target vector. However, Weibo texts has become increasingly complex and feature extraction with LSTM or GRU might cause the loss of key sentiment information. Meanwhile, average target vector might be wrong target feature. To correct drawbacks of the old method, a new Transformer (a new neural network architecture based on self-attention mechanism) based memory network (TF-MN), is introduced. In TF-MN, the task is migrated into question answering process in which context, question and memory module is modified optimally. The text is encoded by Transformer in context module, question module transfer target into sentiment question, memory module eliminates the effect of unrelated words by several extractions. The result of the experiment proves that our model reaches better accuracy than the state-of-the-art model.