
Research Article
Abstractive Text Summarization on Templatized Data
@INPROCEEDINGS{10.1007/978-3-030-79276-3_17, author={C. Jyothi and M. Supriya}, title={Abstractive Text Summarization on Templatized Data}, proceedings={Ubiquitous Communications and Network Computing. 4th EAI International Conference, UBICNET 2021, Virtual Event, March 2021, Proceedings}, proceedings_a={UBICNET}, year={2021}, month={7}, keywords={RNN LSTM GRU Text summarization}, doi={10.1007/978-3-030-79276-3_17} }
- C. Jyothi
M. Supriya
Year: 2021
Abstractive Text Summarization on Templatized Data
UBICNET
Springer
DOI: 10.1007/978-3-030-79276-3_17
Abstract
Abstractive text summarization generates a brief form of an input text from the original source without the sentences being reused by still preserving the meaning and the important information. This could be modelled as a sequence-to-sequence learning by exploiting Recurrent Neural Networks (RNN). Typical RNN models are tough to train owing to the vanishing and exploding gradient complications. ‘Long Short-Term Memory Networks (LSTM)’ are an answer for such vanishing gradients problem. LSTM based modelling with an attentional mechanism is vital to improve the text summarization application. The key intuition behind attention mechanism is to decide how much attention one needs to pay to every word in the input sequence for generating a word at a particular step. The objective of this paper is to study various attention models and its applicability to text summarization. The intent is to implement Abstractive text summarization with an appropriate attention model using LSTM on a templatized dataset to avoid noise and ambiguity in generating high quality summary.