About | Contact Us | Register | Login
ProceedingsSeriesJournalsSearchEAI
Ubiquitous Communications and Network Computing. 4th EAI International Conference, UBICNET 2021, Virtual Event, March 2021, Proceedings

Research Article

Abstractive Text Summarization on Templatized Data

Download(Requires a free EAI acccount)
5 downloads
Cite
BibTeX Plain Text
  • @INPROCEEDINGS{10.1007/978-3-030-79276-3_17,
        author={C. Jyothi and M. Supriya},
        title={Abstractive Text Summarization on Templatized Data},
        proceedings={Ubiquitous Communications and Network Computing. 4th EAI International Conference, UBICNET 2021, Virtual Event, March 2021, Proceedings},
        proceedings_a={UBICNET},
        year={2021},
        month={7},
        keywords={RNN LSTM GRU Text summarization},
        doi={10.1007/978-3-030-79276-3_17}
    }
    
  • C. Jyothi
    M. Supriya
    Year: 2021
    Abstractive Text Summarization on Templatized Data
    UBICNET
    Springer
    DOI: 10.1007/978-3-030-79276-3_17
C. Jyothi1, M. Supriya1
  • 1: Department of Computer Science and Engineering, Amrita School of Engineering, Bengaluru

Abstract

Abstractive text summarization generates a brief form of an input text from the original source without the sentences being reused by still preserving the meaning and the important information. This could be modelled as a sequence-to-sequence learning by exploiting Recurrent Neural Networks (RNN). Typical RNN models are tough to train owing to the vanishing and exploding gradient complications. ‘Long Short-Term Memory Networks (LSTM)’ are an answer for such vanishing gradients problem. LSTM based modelling with an attentional mechanism is vital to improve the text summarization application. The key intuition behind attention mechanism is to decide how much attention one needs to pay to every word in the input sequence for generating a word at a particular step. The objective of this paper is to study various attention models and its applicability to text summarization. The intent is to implement Abstractive text summarization with an appropriate attention model using LSTM on a templatized dataset to avoid noise and ambiguity in generating high quality summary.

Keywords
RNN LSTM GRU Text summarization
Published
2021-07-06
Appears in
SpringerLink
http://dx.doi.org/10.1007/978-3-030-79276-3_17
Copyright © 2021–2025 ICST
EBSCOProQuestDBLPDOAJPortico
EAI Logo

About EAI

  • Who We Are
  • Leadership
  • Research Areas
  • Partners
  • Media Center

Community

  • Membership
  • Conference
  • Recognition
  • Sponsor Us

Publish with EAI

  • Publishing
  • Journals
  • Proceedings
  • Books
  • EUDL