About | Contact Us | Register | Login
ProceedingsSeriesJournalsSearchEAI
IoT as a Service. 4th EAI International Conference, IoTaaS 2018, Xi’an, China, November 17–18, 2018, Proceedings

Research Article

A Proposed Language Model Based on LSTM

Download(Requires a free EAI acccount)
341 downloads
Cite
BibTeX Plain Text
  • @INPROCEEDINGS{10.1007/978-3-030-14657-3_35,
        author={Yumeng Zhang and Xuanmin Lu and Bei Quan and Yuanyuan Wei},
        title={A Proposed Language Model Based on LSTM},
        proceedings={IoT as a Service. 4th EAI International Conference, IoTaaS 2018, Xi’an, China, November 17--18, 2018, Proceedings},
        proceedings_a={IOTAAS},
        year={2019},
        month={3},
        keywords={Language model N-gram RNN LSTM Perplexity},
        doi={10.1007/978-3-030-14657-3_35}
    }
    
  • Yumeng Zhang
    Xuanmin Lu
    Bei Quan
    Yuanyuan Wei
    Year: 2019
    A Proposed Language Model Based on LSTM
    IOTAAS
    Springer
    DOI: 10.1007/978-3-030-14657-3_35
Yumeng Zhang1,*, Xuanmin Lu1,*, Bei Quan1, Yuanyuan Wei1
  • 1: Northwestern Polytechnical University
*Contact email: zhangyumeng@mail.nwpu.edu.cn, luxuanmin@nwpu.edu.cn

Abstract

In view of the shortcomings of language model N-gram, this paper presents a Long Short-Term Memory (LSTM)-based language model based on the advantage that LSTM can theoretically utilize any long sequence of information. It’s an improved RNN model. Experimental results show that the perplexity of the LSTM language model in the PBT corpus is only one-half that of the N-gram language model.

Keywords
Language model N-gram RNN LSTM Perplexity
Published
2019-03-07
Appears in
SpringerLink
http://dx.doi.org/10.1007/978-3-030-14657-3_35
Copyright © 2018–2025 ICST
EBSCOProQuestDBLPDOAJPortico
EAI Logo

About EAI

  • Who We Are
  • Leadership
  • Research Areas
  • Partners
  • Media Center

Community

  • Membership
  • Conference
  • Recognition
  • Sponsor Us

Publish with EAI

  • Publishing
  • Journals
  • Proceedings
  • Books
  • EUDL