Research Article
A Proposed Language Model Based on LSTM
165 downloads
@INPROCEEDINGS{10.1007/978-3-030-14657-3_35, author={Yumeng Zhang and Xuanmin Lu and Bei Quan and Yuanyuan Wei}, title={A Proposed Language Model Based on LSTM}, proceedings={IoT as a Service. 4th EAI International Conference, IoTaaS 2018, Xi’an, China, November 17--18, 2018, Proceedings}, proceedings_a={IOTAAS}, year={2019}, month={3}, keywords={Language model N-gram RNN LSTM Perplexity}, doi={10.1007/978-3-030-14657-3_35} }
- Yumeng Zhang
Xuanmin Lu
Bei Quan
Yuanyuan Wei
Year: 2019
A Proposed Language Model Based on LSTM
IOTAAS
Springer
DOI: 10.1007/978-3-030-14657-3_35
Abstract
In view of the shortcomings of language model N-gram, this paper presents a Long Short-Term Memory (LSTM)-based language model based on the advantage that LSTM can theoretically utilize any long sequence of information. It’s an improved RNN model. Experimental results show that the perplexity of the LSTM language model in the PBT corpus is only one-half that of the N-gram language model.
Copyright © 2018–2024 ICST