About | Contact Us | Register | Login
ProceedingsSeriesJournalsSearchEAI
Proceedings of the First International Conference on Combinatorial and Optimization, ICCAP 2021, December 7-8 2021, Chennai, India

Research Article

Analysis on the Effect of Dropout as a Regularization Technique in Deep Averaging Network

Download546 downloads
Cite
BibTeX Plain Text
  • @INPROCEEDINGS{10.4108/eai.7-12-2021.2314492,
        author={Lovelyn Rose  S and Rashmi  M},
        title={Analysis on the Effect of Dropout as a Regularization Technique in Deep Averaging Network},
        proceedings={Proceedings of the First International Conference on Combinatorial and Optimization, ICCAP 2021, December 7-8 2021, Chennai, India},
        publisher={EAI},
        proceedings_a={ICCAP},
        year={2021},
        month={12},
        keywords={dan; dropout; word embeddings; sentiment analysis},
        doi={10.4108/eai.7-12-2021.2314492}
    }
    
  • Lovelyn Rose S
    Rashmi M
    Year: 2021
    Analysis on the Effect of Dropout as a Regularization Technique in Deep Averaging Network
    ICCAP
    EAI
    DOI: 10.4108/eai.7-12-2021.2314492
Lovelyn Rose S1,*, Rashmi M1
  • 1: PSG College of Technology
*Contact email: slr.cse@psgtech.ac.in

Abstract

Deep neural networks are powerful machine learning systems and many deep learning models for natural language processing tasks focus on learning the compositionality of their inputs. DAN model relies on both simple vector operations and neural network-based models for learning the compositionality. The depth of the model allows it to capture subtle variations in the input even though the composition is unordered. However, overfitting is a serious problem in any deep neural network. Dropout is a technique for addressing overfitting in large neural networks. The idea is to randomly drop neurons and their connections from the neural network during training phase. This prevents neurons from co-adapting. DAN includes a variant of dropout where individual words are dropped rather than individual neurons of the feed forward network.But since this technique has the potential to drop critical words it may have significant impact on the performance of the model in text classification tasks. This paper deliberates on this drawback and the impact of dropping individual neurons rather than word-level dropout.

Keywords
dan; dropout; word embeddings; sentiment analysis
Published
2021-12-22
Publisher
EAI
http://dx.doi.org/10.4108/eai.7-12-2021.2314492
Copyright © 2021–2025 EAI
EBSCOProQuestDBLPDOAJPortico
EAI Logo

About EAI

  • Who We Are
  • Leadership
  • Research Areas
  • Partners
  • Media Center

Community

  • Membership
  • Conference
  • Recognition
  • Sponsor Us

Publish with EAI

  • Publishing
  • Journals
  • Proceedings
  • Books
  • EUDL