Research Article
Comparative Study on Different Word Embedding Techniques
@INPROCEEDINGS{10.4108/eai.7-12-2021.2314494, author={Lovelyn Rose S}, title={Comparative Study on Different Word Embedding Techniques}, proceedings={Proceedings of the First International Conference on Combinatorial and Optimization, ICCAP 2021, December 7-8 2021, Chennai, India}, publisher={EAI}, proceedings_a={ICCAP}, year={2021}, month={12}, keywords={word embedding; term-weighting; glove; word2vec}, doi={10.4108/eai.7-12-2021.2314494} }
- Lovelyn Rose S
Year: 2021
Comparative Study on Different Word Embedding Techniques
ICCAP
EAI
DOI: 10.4108/eai.7-12-2021.2314494
Abstract
Recent advancements in distributional semantics allow a word to be represented in a multi-dimensional semantic space as vectors. Words that are semantically and syntactically similar are closer in value. Popular methods to generate such embeddings include neural networks, reduction of word-to-word co-occurrence matrices, contextual representation and so on. Most embedding techniques are based upon the distributional hypothesis which states that words in the same context are semantically similar and hence can be broadly classified into count based methods and predictive methods. Word embeddings are quite popular in the field of Natural Language Processing in addition to other fields such as text mining, Sentiment analysis, Information Retrieval, Polarity Detection and so on. This paper aims to make a comprehensive study on the different word embedding techniques available.