A Mixture Model for Learning Multi-Sense Word Embeddings

06/15/2017
by   Dai Quoc Nguyen, et al.
0

Word embeddings are now a standard technique for inducing meaning representations for words. For getting good representations, it is important to take into account different senses of a word. In this paper, we propose a mixture model for learning multi-sense word embeddings. Our model generalizes the previous works in that it allows to induce different weights of different senses of a word. The experimental results show that our model outperforms previous models on standard evaluation tasks.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset