Exploring phrase-compositionality in skip-gram models

07/21/2016
by   Xiaochang Peng, et al.
0

In this paper, we introduce a variation of the skip-gram model which jointly learns distributed word vector representations and their way of composing to form phrase embeddings. In particular, we propose a learning procedure that incorporates a phrase-compositionality function which can capture how we want to compose phrases vectors from their component word vectors. Our experiments show improvement in word and phrase similarity tasks as well as syntactic tasks like dependency parsing using the proposed joint models.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset