Using Dynamic Embeddings to Improve Static Embeddings

11/07/2019
by   Yile Wang, et al.
0

How to build high-quality word embeddings is a fundamental research question in the field of natural language processing. Traditional methods such as Skip-Gram and Continuous Bag-of-Words learn static embeddings by training lookup tables that translate words into dense vectors. Static embeddings are directly useful for solving lexical semantics tasks, and can be used as input representations for downstream problems. Recently, contextualized embeddings such as BERT have been shown more effective than static embeddings as NLP input embeddings. Such embeddings are dynamic, calculated according to a sentential context using a network structure. One limitation of dynamic embeddings, however, is that they cannot be used without a sentence-level context. We explore the advantages of dynamic embeddings for training static embeddings, by using contextualized embeddings to facilitate training of static embedding lookup tables. Results show that the resulting embeddings outperform existing static embedding methods on various lexical semantics tasks.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset