Denoising Word Embeddings by Averaging in a Shared Space

06/05/2021
by   Avi Caciularu, et al.
0

We introduce a new approach for smoothing and improving the quality of word embeddings. We consider a method of fusing word embeddings that were trained on the same corpus but with different initializations. We project all the models to a shared vector space using an efficient implementation of the Generalized Procrustes Analysis (GPA) procedure, previously used in multilingual word translation. Our word representation demonstrates consistent improvements over the raw models as well as their simplistic average, on a range of tasks. As the new representations are more stable and reliable, there is a noticeable improvement in rare word evaluations.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset