Evaluating the Stability of Embedding-based Word Similarities

ACL 2018

Evaluating the Stability of Embedding-based Word Similarities

Jun 29, 2018
|
23 views
Details
Abstract: Word embeddings are increasingly being used as a tool to study word associations in specific corpora. However, it is unclear whether such embeddings reflect enduring properties of language or if they are sensitive to inconsequential variations in the source documents. We find that nearest-neighbor distances are highly sensitive to small changes in the training corpus for a variety of algorithms. For all methods, including specific documents in the training set can result in substantial variations. We show that these effects are more prominent for smaller training corpora. We recommend that users never rely on single embedding models for distance calculations, but rather average over multiple bootstrap samples, especially for small corpora. Authors: Maria Antoniak, David Mimno (Cornell University)

Comments
loading...