A La Carte Embedding: Cheap but Effective Induction of Semantic Feature Vectors

ACL 2018

A La Carte Embedding: Cheap but Effective Induction of Semantic Feature Vectors

Jan 27, 2021
|
35 views
|
Details
Abstract: Motivations like domain adaptation, transfer learning, and feature learning have fueled interest in inducing embed-dings for rare or unseen words, n-grams, synsets, and other textual features. This paper introduces a la carte embed-ding, a simple and general alternative to the usual word2vec-based approaches for building such representations that is based upon recent theoretical results for GloVe-like embeddings. Our method relies mainly on a linear transfor-mation that is efficiently learnable using pretrained word vectors and linear regression. This transform is applicable on the fly in the future when a new text feature or rare word is encountered, even if only a single usage example is available. We introduce a new dataset showing how the a la carte method requires fewer examples of words in con-text to learn high-quality embeddings and we obtain state-of-the-art results on a nonce task and some unsupervised document classification tasks. Authors: Mikhail Khodak, Nikunj Saunshi, Yingyu Liang, Tengyu Ma, Brandon Stewart, Sanjeev Arora (Princeton University, University of Wisconsin-Madison, Facebook AI Research)

Comments
loading...