Gaussian Mixture Latent Vector Grammars

ACL 2018

Gaussian Mixture Latent Vector Grammars

Jan 28, 2021
|
29 views
|
|
Code
Details
Abstract: We introduce Latent Vector Grammars (LVeGs), a new framework that extends latent variable grammars such that each nonterminal symbol is associated with a continuous vector space representing the set of (infinitely many) sub-types of the nonterminal. We show that previous models such as latent variable grammars and compositional vector grammars can be interpreted as special cases of LVeGs. We then present Gaussian Mixture LVeGs (GM-LVeGs), a new special case of LVeGs that uses Gaussian mixtures to formulate the weights of production rules over subtypes of nonterminals. A major advantage of using Gaussian mixtures is that the partition function and the expectations of subtype rules can be computed using an extension of the inside-outside algorithm, which enables efficient inference and learning. We apply GM-LVeGs to part-of-speech tagging and constituency parsing and show that GM-LVeGs can achieve competitive accuracies. Authors: Yanpeng Zhao, Liwen Zhang, Kewei Tu (ShanghaiTech University)

Comments
loading...