Finding syntax in human encephalography with beam search

ACL 2018

Finding syntax in human encephalography with beam search

Jan 28, 2021
|
30 views
|
Details
Abstract: Recurrent neural network grammars (RNNGs) are generative models of (tree , string ) pairs that rely on neural net-works to evaluate derivational choices. Parsing with them using beam search yields a variety of incremental complex-ity metrics such as word surprisal and parser action count. When used as regressors against human electrophysio-logical responses to naturalistic text, they derive two amplitude effects: an early peak and a P600-like later peak. By contrast, a non-syntactic neural language model yields no reliable effects. Model comparisons attribute the early peak to syntactic composition within the RNNG. This pattern of results recommends the RNNG+beam search combination as a mechanistic model of the syntactic processing that occurs during normal human language comprehension. Authors: John Hale, Chris Dyer, Adhiguna Kuncoro, Jonathan R. Brennan (DeepMind, University of Oxford, University of Michigan, Cornell University)

Comments
loading...