Neural Text Generation in Stories Using Entity Representations as Context

ACL 2018

Neural Text Generation in Stories Using Entity Representations as Context

Jun 29, 2018
|
25 views
Details
Abstract: We introduce an approach to neural text generation that explicitly represents entities mentioned in the text. Entity representations are vectors that are updated as the text proceeds; they are designed specifically for narrative text like fiction or news stories. Our experiments demonstrate that modeling entities offers a benefit in two automatic evaluations: mention generation (in which a model chooses which entity to mention next and which words to use in the mention) and selection between a correct next sentence and a distractor from later in the same story. We also conduct a human evaluation on automatically generated text in story contexts; this study supports our emphasis on entities and suggests directions for further research. Authors: Elizabeth Clark, Yangfeng Ji, Noah A. Smith (University of Washington)

Comments
loading...