Transformer Hawkes Process

ICML 2020

Details
"Transformer Hawkes Process" is work completed by Simiao Zuo, Haoming Jiang, Zichong Li, Tuo Zhao and Hongyuan Zha at the Georgia Institute of Technology. This work was accepted to the 2020 International Conference on Machine Learning (ICML). Abstract: Modern data acquisition routinely produce massive amounts of event sequence data in various domains, such as social media, healthcare, and financial markets. These data often exhibit complicated short-term and long-term temporal dependencies. However, most of the existing recurrent neural network-based point process models fail to capture such dependencies, and yield unreliable prediction performance. To address this issue, we propose a Transformer Hawkes Process (THP) model, which leverages the self-attention mechanism to capture long-term dependencies and meanwhile enjoys computational efficiency. Numerical experiments on various datasets show that THP outperforms existing models in terms of both likelihood and event prediction accuracy by a notable margin. Moreover, THP is quite general and can incorporate additional structural knowledge. We provide a concrete example, where THP achieves improved prediction performance for learning multiple point processes when incorporating their relational information. Full paper: https://arxiv.org/pdf/2002.09291.pdf

Comments
loading...