Learning to Generate Personalized Query Auto-Completions via a Multi-View Multi-Task Attentive Approach - CrossMinds.ai
Learning to Generate Personalized Query Auto-Completions via a Multi-View Multi-Task Attentive Approach
Aug 13, 202011 views
Jiwei Tan
In this paper, we study the task of Query Auto-Completion (QAC),,which is a very significant feature of modern search engines. In,real industrial application, there always exist two major problems,of QAC - weak personalization and unseen queries. To address,these problems, we propose,M,2,A,, a multi-view multi-task attentive framework to learn personalized query auto-completion models. We propose a new Transformer-based hierarchical encoder to,model different kinds of sequential behaviors, which can be seen as,multiple distinct views of the user’s searching history, and then a,prefix-to-history attention mechanism is used to select the most relevant information to compose the final intention representation. To,learn more informative representations, we propose to incorporate,multi-task learning into the model training. Two different kinds of,supervisory information provided by query logs are utilized at the,same time by jointly training a CTR prediction model and a query,generation model.,To bridge the gap between the setting of research work and,the real scenario, we release a new large-scale query log dataset TaobaoQAC,, which contains rich real prefix-to-query click behaviors. We conduct experiments on TaobaoQAC to demonstrate the,effectiveness or our approach, and results show that,M,2,A,achieves,superior performance compared with several strong baselines in,both candidate ranking and query generation. We also conduct an,online A/B testing and our approach has been deployed online.
SIGKDD_2020
Recommended