[Google AI] Performer: A Generalized Attention Framework Based on Transformer Architecture
CrossMind.ai logo
Details
The new Performer model introduced by Google AI and partners is a generalized attention framework based on the Transformer architecture. While the transformers have huge quadratically growing memory and computational requirements, the Performer implements the novel FAVOR+ algorithm to provide linearly scalable, low-variance, and unbiased estimation of attention mechanisms. This work was conducted by the core Performer designers Krzysztof Choromanski (Google Brain Team, Tech and Research Lead), Valerii Likhosherstov (University of Cambridge) and Xingyou Song (Google Brain Team).

Comments
loading...