WACV 2021 - Efficient Attention: Attention with Linear Complexities

WACV 2021

WACV 2021 - Efficient Attention: Attention with Linear Complexities

Dec 17, 2020
|
86 views
|
Details
Abstract: We proposed a novel attention mechanism with linear memory and computational complexities and the same representational power as the quadratically-complex dot-product attention. We demonstrated that our method scales to high-resolution images, video inputs, and other types of visual data with results on 3 datasets and 4 tasks. Authors: Zhuoran Shen, Mingyuan Zhang, Haiyu Zhao, Shuai Yi, Hongsheng Li (SenseTime International, City University HK)

Comments
loading...