We help you wrap your head around relative positional embeddings as they were first introduced in the โSelf-Attention with Relative Position Representationsโ paper.
Related videos:
๐บ Positional embeddings explained: https://youtu.be/1biZfFLPRSY
๐บ Concatenated, learned positional encodings: https://youtu.be/M2ToEXF6Olw
๐บ Transformer explained: https://youtu.be/FWFA4DGuzSc
Papers ๐:
Shaw, Peter, Jakob Uszkoreit, and Ashish Vaswani. "Self-Attention with Relative Position Representations." In Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers), pp. 464-468. 2018. https://arxiv.org/pdf/1803.02155.pdf
Vaswani, Ashish, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, ลukasz Kaiser, and Illia Polosukhin. "Attention is all you need." In Advances in neural information processing systems, pp. 5998-6008. 2017. https://proceedings.neurips.cc/paper/2017/file/3f5ee243547dee91fbd053c1c4a845aa-Paper.pdf
Outline:
00:00 Relative positional representations
02:15 How do they work?
07:59 Benefits of relative vs. absolute positional encodings
Music ๐ต : Holi Day Riddim - Konrad OldMoney
โโโโโโโโโโโโโโโโโโโโโโโโโโ
๐ฅ Optionally, pay us a coffee to help with our Coffee Bean production! โ
Patreon: https://www.patreon.com/AICoffeeBreak
Ko-fi: https://ko-fi.com/aicoffeebreak
โโโโโโโโโโโโโโโโโโโโโโโโโโ
๐ Links:
AICoffeeBreakQuiz: https://www.youtube.com/c/AICoffeeBreak/community
Twitter: https://twitter.com/AICoffeeBreak
Reddit: https://www.reddit.com/r/AICoffeeBreak/
YouTube: https://www.youtube.com/AICoffeeBreak
#AICoffeeBreak #MsCoffeeBean #MachineLearning #AI #researchโ
00:00 Relative positional representations
02:15 How do they work?
07:59 Benefits of relative vs. absolute positional encodings