We help you wrap your head around relative positional embeddings as they were first introduced in the βSelf-Attention with Relative Position Representationsβ paper.
Related videos:
πΊ Positional embeddings explained: https://youtu.be/1biZfFLPRSY
πΊ Concatenated, learned positional encodings: https://youtu.be/M2ToEXF6Olw
πΊ Transformer explained: https://youtu.be/FWFA4DGuzSc
Papers π:
Shaw, Peter, Jakob Uszkoreit, and Ashish Vaswani. "Self-Attention with Relative Position Representations." In Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers), pp. 464-468. 2018. https://arxiv.org/pdf/1803.02155.pdf
Vaswani, Ashish, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Εukasz Kaiser, and Illia Polosukhin. "Attention is all you need." In Advances in neural information processing systems, pp. 5998-6008. 2017. https://proceedings.neurips.cc/paper/2017/file/3f5ee243547dee91fbd053c1c4a845aa-Paper.pdf
Outline:
00:00 Relative positional representations
02:15 How do they work?
07:59 Benefits of relative vs. absolute positional encodings
Music π΅ : Holi Day Riddim - Konrad OldMoney
ββββββββββββββββββββββββββ
π₯ Optionally, pay us a coffee to help with our Coffee Bean production! β
Patreon: https://www.patreon.com/AICoffeeBreak
Ko-fi: https://ko-fi.com/aicoffeebreak
ββββββββββββββββββββββββββ
π Links:
AICoffeeBreakQuiz: https://www.youtube.com/c/AICoffeeBreak/community
Twitter: https://twitter.com/AICoffeeBreak
Reddit: https://www.reddit.com/r/AICoffeeBreak/
YouTube: https://www.youtube.com/AICoffeeBreak
#AICoffeeBreak #MsCoffeeBean #MachineLearning #AI #researchβ
00:00 Relative positional representations
02:15 How do they work?
07:59 Benefits of relative vs. absolute positional encodings