Sign in
Must-read AI Papers
Feb 22, 2021
18 videos
Share
Watch
Most Popular
Most Recent
Watch later
46:07
AlexNet: ImageNet Classification with Deep Convolutional Neural Networks (Paper Explained)
Yannic Kilcher
Watch later
27:07
Transformer: Attention Is All You Need (Paper Explained)
Yannic Kilcher
Watch later
31:22
Word2Vec: Distributed Representations of Words and Phrases and their Compositionality (Paper Explained)
Yannic Kilcher
Watch later
7:16
Language Models are Few-Shot Learners - OpenAI GPT-3 Explained [NeurIPS 2020 Best Paper]
Two Minute Papers
Watch later
11:51
ResNet: Deep Residual Learning for Image Recognition | Kaiming He @ CVPR 2016
ComputerVisionFoundation Videos
Watch later
17:48
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding | Ming-Wei Chang @ ACL 2019
TechTalksTV
Watch later
39:12
Playing Atari with Deep Reinforcement Learning (Paper Explained)
Yannic Kilcher
Watch later
37:04
Generative Adversarial Networks (Paper Explained)
Yannic Kilcher
Watch later
17:18
Reformer: The Efficient Transformer | Łukasz Kaiser (Google Brain)
NLP Zurich
Watch later
3:01
Dynamic Routing Between Capsules (CapsNet) | Sara Sabour @ NIPS 2017
Sara Sabour
Watch later
31:21
Deep Residual Learning for Image Recognition (ResNet Paper Explained)
Yannic Kilcher
Watch later
40:13
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding (Paper Explained)
Yannic Kilcher
Watch later
31:25
Generative Adversarial Networks | Ian Goodfellow @ NIPS 2016 Tutorial
Preserve Knowledge
Watch later
1:43
Playing Atari with Deep Reinforcement Learning | Two Minute Paper
Two Minute Papers
Watch later
54:45
CapsNets: Dynamic Routing Between Capsules | Geoffrey Hinton
Tsotsos Lab
Watch later
1:04:30
GPT-3: Language Models are Few-Shot Learners (Paper Explained)
Yannic Kilcher
Watch later
4:48
Word2Vec: Distributed Representations of Words and Phrases and their Compositionality (5-min Walkthrough)
Henry AI Labs
Watch later
48:23
Transformer: Attention is all you need; Attentional Neural Network Models | Łukasz Kaiser @ Google Brain
Pi School