Sign in
BART: Denoising Sequence-to-Sequence Pre-training for NLG (Research Paper Walkthrough)
May 01, 2021
|
39 views
prakharmishra137
Follow
Details
Subscribe and Support!
#bart #transformers #naturallanguageprocessing The authors from Facebook AI propose a new pre-training objective for sequence models as denoisining autoencoder. It can be viewed as corrupting text with some arbitrary noise function while the Language Model is expected to denoise it. Watch video to know more :) ⏩ Abstract: We present BART, a denoising autoencoder for pretraining sequence-to-sequence models. BART is trained by (1) corrupting text with an arbitrary noising function, and (2) learning a model to reconstruct the original text. It uses a standard Tranformer-based neural machine translation architecture which, despite its simplicity, can be seen as generalizing BERT (due to the bidirectional encoder), GPT (with the left-to-right decoder), and many other more recent pretraining schemes. We evaluate a number of noising approaches, finding the best performance by both randomly shuffling the order of the original sentences and using a novel in-filling scheme, where spans of text are replaced with a single mask token. BART is particularly effective when fine tuned for text generation but also works well for comprehension tasks. It matches the performance of RoBERTa with comparable training resources on GLUE and SQuAD, achieves new state-of-the-art results on a range of abstractive dialogue, question answering, and summarization tasks, with gains of up to 6 ROUGE. BART also provides a 1.1 BLEU increase over a back-translation system for machine translation, with only target language pretraining. We also report ablation experiments that replicate other pretraining schemes within the BART framework, to better measure which factors most influence end-task performance. Please feel free to share out the content and subscribe to my channel :) ⏩ Subscribe -
https://youtube.com/channel/UCoz8NrwgL7U9535VNc0mRPA?sub_confirmation=1
⏩ OUTLINE: 0:00 - Abstract & Background (De-noising Autoencoder) 04:04 - BERT vs GPT vs BART 06:51 - BART Model 07:41 - Pre-training BART (Token Masking, Token Deletion, Text Infilling, Sentence Permutation, Document Rotation) ⏩ Paper Title: BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension ⏩ Paper:
https://arxiv.org/abs/1910.13461
⏩ Author: Mike Lewis, Yinhan Liu, Naman Goyal, Marjan Ghazvininejad, Abdelrahman Mohamed, Omer Levy, Ves Stoyanov, Luke Zettlemoyer ⏩ Organisation: Facebook AI ⏩ IMPORTANT LINKS Full Playlist on BERT usecases in NLP:
https://www.youtube.com/watch?v=kC5kP1dPAzc&list=PLsAqq9lZFOtV8jYq3JlkqPQUN5QxcWq0f
Full Playlist on Text Data Augmentation Techniques:
https://www.youtube.com/watch?v=9O9scQb4sNo&list=PLsAqq9lZFOtUg63g_95OuV-R2GhV1UiIZ
Full Playlist on Text Summarization:
https://www.youtube.com/watch?v=kC5kP1dPAzc&list=PLsAqq9lZFOtV8jYq3JlkqPQUN5QxcWq0f
Full Playlist on Machine Learning with Graphs:
https://www.youtube.com/watch?v=-uJL_ANy1jc&list=PLsAqq9lZFOtU7tT6mDXX_fhv1R1-jGiYf
Full Playlist on Evaluating NLG Systems:
https://www.youtube.com/watch?v=-CIlz-5um7U&list=PLsAqq9lZFOtXlzg5RNyV00ueE89PwnCbu
Full Playlist on Query Expansion for Information Retrieval using NLP:
https://www.youtube.com/watch?v=QpTZ_-6uio8&list=PLsAqq9lZFOtXsJ_S_lB9pPz2cbz-i2DD0
Full Playlist on Text Generation Evaluation Techniques:
https://www.youtube.com/watch?v=-CIlz-5um7U&list=PLsAqq9lZFOtXlzg5RNyV00ueE89PwnCbu
********************************************* ⏩ Youtube -
https://www.youtube.com/c/TechVizTheDataScienceGuy
⏩ Blog -
https://prakhartechviz.blogspot.com
⏩ LinkedIn -
https://linkedin.com/in/prakhar21
⏩ Medium -
https://medium.com/@prakhar.mishra
⏩ GitHub -
https://github.com/prakhar21
⏩ Twitter -
https://twitter.com/rattller
********************************************* Please feel free to share out the content and subscribe to my channel :) ⏩ Subscribe -
https://youtube.com/channel/UCoz8NrwgL7U9535VNc0mRPA?sub_confirmation=1
Tools I use for making videos :) ⏩ iPad -
https://tinyurl.com/y39p6pwc
⏩ Apple Pencil -
https://tinyurl.com/y5rk8txn
⏩ GoodNotes -
https://tinyurl.com/y627cfsa
#techviz #datascienceguy #research #autoencoder #denoising
0:00
Abstract & Background (De-noising Autoencoder)
04:04
BERT vs GPT vs BART
06:51
BART Model
07:41
Pre-training BART (Token Masking, Token Deletion, Text Infilling, Sentence Permutation, Document Rotation)
Category: Research Paper
Comments
loading...
Reactions
(0)
| Note
📝 No reactions yet
Be the first one to share your thoughts!
Reactions
(0)
Note
loading...
Recommended
20:58
ICAPS 2014: Jendrik Seipp on "Diverse and Additive Cartesian Abstraction Heuristics"
ICAPS
| Apr 14, 2015
20:15
ICAPS 2014: Bram Ridder on "Heuristic Evaluation Based on Lifted Relaxed Planning Graphs"
ICAPS
| Apr 14, 2015
15:42
ICAPS 2014: Richard Valenzano on "A Comparison of Knowledge-Based GBFS Enhancements and..."
ICAPS
| Apr 15, 2015
19:34
ICAPS 2014: Florian Pommerening on "LP-Based Heuristics for Cost-Optimal Planning"
ICAPS
| Apr 15, 2015
17:40
ICAPS 2014: Benjamin Cohen on "Single- and Dual-Arm Motion Planning with Heuristic Search"
ICAPS
| Apr 15, 2015