#bert #lexicalsimplification #nlproc #researchpaperwalkthrough
In this video, we will read about a technique that will help in simplifying a given piece of text. Lexical Simplification (LS) aims at replacing complex words in a sentence with simpler alternatives.
⏩ Abstract: Lexical simplification (LS) aims to replace complex words in a given sentence with their simpler alternatives of equivalent meaning, to simplify the sentence. Recently unsupervised lexical simplification approaches only rely on the complex word itself regardless of the given sentence to generate candidate substitutions, which will inevitably produce a large number of spurious candidates. In this paper, we propose a lexical simplification framework LSBert based on pretrained representation model Bert, that is capable of (1) making use of the wider context when both detecting the words in need of simplification and generating substitue candidates, and (2) taking five high-quality features into account for ranking candidates, including Bert prediction order, Bert-based language model, and the paraphrase database PPDB, in addition to the word frequency and word similarity commonly used in other LS methods. We show that our system outputs lexical simplifications that are grammatically correct and semantically appropriate, and obtains obvious improvement compared with these baselines, outperforming the state-of-the-art by 29.8 Accuracy points on three well-known benchmarks.
⏩ OUTLINE:
0:00 - Introduction and Background
5:42 - Lexical Simplification Pipeline
7:26 - Complex Word Identification
10:20 - Substitute Generation
13:05 - Filtering and Substitute Ranking
⏩ Paper Title: LSBert: A Simple Framework for Lexical Simplification
⏩ Paper Link: https://arxiv.org/abs/2006.14939
⏩ Paper Authors: Jipeng Qiang, Yun Li, Yi Zhu, Yunhao Yuan, Xindong Wu
⏩ Organization: epartment of ComputerScience, Yangzhou, Jiangsu, China | Key Laboratory of Knowledge Engineering with Big Data(Hefei University of Technology), Ministry of Education, Hefei, Anhui,China, and Mininglamp Academy of Sciences, Minininglamp, Beijing, China
*********************************************
⏩ Youtube - https://youtube.com/channel/UCoz8NrwgL7U9535VNc0mRPA
⏩ Blog - https://prakhartechviz.blogspot.com
⏩ LinkedIn - https://linkedin.com/in/prakhar21
⏩ Medium - https://medium.com/@prakhar.mishra
⏩ GitHub - https://github.com/prakhar21
*********************************************
Please feel free to share out the content and subscribe to my channel :)
⏩ Subscribe - https://youtube.com/channel/UCoz8NrwgL7U9535VNc0mRPA?sub_confirmation=1
Tools I use for making videos :)
⏩ iPad - https://tinyurl.com/y39p6pwc
⏩ Apple Pencil - https://tinyurl.com/y5rk8txn
⏩ GoodNotes - https://tinyurl.com/y627cfsa
#unsupervised #pretrained #languagemodels #techviz #datascienceguy #naturallanguageprocessing #nlp #transformers