Multi-Class Language Classification With BERT in TensorFlow

Multi-Class Language Classification With BERT in TensorFlow

Mar 25, 2021
|
49 views
Details
This video works through building a multi-class classification model using transformers - from start-to-finish. Transformers have been described as the fourth pillar of deep learning, alongside the three big neural net architectures of CNNs, RNNs, and MLPs. However, from the perspective of natural language processing - transformers are much more than that. Since their introduction in 2017, they've come to dominate a majority of NLP benchmarks - and continue to impress daily. What I'm saying is, transformers are damn cool. And with libraries like HuggingFace's transformers - it has become too easy to build incredible solutions with them. So, what's not to love? Incredible performance paired with the ultimate ease-of-use.

00:00 Intro 01:21 Pulling Data 01:47 Preprocessing 14:33 Data Input Pipeline 24:14 Defining Model 33:29 Model Training 35:36 Saving and Loading Models 37:37 Making Predictions
Comments
loading...