Crossminds - Latest Tech Research Videos
Name:
BERT
Full Name:
Bidirectional Encoder Representations from Transformers
Type:
technique
Description:
A language representation model designed to pre-train deep bidirectional representations from unlabeled text by jointly conditioning on both left and right context in all layers. As a result, the pre-trained model can be fine-tuned with just one additional output layer to create state-of-the-art models for a wide range of tasks, such as question answering and language inference, without substantial task-specific architecture modifications.