BERT - Crossminds
CrossMind.ai logo
BERT| Bidirectional Encoder Representations from Transformers
A language representation model designed to pre-train deep bidirectional representations from unlabeled text by jointly conditioning on both left and right context in all layers. As a result, the pre-trained model can be fine-tuned with just one additional output layer to create state-of-the-art models for a wide range of tasks, such as question answering and language inference, without substantial task-specific architecture modifications.
Field
Sub Field
Task
Sub Task
Technique
135 videos | by date