[Spotlight at NeurIPS 2020] Rethinking the Value of Labels for Improving Class-Imbalanced Learning

NeurIPS 2020

[Spotlight at NeurIPS 2020] Rethinking the Value of Labels for Improving Class-Imbalanced Learning

Oct 30, 2020
|
117 views
|
Details
NeurIPS 2020 talk: Rethinking the Value of Labels for Improving Class-Imbalanced Learning Code (data & pretrained models): https://github.com/YyzHarry/imbalanced-semi-self Project page: https://www.mit.edu/~yuzhe/imbalanced-semi-self.html We show theoretically and empirically that, both semi-supervised learning (using unlabeled data) and self-supervised pre-training (first pre-train the model with self-supervision) can substantially improve the performance on imbalanced (long-tailed) datasets, regardless of the imbalanceness on labeled/unlabeled data and the base training techniques.

Comments
loading...