Meta-DermDiagnosis: Few-Shot Skin Disease Identification Using Meta-Learning

CVPR 2020

Authors: Kushagra Mahajan, Monika Sharma, Lovekesh Vig Description: Annotated images for diagnosis of rare or novel diseases are likely to remain scarce due to small affected patient population and limited clinical expertise to annotate images. Deep networks employed for image based diagnosis need to be robust enough to quickly adapt to novel diseases with few annotated images. Further, in case of the frequently occurring long-tailed class distributions in skin lesion and other disease classification datasets, conventional training approaches lead to poor generalization on classes at the tail end of the distribution due to biased class priors. This paper focuses on the problems of disease identification and quick model adaptation in such data-scarce and long-tailed class distribution scenarios by exploiting recent advances in meta-learning. This involves training a neural network on few-shot image classification tasks based on an initial set of class labels / head classes of the distribution, prior to adapting the model for classification on a set of unseen / tail classes. We named the proposed method Meta-DermDiagnosis because it utilizes meta-learning based few-shot learning techniques such as the gradient based Reptile and distance metric based Prototypical networks for identification of diseases in skin lesion datasets. We evaluate the effectiveness of our approach on publicly available skin lesion datasets, namely the ISIC 2018, Derm7pt and SD-198 datasets and obtain significant performance improvement over pre-trained models with just a few annotated examples. Further, we incorporate Group Equivariant convolutions (G-convolutions) for the Meta-DermDiagnosis network to improve disease identification performance as these images generally do not have any prevailing global orientation / canonical structure and G-convolutions make the network equivariant to any discrete transformations like rotation, reflection and translation.