Explaining Knowledge Distillation by Quantifying the Knowledge - Crossminds
CrossMind.ai logo

Explaining Knowledge Distillation by Quantifying the Knowledge

Sep 29, 2020
Authors: Xu Cheng, Zhefan Rao, Yilan Chen, Quanshi Zhang Description: This paper presents a method to interpret the success of knowledge distillation by quantifying and analyzing task-relevant and task-irrelevant visual concepts that are encoded in intermediate layers of a deep neural network (DNN). More specifically, three hypotheses are proposed as follows. 1. Knowledge distillation makes the DNN learn more visual concepts than learning from raw data. 2. Knowledge distillation ensures that the DNN is prone to learning various visual concepts simultaneously. Whereas, in the scenario of learning from raw data, the DNN learns visual concepts sequentially. 3. Knowledge distillation yields more stable optimization directions than learning from raw data. Accordingly, we design three types of mathematical metrics to evaluate feature representations of the DNN. In experiments, we diagnosed various DNNs, and above hypotheses were verified.

Reactions (0) | Note
    📝 No reactions yet
    Be the first one to share your thoughts!