M2SGD: Learning to Learn Important Weights - Crossminds
CrossMind.ai logo
Authors: Nicholas I-Hsien Kuo, Mehrtash Harandi, Nicolas Fourrier, Christian Walder, Gabriela Ferraro, Hanna Suominen Description: Meta-learning concerns rapid knowledge acquisition. One popular approach cast optimisation as a learning problem and it has been shown that learnt neural optimisers update base learners more quickly than their handcrafted counterparts. In this paper, we learn an optimisation rule that sparsely updates the learner parameters and removes redundant weights. We present Masked Meta-SGD (M2SGD), a neural optimiser which is not only capable of updating learners quickly, but also capable of removing 83.71% weights for ResNet20s.

Reactions (0) | Note
    📝 No reactions yet
    Be the first one to share your thoughts!