Non-Convex SGD Learns Halfspaces with Adversarial Label Noise

NeurIPS 2020

Non-Convex SGD Learns Halfspaces with Adversarial Label Noise

Dec 06, 2020
|
36 views
|
Details
We study the problem of agnostically learning homogeneous halfspaces in the distribution-specific PAC model. For a broad family of structured distributions, including log-concave distributions, we show that non-convex SGD efficiently converges to a solution with misclassification error $O(\opt)+\eps$, where $\opt$ is the misclassification error of the best-fitting halfspace. In sharp contrast, we show that optimizing any convex surrogate inherently leads to misclassification error of $\omega(\opt)$, even under Gaussian marginals. Speakers: Ilias Diakonikolas, Vasilis Kontonis, Christos Tzamos, Nikos Zarifis

Comments
loading...