Optimal Robust Learning of Discrete Distributions from Batches

ICML 2020

Optimal Robust Learning of Discrete Distributions from Batches

Jul 12, 2020
|
24 views
|
Details
Let $d$ be the lowest $L_1$ distance to which a $k$-symbol distribution $p$ can be estimated from $m$ batches of $n$ samples each, when up to $\beta m$ batches may be adversarial. For $\beta<1/2$, Qiao and Valiant (2017) showed that $d=\Omega(\beta/\sqrt{n})$ and requires $m=\Omega(k/\beta^2)$ batches. For $\beta<1/900$, they provided a $d$ and $m$ order-optimal algorithm that runs in time exponential in $k$. For $\beta<0.5$, we propose an algorithm with comparably optimal $d$ and $m$, but run-time polynomial in $k$ and all other parameters. Speakers: Ayush Jain, Alon Orlitsky

Comments
loading...