The Generalized Lasso with Nonlinear Observations and Generative Priors

NeurIPS 2020

The Generalized Lasso with Nonlinear Observations and Generative Priors

Dec 06, 2020
|
24 views
|
Details
In this paper, we study the problem of signal estimation from noisy non-linear measurements when the unknown $n$-dimensional signal is in the range of an $L$-Lipschitz continuous generative model with bounded $k$-dimensional inputs. We make the assumption of sub-Gaussian measurements, which is satisfied by a wide range of measurement models, such as linear, logistic, 1-bit, and other quantized models. In addition, we consider the impact of adversarial corruptions on these measurements. Our analysis is based on a generalized Lasso approach (Plan and Vershynin, 2016). We first provide a non-uniform recovery guarantee, which states that under i.i.d. Gaussian measurements, roughly $O\left(\frac{k}{\epsilon^2}\log L\right)$ samples suffice for recovery with an $\ell_2$-error of $\epsilon$, and that this scheme is robust to adversarial noise. Then, we apply this result to neural network generative models, and discuss various extensions to other models and non-i.i.d. measurements. Moreover, we show that our result can be extended to the uniform recovery guarantee whenever a so-called local embedding property holds. For instance, under 1-bit measurements, this recovers an existing $O\left(\frac{k}{\epsilon^2}\log L\right)$ sample complexity bound with the advantage of using an algorithm that is more amenable to practical implementation. Speakers: Zhaoqiang Liu, Jonathan Scarlett

Comments
loading...