Incremental Sampling Without Replacement for Sequence Models

ICML 2020

Incremental Sampling Without Replacement for Sequence Models

Jul 12, 2020
|
39 views
|
Details
Sampling is a fundamental technique, and sampling without replacement is often desirable when duplicate samples are not beneficial. Within machine learning, sampling is useful for generating diverse outputs from a trained model. We present an elegant procedure for sampling without replacement from a broad class of randomized programs, including generative neural models that construct outputs sequentially. Our procedure is efficient even for exponentially-large output spaces. Unlike prior work, our approach is incremental, i.e., samples can be drawn one at a time, allowing for increased flexibility. We also present a new estimator for computing expectations from samples drawn without replacement. We show that incremental sampling without replacement is applicable to many domains, e.g., program synthesis and combinatorial optimization. Speakers: Kensen Shi, David Bieber, Charles Sutton

Comments
loading...