Processing math: 100%

Tuan Anh Le

variational inference with normalizing flows

10 January 2017

notes on (Rezende & Mohamed, 2015).

summary

couldn’t find code… closest things was https://github.com/casperkaae/parmesan/issues/22

setup: probabilistic model pθ(z,x)=pθ(z)pθ(x|z) of latents z, observes x, parametrized by model parameters θ. interested in pθ(z|x).

the main problem addressed by this paper is choosing the family of variational approximations Q so that the true posterior pθ(z|x)Q. the usual way of doing this is to fix a parametrized family Q={qϕ:ϕΦ}, most commonly mean-field. the authors argue that this hardly covers the true posterior.

instead, Q is induced by successive transformations f1,,fK on a sample z0 from some initial distribution q0(z0). the density of the qs thus has jacobian terms (which must be cheap to evaluate). they derive the ELBO in this setting (eqn 15).


references

  1. Rezende, D., & Mohamed, S. (2015). Variational Inference with Normalizing Flows. Proceedings of the 32nd International Conference on Machine Learning (ICML-15), 1530–1538.
    @inproceedings{rezende2015variational,
      title = {Variational Inference with Normalizing Flows},
      author = {Rezende, Danilo and Mohamed, Shakir},
      booktitle = {Proceedings of the 32nd International Conference on Machine Learning (ICML-15)},
      pages = {1530--1538},
      year = {2015},
      link = {http://jmlr.org/proceedings/papers/v37/rezende15.pdf}
    }
    

[back]