# variational inference with normalizing flows

*10 January 2017*

notes on (Rezende & Mohamed, 2015).

## summary

couldn’t find code… closest things was https://github.com/casperkaae/parmesan/issues/22

setup: probabilistic model \(p_{\theta}(z, x) = p_{\theta}(z) p_{\theta}(x \given z)\) of latents \(z\), observes \(x\), parametrized by model parameters \(\theta\). interested in \(p_{\theta}(z \given x)\).

the main problem addressed by this paper is choosing the family of variational approximations \(\mathcal Q\) so that the true posterior \(p_{\theta}(z \given x) \in \mathcal Q\). the usual way of doing this is to fix a parametrized family \(\mathcal Q = \{q_{\phi}: \phi \in \Phi\}\), most commonly mean-field. the authors argue that this hardly covers the true posterior.

instead, \(\mathcal Q\) is induced by successive transformations \(f_1, \dotsc, f_K\) on a sample \(z_0\) from some initial distribution \(q_0(z_0)\). the density of the \(q\)s thus has jacobian terms (which must be cheap to evaluate). they derive the ELBO in this setting (eqn 15).

## references

- Rezende, D., & Mohamed, S. (2015). Variational Inference with Normalizing Flows.
*Proceedings of the 32nd International Conference on Machine Learning (ICML-15)*, 1530–1538.
@inproceedings{rezende2015variational,
title = {Variational Inference with Normalizing Flows},
author = {Rezende, Danilo and Mohamed, Shakir},
booktitle = {Proceedings of the 32nd International Conference on Machine Learning (ICML-15)},
pages = {1530--1538},
year = {2015},
link = {http://jmlr.org/proceedings/papers/v37/rezende15.pdf}
}

[back]