by Blei, Jordan (Bayesian Analysis 2006)
For a quick primer on Dirichlet processes and their use in mixture modeling, see my notes on DPs
Assuming the observable data is drawn from an exponential family distribution and the base distribution is the conjugate prior, we have a nice probabilistic model:
In constructing the variational family, we take the usual approach of breaking dependencies between latent variables that make computing the posterior difficult. Our variational family is
\[q(\underline{V}, \underline{\eta^*}, \underline{Z}) = \prod_{k=1}^{K-1} q_{\gamma_k}(V_k) \prod_{k=1}^K q_{\tau_k}(\eta_t^*)\prod_{n=1}^N q_{\phi_n}(z_n)\]where \(K\) is the variational truncation of the number of mixing components and \(\{\gamma_k\} \cup \{\tau_k\} \cup \{\phi_n \}\) are our variational parameters.
tags: dirichlet-process - variational-inference - mixture-models - bayesian-nonparametrics