Processing math: 100%

Rylan Schaeffer

Logo
Resume
Research
Learning
Blog
Teaching
Jokes
Kernel Papers


“Humanity does not ask us to be happy. It merely asks us to be brilliant on its behalf.“

21 March 2021

Variational Inference for Dirichlet Process Mixtures

by Blei, Jordan (Bayesian Analysis 2006)

Research Questions

Background

For a quick primer on Dirichlet processes and their use in mixture modeling, see my notes on DPs

Approach

Assuming the observable data is drawn from an exponential family distribution and the base distribution is the conjugate prior, we have a nice probabilistic model:

  1. Draw Vi|αBeta(1,α). Let V_={V1,V2,...}
  2. Draw parameters for the mixing distributions ηi|G0G0, where G0 is the base measure of the DP. Let η_={η1,η2,...}.
  3. For the n=1,...,N data point
    • Draw Zn|V_Multi(π(V))
    • Draw Xn|Znp(xn|ηzn)

Figure1

In constructing the variational family, we take the usual approach of breaking dependencies between latent variables that make computing the posterior difficult. Our variational family is

q(V_,η_,Z_)=K1k=1qγk(Vk)Kk=1qτk(ηt)Nn=1qϕn(zn)

where K is the variational truncation of the number of mixing components and {γk}{τk}{ϕn} are our variational parameters.

tags: dirichlet-process - variational-inference - mixture-models - bayesian-nonparametrics