A novel view of the variational Bayesian clustering |
| |
Authors: | Takashi Tomoki |
| |
Affiliation: | aLaboratory for Neural Circuit Theory, RIKEN Brain Science Institute (BSI), 2-1 Hirosawa, Wako, Saitama 351-0198, Japan;bDepartment of Complexity Science and Engineering, University of Tokyo, 5-1-5 Kashiwanoha, Kashiwa, Chiba 277-8561, Japan |
| |
Abstract: | We prove that the evaluation function of variational Bayesian (VB) clustering algorithms can be described as the log likelihood of given data minus the Kullback–Leibler (KL) divergence between the prior and the posterior of model parameters. In this novel formalism of VB, the evaluation functions can be explicitly interpreted as information criteria for model selection and the KL divergence imposes a heavy penalty on the posterior far from the prior. We derive the update process of the variational Bayesian clustering with finite mixture Student's t-distribution, taking the penalty term for the degree of freedoms into account. |
| |
Keywords: | Unsupervised learning Bayesian estimation Variational approximation Model selection Robust variational Bayes |
本文献已被 ScienceDirect 等数据库收录! |
|