首页 | 官方网站   微博 | 高级检索  
     


A novel view of the variational Bayesian clustering
Authors:Takashi  Tomoki  
Affiliation:aLaboratory for Neural Circuit Theory, RIKEN Brain Science Institute (BSI), 2-1 Hirosawa, Wako, Saitama 351-0198, Japan;bDepartment of Complexity Science and Engineering, University of Tokyo, 5-1-5 Kashiwanoha, Kashiwa, Chiba 277-8561, Japan
Abstract:We prove that the evaluation function of variational Bayesian (VB) clustering algorithms can be described as the log likelihood of given data minus the Kullback–Leibler (KL) divergence between the prior and the posterior of model parameters. In this novel formalism of VB, the evaluation functions can be explicitly interpreted as information criteria for model selection and the KL divergence imposes a heavy penalty on the posterior far from the prior. We derive the update process of the variational Bayesian clustering with finite mixture Student's t-distribution, taking the penalty term for the degree of freedoms into account.
Keywords:Unsupervised learning  Bayesian estimation  Variational approximation  Model selection  Robust variational Bayes
本文献已被 ScienceDirect 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号