首页 | 官方网站   微博 | 高级检索  
     


PAC-Bayesian Compression Bounds on the Prediction Error of Learning Algorithms for Classification
Authors:Thore Graepel  Ralf Herbrich  John Shawe-Taylor
Affiliation:(1) Microsoft Research Cambridge, UK;(2) School of Electronics and Computer Science, University of Southampton, UK
Abstract:We consider bounds on the prediction error of classification algorithms based on sample compression. We refine the notion of a compression scheme to distinguish permutation and repetition invariant and non-permutation and repetition invariant compression schemes leading to different prediction error bounds. Also, we extend known results on compression to the case of non-zero empirical risk.We provide bounds on the prediction error of classifiers returned by mistake-driven online learning algorithms by interpreting mistake bounds as bounds on the size of the respective compression scheme of the algorithm. This leads to a bound on the prediction error of perceptron solutions that depends on the margin a support vector machine would achieve on the same training sample.Furthermore, using the property of compression we derive bounds on the average prediction error of kernel classifiers in the PAC-Bayesian framework. These bounds assume a prior measure over the expansion coefficients in the data-dependent kernel expansion and bound the average prediction error uniformly over subsets of the space of expansion coefficients.Editor Shai Ben-David
Keywords:classification  error bounds  sample compression  PAC-Bayes  kernel classifiers
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号