Generalized rules for combination and joint training of classifiers |
| |
Authors: | Email author" target="_blank">J A?BilmesEmail author K?Kirchhoff |
| |
Affiliation: | (1) Department of Electrical Engineering, University of Washington, EE/CS, Box 352500, Seattle, WA, USA;(2) Department of Electrical Engineering, SSLI Laboratory, University of Washington, Seattle, WA, USA |
| |
Abstract: | Classifier combination has repeatedly been shown to provide significant improvements in performance for a wide range of classification tasks. In this paper, we focus on the problem of combining probability distributions generated by different classifiers. Specifically, we present a set of new combination rules that generalize the most commonly used combination functions, such as the mean, product, min, and max operations. These new rules have continuous and differentiable forms, and can thus not only be used for combination of independently trained classifiers, but also as objective functions in a joint classifier training scheme. We evaluate both of these schemes by applying them to the combination of phone classifiers in a speech recognition system. We find a significant performance improvement over previously used combination schemes when jointly training and combining multiple systems using a generalization of the product rule. |
| |
Keywords: | Classifier combination Combination rules Joint classifier training Mixtures of experts Neural network ensembles Neural networks Products of Experts Speech recognition |
本文献已被 SpringerLink 等数据库收录! |
|