Combining feature spaces for classification

Damoulas, T. and Girolami, M.A. (2009) Combining feature spaces for classification. Pattern Recognition, 42(11), pp. 2671-2683. (doi: 10.1016/j.patcog.2009.04.002)

Full text not currently available from Enlighten.

Publisher's URL: http://dx.doi.org/10.1016/j.patcog.2009.04.002

Abstract

In this paper we offer a variational Bayes approximation to the multinomial probit model for basis expansion and kernel combination. Our model is well-founded within a hierarchical Bayesian framework and is able to instructively combine available sources of information for multinomial classification. The proposed framework enables informative integration of possibly heterogeneous Sources in a Multitude of ways, from the simple Summation of feature expansions to weighted product of kernels, and it is shown to match and in certain cases outperform the well-known ensemble learning approaches of combining individual classifiers. At the same time the approximation reduces considerably the CPU time and resources required with respect to both the ensemble learning methods and the full Markov chain Monte Carlo, Metropolis-Hastings within Gibbs solution of our model. We present our proposed framework together with extensive experimental studies on synthetic and benchmark datasets and also for the first time report a comparison between summation and product of individual kernels as possible different methods for constructing the composite kernel Matrix.

Item Type:Articles
Status:Published
Refereed:Yes
Glasgow Author(s) Enlighten ID:Girolami, Prof Mark
Authors: Damoulas, T., and Girolami, M.A.
College/School:College of Science and Engineering > School of Computing Science
Journal Name:Pattern Recognition
Publisher:Elsevier
ISSN:0031-3203

University Staff: Request a correction | Enlighten Editors: Update this record