Brownboost is an adaptive, continuous time boosting algorithm based on the Boost-by-Majority (BBM) algorithm. Though it has been little studied at the time of writing, it is believed that it should prove especially robust with re- spect to noisy data sets. This would make it a very useful boosting algorithm for real-world applications. More familiar algorithms such as Adaboost, or its successor Logitboost, are known to be especially susceptible to overtting the training data examples. This can lead to a poor generalization error in the pres- ence of class noise, since weak hypotheses induced at later iterations to t the noisy examples will tend to be given undue in uence in the nal combined hy- pothesis. Brownboost allows us to specify an expected base-line error rate in advance, corresponding to our prior beliefs about the proportion of noise in the training data, and thus avoid overtting. The original derivation of Brownboost is restricted to binary classication problems. In this paper we propose a nat- ural multi-class extension to the basic algorithm, incorporating error-correcting output codes and a multi-class gain measure. We test two-class and multi-class versions of the algorithm on a number of real and simulated data sets with arti- cial class noise, and show that Brownboost consistently outperforms Adaboost in these situations.
KEYWORDS: Boost-by-majority, Error-Correcting Output Codes, BrownBoost, Adaboost, Brownian Motion, Multi-Class Problem.