LPBoost with Strong Classifiers
- DOI
- 10.2991/ijcis.2010.3.s1.7How to use a DOI?
- Keywords
- boosting, strong classifier, soft margin, minimax theory, linear programming
- Abstract
The goal of boosting algorithm is to maximize the minimum margin on sample set. Based on minimax theory, the goal can be converted into minimize the maximum edge. This idea motivates LPBoost and its variants (including TotalBoost, SoftBoost, ERLPBoost) which solve the optimization problem by linear programming. These algorithms ignore the strong classifier and just minimize the maximum edge of weak classifiers so that all the edges of weak classifier are at mostγ.This paper shows that the edge of strong classifier may be higher than the maximum edge of weak classifiers and proposes a novel boosting algorithm which introduced strong classifier into the optimization problem and constrained the edges of both weak and strong classifiers no more thanγ. Furthermore, we justified the reasonability of introducing strong classifier using minimax theory. We compared our algorithm with other approaches including AdaBoost, LPBoost, TotalBoost, SoftBoost, and ERLPBoost on the UCI benchmark dataset. In simulation studies we show that our algorithm converges faster than SoftBoost and ERLPBoost. In a benchmark comparison we illustrate the competiveness of our approach from the aspect of time consuming, and generalization error.
- Copyright
- © 2010, the Authors. Published by Atlantis Press.
- Open Access
- This is an open access article distributed under the CC BY-NC license (http://creativecommons.org/licenses/by-nc/4.0/).
Cite this article
TY - JOUR AU - Jun L. Zhou AU - Yu K. Fang AU - Yan Fu AU - Chong J. Sun PY - 2010 DA - 2010/12/01 TI - LPBoost with Strong Classifiers JO - International Journal of Computational Intelligence Systems SP - 88 EP - 100 VL - 3 IS - Supplement 1 SN - 1875-6883 UR - https://doi.org/10.2991/ijcis.2010.3.s1.7 DO - 10.2991/ijcis.2010.3.s1.7 ID - Zhou2010 ER -