-{ MultiBoost }-

This project is hosted at:

SourceForge.net Logo
MultiBoost About

Important Update (2012):

Please update your bookmarks.


MultiBoost is a portable C++ implementation of the multi-class AdaBoost algorithm, that is AdaBoost.MH. AdaBoost is a powerful meta-learning algorithm commonly used in machine learning. It is ready to use and to crunch numbers!

Key features:

  • Fast and easy to use
  • Fully documented with Doxygen
  • Very easy to add weak learners and extend
  • Support abstention using two different techniques
  • Support regularization
  • Support "AdaBoost.MH with real valued predictions" (see BoosTexter: A Boosting-based System for Text Categorization)
  • Supports Haar-Like features similar to the ones used by Viola&Jones' paper "Robust Real-time Object Detection"
  • XML serialization
  • Entirely written in pure C++. No additional libraries required
  • Tested on linux and windows (OSX to come: I don't have a machine around! :P)

Currently available weak learners are:

  • Single threshold decision stump (default)
  • Multi threshold decision stump
  • Single thresdhold haarlike-features decision stump
  • Multi thresdhold haarlike-features decision stump