Monthly Archives: December 2016

Boosting Your Model Accuracy with AdaBoost

In this post, I am going to talk about an exceptional ensemble method for improving classification accuracy (boosting) called AdaBoost. AdaBoost algorithm efficiently converts a weak classifier, which is defined as a classifier that achieves only a slightly better accuracy than random guessing, into a strong classifier, which performs significantly better. AdaBoost is fast, does not require any inner parameters to tune and we can combine it with any weak learner, for example Decision Tree.decision

Imagine you are dealing with a classification task critical to your underlying business. For example, you may want to identify two different groups of customers in order to make a targeted offer, suitable only for one of the groups. In this case, the more accurately you classify your two major groups of customers, the more profit you gain on your targeted offers.

Continue reading