AdaBoost

A model based on Boosting.

Training

We have Dataset with samples .

  1. Initialize lists
    1. List to hold weights:
    2. List to hold classifiers:
    3. List to hold weight-updates:
  2. Initialize weigths for first classifier so that each tuple has the same probability.
  3. Generate classifiers in iterations
  4. At iteration do:
    1. Calculate normalized weights:
    2. Use Bootstap method to sample Dataset with replacement according to the previously assigned weights to form the training set for classifier
    3. Derive model from
    4. Test model with test set by calculating error as the sum of all missclassified weights
    5. If this error is bigger than go back to step 4.1 and abandon this classifier
    6. Calculate the weight update as
    7. Update weigths for the next iteration by multiplying them with if they have been correctly classified: thus reducing the weight if they were classified correctly and leaving the weight as it is if it has been missclassified.
    8. Add to their respective lists

Prediction

  1. Initialize weigths of each class to zero
  2. For each classifier
    1. Calculate weight of its vote:
    2. Get prediction from that weak classifier
    3. Add to the weight for class
  3. Return class with the largest weight

Or in short: