AdaBoost


AdaBoost - the basic algorithm for boosting, a method whereby a large number of weak classifiers can get one better. The authors of the algorithm are Yoav Freund and Robert Schapire. Principle of operation

In short, AdaBoost works so that it trains in subsequent iterations and then measures the error of all available weak classifiers. In each subsequent iteration, "validity" of poorly qualified observations is increased, so classifiers pay more attention to it.

Roberta Schapire page dedicated to AdaBoost

wiki

Comments

Popular posts from this blog

Association of Jewish handicrafts "Jad Charuzim"

Grouping Red Arrows

Catechism of Polish Child