中文说明:
Adaboost 算法的思想是合并多个“弱”分类器的输出以产生有效分类。其主要步骤为 :首先给出弱学习算法和样本空间(工, y) ,从样本空间中找出 m 组训练数据,每组训练数据的权重都是 1 /m。然后用弱学习算法迭代运算 T 次,每次运算后都按照分类结果更新训练数据权重分布,对于分类失败的训练个体赋予较大权重,下一次迭代运算时更加关注这些训练个体。弱分类器通过反复迭代得到一个分类函数序列 f, ,fz , … , Jr ,每个分类函数赋予一个权重,分类结果越好的函数,其对应权重越大。
English Description:
The idea of AdaBoost algorithm is to combine the output of multiple "weak" classifiers to produce effective classification. The main steps are as follows: firstly, the weak learning algorithm and the sample space (I, y) are given, and m groups of training data are found from the sample space, and the weight of each group of training data is 1 / m. Then, the weak learning algorithm is used to iterate t times. After each operation, the weight distribution of training data is updated according to the classification results, and the training individuals who fail in classification are given greater weight. In the next iteration, more attention is paid to these training individuals. The weak classifier obtains a classification function sequence F, FZ Each classification function is given a weight. The better the classification result is, the greater the corresponding weight is.