TransWikia.com

Bagging Base models

Data Science Asked by Aman Oswal on January 16, 2021

If bagging reduces overfitting than the general statement that base learners of ensemble models should have high bias and low variance(that is should be undefiting) wrong?

One Answer

Bagging, also called bootstrap aggregation, reduces overfitting by considering an ensemble of weak learners. This doesnot mean that the model underfits. The weak classifiers or regressors(usually regression trees) are models having a low variance and an accuacy of slightly more than random (say 52%). Having low variance does imply that the weak classifiers have a high bias. But when you combine multiple outputs, from several weak classifiers, the accuracy is improved.

In bootstrapping this taining of weak classifiers happen on different randomly chosen collection of data points from a larger dataset.

But there is a technique called boosting where the erroneous classifications of one weak classifer are taken into account while training other weak classifiers. This leads to much better accuracy improvement.

Answered by Anoop A Nair on January 16, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP