Bagging#

Bagging (short for bootstrapped aggregation) is an ensemble machine learning technique that is used to improve the stability and accuracy of models. In bagging, multiple models are trained independently on random samples of the training data, and the final prediction is made by combining the predictions of the individual models. This can help reduce the variance of the prediction and improve the robustness of the model, as the final prediction is based on the combined output of multiple models, each of which may have different strengths and weaknesses.

Further Readings#

  • Vincent Tan notes

  • Murphy, Kevin P. “Chapter .” In Probabilistic Machine Learning: An Introduction. MIT Press, 2022.

  • James, Gareth, Daniela Witten, Trevor Hastie, and Robert Tibshirani. “Chapter .” In An Introduction to Statistical Learning: With Applications in R. Boston: Springer, 2022.

  • Jung, Alexander. “Chapter ” In Machine Learning: The Basics. Singapore: Springer Nature Singapore, 2023.

  • Bishop, Christopher M. “Chapter .” In Pattern Recognition and Machine Learning. New York: Springer-Verlag, 2016.

  • Hal Daumé III. “Chapter .” In A Course in Machine Learning, January 2017.

  • Machine Learning from Scratch

  • GOOD: https://github.com/NathanielDake/intuitiveml

  • https://github.com/goodboychan/goodboychan.github.io/tree/main/_notebooks