XGBoost Part 3: Mathematical Details
May 19, 2020
StatQuest with Josh Starmer
In this video we dive into the nitty-gritty details of the math behind XGBoost trees. We derive the equations for the Output Values from the leaves as well as the Similarity Score. Then we show how these general equations are customized for Regression or Classification by their respective Loss Functions. If you make it to the end, you will be approximately 22% smarter than you are now! :) NOTE: This StatQuest assumes that you are already familiar with... XGBoost Part 1: XGBoost Trees for Regression:
XGBoost Part 2: XGBoost Trees for Classification:
Gradient Boost Part 1: Regression Main Ideas:
Gradient Boost Part 2: Regression Details:
Gradient Boost Part 3: Classification Main Ideas:
Gradient Boost Part 4: Classification Details:
...and Ridge Regression:
Also note, this StatQuest is based on the following sources: The original XGBoost manuscript:
The original XGBoost presentation:
... And the XGBoost Documentation:
... Last but not least, I want to extend a special thanks to Giuseppe Fasanella and Samuel Judge for thoughtful discussions and helping me understand the math. For a complete index of all the StatQuest videos, check out:
If you'd like to support StatQuest, please consider... Patreon:
...or... YouTube Membership:
... ...a cool StatQuest t-shirt or sweatshirt (USA/Europe):
... ...buying one or two of my songs (or go large and get a whole album!)
...or just donating to StatQuest!
Lastly, if you want to keep up with me as I research and create new StatQuests, follow me on twitter:
Category: Work Skills
📝 No reactions yet
Be the first one to share your thoughts!
© Provided by CrossCircles Inc.