课程: Advanced Predictive Modeling: Mastering Ensembles and Metamodeling

今天就学习课程吧!

今天就开通帐号,24,600 门业界名师课程任您挑!

AdaBoost, XGBoost, Light GBM, CatBoost

AdaBoost, XGBoost, Light GBM, CatBoost

- [Instructor] Okay, now let's talk about boosting algorithms. It's important to realize that boosting isn't one thing. It's a whole research area. And the number of boosting algorithms has exploded. There are dozens of them. I'm going to mention just a few. Adaptive boosting, frequently just called AdaBoost, is really how this all got started. You're still going to find it implemented, you're going to find it in R, in Python, but the really dominant one now is XGBoost. So to briefly compare and contrast them remember that in adaptive boosting we're finding the incorrect rows and giving them added weight. And there's a formula for doing that within the algorithm. In XGBoost, what we're doing is we're defining a loss function. In our demonstration it was the simplest one you can imagine, a residual. But then we're iteratively building more and more trees trying to minimize that lost function. And, again, this really is the dominant one at the moment. So, keep your eyes open for this…

内容