XGBoost Tutorial in R (from Scratch)

XGBoost Tutorial in R (from Scratch)

Introduction

Lately, I've come to know that a lot of newbies in R are keen to use xgboost package at best. And, why shouldn't they? After all, kagglers have embraced it so deeply. Currently the trend is, you learn xgboost properly and your chances of performing better at kaggle shoots up (ofcourse, considering other variables constant).

But most of us (newbies) don't fully understand how to use xgboost properly.

Therefore, I've written this guide to help newbies (using R) understand the science behind xgboost and how to tune its parameters. In this article, you'll learn about core concepts of XGBoost algorithm. In addition, we'll look into its practical side i.e. improving xgboost model using parameter tuning in R.

Last week, we learned about Random Forest Algorithm. Now we know, it helps us to reduce model's variance by building models on resampled data and thereby increases its generalization capability. Good! Let's proceed ahead now.

Table of Contents

  1. What is XGBoost? Why is it so good?
  2. How does XGBoost work?
  3. Understanding XGBoost Tuning Parameters
  4. Practical - Tuning XGBoost using R

Read Article

Do leave your suggestions, questions in the comments below.


Edoardo Piccari

Data Processing Software Engineer

8 年

Sweet tutorial, thank you very much :D

回复

要查看或添加评论,请登录

Manish Saraswat的更多文章

社区洞察

其他会员也浏览了