Decision Tree Point's to keep with implementations

Decision Tree Point's to keep with implementations

You will go through important points of Decision Tree

You will go through my notebook to see how to apply Decision Tree end2end.

Lets dive in:

No alt text provided for this image

How to find Purity of Nodes?

  • Entropy - use when features are less || more entropy more uncertainty || ranges (0 to 1)

No alt text provided for this image

  • Gini Impurity - use when features are more || by default Decision Tree classifier uses gini impurity || ranges (0 to 0.5)

No alt text provided for this image

Gini Impurity for Binary Class

No alt text provided for this image

How to Select Feature to Split on?

  • Information Gain helps to select feature to split on
  • more information we gain the better feature
  • use that feature as root node

Usually Decision Tree's tends to Overfit

How to Prevent Overfitting in Decision Tree?

No alt text provided for this image

  1. Post Pruning
  2. Pre Pruning

Hyperparameters to Train:

  • max_depth
  • max features
  • n_estimators
  • min_samples_split
  • min_samples_leaf

Above given are quick points to recap how tree works, now if you can't relate much or maybe you want to see how to perform all of above points practically with Hyperparameter Tuning visit this notebook:

Next will share some interview questions on Decision Tree.

????????????????????????????????????????????????

Happy Learning??


要查看或添加评论,请登录

Mukesh Manral????的更多文章

社区洞察

其他会员也浏览了