Underfit, Overfit & Unfit - Model
We know this because the model perfectly fit our training data, but it didn't generalize to fit the test data at all. Models that are too complex will overfit. We need to make the model less complex. You can make your gradient boosting model less complex by using fewer decision trees, making each decision tree smaller, or by preferring simple decision trees over complex ones. It's also possible that the model is underfitting, because we don't have enough training data. If reducing the complexity of the model doesn't help, it's possible that you might not have enough training data to solve the problem.
If the error rate for both our training data set and test data sets are high, that means our model is underfit. It didn't capture the patterns in the data set very well. Models that are too simple will underfit. You need to make the model more complex. You can make a gradient boosting model more complex by using more decision trees, or making each decision tree deeper. If the error rate for both our training set and test sets are low, that means our model's working well. It is accurate for the training data and test data. So that means the model has learned the real patterns behind the data.