Types of Hyperparameter Tuning
What is hyperparameter tuning?
Hyperparameter tuning is an extra step to make sure that your model is using the right values for parameters whose values are selected randomly. Every algorithm has such variables whose values are not fixed. You could use any arbitrary value as long as it’s benefitting the accuracy of the model. There are mainly 4 different approaches to tune your model.
- Manual Search: Here you try random values based on your intuition or use some standard values noticed by people over the period.
- Grid Search (GridSearchCV): Grid Search, known as GridSearchCV in python, works on a simple principle. It will make all different combinations of hyperparameter values from the choices you have provided it with. This becomes computationally expensive when we have a lot of hyperparameters or a lot of choices to choose from. Being computationally heavy, it will guarantee you the best combination from the range of values you’ve provided.
- Random Search (KerasTuner): Unlike Grid Search, Random Search will select random combinations of the inputted values and test them. This won’t guarantee you 100% results, but it will be quick to implement.
- Bayesian Optimization: Unlike Grid Search and Random Search, Bayesian Optimization takes into account the results of past combinations and selects the next set of values accordingly. This will converge faster in comparison to the above methods, as every next selection is made with some prior information.