Ridge Regression

Ridge Regression

In Day 23 of the ML: Teach by Doing Project, I learnt all about Ridge Regression.

The reason it’s called “Ridge” Regression is really fascinating. Anyone who understands this nomenclature would have understood ridge regression.

Look at these 2 figures.

In the presence of a ridge in the least squares criterion in parameter space, the penalty you get with ridge regression gets rid of those ridges by pushing the criterion up as the parameters head away from the origin:

In the left plot, a large change in parameter values (along the ridge) produces a negligible change in the loss. This can cause numerical instability; it's very sensitive to small changes. You may get parameter estimates that are very large in magnitude.

By contrast, in the right plot, the ridge is lifted up. Thus, we make sure that small changes in conditions can't produce gigantic changes in the resulting estimates.?The solution becomes stable when we use ridge regression.

Ridge Regression is amazing, if you understand it the right way.

I made a video to explain all my learnings here.

It will premier on Youtube at 9 am IST on 26 April and you can watch it here:


Here is a link to access my Lecture Notes and the 3 code files used in the lecture: Link

Stay tuned for Day 24!


要查看或添加评论,请登录

Vizuara的更多文章

社区洞察

其他会员也浏览了