Linear Regression

In my previous linked post, we understand the math behind linear regression.

****let’s revise****

Equation of LR is

Y= mX + b ---> This equation is also called line equation.

Where m is slope & b intercept

Q. What is m (Slope) & b (Intercept)?

Ans: Slope(m) = Change in Y / Change in X = Y2-Y1 / X2-X1

This will give the idea that how the line is steep with respect to X. 

Intercept(b) = Value of Y when X is 0.

No alt text provided for this image


We Can get Value of X from the dataset, but what about m & b. How to get that?

Before that let's understand cost function. The cost function is nothing average of the square of the difference between the predicted and actual value of Y.

No alt text provided for this image

The cost function is also called as Mean square Error. Our goal is to reduce/minimize the error.


In short, we must obtain the value of m & b in such a way that MSE is very low. It means the predicted value of Y is very close to actual value Y. So that we will get our best line of our model for prediction.

Now next question arises on how to get value m & b to get a minimum error.

For this, we will use the Gradient Descent approach.

In my upcoming post, we will understand Gradient Descent in a simple way.

要查看或添加评论,请登录

sainath pawar的更多文章

社区洞察

其他会员也浏览了