Cost Function in Linear Regression

Cost Function in Linear Regression

As discussed in the previous Article in Linear Regression we need to reduce error between actual to predicted values.

No alt text provided for this image

error = y(actual)-y(pred)

error = y - (B0+B1x)

Hereafter observing the figure the error terms some are positive and some are negative. error should be positive it should not be negative to make sure error has positive term we can use mean absolute error which is done with modules operation

Mean Absolute Error = ( |y|-|B0+B1x|)/total number of samples

Even though if it is an Absolute error is an error may be negative to avoid this we use mean square error. mean square error tells you how close a regression line is to a set of points. It does this by taking the distances from the points to the regression line (these distances are the “errors”) and squaring them. The squaring is necessary to remove any negative signs. It also gives more weight to larger differences.?

Mean Square Error (MSE)= 1/n Sum(y- (B0+B1x))^2.

We can say error as a cost function. If we observe the equation the input x and actual output y are constant terms. MSE can be optimized using changing the coefficient and constant tersms B0 and B1. Cost function optimizes the regression coefficients or weights and measures how a linear regression model is performing. The cost function is used to find the accuracy of the?mapping function?that maps the input variable to the output variable.?

Cost function = 1/n Sum(y- (B0+B1x))^2.

For this model has to go with few assumptions like linearity, Homoscedasticity etc i will post in my next article. Happy reading

Previous post link: https://www.dhirubhai.net/pulse/introduction-linear-regression-machine-learning-gummadidala/?trackingId=YMKwuYBDR%2F6PfzL9hGJnHw%3D%3D


要查看或添加评论,请登录

Murali Divya Teja Gummadidala的更多文章

社区洞察

其他会员也浏览了