Introduction to Linear Regression in Machine Learning
Before we starting the Linear Regression we have a little introduction about Machine Learning(ML). For the past decade, the most used buzzword in the IT field is Machine Learning.
Machine Learning?is the field of study that gives computers the capability to learn without being explicitly programmed. Here computer will learn and analyze data and improve its predictions using algorithms. for this to happen machine learning algorithm will build a model using sample data called training data. After the model has build testing data is fed to the model to give predictions based on previous training experience.
Types of Machine Learning Algorithms:
Supervised Learning is ML algorithm that builds a model based on Labelled data. The model has data with labels.
Unsupervised Learning is ML algorithm that builds a model based on unlabelled data. The mode has data without labels.
Regression: Regression is a statistical method used in that attempts to determine the strength and character of the relationship between one dependent variable (usually denoted by Y) and a series of other variables (known as independent variables).
Linear Regression: Linear regression is a simple statistical model for predicting the linear relationship between the dependent variables and independent variables.
Here the independent variables are at x-axis and dependent variables are on y-axis and the relation between them is linear. we can express this relation in mathematics terms by
y= B1x +B0
Here y- dependent variable, x - independent variable, B1- regression coefficient and B0- intercept coefficient.
领英推荐
Types of Linear Regression:
Types of Linear Regression is done based on no of variables the dependent variable(Y) depends on Independent variables (X)
Simple Linear Regression is a dependent variable that depends only one on one input independent variable.
Multiple Linear Regression is the dependent variable that depends on multiple input independent variables.
Here in Linear Regression, the Training data is fed to the model and we get the predicted values which are continuous in nature. we are comparing the predicted values with actual values we try to minimize the error for these predicted values to actual values. if the error is minimum then we can say our linear regression model a good fit for predicting outputs.
A Linear Regression model’s main aim is to find the best fit linear line and the optimal values of intercept and coefficients such that the error is minimized. The main aim is to reduce the error between actual to predicted values.
For this model has to go with few assumptions like linearity, Homoscedasticity etc i will post in my next article. Happy reading