The Model-Exposed: Getting to know Linear Regression

The Model-Exposed: Getting to know Linear Regression


This article is part of the X-Exposed series where I share with you some basic knowledge about technical topics in an entertaining way. To know more about this series please head to this post.

In today’s interview, we are going to have a talk with one of the most commonly used machine learning algorithms, Linear Regression. I am particularly excited about this interview because my guest will teach us a lot of things about machine learning and statistics.

Note: In this article, I used the words machine learning algorithm and machine learning model interchangeably, even though there is a difference between them.

Without further ado, let’s start with today’s interview: The Model-Exposed

?

  • Host: Welcome Linear Regression. Glad you’re willing to talk with us
  • Linear Regression: I’m really happy to be here too.
  • Host: So, in one word tell us what sets you apart from other machine learning models.
  • Linear Regression: Interpretability.
  • Host: Care to elaborate?
  • Linear Regression: By Interpretability, I mean that a person could easily grasp the underlying algorithm that I use to make predictions. In fact, my popularity is due in large part to my interpretability. That is, even non-technical people could understand the parameters with just a little bit of explanation.
  • Host: So, I hear you're good at showing relationships in the data.
  • Linear Regression: Yeah, that’s part of my job. I always try to find a linear relationship between features and outcomes. When you have only one feature, this relationship is just a line that fits through data.
  • Host: The idea behind you is quite straightforward. Am I missing something?
  • Linear regression: Not at all. Just keep in mind that simple doesn’t necessarily mean weak or limited.
  • Host: I see. But is this all that sets you apart?!
  • Linear regression: As opposed to many ML models, I’m less taxing on resources and considerably faster.
  • Host: How so?
  • Linear regression: Contrary to other complex algorithms, making predictions for me is as simple as solving the equation for a specific set of inputs.
  • Host: Tell us about your buddy Gradient Descent. I’ve heard you can’t get through the day without him.
  • Linear regression: That is true. Without Gradient Descent, I’d be in big trouble because I wouldn’t have a way to fit a line to the data, especially when the number of features is enormous. Remember, you can’t find the optimal line manually. You need a powerful way to do it, and my buddy Gradient Descent always helps me with that.
  • Host: Got it. I guess the thing I still don’t understand is what is the difference between you and correlation? Aren’t you two sides of the same coin?
  • Linear regression: No, we aren’t. Correlation just measures the degree of the linear relationship between two variables and doesn’t fit a line through the data. One way to quantify this relationship is by using a correlation coefficient such as Pearson’s r coefficient. On the other hand, I, as a regression algorithm, quantify this linear relationship using a mathematical equation that allows you to predict the value of the dependent variable based on the value of one or more independent variables. It is kind of like a cause-and-effect relationship in my case.
  • Host: Hmm, you just made things fuzzier. Another thing that still confuses me, however, is how you relate to logistic regression. Are you guys really the same person?
  • Linear regression: ?No, I have nothing to do with Logistic regression. Even though we are both supervised machine learning models. He is a classification model, not a regression model. We are miles apart!
  • Host: With those new technologies like deep learning, many people think that your glorious life has come to an end. Is that true?
  • Linear regression: what do you mean by that?
  • Host: Let me just put it out there: Neural Networks are becoming quite a celebrity nowadays—what’s your take on this?
  • Linear regression: Well, I don’t want to sound cocky, but I think that a lot of the credit going to Neural Networks should be going to me. In order to master Neural Networks, it is imperative to have a good understanding of linear regression.
  • Host: That’s intriguing! How is it possible that the simplest machine learning model is the backbone of the world’s most sophisticated deep learning models?
  • Linear regression: If you understand linear regression and gradient descent, understanding neural networks shouldn’t be a problem at all. For example, when you look at a Feed-Forward Neural Network Architecture, you will immediately notice that it is based on linear regression.
  • Host: I see. Well, I think we have come to the end of our interview. Any piece of advice before you go?
  • Linear Regression: Sure. Remember that simplicity is key. There’s nothing worse than complexity. Nobody likes unnecessary computations. Just because a model sounds sophisticated, that doesn’t mean it is the way to go.
  • Host: Well, thank you Linear Regression for granting us some of your time.
  • Linear Regression: No problem.

That's it for today's interview. Thank you for making it to the end of this article.

It is worth bearing in mind that the information presented in this article is from my learning experience. So, if you find that there is some inconsistency, feel free to write a comment. I’ll be happy to read it!?

?

Mohamed Abderrahmane TAYA

Machine Learning Engineer @YZR

3 年

Well done, i enjoyed the interview

要查看或添加评论,请登录

Soufiane DAHIMI的更多文章

社区洞察

其他会员也浏览了