Machine learning
ML is an application of A.I

Machine learning

It's been two weeks already since I started my Machine Learning journey at Codegnan . I remember being so excited to learn on the first day.

In the last two weeks I have learnt a lot regarding python topics which I already knew and some which I didn't know.

My mentor Saketh Kallepu really did great job at keeping things very interesting which helped me stay consistent so far.

So here's what I've learnt so far,



No alt text provided for this image
ML,DL,NLP and NMT are all Applications of A.I

On the very first day I've learnt the fundamentals of Machine Learning covering the questions like, "What is Machine Learning?", and " How ML is related to AI and Deep Learning?" and What exactly we were going to be using to build Machine Learning models.



No alt text provided for this image

On the second day we got started with the basics of python. And continued to learn all the fundamental concepts of python till the sixth day, mainly focusing on the modules and functions, such as NumPy and Pandas, which play a crucial role in Machine Learning.


Machine learning models require lots of data in order to learn. Mostly datasets are in the form of tables with many columns and rows some of which are not needed. And rows with inconsistent values.

So, firstly we need to prepare the dataset, before we can use it to train our Machine Learning model. That's exactly what I've learn to do, on the seventh day.

Understood how a Data Scientist would study the data and prepare it before being able to tell stories about it. It's achieved using the following simple steps.

No alt text provided for this image

  • Understanding the Data (what and why)
  • Data preparation
  • Data cleansing or Preprocessing
  • Data Normalization
  • Data Visualization


We've implemented all these steps on the Titanic dataset which is known as the "Hello world" Dataset.



Plotly

On the 8th day I was introduced to the interactive visualization tool called Plotly by my mentor Saketh Kallepu . Which to be honest I loved working with. Plotly so far intrigued me the most and I found it very reliable, consistent, interactive and easy to use. I learnt the process of importing Plotly and other required libraries such as cufflinks. And built basic graphs such as bar graphs, line graphs and scatter plots using the iplot function in both offline and online modes.

No alt text provided for this image
Plotly as a library for visualization in Python

And I've worked on a case study using the Plotly Express creating various graphs including,

  • Interactive bar charts
  • Interactive scatter plots
  • Bubble plots
  • Animated Bubble plots
  • Facet plots (A single graph showing multiple plots at the same time)

I've posted about the mentioned case study along with the graphs. Here is the link.



Machine Learning

It was on the 9th day at Codegnan , we started learning about Machine Learning and it's basic concepts.

No alt text provided for this image
Traditional Progamming VS ML

Which included,

  • The main design principles of Machine Learning.
  • Feature Scaling and it's types. (Min max normalization and Standard Scalers)
  • Scikit-Learn library.
  • Fitting and Transforming

I've implemented the feature scaling methods mentioned above with the help of the Scikit-Learn library. Which helped me understand how things worked practically.



From the 10th day I learnt a lot about Machine Learning Models starting with the Linear Regression. Firstly our mentor Saketh Kallepu gave the theoretical explanation about it and gave the us the formulas that were used in it (Linear Regression). And then we executed it practically using python in two ways.

No alt text provided for this image
Linear Regression graph

  1. The Mathematical Approach - Where I performed all the calculations using the Regression formulas and plotted the regression line along with the data.
  2. The Scikit-learn Approach - Here we did the same thing but using the "sklearn library" which has predefined functions that made building the Linear Regression model a lot easier.

Data can be in different forms so different datasets require different types of Machine Learning models in order to make the predictions more accurate.

No alt text provided for this image
Polynomial VS Linear Regression

One of such alternatives when Linear Regression model fails to achieve accurate predictions, is Polynomial Regression. Which can be easily used using the Polynomial features function from preprocessing module in the sklearn library.

In order to increase the accuracy of the prediction a dataset is always split into two parts,

  • The training data (used to train the model)
  • The testing data (used to test the model)

This is achieved by using the train_test_split function.

Here's the link to the post where I shared the graphs that I plotted for Linear Regression and Polynomial Regression.



The last topic I learnt was regarding the performance of a machine learning algorithm. How the model utilizes the data we provide to generalize a proper way to predict accurate results. It all depends on the amount of data we provide the model to learn from. It could result in one of the three data fitting scenarios,

  1. Underfitting (Occurs when the training data insufficient)
  2. Overfitting (Occurs when the training data is more than required)
  3. Ideal fitting (When the training data given is just right)

Click the following link to check out the post in which I included the graph in which I plotted all the scenarios mentioned above.





Saketh Kallepu

Chief Management Officer and Data Science Mentor | Computational Intelligence

1 年

Keep up your learning Codegnan will always be there to guide you

要查看或添加评论,请登录

社区洞察

其他会员也浏览了