Artificial Intelligence #34 - Foundations of Coding for artificial intelligence - part two

Artificial Intelligence #34 - Foundations of Coding for artificial intelligence - part two

Hello all

Welcome to artificial intelligence #34

Previously, I wrote about A chance to get a free copy of my book – Foundations of Coding for artificial intelligence

We have the winners now (listed below in this edition)

In this edition, I will explain the rationale and the approach of the online course and also outline the syllabus which you could adapt also

There are many books on AI and learning coding for AI.

Yet, its a hard subject from a learner's standpoint.

As a teacher of AI, I see this often

Like sales, teaching needs you to put yourselves in the mind of the participant i.e. understand the subject as they would see it.

Specifically understand issues like

  • Cognitive overload (too many things)
  • Cognitive dependencies (too many things you need to know before you know something)
  • Unclear sequence of learning
  • Lack of context (big picture)
  • Limiting the scope (trying to learn too many things)
  • Missing the common elements between problems etc??
  • Lack of a motivating case study / example to explain a problem first and then address it in code or maths.

So, one of the strategies is: I decided to focus the book on the fashion mnist dataset

I heard heard about this database from Amy Boyd (A long term member of our team at the #universityofoxford)

Its simple to understand but unlike mnist - fashion mnist can be used for a very wide range of problems. Hence, the approach is tailored mainly to this one dataset as I explain below

A couple of more maths things this week. ?The concept of Objective Bayesian By Graeme Keith and "An Evening with Leonhard Euler" - took me off netflix this week :)

Finally, this week I contributed to the good work on women's education in Afghanistan through Afghan women on the move run by my friend Maryam popal zahid

The course outline (for the book) and the winners as per survey as below

Note the course is now closed but I hope that this approach would also help in learning

Detailed schedule

Week 1 Feb 6 Concepts

The aim of this week to introduce you to the core concepts in developing and building a machine learning model, including:

  • an overview of machine learning and types of machine learning
  • an introduction of two main problem types: regression and classification in machine learning
  • an introduction of the key machine learning libraries in python
  • define machine learning workflow
  • explain methods and techniques to build and train a machine learning model (such as data exploration; feature selection, feature engineering, preprocessing (outliers, normalize, missing values), model selection and evaluation)

????????????????????????????????????????????????????????????????????????????????????????????????????????????????????

Week 2:?Feb 13 Hands On exercises in Classification and Regression

Exercise: Boston House Prices prediction

In this exercise, we aim to predict the Boston house prices based on several environmental, economic, demographic, and societal features using Boston House dataset. Firstly, explore the data. We continue to build a regression model (using support vector regression) and evaluate the performance of the model.

Exercise: Breast Cancer classification

In this exercise, we aim to create a classification model (mainly Support vector Classification) that predicts if the cancer diagnosis is benign or malignant based on several features.

Week 3: Feb 20 – model selection and evaluation

The aim of this week to explain model selection and evaluation, including:

  • An overview of model selection and model evaluation
  • What are regression and classification evaluation metrics for measuring the performance of this trained model and explain when to choose these metrics
  • An overview of cross validation
  • An overview of ensemble methods
  • An overview of hyperparameter tuning (such as GridSearchCV and RandomizedSearchCV)

?

Week 4: Feb 27 – Feature engineering

An exploration of feature engineering techniques

This week will include all the techniques covered in the previous but also following weeks. Feature engineering is a very important process in the way to build powerful machine learning or deep learning models. The techniques used for this are very diverse. We will cover:

·??????Feature selection including correlation and missing values imputation.

·??????Feature transformation by means of normalization and standardization and other tools.

·??????Feature extraction By finding a smaller set of new variables, each being a combination of the input variables, containing basically the same information as the input variables.

a. PCA(Principal Component Analysis)

b. TSNE

?

Week 5: March 6 – Deep learning models

The aim of this week is to build a deep learning model in Keras, including:

  • An overview of a deep learning; MLP, CNN, SLP, VGG16, ResNet models
  • An overview of Keras Model API: the sequential and the functional API
  • Steps to define and train a model in Keras
  • Explain how to improve a baseline model: adding hidden layers, dropout layer, changing optimizer, epochs, batch sizes, using SGD
  • Types of layer; convolutional layer, pooling layer, max layer, fullu connected layer

?

Week6: March 13 – Classification without dimensionality reduction

Exercise: Classification FM without dimensionality reduction

The project consists of a main task which is to classify items in the Fashion MNIST dataset successfully.?We will divide the implementation course of two main parts, one which we will be doing just for once and we won’t need to rewrite the code for it every time we run the whole code. The second part, which will contain the different models we will be using in this project mainly: Convolutional Neural Networks (CNN), Multi-Layer Perceptron (MLP), Single Layer Perceptron (SLP), VGG16, ResNet, Gaussian Mixture Model and Clustering more specifically we are going to use K-means Clustering techniques. To have more fun, we will apply these models on processed data before and after applying Principal Component Analysis (PCA).

?

Week7:?March 20 – Principal component analysis

The aim of this week is to decide where and when to use PCA, including;

  • An overview of Principal Component Analysis
  • Explain how to apply PCA; data standardization, create a covariance matrix,eigen decomposition, feature transformation,

?

Week8: March 27 – Classification with Dimensionality reduction

Exercise: Classification FM with dimensionality

The project consists of a main task which is to classify items in the Fashion MNIST dataset successfully.?We will divide the implementation course of two main parts, one which we will be doing just for once and we won’t need to rewrite the code for it every time we run the whole code. The second part, which will contain the different models we will be using in this project mainly: Convolutional Neural Networks (CNN), Multi-Layer Perceptron (MLP), Single Layer Perceptron (SLP), VGG16, ResNet, Gaussian Mixture Model and Clustering more specifically we are going to use K-means Clustering techniques. To have more fun! We will apply these models on processed data before and after applying Principal Component Analysis (PCA).

Contact

Ajit.jaokar at feynlabs.ai?

Finally, the list of participants selected based on the questionnaire responses

  • Jay Padmanabhan
  • Dr. R. A. Gore
  • Jon Miller
  • sergio gonzalez
  • Ridwanullahi Abdulrauf
  • Franz Dill
  • Shayan Shah
  • Veloshan Pillay
  • Farid Kazi
  • Sandor Szabo
  • Chirag Mandal
  • Fernando Guarin
  • Magno Carneiro?
  • S P Sreenivas
  • raj kosaraju
  • Nitin Malik
  • Sohom Majumder
  • Giuseppe Saladino
  • Tim Hurst
  • Srivatsa Nori
  • Maria Fernandez?
  • Assadour Derderian
  • Neavil Porus and
  • Sonia Fabiola Ortiz
  • Avinash Mishra
  • Varun Divadkar
  • Philippine Waisvisz



Nitin Malik

PhD | Professor | Data Science | Machine Learning | Deputy Dean (Research)

2 年

Ajit Jaokar Thank you for all the hard work. Looking forward to Week 1 in February.

Malyala Sriman

Looking for New Opportunities An AI Engineer

2 年

is there any certificates for this course completion and also any project

回复
Rahul Jaiswal

Scientist at NIH

2 年

How one can attend now

回复
soundar A

Saas Copywriting @ b2b Sales pitch strategist @ SEO Curated Content Marketing in SMM & ORM & Analyst of lucrative traffic graph model at branding funnel creator & linkedin @ IOT,GMB,GTM in CRM.

2 年

Pretty incredible symmetric information in terms of AI at coherent difference nuance . Sounds great

Giuseppe Saladino

Technical Engineer presso Saeta Yield

2 年

I'm really excited to start this course

要查看或添加评论,请登录

社区洞察

其他会员也浏览了