Artificial Intelligence #34 - Foundations of Coding for artificial intelligence - part two
Hello all
Welcome to artificial intelligence #34
Previously, I wrote about A chance to get a free copy of my book – Foundations of Coding for artificial intelligence
We have the winners now (listed below in this edition)
In this edition, I will explain the rationale and the approach of the online course and also outline the syllabus which you could adapt also
There are many books on AI and learning coding for AI.
Yet, its a hard subject from a learner's standpoint.
As a teacher of AI, I see this often
Like sales, teaching needs you to put yourselves in the mind of the participant i.e. understand the subject as they would see it.
Specifically understand issues like
So, one of the strategies is: I decided to focus the book on the fashion mnist dataset
I heard heard about this database from Amy Boyd (A long term member of our team at the #universityofoxford)
Its simple to understand but unlike mnist - fashion mnist can be used for a very wide range of problems. Hence, the approach is tailored mainly to this one dataset as I explain below
A couple of more maths things this week. ?The concept of Objective Bayesian By Graeme Keith and "An Evening with Leonhard Euler" - took me off netflix this week :)
Finally, this week I contributed to the good work on women's education in Afghanistan through Afghan women on the move run by my friend Maryam popal zahid
The course outline (for the book) and the winners as per survey as below
Note the course is now closed but I hope that this approach would also help in learning
Detailed schedule
Week 1 Feb 6 Concepts
The aim of this week to introduce you to the core concepts in developing and building a machine learning model, including:
????????????????????????????????????????????????????????????????????????????????????????????????????????????????????
Week 2:?Feb 13 Hands On exercises in Classification and Regression
Exercise: Boston House Prices prediction
In this exercise, we aim to predict the Boston house prices based on several environmental, economic, demographic, and societal features using Boston House dataset. Firstly, explore the data. We continue to build a regression model (using support vector regression) and evaluate the performance of the model.
Exercise: Breast Cancer classification
In this exercise, we aim to create a classification model (mainly Support vector Classification) that predicts if the cancer diagnosis is benign or malignant based on several features.
Week 3: Feb 20 – model selection and evaluation
The aim of this week to explain model selection and evaluation, including:
领英推荐
?
Week 4: Feb 27 – Feature engineering
An exploration of feature engineering techniques
This week will include all the techniques covered in the previous but also following weeks. Feature engineering is a very important process in the way to build powerful machine learning or deep learning models. The techniques used for this are very diverse. We will cover:
·??????Feature selection including correlation and missing values imputation.
·??????Feature transformation by means of normalization and standardization and other tools.
·??????Feature extraction By finding a smaller set of new variables, each being a combination of the input variables, containing basically the same information as the input variables.
a. PCA(Principal Component Analysis)
b. TSNE
?
Week 5: March 6 – Deep learning models
The aim of this week is to build a deep learning model in Keras, including:
?
Week6: March 13 – Classification without dimensionality reduction
Exercise: Classification FM without dimensionality reduction
The project consists of a main task which is to classify items in the Fashion MNIST dataset successfully.?We will divide the implementation course of two main parts, one which we will be doing just for once and we won’t need to rewrite the code for it every time we run the whole code. The second part, which will contain the different models we will be using in this project mainly: Convolutional Neural Networks (CNN), Multi-Layer Perceptron (MLP), Single Layer Perceptron (SLP), VGG16, ResNet, Gaussian Mixture Model and Clustering more specifically we are going to use K-means Clustering techniques. To have more fun, we will apply these models on processed data before and after applying Principal Component Analysis (PCA).
?
Week7:?March 20 – Principal component analysis
The aim of this week is to decide where and when to use PCA, including;
?
Week8: March 27 – Classification with Dimensionality reduction
Exercise: Classification FM with dimensionality
The project consists of a main task which is to classify items in the Fashion MNIST dataset successfully.?We will divide the implementation course of two main parts, one which we will be doing just for once and we won’t need to rewrite the code for it every time we run the whole code. The second part, which will contain the different models we will be using in this project mainly: Convolutional Neural Networks (CNN), Multi-Layer Perceptron (MLP), Single Layer Perceptron (SLP), VGG16, ResNet, Gaussian Mixture Model and Clustering more specifically we are going to use K-means Clustering techniques. To have more fun! We will apply these models on processed data before and after applying Principal Component Analysis (PCA).
Contact
Ajit.jaokar at feynlabs.ai?
Finally, the list of participants selected based on the questionnaire responses
PhD | Professor | Data Science | Machine Learning | Deputy Dean (Research)
2 年Ajit Jaokar Thank you for all the hard work. Looking forward to Week 1 in February.
Looking for New Opportunities An AI Engineer
2 年is there any certificates for this course completion and also any project
Scientist at NIH
2 年How one can attend now
Saas Copywriting @ b2b Sales pitch strategist @ SEO Curated Content Marketing in SMM & ORM & Analyst of lucrative traffic graph model at branding funnel creator & linkedin @ IOT,GMB,GTM in CRM.
2 年Pretty incredible symmetric information in terms of AI at coherent difference nuance . Sounds great
Technical Engineer presso Saeta Yield
2 年I'm really excited to start this course