Image Classification with Tensorflow and AWS Sage Maker based Jupyter Notebook
NARAYANAN PALANI ??????
Platform Engineering Lead | AWS & Google Cloud Certified Architect | Cloud Solutions Expert | Driving Innovation in Retail, Commercial & Investment Banking | CI/CD | DevOps | Cloud Transformation
Have done number of Image Classifications in Google Cloud based Tensorflow in the past and it is always been a time consuming exercise to mark the images and have fine balance between unfit data to overfit data. This time I have done similar exploration with AWS SageMaker and shared insights in this newsletter this week.
Introduction
Integrated Notebook of SageMaker with Keras to import the libraries to gather training images and send to TensorFlow-so that it produces insights back to notebook:
As a first option, I have reused a training image data from Pluralsight/ACloudGuru to understand how this Lego images set has been classified in a notebook:
Now, I could review the pynb and npy file sets from the notebook which has been pre-built:
Then I moved on to change the kernal to right version to produce the execution with right python libraries:
Now, executive Import commands enabled importing keras from tensorflow, plt from matplotlib:
Then I made two sets of data (one for training and one for testing) to load into the kernal:
Now Lego Images are being loaded and printed in kernal with different pixel variations listed part of Out[3]:
Then displaying the Lego essential for us to understand what the training image looks like? This is done via series of plt commands in python:
Now I need to have a label set for the lego used across these data sets in order to differentiate right set of data preprocessing:
Then next step is to write human readable names for each class of lego codes -so it is helpful in identifying right lego being picked in every attempt of analysis:
Now printing the name of the lego displayed earlier:
Then bunch of commands used in listing the 20 different lego figures used within the training table set with different labels:
Tensorflow Model Training
Now we are in critical phase of machine learning model training to use these classified labelled data into machine learning (kera services which are integrated to notebook):
Then it produce the history of images which are used in training the machine learning model:
Model Accuracy and Model Loss
Now list of python commands used to produce the image of Model Accuracy and Model Loss to understand how effect these images are w.r.t training the model!
Then let us look at the overall test accuracy through evaluate command:
Single Prediction Exercise
Under Single Prediction, running the first code cell to pick a random image in the test set. Similarly executed the next code cell to transform the image into a collection of one image and executed the third code cell to pass the image into the predict method.
Highest-probability Prediction Result
Using argmax to find highest probability in the prediction of results within predictions_single:
Batch Predictions
Similar to previous line of code, predicting labels of the images are possible with this batch prediction code:
Now these results are summarised in a bar chart with the help of commands below:
Some standards steps of Image Classification to summarise from this newsletter,
Source: ChatGPT for this part of the newsletter from here onwards:)
1. Set Up the Environment
2. Prepare the Dataset
3. Define the Model
4. Train the Model
5. Evaluate the Model
6. Save and Deploy the Model
7. Perform Inference
Why Use AWS SageMaker for TensorFlow Image Classification
??Watch the steps in this youtube video Link
??Please feel free to share your views in the comments section on any simplified steps in creating image classification using tensorflow in any better or alternative approaches.
?Follow me on LinkedIn: Link
Like this article? Subscribe to Engineering Leadership , Digital Accessibility, Digital Payments Hub and Motivation newsletters to enjoy reading useful articles. Press SHARE and REPOST button to help sharing the content with your network.