SVM
Kiruthika Subramani
Innovating AI for a Better Tomorrow | AI Engineer | Google Developer Expert | Author | IBM Dual Champion | 200+ Global AI Talks | Master's Student at MILA
?? Hello all!
Welcome to the Eighth week of our "Cup of coffee with an Algorithm in ML" series! ??.And this week, we're excited to dive into the?SVM algorithm! ??
Thanks for all your DMs asking, Where's the cup of coffee:)
Actually there is no chance to enjoy the sip due to Semester exams and Immigration works to Canada.
Come on, let resume the Series with SVM Today!!
Support Vector Machine is a type of machine learning algorithm that is used for classification tasks.
SVM (Support Vector Machines) helps you draw a line between two groups of things. For example, let's say you have two types of animals: dogs and cats.
SVM helps you find the best line to separate the dogs from the cats.
To draw this line, SVM looks at the animals closest to the line. These animals are like the "special representatives" for their groups. SVM draws the line so that it's the same distance from these special animals on both sides.
Why is this helpful? Well, when you have a new animal that you don't know if it's a dog or a cat, you can put it near the line and see which side it's closer to. If it's closer to the dog side, it's probably a dog. If it's closer to the cat side, it's probably a cat.
So, SVM helps you find the best line to separate two groups (like dogs and cats) by looking at special animals close to the line. This line helps you classify new animals based on which side they're closer to.
领英推荐
I know you have came across certain terminologies in SVM. Everytime we will feel like
Let us understand the terms in the graph
Then what is Positive Hyperplane and negative Hyperplane?
The terms "positive" and "negative" are just labels used to differentiate the two sides of the decision boundary. They don't carry any positive or negative meaning in the context of the classes themselves. They are merely used to describe the separation of data points into different groups based on their positions relative to the decision boundary.
So, in SVM, we refer to the sides of the decision boundary as the positive hyperplane (representing one class) and the negative hyperplane (representing the other class).
N-dimensional space what it is??
N-dimensional space refers to a space where data points are represented by multiple features, and SVM finds the optimal way to separate the data points based on these features.
SVM finds the best way to separate the cats and dogs in this N-dimensional space. It tries to draw a hyperplane (a line in 2D, a plane in 3D, etc.) that maximizes the gap between the two classes, based on their feature values.
Come on Let's do it
mport pandas as pd
from sklearn import svm
from sklearn.model_selection import train_test_split
from sklearn.metrics import accuracy_score
# Step 1: Create a data frame with features and labels
data = {
? ? 'color': ['brown', 'gray', 'black', 'white', 'brown', 'gray'],
? ? 'size': ['small', 'large', 'large', 'small', 'large', 'small'],
? ? 'label': ['cat', 'dog', 'dog', 'cat', 'cat', 'dog']
}
df = pd.DataFrame(data)
# Step 2: Convert categorical features to numerical
df_encoded = pd.get_dummies(df, columns=['color', 'size'])
# Step 3: Split the data into features (X) and labels (y)
X = df_encoded.drop('label', axis=1)
y = df_encoded['label']
# Step 4: Split the dataset into training and testing sets
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)
# Step 5: Train the SVM model
model = svm.SVC(kernel="linear")
model.fit(X_train, y_train)
# Step 6: Predict using the trained model
y_pred = model.predict(X_test)
# Step 7: Evaluate the model
accuracy = accuracy_score(y_test, y_pred)
print("Accuracy:", accuracy)
?? Finally, we did it! ?? Our weekly "Cup of Coffee with ML Algorithm" Eighth week series has come to a close.?? But don't worry, I'll be back next week with more exciting algorithm to explore. So grab a cup of coffee and join us for another week:)
Stay tuned for updates on our next topic. See you soon! ??
Cheers,
Kiruthika
Computer Engineering student || Big Data trainee @Orange DC | Data Engineer intern @DEPI | Learner @ Manara
1 年thank you ??
Innovator TN & Technoverse winner | EDII TN final list | Design Patent holder | AI developer | Social activist | SEO | founder of PB | Final year student
1 年Great