All About Support Vector Machines (SVM)
???? Support Vector Machine (SVM) ????
?? Basic Assumptions: There are no such assumptions
?? Advantages:
?? It is more effective in high-dimensional spaces.
?? It is relatively memory efficient.
?? It’s very good when we have no idea about the data.
?? Works well with even unstructured and semi-structured data like text, Images, and trees.
?? The kernel trick is the real strength of SVM. With an appropriate kernel function, we can solve any complex problem.
?? SVM models have generalization in practice, the risk of overfitting is less in SVM.
?? Disadvantages
?? More Training Time is required for a larger dataset
?? It is difficult to choose a good kernel function
?? The SVM hyperparameters are Cost - C and gamma. It is not that easy to fine-tune these hyper-parameters. It is hard to visualize their impact.
?? Whether Feature Scaling is required: Yes.
?? Impact of Missing Values:
?? Although SVMs are an attractive option when constructing a classifier, SVMs do not easily accommodate missing covariate information. Similar to other prediction and classification methods, in-attention to missing data when constructing an SVM can impact the accuracy and utility of the resulting classifier.
?? Impact of outliers: It is usually sensitive to outliers
?? Types of Problems it can solve (Supervised)
?? Regression
?? Classification
?? Overfitting And Underfitting
?? In SVM, to avoid overfitting, we choose a Soft Margin, instead of a Hard one i.e. we let some data points enter our margin intentionally (but we still penalize it) so that our classifier don't overfit on our training sample
?? Different Problem statement you can solve using SVM:
?? We can use SVM with every ANN use cases
?? Intrusion Detection
?? Handwriting Recognition
?? Practical Implementation
sklearn.svm.SVC — scikit-learn 0.24.1 documentation
sklearn.svm.SVR — scikit-learn 0.24.1 documentation
?? Performance Metrics
?? Classification
- Confusion Matrix
- Precision, Recall, F1 score
?? Regression
- R2, Adjusted R2
- MSE, RMSE, MAE