Confusion matrix-AUC-ROC Curve
Confusion matrix: Confusion matrix is a performance measurement for machine learning Classification & Regression problem where output can be two or more classes. performance measurement is an essential task in Machine Learning, So when it comes to a classification /Regression problem, we can count on Confusion matrix & AUC - ROC Curve.
Confusion matrix is a table with 4 different combinations of predicted and actual values. Confusion matrix is highly useful for calculating Recall, Precision, Specificity, Accuracy, and most importantly AUC-ROC curves.
AUC_ROC curves are the most important evaluation metrics to check model's Performance. AUC means ' Area Under The Curve '& ROC means ' Receiver Operating Characteristics' curve.
Higher the AUC, the better the model?is. If the AUC value is nearer to 1, then the model is high in accuracy. ?ROC curve plots TPR vs. FPR at different classification thresholds. Lowering the classification threshold classifies more items as positive, thus increasing both False Positives and True Positives. The following figure shows a typical ROC curve.
ROC AUC 0.5 means that the classifier is not working. An AUC value above 0.5 means the classifier can detect more numbers of True positives and True negatives than False negatives and False positives.