Confusion matrix-AUC-ROC Curve

Confusion matrix-AUC-ROC Curve

#dataanalytics #machinelearning #datacience #dailylearning day 7

Confusion matrix: Confusion matrix is a performance measurement for machine learning Classification & Regression problem where output can be two or more classes. performance measurement is an essential task in Machine Learning, So when it comes to a classification /Regression problem, we can count on Confusion matrix & AUC - ROC Curve.

Confusion matrix is a table with 4 different combinations of predicted and actual values. Confusion matrix is highly useful for calculating Recall, Precision, Specificity, Accuracy, and most importantly AUC-ROC curves.

  • True Positive: Actual Positive and Predicted as Positive
  • True Negative: Actual Negative and Predicted as Negative
  • False Positive(Type I Error): Actual Negative but predicted as Positive
  • False Negative(Type II Error): Actual Positive but predicted as Negative

AUC_ROC curves are the most important evaluation metrics to check model's Performance. AUC means ' Area Under The Curve '& ROC means ' Receiver Operating Characteristics' curve.

Higher the AUC, the better the model?is. If the AUC value is nearer to 1, then the model is high in accuracy. ?ROC curve plots TPR vs. FPR at different classification thresholds. Lowering the classification threshold classifies more items as positive, thus increasing both False Positives and True Positives. The following figure shows a typical ROC curve.

ROC AUC 0.5 means that the classifier is not working. An AUC value above 0.5 means the classifier can detect more numbers of True positives and True negatives than False negatives and False positives.

要查看或添加评论,请登录

Sruthi .的更多文章

社区洞察

其他会员也浏览了