How to read the Confusion Matrix

A beautiful sight

The Confusion Matrix is a square table representing the predictions of a classification model. As indicated by its name, the table shows how the model is confused when predicting. By examining the Confusion Matrix, we can derive many evaluation measures (like accuracy, recall, precision, etc.)

Test your knowledge
0%

How to read the Confusion Matrix - Quiz 1

1 / 3

What is the False Positive count?

2 / 3

What is Accuracy?

3 / 3

Suppose you have a dataset A with 10 Positive and 1000 Negative data points. As you want to make it balance, you sample the Positive 100 times to have a dataset B that contains 1000 Positive and 1000 Negative data points.

You use a model to give predictions on these 2 datasets, how are their Recalls (TPRs) compared?

Your score is

0%

Please rate this quiz

While there can be many classes in a Classification problem, the case with binary-label is the most common and simple. To illustrate, let’s suppose we have 140 binary-labeled samples in our dataset and the resulting prediction taken from the predictive model is:

Table showing the predicted versus actual labels of data.

To summarize the table: there are 50 positive samples, among which 30 are predicted correctly while the other 20 are mistakenly predicted to be negative. There are also 90 negative samples, 50 of them are indeed predicted as negative while the remaining 40 are erroneously classified as positive.

To make it more concrete, each cell of the table is assigned a name, those are True Positive, True Negative, False Positive and False Negative.

Confusion matrix with TP, TN, FP, FN.

To not get confused with these names, remember that: Because the Confusion Matrix is created to evaluate how a model works, each cell’s name is all about the prediction of the model. For example, False Positive (FP) is the number of predictions that say the labels are positive but those predictions are false. Many people are dazed with these names, e.g. sometimes mistakenly thinking that the Positive term is about the actual label of the data, so, be careful.

The confusion matrix gives us insights into how the model works. For example, looking at the matrix and do some little computation, we get that there are more samples to be predicted accurately (80) than wrongly (60).

To be more formal, let’s take a look at some of the most frequently used measurements as shown in the table below:

the most frequently used measurements derived from confusion matrix

The first and most popular one should be the Accuracy of the model, which implies the percentage of true predictions overall.

Accuracy = \frac{\text{TP+TN}}{\text{TP+TN+FP+FN}} = \frac{\text{True predictions}}{\text{All predictions}}

In the above example,

Accuracy = \frac{30+50}{30+50+40+20} \approx 0.57

The Precision expresses how precise the model is when it spots a positive case.

Precision = \frac{\text{TP}}{\text{TP+FP}} = \frac{\text{Successful Detections}}{\text{All Detections}}

Plug in the above example,

Precision = \frac{30}{30+40} \approx 0.43

The True Positive Rate (TPR), or Recall, is the proportion of the True Positive count over the maximum number of possible True Positives we can have with any models.

TPR = \frac{\text{TP}}{\text{TP+FN}} = \frac{\text{Successful Detections}}{\text{All Possible Successful Detections}}

For the example,

TPR = \frac{30}{30+20} = 0.6

The False Positive Rate (FPR), similarly, is the proportion of False Positive count over the maximum number of possible False Positive we can have with any models.

FPR = \frac{\text{FP}}{\text{FP+TN}} = \frac{\text{Bad Detections}}{\text{All Possible Bad Detections}}

This gives,

FPR = \frac{40}{40+50} \approx 0.44

Even though we only examined the matrix in a binary classification basis, in practice, confusion matrices can also represent the predictions of multi-class classification, such a matrix will have more than 2 rows and 2 columns. However, be reminded that in those cases, each measurement described above should be calculated for each category separately. For example, in a 3-class problem, we have to calculate True-Positive-class-1, True-Positive-class-2, and True-Positive-class-3 instead of just True Positive.

Test your understanding
0%

How to read the Confusion Matrix - Quiz 2

1 / 7

Suppose you have a dataset A with 10 Positive and 1000 Negative data points. As you want to make it balance, you sample the Positive 100 times to have a dataset B that contains 1000 Positive and 1000 Negative data points.

You use a model to give predictions on these 2 datasets, how are their Precision compared?

2 / 7

What is the False Negative Rate?

3 / 7

Suppose you have a dataset A with 10 Positive and 1000 Negative data points. As you want to make it balance, you sample the Positive 100 times to have a dataset B that contains 1000 Positive and 1000 Negative data points.

You use a model to give predictions on these 2 datasets, how are their False Positive Rates (FPRs) compared?

4 / 7

What is Precision?

5 / 7

What is Recall?

6 / 7

What is the False Negative count?

7 / 7

Can we compute True Positive, True Negative, False Positive, False Negative for multiclass classification problems?

Your score is

0%

Please rate this quiz

Conclusion

In this blog post, we get to know how to read the confusion matrix and make familiar with some typical measurements that are computed from a confusion matrix.

Apart from the mentioned measurements, there are also many other metrics that can be derived from confusion matrices, for instance, F1-score, Area under the ROC Curve and Area under the Precision-Recall curve. To choose to use which one for a problem depends upon the problem itself.

References:

  • Wikipedia’s page about Confusion matrix: link
  • Banso’s post on multi-class Confusion matrix: link

Leave a Reply