Machine Learning – Confusion Matrix
January 7, 2010 Leave a comment
So what happens if our dataset only consists of 5 Cats but 95 Dogs? Well chances are you’ll get a 95% by just predicting everything as a Dog! So this means that you predicted 5 Cats as Dogs.
So what happens when we predict a Cat as a Dog and vice versa? Well chances are we won’t to know what’s been predicted as what! This is where a Confusion Matrix comes in! 🙂
Positives and Negatives
In a Confusion Matrix we have things called False Positives and False Negatives. These are defined as:
- False Positive: Falsely Predicting an event (or saying a Cat is a Dog)
- False Negative: Missing and incoming event (or saying a Dog is a Cat)
As well as these, we also have True Positives and True Negatives. Remembering that a Cat=0 and Dog=1, a Confusion Matrix would look like this:
So using this, lets use the example that we have 50 Cats and 50 Dogs. We see that we have predicted only 30% of Cats as Cats and 20% of Dogs as Dogs. In a Confusion Matrix this would look like:
So here we can see that we predicted 15 Cats as Cats, yet we predicted 35 Cats and Dogs.
We can also see that we predicted 40 Dogs as Cats and only 10 Dogs as Dogs.
(in reality these numbers a low… i think…)
- 15 Cats as Cats (TN) – the N because Cat=0
- 35 Cats a Dogs (FP)
- 40 Dogs as Cats (FN)
- 10 Dogs as Dogs (TP)
Sensitivity and Specificity
Now we have 2 equations to note 🙂
The first one is Sensitivity which is:
- TP / (TP+FN)
and then Specificity which is:
- TN / (TN+FP)
So they really aren’t that hard to remember, haha 😉
In general here, Sensitivity means the accuracy on the class Dog, and Specificity means the accuracy on the class Cat.
So using this, what is the accuracy on Dogs and Cats? Well:
- Sensitivity = TP / (TP+FN) = 10/50 = 0.2 = 20% (which is correct 😛 )
- Specificity = TN / (TN+FP) = 15/50 = 0.3 = 30% (which is also correct! 😛 )
Also note, that if you add up the numbers going horizontal, then you end up with the total number of examples for that class in the data set. Then adding up then totals give you the total number of examples in the overall dataset.
To prove this. In the first row of the Confusion Matrix we have 15 and 35. Add these up and we get 50 Cats. If we do the same for the next row we get 10+40=50 Dogs. Adding all these up gives us 100 total examples – which is correct 😉