Machine Learning – Entropy

Entropy is a probability based measure.

Now, lets have a quick example:

2B || ¬2B

Yes, okay, bad joke… I know. I was going to call this section “Heads or Tails” but my mind wondered, I forgot that I was going to and BAM, my bored self put that.

For people who are like “What the frick!?” it means “to be or not to be”

So, what is Entropy? Well first we need to quantify something called uncertainty first.

Lets say i flip a coin – is it heads or tails? how uncertain are you? Well, going through the basics over probability:

  • P(head) = 0.5
  • P(tails) = 0.5

So there is a 50:50 chance of the coin landing on either side. SO you are 50% uncertain as to what it is.



So how do we calculate Entropy? Well, Entropy is denoted with the variable “H”, and uses the following formula:


So, lets say we have the following set of Heads (H) and Tails (T):

  • S = {H, H, H, H, H, T, T, T, T, T, T, T, T}

Here we have 5H’s and 8T’s This gives us a total of 13. If we substitute these values in we get:


So, as you can see. “Pi” is the probability of that event happening. So for Heads say, the probability of a Heads appearing is P(head)=5/13.

And, we have a 0.96124 uncertainty as to whether the side will be a Heads or a Tails.


About Badgerati
Computer Scientist, Games Developer, and DevOps Engineer. Fantasy and Sci-fi book lover, also founder of Cadaeic Studios.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: