Machine Learning – Cluster Analysis


Cluster Analysis

I’m running my Machine Learning revision very late and struggling with the second part of the course. If anyone wants to get in touch about any part two stuff I’d be more than happy to chat any time before the exam as it really helps. I shall leave my laptop on overnight so emails (bedforj8@cs.man.ac.uk) and skype messages/calls (Starscent) will wake me up. Here’s what I’ve managed to grasp of cluster analysis. Read more of this post

Advertisements

Machine Learning – Support Vector Machines


Hopefully all of this should now be complete, if not tell me, hehe

Say we have the following graph with set of plotted points:

Read more of this post

Machine Learning – Naive Bayes Classifier


Background

There are 3 methods to establish a classifier, these are:

Read more of this post

Machine Learning – Decision Trees and Entropy


Decision Trees

If anyone requires further explanation, please get in touch with me (or anyone else for that matter)!

In machine learning, it can be desirable to come up with meaningful if-then rules in order to predict what will occur in the future. For example, “if this and if that then this will probably happen”. Decision trees can be built automatically, which can then be used to come up with these if-then rules.

A decision tree is used to investigate huge amounts of data and come up with the most probable outcomes. Each condition is an internal node on the tree. Each outcome is an external node.
Read more of this post

Machine Learning – Entropy


Entropy is a probability based measure.

Now, lets have a quick example:

Read more of this post

Machine Learning – Decision Trees


I bet you thought that was going to say Decision Boundaries again 😀 – well… that is… if you’ve read the first 4 Machine Learning posts 😉

Nope, this time is Decision Trees, which are very similar to trees in programming – aka Binary Trees.

Read more of this post

Machine Learning – Confusion Matrix


So what happens if our dataset only consists of 5 Cats but 95 Dogs? Well chances are you’ll get a 95% by just predicting everything as a Dog! So this means that you predicted 5 Cats as Dogs.
So what happens when we predict a Cat as a Dog and vice versa? Well chances are we won’t to know what’s been predicted as what! This is where a Confusion Matrix comes in! 🙂

Read more of this post

Machine Learning – Training and Testing


Splitting the Dataset

Hmm… Ah… Looking back at the graphs we used for the Cats and Dogs; I’ve just realised something, they have no units ^_^

Ah well, that’s about to change, haha!

Read more of this post

Machine Learning – Artificial Neurons / Perceptrons (Part 2)


So then, time for part 2 of Perceptrons 😀

Here im going to give a few examples of the algorithm working, then its onto the next part 🙂

Read more of this post

Machine Learning – Artificial Neurons / Perceptrons (Part 1)


Quick Perceptron Example

Here we go… a Decision Boundary… again. Yes you may now kill me 🙂

Actually, this time it really isn’t that bad, because a Perceptron literally IS a decision boundary. Lets take a look at a rounded decision boundary from KNN:

Read more of this post