# Sorting – Merge Sort

A post all about the Merge sort π

# Merge Sort

Merge Sort uses a design pattern technique known as Divide-and-Conquer:

### Divide-and-Conquer

• Divide β If the sequence contains one or two element then simply sort the elements and return the elements as the sequence is sorted. Otherwise we need to split the sequence up into two or more sequences.
• Recursively β solve the subproblems that we have just created.
• Conquer β merge the subproblems back together to form the original problem

So, how does Merge Sort use this Divide-and-Conquer strategy? Well, like this:

• DIVIDE β If the sequence S has zero or one element then return it. Otherwise, take a sequence S and split into two new sequences S1 and S2.
• RECURSIVELY β sort the sequences S1 and S2.
• CONQUER β merge the two sorted sequences S1 and S2 back into their original sequence S.

#### Example

Okay, so letβs have an example of Merge Sort in action:

Here i have just jumped straight in and done the first step, which is to split the original sequence up into two more sequences. Since neither of them have 0 or 1 elements, then we need to split up these two sequences as well. Jumping ahead to steps, this is what we get:

Now after a few steps of splitting up the sequences, we have a few sequences with just 1 element. So what we now do is return the elements, sorting them as we place them back into the original sequences:

And there we have it π , one sequence of numbers, sorted via a merge sort.

## Complexity

Now for the big question. What is the time complexity of a merge sort? Well, just think about it π Splitting up the sequences forms a tree, and a perfect binary tree at that! So we have the height of the tree at O(log n), where n is the number of elements. Next we have to consider how long we spend at each level, which is O(n) because of the sorting.
Altogether we have a complexity of O(n log n) in the worse case.