Algorithm Review: Insertion Sort

I have to admit it. I forget things quite easily, and probably the most annoying things to forget are oneself’s skills, aren’t they?

Fortunately, a couple of weeks ago I came across HackerRank and I’m loving how following its challenges is easily refreshing and resharpening some of my skills!

A couple of days ago I committed myself to go through every single challenge there in order and today I’ve decided to write a small post here with the concepts and ideas of every algorithm, data structure, etc… reviewed, first of all to settle that concepts down in my mind a bit deeper but also to have a place where to easily come back and have a quick read if after some time I feel I need to. And last but not least to share it with whoever may read this! 🙂

So, as HackerRank started off with the Insertion Sort sorting algorithm and as I’ve already finished all related exercises with the maximum score (was not so hard indeed), here is my Insertion Sort algorithm review!

Insertion Sort is a very simple in place sorting algorithm. This means that no significant extra memory is required for it to execute. This can be reworded as: Insertion Sort memory complexity is O(n).

The execution itself could be explained as a set of steps:

Consider the subarray formed by the first element as the sorted part and the remaining elements the unsorted part.

For each element in the unsorted part:

  1. Store it in an auxiliar variable
  2. Shift right all elements in the sorted part that are bigger than the value stored in step 1.
  3. Insert the auxiliar value in the location where step 2 stopped.

We could graphically draw it like this:

[2, 3, 4, 6, 5, 1]
[2, 3, 4, 6, 5, 1]
[2, 3, 4, 6, 5, 1]
[2, 3, 4, 6, 5, 1]
[2, 3, 4, 6, 6, 1]
[2, 3, 4, 5, 6, 1]
[2, 3, 4, 5, 6, 6]
[2, 3, 4, 5, 5, 6]
[2, 3, 4, 4, 5, 6]
[2, 3, 3, 4, 5, 6]
[2, 2, 3, 4, 5, 6]
[1, 2, 3, 4, 5, 6]

As we can see, the sorted array (in bold) keeps growing while next element is sorted and shifts to right elements, temporarily overlapping values until the proper place for the current value is found.

A possible Ruby implementation:

data = [...]
for i in 1...data.count
  value = data[i]
  j = i - 1
  while j >= 0 && value < data[j]
    data[j+1] = data[j]
    j -= 1
  end
  data[j+1] = value
end

Analysing the worst case algorithm performance we can easily guess that the worst case scenario occurs when the input array is in reverse order because the algorithm will have to shift that number to the beginning of the sorted part and therefore will need 1 + 2 + 3 + … + (N-1) shifts.

1 + 2 + 3 + … + (N-1) is a finite arithmetic progression, which sum, by definition is frac{n(a_{0}+a_{n})}{2}. In our particular case this is frac{n(1 + (n-1))}{2}, that is frac{n^{2}}{2} which we can round to n2.

Therefore we can say that Insertion Sort time complexity is O(n2).

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s