# Sorting algorithms

In this post we are going to compare and analyse time complexity of some of the basic sorting algorithms such as

- Bubble sort
- Selection sort
- Insertion sort
- Merge sort
- Quick sort
- Heap sort

You can visualize some of the sorting algorithms before delving in understanding them.

The complete source code can be found at below location in case if you want to run the program and then learn step by step

Before diving into the sorting algorithms, lets have a broader picture of which complexity basket these algorithms fit into. Bubble, insertion and selection sort have a O(n^{^2}) complexity, where as Merge, quick and Heap sort have O(nLog(n)) complexity in average case and O(n^{^2}) complexity as well in worst case.

**Bubble Sort**

private void bubbleSort(int[] array, int n) { boolean swapped; for (int i = 0; i < n - 1; i++) { swapped = false; for (int j = 0; j < n - i - 1; j++) { if (array[j] > array[j + 1]) { swap(array, j, j + 1); swapped = true; } } if (!swapped) { break; } } } private void swap(int[] array, int i, int j) { int temp = array[i]; array[i] = array[j]; array[j] = temp; }

As we can see here in the above code that we are looping in the following manner

T(n) = (n – 1) + (n – 2) + (n – 3) + … + 1

which is as good as writing

T(n) = 1 + 2 + 3 + 4 + … + (n – 2) + (n – 1)

T(n) = n x (n – 1) / 2

T(n) = [n x (n – 1) / 2] < [n x n / 2]

T(n) = [n x (n – 1) / 2] > [(n /2) x (n / 2)]

n^{^2 }/ 4 <= T(n) <= n^{^2} / 2 for all n > 2

So here the number of swaps is at least n^{^2 }which can be reduced to O(n) in the selection sort.

**Selection sort**

private void selectionSort(int[] array, int n) { int j = 0; for (int i = 0; i < n - 1; i++) { int m = 0; for (j = 1; j < n - i; j++) { if (array[j] > array[m]) { m = j; } } if (j - 1 != m) { swap(array, j - 1, m); } } }

Here we are iterating N times in the outer loop and finding the largest number in each iteration in the inner loop, and finally we are swapping numbers only for N times. Hence we are sorting the array in O(n^{^2}) but swapping is done only for O(n) times.

**Insertion Sort**

void insertionSort(int arr[], int n) { int i, key, j; for (i = 1; i < n; i++) { key = arr[i]; j = i - 1; while (j >= 0 && arr[j] > key) { arr[j + 1] = arr[j]; j = j - 1; } arr[j + 1] = key; } }

Time complexity of insertion sort is also O(n^{^2}) in the worst case but it will only take O(n) if the array is already sorted.

As we can see that Bubble sort, insertion sort and selection sort have O(n^{^2}) time complexity, but we can sort in better way using merge sort and quick sort in O(nLog(n)) time complexity.

**Merge sort**

private void mergeSort(int[] array, int l, int r) { if (l < r) { int mid = l + (r - l) / 2; mergeSort(array, l, mid); mergeSort(array, mid + 1, r); merge(array, l, mid, r); } } private void merge(int[] array, int l, int mid, int r) { int i = l, j, k, n1 = mid - l + 1, n2 = r - mid; int[] lArray = new int[n1]; int[] rArray = new int[n2]; for (i = 0; i < n1; i++) { lArray[i] = array[l + i]; } for (j = 0; j < n2; j++) { rArray[j] = array[mid + 1 + j]; } i = 0; j = 0; k = l; while (i < n1 && j < n2) { if (lArray[i] < rArray[j]) { array[k] = lArray[i++]; } else if (lArray[i] > rArray[j]) { array[k] = rArray[j++]; } k++; } while (i < n1) { array[k++] = lArray[i++]; } while (j < n2) { array[k++] = rArray[j++]; } }

As we can see in the above program that “mergeSort” function is dividing the given array into two parts and and then while merging it is placing the elements in the ascending order. So you can visualize it as below when mergeSort function is called

and it takes θ(N) to merge elements. Hence

T(n) = T(n/2) + T(n/2) + θ(n)

T(n) = 2T(n/2) + n

T(n) = n + 2[n/2 + 2T(n/4)]

T(n) = n + n + 4T(n/4)

T(n) = 2n + 4[n/4 + 2T(n/8)]

T(n) = 2n + 4n + 8T(n/8)

T(n) = 6n + 8T(n/8)

T(n) = n(2k) + 2^kT(n/2^k)

when 2^k tends to n, n = 2^k

k = log(n)

so, T(n) = n(2logn) + nT(1)

T(n) = 2nlogn + n

T(n) = θ(nlogn)

Hence the time complexity of merge sort is always θ(nlogn) is all the cases.

**Quick sort**

private void quickSort(int[] array, int l, int r) { if (l < r) { int pivot = partition(array, l, r); quickSort(array, l, pivot - 1); quickSort(array, pivot + 1, r); } } private int partition(int[] array, int l, int r) { int pivot = array[r]; int i = l; for (int j = l; j < r; j++) { if (array[j] < pivot) { swap(array, i++, j); } } swap(array, i, r); return i; }

Here quick sort is also a divide and conquer algorithm where we are going to divide the array based on partition key unlike merge sort where we divide array into exactly half.

T(n) = T(k) + T(n-k-1) + θ(n)

Worst case: It occurs when the partition process always picks greatest or smallest element as pivot

T(n) = T(0) + T(n-1) + θ(n)

T(n) = T(n-1) + θ(n)

T(n) = n + [T(n-2) + n-1]

T(n) = n + (n – 1) + T(n – 2)

T(n) = n(n-1)/2

T(n) = O(n^{^2})

Best case and average case: It occurs when the partition process always picks the middle element as pivot. which is as same as merge sort, so the complexity will be

T(n) = 2T(n/2) + n

T(n) = θ(nlogn)

**Heap Sort**

private void heapSort(int[] array, int n) { for (int i = (n / 2) - 1; i >= 0; i--) { heapify(array, n, i); } for (int i = n - 1; i >= 0; i--) { int temp = array[0]; array[0] = array[i]; array[i] = temp; heapify(array, i, 0); } } private void heapify(int[] array, int n, int i) { int largest = i; int left = (2 * i) + 1; int right = (2 * i) + 2; if (left < n && array[left] > array[largest]) { largest = left; } if (right < n && array[right] > array[largest]) { largest = right; } if (i != largest) { int temp = array[i]; array[i] = array[largest]; array[largest] = temp; heapify(array, n, largest); } }

we are building max heap in the first for loop of heapsort method , which will give us largest element in the given array, in the subsequent for loop we are taking the top element and start adding from the last of the array, which eventually generates array in the ascending order from last index to first.

heapify method will make sure that heap property is maintained in O(logN) time, so total time complexity of the above program is as below

T(n) = T((n/2) * Logn) + T(n * Logn)

T(n) = 2T(nLogn)

T(n) = O(nLogn)

Complete code of the above mentioned algorithms can be found at

Kindly comment if there is any mistake or some more information needs to be updated.

## 5 thoughts on “Sorting algorithms”

Good explanation.

Thank you Raghavendra 🙂

Good work Mantu. One small suggestion (It may sound unrelated 😀 ) – When you explain set of algorithms – it will be interesting if you can pick some problems and solve using any one of these algorithms – which you think is the best for that problem & explain why you chose that over others – So this way you will touch algorithms in a more practical way with examples.

Thanks Sanju, Good point and it makes more sense explaining that way. I will keep that in mind while writing my next post.

Thanks Mahantesh for the detailed explaination.

I like that sorting GIF 🙂

Comments are closed.