 ### Dmitry Halai

Software engineer

# Sorting Algorithms Hi everyone! As I mentioned earlier, I decided to recall some base knowledge.
In this article I would like to share with you a difference between sorting algorithms. I know, that it can be very simple, but how many times do you think about it? We all use it very often (for example, .sort in ruby), but not everyone takes a look under the hood.
So, here I want to describe the most popular sorting algorithms.
Let’s start!

## Selection Sort

Selection sort is an algorithm which is based on scanning array from left to right.
It means that:

a) No entry to right of sorted part of data is smaller than any entry to left.
b) Entries the left of sorted part of data are in ascending order.

So, we start from the first element of the data with index i and go through the data from left to right. If we find smaller element, we will change it with element with index i. Next iteration we move the pointer (index i) to the right by 1 and repeat searching smaller element from right part of the data. http://www.sorting-algorithms.com/selection-sort

Selection sort uses $(N-1) + (N-2) + .. + 1 + 0 = \sum_{0}^{N-1} dx = {x^2\over 2}$ compares and N exchanges.

So, we can see that the time complexity of the algorithm is $O(n^2)$

## Insertion Sort

Insertion sort is another simple sorting algorithm.
Main steps can be described as:

a) Split the data into 2 parts: sorted and unsorted.
b) Start from the first element of the data with index i and put it to the sorted part.
c) Increment the index i by 1, select element by the index.
d) Find a new index for the selected element. Go trough the sorted part of the data from right to left unless smaller element is found.
e) Insert the selected element to the sorted part of the data unsing new index.
f) Go to c until the data is sorted. http://www.sorting-algorithms.com/insertion-sort

The time complexity of the algorithm is the same as in the selection sort: $O(n^2)$

## Shellsort

Shellsort is an improved version of the insertion sort.
This method is based on sorting pairs of elements far apart from each other.
Shellsort splits data into some parts unsing an interval. This interval is calculated by formula (was introduces by Shell):

$d = floor(gap/2)$ , where gap — is interval with initial value N (count of elements).

Main steps can be described as:

a) Initialize the value of d.
b) Divide the list into smaller sublists unsing interval h.
c) Sort these sublists using insertion sort.
d) Repeat until complete list is sorted. http://www.sorting-algorithms.com/shell-sort

The time complexity of the algorithm depends on interval d.
In the worst case: $O(n^2)$.
In the best case: $O(n*log(n))$.

## Mergesort

Mergesort is one of the divide-and-conquer algorithms which is built on recursion.
Conceptually, a merge sort works as follows:

a) Divide the unsorted list into n sublists, each containing 1 element (a list of 1 element is considered sorted).
b) Repeatedly merge sublists to produce new sorted sublists until there is only 1 sublist remaining. This will be the sorted list. http://www.sorting-algorithms.com/merge-sort

The time complexity of the algorithm in the worst case is $O(n * log(n))$.

## Quicksort

Quick sort is a highly afficient algorithm which is based on recursion (as well as mergesort).

The steps of algorithm are:

a) Pick an element, called a pivot, from the array.
b) Partitioning: reorder the array so that all elements with values less than the pivot come before the pivot, while all elements with values greater than the pivot come after it (equal values can go either way).
After this partitioning, the pivot is in its final position. This is called the partition operation.
c) Recursively apply the above steps to the sub-array of elements with smaller values and separately to the sub-array of elements with greater values. http://www.sorting-algorithms.com/quick-sort

Quicksort’s divide-and-conquer formulation makes it amenable to parallelization using task parallelism.

The time complexity of the algorithm depends on pivot element and can be:
— in the worst case: $O(n^2)$.
— in the best case: $O(n*log(n))$.

## Summary

 Algorithm Compexity (worst case) Compexity (best case) Selection Sort $$O(n^2)$$ $$O(n^2)$$ Insertion Sort $$O(n^2)$$ $$O(n^2)$$ Shellsort $$O(n^2)$$ $$O(n*log(n))$$ Mergesort $$O(n*log(n))$$ $$O(n*log(n))$$ Quicksort $$O(n^2)$$ $$O(n*log(n))$$

For small data sets we can use selection sort or insertion sort.
In other cases we should prefer $O(n*log(n))$ algorithms.

By the way, here you can find a nice tutorial about data structures and algorithms.