Quicksort is one of the most efficient algorithms that we can use to accomplish our sorting goals. In this article, we discuss how to implement Quicksort in C# as well as analyze its time and space complexity.

Let’s start.Â

## What is Quicksort Algorithm?

Just like merge sort, quicksort uses the **“divide and conquer”** strategy to sort elements in arrays or lists. It implements this strategy by choosing an element as a pivot and using it to partition the array.

The left subarray contains all elements that are less than the pivot. The right subarray contains all the elements that are greater than the pivot. We recursively repeat this process until we sort the array. We can select the pivot the algorithm uses during this process in different ways:

- The first element of the array
- The last element of the array
- A random element of the array
- Median element of the array

So, what is the best pivot to select when implementing the quicksort algorithm? The answer to this question is not that simple.

Selecting the middle element of the unsorted array seems to make sense as it divides the array into equal halves.Â However, the process of finding that middle element is difficult and time-consuming. Using this strategy involves calculating the array’s length in every iteration and halving it to determine the index of the element in the middle of the array.Â

On the other hand, when using the median element of the array as the pivot, we use the **median-of-three** technique where we select the pivot based on the median of three values such as the first, middle, and last elements of the array.Â

Therefore, selecting the first, last, random, or median element of the array as the pivot is the best approach.Â

Let’s take a deep dive and have and learn how quicksort works.Â

## How Does Quicksort Algorithm Work?

To illustrate how the quicksort algorithm works, let’s assume we intend to sort this array:

`int[] array = { 52, 96, 67, 71, 42, 38, 39, 40, 14 };`

In this article, let’s take the first element (52) as the pivot as we learn how to implement quicksort.

### First Partition Level

We start traversing the array from the left and right indexes while comparing their elements against the pivot. 96 is greater than the pivot while 14 is less than the pivot so we swap their positions and the array becomes:

`52, 14, 67, 71, 42, 38, 39, 40, 96`

Next, we can see that 67 is greater than the pivot element while 40 is less than the pivot so we swap their positions:

`52, 14, 40, 71, 42, 38, 39, 67, 96`

As we traverse the array, 71 is greater than the pivot while 39 is less than the pivot so we swap their positions:

`52, 14, 40, 39, 42, 38, 71, 67, 96`

38 and 42 are both less than the pivot, which triggers the iteration to stop. The next step is to determine the array’s split point. 38 is less than the pivot so we swap their positions while 71 is greater than the pivot, which becomes the new split point.Â

`38, 14, 40, 39, 42, 52, 71, 67, 96`

Given the fact that quicksort is recursive, in the next iteration, we are going to have two subarrays based on the identified splitting points.Â

### Second Partition Level

The left subarray is `38, 14, 40, 39, 42`

and the right subarray is `71, 67, 96`

Let’s start with the right subarray and take 71 as the pivot. 67 is less than the pivot hence, we swap their positions. On the other hand, 96 is greater than the pivot so we don’t swap them and the array becomes sorted:

`67, 71, 96`

We are going to repeat the same process for the left subarray and select 38 as the pivot for that subarray. 14 is less than the pivot while the rest of the elements are greater than the pivot so the array becomes:

`14, 38, 40, 39, 42`

### Third Partition Level

In the last iteration, quicksort takes 40 as the pivot. 39 is less than the pivot so we swap their positions but 42 is greater than the pivot, which completes the sorting process.

`39, 40, 42`

We can see that by using the divide and conquer strategy we complete the sorting process efficiently with the result being:

`14, 38, 39, 40, 42, 67, 71, 96`

Let’s learn how to implement the quicksort algorithm in C#.Â

## How to Implement Quicksort in C#?

We are going to define a method `SortArray()`

as our entry point into the sorting algorithm. The method takes three parameters `int[] array`

, `int leftIndex`

and `int rightIndex`

:

public int[] SortArray(int[] array, int leftIndex, int rightIndex) { var i = leftIndex; var j = rightIndex; var pivot = array[leftIndex]; while (i <= j) { while (array[i] < pivot) { i++; } while (array[j] > pivot) { j--; } if (i <= j) { int temp = array[i]; array[i] = array[j]; array[j] = temp; i++; j--; } } if (leftIndex < j) SortArray(array, leftIndex, j); if (i < rightIndex) SortArray(array, i, rightIndex); return array; }

`SortArray()`

starts by assigning the values of the `leftIndex`

and `rightIndex`

to new variables `i`

and `j`

, which we are going to use when iterating through the array.Â

Next, we set the pivot as the leftmost element in the array:

`var pivot = array[leftIndex];`

The algorithm starts placing the pivot element at its correct position in the sorted array by dividing the array into two lists in the outermost while loop.Â Our goal is to place all smaller elements (smaller than the pivot) to the left of the pivot and all greater elements to the right of the pivot.Â

If the elements to the left of the pivot are less than the pivot element, we skip their positions:

while (array[i] < pivot) { i++; }

Subsequently, if the elements to the right of the pivot are greater than the pivot element, we skip their positions as we loop through the right subarray:

while (array[j] > pivot) { j--; }

As we loop through the array, if we find an element in the left subarray that is greater than the pivot and an element in the right subarray which is less than the pivot, we swap their positions.Â `SortArray()`

swaps the positions of `array[i]`

and `array[j]`

and updates the subarrays’ counters accordingly:

if (i <= j) { int temp = array[i]; array[i] = array[j]; array[j] = temp; i++; j--; }

Since the quicksort algorithm is recursive, the method calls itself to sort the left and right subarrays and returns a sorted array when the process is complete:

if (leftIndex < j) SortArray(array, leftIndex, j); if (i < rightIndex) SortArray(array, i, rightIndex); return array;

Finally, we can verify that the `SortArray()`

method sorts a given unsorted array accurately:

var array = new int[] { 73, 57, 49, 99, 133, 20, 1 }; var expected = new int[] { 1, 20, 49, 57, 73, 99, 133 }; var sortFunction = new QuickSortMethods(); var sortedArray = sortFunction.SortArray(array, 0, array.Length - 1); Assert.IsNotNull(sortedArray); CollectionAssert.AreEqual(sortedArray, expected);

Let’s learn quicksort’s time and space complexity and understand why it is considered to be one of the most efficient sorting algorithms.Â Â

## Space Complexity of Quicksort Algorithm

As we’ve seen in the implementation section, the quicksort algorithm only needs extra space for handling recursive function calls and temporary variables when swapping array elements. This means quicksort calls itself on the order of log(N) times while the number of calls in worst-case scenarios is O(N).

At each partition level or recursive call, the algorithm has to allocate a new stack frame of constant size to handle the sorting process. Therefore, the space complexity of the quicksort algorithm is **O(log N)**.Â

## Time Complexity of Quicksort Algorithm

Quicksort is a “divide and conquer” algorithm as it subdivides a large unsorted array into smaller partitions that are easily sorted by comparing array elements with their pivots.Â

### Best-Case Time Complexity

Quicksort achieves optimal performance if we always divide the arrays and subarrays into two partitions of equal size by selecting the middle element as the pivot. For example, when we want to sort an array that has four elements, we partition it twice since we use recursion to implement the algorithm.

This becomes quicksort’s **best-case** scenario, which we can explain using this recurrence relation:

`T(N) = 2T(N/2) + O(N)`

From this relation, we can see that the size of the array is halved each time we partition the array. This means the number of partitioning levels is **log _{2} N.** Therefore since we have N array elements and log

_{2}N partitioning levels, the best-case scenario of quicksort is

**O(N log N)**.

### Average-Case Time Complexity

To come up with an accurate depiction of how quicksort handles average-case complexity, we need to consider all the possible array permutations and calculate the time the algorithm takes to sort each array permutation, which is not easy.Â

However, since the algorithm still has partitioning log_{2} N levels and sorting N elements, the average-case complexity of the quicksort algorithm is **O(N log N).**

### Worst-Case Time Complexity

This scenario occurs if the selected pivot element is always the smallest or largest element of the array or its partitions. Such a scenario occurs when we keep choosing the last element in an array that has been sorted as the pivot element.

The partitioning process splits the array into two non-equal partitions. One partition has a length of zero (no element is larger than the pivot element). The other partition has a length of N-1 ( the rest of the elements except the pivot element).

Therefore, when we calculate the partitioning levels, the algorithm would need N partitioning levels with the partitions having size N, N-1, N-2, etc.Â

Quicksort’s worst-case complexity becomes **O(N ^{2})** as its partitioning effort decreases linearly from N to 0.Â

However, in practice, sorting a very large presorted array using the pivot strategy fails as a `StackOverflowException`

occurs, which is often caused by deep recursive calls in that array.

One of the strategies we can use to address this problem is shuffling. It introduces randomness in the array, making it possible for the algorithm to work efficiently.Â

## Advantages of Quicksort Algorithm

To start with, the quicksort algorithm is **fast and efficient**, especially in best and average-case time complexity scenarios. This makes it better than other algorithms such as selection sortÂ and bubble sort, which have **O(N ^{2})Â **time complexity.Â

Besides being fast and efficient, quicksort is **memory-efficient.Â **This algorithm sorts all the array elements in place, which makes it ideal for use in applications that have limited memory. On the other hand, other algorithms such as merge sort need auxiliary arrays to store values during the sorting process.Â

## Disadvantages of Quicksort Algorithm

First, quicksort is an **unstable algorithm**, which means, it may not preserve the original order of key-value pairs. For example, when quicksort encounters two similar elements, their order could be reversed as the algorithm sorts them. Therefore, in situations where stability is an important factor, quicksort may not be the best algorithm to use.Â

Besides being unstable, we can see that quicksort has a worst-case complexity of **O(N ^{2}).Â **Therefore, although the algorithm is fast and efficient in best and average-case scenarios, it is not efficient when it encounters large arrays that are sorted.Â

## Performance Tests

Let’s verify the quicksort algorithm has a time complexity of **O(N log N)**. We are going to accomplish this by measuring the time it takes for the algorithm to sort an array.

For these tests, we are going to use the first array element as the pivot element.Â Â

First, let’s write a method to generate a set of random array elements:

public static int[] CreateRandomArray(int size) { var array = new int[size]; var rand = new Random(); for (int i = 0; i < size; i++) array[i] = rand.Next(); return array; }

The `CreateRandomArray()`

Â method takes an integer as its sole input. Using the inbuilt `Random`

class, we generate integer values that we’re going to put into the array.

Next, we are going to define a method that generates a sequence of elements. This method simulates a scenario where we have a sorted array:

public static int[] CreateSortedArray(int size) { var array = new int[size]; for (int i = 0; i < size; i++) array[i] = i; return array; }

Next, we are going to create an object that holds different arrays that have random and sorted values:

public IEnumerable<object[]> SampleArrays() { yield return new object[] { CreateRandomArray(200), 0, 199, "Small Unsorted" }; yield return new object[] { CreateRandomArray(2000), 0, 1999, "Medium Unsorted" }; yield return new object[] { CreateRandomArray(10000), 0, 9999, "Large Unsorted" }; yield return new object[] { CreateSortedArray(200), 0, 199, "Small Sorted" }; yield return new object[] { CreateSortedArray(2000), 0, 1999, "Medium Sorted" }; yield return new object[] { CreateSortedArray(10000), 0, 9999, "Large Sorted" }; }

Each object entry has four values: an integer array such as `CreateRandomArray(200)`

, the index of the first value in the array (0), the index of the last element in the array (199), and a string object storing the name of that array (“Small Unsorted”).Â

The array objects have different sizes (to simulate time complexity scenarios) and hold random numbers that are added by the `CreateRandomArray()`

method. The `CreateSortedArray()`

method creates arrays that have values that are sorted.Â

Let’s assess the sample best, average, and worst-case complexity performance results of the algorithm:

| Method | array | leftIndex | rightIndex | arrayName | Mean | Error | StdDev | |---------- |------------- |---------- |----------- |---------------- |-------------:|-------------:|-------------:| | SortArray | Int32[200] | 0 | 199 | Small Sorted | 20.80 Î¼s | 0.333 Î¼s | 0.260 Î¼s | | SortArray | Int32[200] | 0 | 199 | Small Unsorted | 20.19 Î¼s | 0.387 Î¼s | 1.093 Î¼s | | SortArray | Int32[2000] | 0 | 1999 | Medium Sorted | 1,469.33 Î¼s | 29.042 Î¼s | 69.020 Î¼s | | SortArray | Int32[2000] | 0 | 1999 | Medium Unsorted | 1,458.40 Î¼s | 29.068 Î¼s | 75.034 Î¼s | | SortArray | Int32[10000] | 0 | 9999 | Large Sorted | 48,691.38 Î¼s | 2,374.318 Î¼s | 7,000.731 Î¼s | | SortArray | Int32[10000] | 0 | 9999 | Large Unsorted | 41,336.54 Î¼s | 1,670.869 Î¼s | 4,739.976 Î¼s |

We can see that quicksort performs well when sorting small and medium arrays. However, as we continue increasing the size of the array, the longer it takes to sort it. (5 times the array size, 30 times longer to sort it – medium to large unsorted)

From the benchmark, we can also see that it takes longer to sort presorted arrays than randomly generated arrays. At some point, quicksort’s worst-case time complexity may come into play resulting in the algorithm throwing a `StackOverflowException`

.

## Conclusion

In this article, we have learned how quicksort works in C#. It uses recursion and the “divide and conquer” strategy to make it efficient. Just like the other algorithms, it has some strengths and weaknesses.Â

Can you advise other site where algorithms will be as good explaned as here

You have a couple of problems. Probably related, but I haven’t had a chance to dig into it yet to be sure.

The algorithm gets caught in an infinite loop if there are duplicate items in the array. Test case:

It gets caught when it hits the second 73.

Second, there’s a clear problem here in your code:

My mistake. The second “problem” isn’t a problem. After the swap, that expression can be true.

but it still fails if the values are not unique.

You also get a StackOverflowException if you run it on a sorted list containing 20,000 items.

This fixes both of your problems, I think:

Thank you Chris. The author of the article is looking into this problem. Thanks for the suggestions. And of course for the code.