Recursive Quicksort in C:
What is a sorting algorithm?⌗
Sorting algorithms are amongst the most basic concepts of computer science. A sorting algorithm is an algorithm that puts elements of a list(or array) in a certain manner. There are plethora of sorting algorithm available.
Examples are⌗
Bubble sort, Radix sort, Insertion sort, Selection sort, Quick sort, Merge sort etc.
Why do we need sorting algorithm?⌗
So, the first and foremost question is why sorting is important? Sorting algorithms are necessary because sorted data is the prerequisite input to many other algorithms. For instance, Binary search needs sorted data to perform the search,many Database algorithms requires sorted data. (The knowledge of Recursion is the prerequisite for Quicksort.) Let’s have a brief overview of recursion.
What is recursion?⌗
The process of calling a function inside the same function until the base condition gets satisfied is called recursion., the call of the function is called the recursive call.
Quicksort:⌗
The “Quick” in Quicksort does not necessarily mean that it is the fastest, but to be specific quicksort is usally fast although it has worst case O(n^2) behaviour. Quicksort follows Divide and Conquer strategy. Quicksort works by recursively breaking down the problem into two or more subproblems of the same or related problem type, after we break them into smaller subproblems they become simple enough to be solved directly. There are many different ways to implement Quicksort.
 We can pick the first element as pivot.
 We can pick the lst element as pivot.
 We can pick median as pivot.
 We can pick pivot randomly.
Mostly we pick the first and last element as pivot, in this article we will implement Quicksort by picking up the last element as pivot. Now let’s have a glace at the Quicksort program implemented in C, then we will track it.
Code:⌗
#include<stdio.h>
void swap(int* a, int* b)
// A function to swap two elements using pointers
{
int t = *a;
*a = *b;
*b = t;
}
int partition (int arr[], int low, int high)
{
int pivot = arr[high]; // pivot
int i = (low  1); // Index of smaller element
for (int j = low; j <= high 1; j++)
{
// If current element is smaller than the pivot
if (arr[j] < pivot)
{
// increment index of smaller element
i++;
swap(&arr[i], &arr[j]);
}
}
swap(&arr[i + 1], &arr[high]);
return (i + 1);
}
void quickSort(int arr[], int low, int high)
{
if (low < high)
{
int pi = partition(arr, low, high); // pi is partitioning index, arr[p] is now at right place
// Separately sort elements before
// partition and after partition
quickSort(arr, low, pi  1);
quickSort(arr, pi + 1, high);
}
}
void printArray(int arr[], int size)
{
int i;
for (i=0; i < size; i++)
printf("%d ", arr[i]);
printf("n");
}
int main()
{
int arr[] = {10, 7, 8, 9, 1, 5};
int n = sizeof(arr)/sizeof(arr[0]);
quickSort(arr, 0, n1);
printf("Sorted array: \n");
printArray(arr, n);
return 0;
}
Output of the above program:⌗
Sorted array:
1 5 7 8 9 10
Explanation of the Quicksort algorithm:⌗
We select the last element as pivot and the pivot has to find it’s place in the array. Means, all the elements smaller than the pivot should be on the left side and all the elements that are greater than pivot should be on the right hand side.
 Array: 10 80 30 90 40 50 70
 Indexes: 0 1 2 3 4 5 6
For that instance the pivot is 70. Then we find out it’s correct position. After the pivot finds it’s correct place the array becomes,
 Array: 10, 30, 40, 50, 70, 90, 80
Here 70 is sorted as all the elements smaller than 70 is on it’s left hand side and all the elements that is greater than 70 is on it’s right hand side. The whole array get partitioned at the pivot’s index and we get two subarrays.
 Left subarray: 10 30 40 50
 Right subarray: 90 80
Now we have to sort both subarrays. For doing so we perform the same operation on both subarrays. Quicksort is recursively applied on left hand side and right hand side even though a iterative approach of quicksort is possible. This entire procedure is called partitioning procedure, Hence Quicksort uses partitioning procedure for sorting the elements.
Time complexity of quicksort:⌗
 Average case: O(nlogn)
 Worst case: O(n^2)
 Best case: O(logn)
Is quick sort an inplace sorting algorithm?⌗
Inplace means the input and ouput occupy the same memory. Hence, there is no copying of input to output. Quicksort is an inplace sorting algorithm as it does not require partial or complete copies of data being sorted. But that does not mean that Quicksort requires zero additional memory. We have implemented Quicksort using recursion. So, it consumes O(logn) stack space. Note: It does not require any heap space.
Can we implement Quicksort using loops?⌗
We can also implement QuickSort by using loops instead of recursion. We must perform partitioning in both iterative and recursive Quicksort. Here we will limit ourselves in Recursive quicksort only.
Why Quicksort is better than Merge sort?⌗
This is a common question asked in many software engineering interview. Quicksort is considered better than mergesort as even though Merge sort has a better worst case performence than Quicksort. there are some certain reasons due to which Quicksort is better, they are

The worst case of Quicksort O(n^2) can be avoided by using randomized pivot selection. If we use randomized pivot selection then there is a very high probability that we will obtain average case O(nlogn) that makes randomized quicksort as efficient as mergesort.

Mergesort requires extra space but quicksort requires a little space. For quicksort no additional space is required to perform sorting but for mergesort we require a temorary array to merge the sorted arrays. So, Quicksort is having an advantage of space over mergesort.