Analysis of Sorting Algorithms Runtime (2024)

In this article, we will analyze the runtime of several sorting algorithms through tests with arrays of different sizes. The results will be presented graphically, and finally, it will be validated whether the obtained results correspond to the computational complexity of each tested algorithm using Big O notation. The sorting algorithms that will be analyzed and compared are:

  • Bubble Sort;
  • Selection Sort;
  • Insertion Sort;
  • Shell Sort;
  • Quick Sort;
  • Merge Sort.

These algorithms are used to sort vectors of different sizes and types of values, but they are different in performance and usage scenario, we will see at the end of the article how efficient they are in relation to each other.

What is the computacional complexity?

What is the Big O notation?

Bubble Sort

Selection Sort

Insertion Sort

Shell Sort

Quick Sort

Merge Sort

How were the tests conducted and what are the results?

Analyzing the Results

Conclusion

But before we explain how the analyzed algorithms work, we need to explain what computational complexity is. Computational complexity seeks to analyze the efficiency of an algorithm based on its execution time and the space it occupies in memory. However, algorithms do not have the same efficiency on different devices, as the algorithm’s efficiency depends, for example, on the machine’s processor. To solve this problem, it is possible to calculate algorithmic complexity using special notations, and the notation we will use in this article is Big O.

For a better understanding of the rest of the article, it is also necessary to know what Big O notation is. Big O notation is a way of describing the complexity of an algorithm through the asymptotic growth of its execution time. With it we can classify the worst and best case of an algorithm with the following Big O complexities, the items were listed from best to worst complexity:

  • O(1) -> Constant;
  • O(n) -> Linear;
  • O(Log n) -> Logarithmic;
  • O(n Log n) -> Logarithmic-Linear;
  • O(n²) -> Quadratic;
  • O(2^n) -> Exponential;
  • O(n!) -> Factorial.

The algorithms analyzed are between O(n) and O(n Log n) complexities, highlighting that there are no sorting algorithms with O(1) complexity, as a constant time algorithm does not depend on the size of an input, which would be necessary for ordering a vector that has a variable size.

The Bubble Sort algorithm is the simplest structurally, as well as being easy to understand, thus being the first sorting algorithm presented to beginner programmers. Its operation consists of comparing pairs of adjacent numbers starting from the left to the right. If the number on the right is smaller than the one on the left, they swap positions, and this process is repeated until the entire array is finally sorted. Such structure of Bubble Sort results in its algorithmic complexity being O(n²) in both the best and worst-case scenarios.

The Selection Sort also has a simple structure; its goal is to select a value from the array and swap it with the smallest value. It initially considers the first number of the array as the smallest value, then traverses the array to find the smallest value, stores its index, and swaps positions. This process repeats until the array is sorted. However, because it traverses the array regardless of whether it is sorted, its algorithmic complexity is O(n²) in both the best and worst-case scenarios.

The Insertion Sort is often compared to sorting cards in one hand, where the cards are divided into sorted and unsorted, and you keep moving the unsorted cards until they are in sorted order. Therefore, the main characteristic of Insertion Sort is comparing a value of the array to its previous until that value is in its correct position. In the best case, when the array is already sorted, the algorithm has a complexity of O(n), but in the worst case, when the array is in descending order, it has a complexity of O(n²).

The Shell Sort is quite similar to the Insertion Sort, to the extent that its final iteration on the array is the Insertion Sort itself. The difference is that Shell Sort has a controller, usually called the Gap. The Gap is the interval between values that will be compared; initially, it can be found by dividing the size of the array by 2, but with each iteration, the calculation is repeated using the result of the previous Gap divided by 2 until the Gap equals 1. The complexity of Shell Sort in the best case is O(n log n), and in the worst case, it is O(n²).

The Quick Sort is one of the fastest sorting algorithms along with the Merge Sort, which will be presented next. Quick Sort is an algorithm that utilizes recursive functions to sort an array, meaning it calls itself when necessary. However, the main technique of Quick Sort is Partitioning, which involves choosing any value, called the pivot, and placing it in its correct position while ensuring that the values to its left are smaller than it and the values to its right are larger than it, this process repeats until the array is sorted. The complexity of Quick Sort in the best case is O(n log n), and in the worst case, it is O(n²).

The Merge Sort also utilizes recursive functions in its structure, focusing on dividing an array into sublists, a method commonly known as “divide and conquer.” It divides the array using controllers called Left, Right, and Mid, where Mid is equal to the sum of Left and Right divided by 2. The division continues until there are sublists of two elements. It sorts these sublists and recursively merges them back, sorting the other created sublists until the original array is sorted. The complexity of Merge Sort in both the best and worst-case scenarios is O(n log n).

The tests were conducted using the code created by the authors of this article, which was developed in C++. The code can be found at the link provided at the end of this article. The tests to obtain the execution time of the algorithms were conducted on a single machine. The elements of the arrays were randomly generated, and for each array input size, the code was executed again. In each execution, the sorting algorithms were called 5 times — a value that can be modified in the code — to increase result accuracy by providing an average of the execution time for each algorithm.

The following table shows the input sizes that were tested and the average execution time of each algorithm for each size, it is worth noting that the time was measured in milliseconds (ms):

Analysis of Sorting Algorithms Runtime (2)

For a better visualization of the data above, we can graphically represent the same results. The y-axis represents the time in milliseconds, while the x-axis represents the size of the input array:

Analysis of Sorting Algorithms Runtime (3)

The runtime results of the algorithms align with their complexities as initially presented. Bubble Sort is the poorest performing sorting algorithm among the others, followed by Selection Sort and Insertion Sort, which in turn is the best and most effective among the three, confirming the O(n²) complexity of Bubble and Selection Sort, and the O(n) complexity in the best case of Insertion Sort. However, these three algorithms compared to Quick Sort, Merge Sort, and Shell Sort exhibit a notable performance gap, confirming their O(n log n) complexity. These algorithms may not initially appear on the graph, but they are represented by a small green line at the bottom.

As a point of comparison, while Bubble Sort averaged of 3 minutes of execution when the input array size was 100,000 elements, Quick Sort achieved an execution time averaging of 0.6 seconds with the same input size. The average execution time for Merge Sort was 0.1 seconds and the best was Shell Sort with 81 ms in the same conditions.

What may explain the performance of Bubble and Selection being the worst is due to their structure that necessarily runs through the entire vector even if it is ordered. And the best algorithms, Quick, Shell and Merge, have better performance due to the “Divide and Conquer” tactic, where they divide the array into sublists.

It was verified that the tests conducted confirm the complexities of the sorting algorithms analyzed. However, due to a possible implementation issue in Quick Sort, it was observed that it is not the fastest among Merge Sort and Shell Sort. More tests and adjustments need to be made for a more accurate measurement of the execution time of Quick Sort. The code used for the tests can be found here: https://github.com/ManoelIvisson/execution_time_of_sorting_algorithms.

Authors: Manoel Ívisson de Araújo Ferreira and Yago Cortez Rodrigues Fernandes

Analysis of Sorting Algorithms Runtime (2024)
Top Articles
Latest Posts
Recommended Articles
Article information

Author: Kieth Sipes

Last Updated:

Views: 5893

Rating: 4.7 / 5 (47 voted)

Reviews: 94% of readers found this page helpful

Author information

Name: Kieth Sipes

Birthday: 2001-04-14

Address: Suite 492 62479 Champlin Loop, South Catrice, MS 57271

Phone: +9663362133320

Job: District Sales Analyst

Hobby: Digital arts, Dance, Ghost hunting, Worldbuilding, Kayaking, Table tennis, 3D printing

Introduction: My name is Kieth Sipes, I am a zany, rich, courageous, powerful, faithful, jolly, excited person who loves writing and wants to share my knowledge and understanding with you.