Sorting Algorithms Table For Time and Space Complexities

Understanding the time and space complexities of sorting algorithms is crucial when deciding which algorithm to use for a given problem. The time complexity measures how an algorithm’s running time increases as the size of the input grows, while the space complexity tracks the amount of memory the algorithm requires during execution. Different algorithms are better suited for different kinds of data, whether it’s small, large, already sorted, or randomly arranged.

Below is a table comparing the time complexity and space complexity of popular sorting algorithms. This table provides a clear view of how these algorithms perform in the best, average, and worst-case scenarios, as well as how much additional memory they require.

By referencing this table, you can make informed decisions about which sorting algorithm to use based on the specific constraints of your problem, such as time efficiency or memory usage.

Algorithm ▲▼ Best Case ▲▼ Average Case ▲▼ Worst Case ▲▼ Space Complexity ▲▼
Insertion SortO(n)O(n²)O(n²)O(1)
Selection SortO(n²)O(n²)O(n²)O(1)
Bubble SortO(n)O(n²)O(n²)O(1)
Cocktail SortO(n)O(n²)O(n²)O(1)
Merge SortO(n log n)O(n log n)O(n log n)O(n)
Quick SortO(n log n)O(n log n)O(n²)O(log n)
Heap SortO(n log n)O(n log n)O(n log n)O(1)
TimSortO(n)O(n log n)O(n log n)O(n)
Counting SortO(n + k)O(n + k)O(n + k)O(k)
Radix SortO(nk)O(nk)O(nk)O(n + k)
Bucket SortO(n + k)O(n + k)O(n²)O(n)
Pigeonhole SortO(n + k)O(n + k)O(n + k)O(n + k)