Understanding the time and space complexities of sorting algorithms is crucial when deciding which algorithm to use for a given problem. The **time complexity** measures how an algorithm’s running time increases as the size of the input grows, while the **space complexity** tracks the amount of memory the algorithm requires during execution. Different algorithms are better suited for different kinds of data, whether it’s small, large, already sorted, or randomly arranged.

Below is a table comparing the **time complexity** and **space complexity** of popular sorting algorithms. This table provides a clear view of how these algorithms perform in the best, average, and worst-case scenarios, as well as how much additional memory they require.

By referencing this table, you can make informed decisions about which sorting algorithm to use based on the specific constraints of your problem, such as time efficiency or memory usage.

Algorithm ▲▼ | Best Case ▲▼ | Average Case ▲▼ | Worst Case ▲▼ | Space Complexity ▲▼ |
---|---|---|---|---|

Insertion Sort | O(n) | O(n²) | O(n²) | O(1) |

Selection Sort | O(n²) | O(n²) | O(n²) | O(1) |

Bubble Sort | O(n) | O(n²) | O(n²) | O(1) |

Cocktail Sort | O(n) | O(n²) | O(n²) | O(1) |

Merge Sort | O(n log n) | O(n log n) | O(n log n) | O(n) |

Quick Sort | O(n log n) | O(n log n) | O(n²) | O(log n) |

Heap Sort | O(n log n) | O(n log n) | O(n log n) | O(1) |

TimSort | O(n) | O(n log n) | O(n log n) | O(n) |

Counting Sort | O(n + k) | O(n + k) | O(n + k) | O(k) |

Radix Sort | O(nk) | O(nk) | O(nk) | O(n + k) |

Bucket Sort | O(n + k) | O(n + k) | O(n²) | O(n) |

Pigeonhole Sort | O(n + k) | O(n + k) | O(n + k) | O(n + k) |