Back to Blog
sortingquick-sortalgorithmsdivide-and-conquer

Quick Sort: How It Works Step by Step

Quick sort is one of the fastest sorting algorithms in practice. Learn how pivot selection and partitioning work, understand its O(n log n) average performance, and see it in action.

CS VisualizationsMarch 22, 20267 min

Interactive Visualization

Quick Sort Visualization

See this concept in action with our step-by-step interactive visualization.

Try the Visualization

Quick sort is the algorithm most programming languages use under the hood when you call .sort(). It's fast, memory-efficient, and elegant — a masterclass in divide-and-conquer problem solving.

Let's break down exactly how it works.

The Core Idea: Divide and Conquer

Quick sort follows a three-step strategy:

  1. Choose a pivot — pick one element from the array
  2. Partition — rearrange elements so everything smaller than the pivot is on the left, everything larger is on the right
  3. Recurse — apply quick sort to the left and right sub-arrays

After partitioning, the pivot is in its final sorted position. We then recursively sort the elements to its left and right. Eventually, every element has been a pivot, and the array is sorted.

A Concrete Example

Let's sort [8, 3, 1, 7, 5, 6, 2, 4] using the last element as the pivot:

8031127354652647pivot=4
Quick sort picks a pivot (purple) and partitions: elements ≤ pivot go left (green), elements > pivot go right.

Step 1: Pivot = 4

Partition the array around 4:

  • Elements ≤ 4: [3, 1, 2]
  • Pivot: [4]
  • Elements > 4: [8, 7, 5, 6]

Result: [3, 1, 2, **4**, 8, 7, 5, 6] — 4 is in its final position (index 3).

31248756123687
Quick sort recursively partitions sub-arrays. Each pivot (purple) lands in its final position.

Step 2: Sort left sub-array [3, 1, 2]

Pivot = 2. Partition:

  • Elements ≤ 2: [1]
  • Pivot: [2]
  • Elements > 2: [3]

Result: [1, **2**, 3]

Step 3: Sort right sub-array [8, 7, 5, 6]

Pivot = 6. Partition:

  • Elements ≤ 6: [5]
  • Pivot: [6]
  • Elements > 6: [8, 7]

Result: [5, **6**, 8, 7]

Continue recursively on [8, 7]: Pivot = 7. Result: [**7**, 8]

Final result: [1, 2, 3, 4, 5, 6, 7, 8]

The Partitioning Algorithm

The partition step is where the real work happens. Here's the Lomuto partition scheme:

def partition(arr, low, high):
    pivot = arr[high]  # Choose last element as pivot
    i = low - 1        # Index of smaller element boundary
 
    for j in range(low, high):
        if arr[j] <= pivot:
            i += 1
            arr[i], arr[j] = arr[j], arr[i]  # Swap
 
    arr[i + 1], arr[high] = arr[high], arr[i + 1]  # Place pivot
    return i + 1  # Return pivot's final position

The variable i tracks the boundary between "elements ≤ pivot" and "elements > pivot." As j scans through the array, whenever we find an element ≤ pivot, we swap it into the left partition.

The Full Algorithm

def quick_sort(arr, low=0, high=None):
    if high is None:
        high = len(arr) - 1
    if low < high:
        pivot_idx = partition(arr, low, high)
        quick_sort(arr, low, pivot_idx - 1)   # Sort left
        quick_sort(arr, pivot_idx + 1, high)   # Sort right
    return arr

The elegance is in its simplicity — just partition and recurse. No merging step needed (unlike merge sort) because the partition puts elements in the right relative position.

Time Complexity

CaseComplexityWhen
BestO(n log n)Pivot always splits evenly
AverageO(n log n)Random data
WorstO(n²)Already sorted + bad pivot choice

Why O(n log n) on average?

Each partition step does O(n) work — it scans every element once. If the pivot splits the array roughly in half each time, we get log n levels of recursion. Total: n × log n.

When does O(n²) happen?

If the pivot is always the smallest or largest element, one partition is empty and the other has n-1 elements. Instead of log n levels, we get n levels. This happens when:

  • The array is already sorted and you always pick the first/last element as pivot
  • All elements are identical

Preventing worst-case

  • Randomized pivot: Pick a random element instead of always using the last
  • Median-of-three: Use the median of the first, middle, and last elements
  • Introsort: Switch to heap sort if recursion gets too deep (what most standard libraries use)

Space Complexity

Quick sort is in-place — it doesn't create copies of the array. But it uses the call stack for recursion:

  • Average: O(log n) stack space
  • Worst: O(n) stack space

This is better than merge sort's O(n) auxiliary space for the merge buffer.

Quick Sort vs Merge Sort

These two O(n log n) algorithms are often compared:

AspectQuick SortMerge Sort
Average timeO(n log n)O(n log n)
Worst timeO(n²)O(n log n)
SpaceO(log n)O(n)
Stable?NoYes
Cache-friendly?YesLess so
In practiceUsually fasterGuaranteed consistent

Quick sort wins in practice because:

  1. Its inner loop is simple (just comparisons and swaps)
  2. It's cache-friendly — elements are accessed sequentially
  3. The O(n²) worst case is easily avoided with good pivot selection
  4. No extra memory allocation needed

Merge sort wins when:

  1. You need a guaranteed O(n log n) worst case
  2. Stability matters (preserving order of equal elements)
  3. You're sorting linked lists (merge sort is naturally suited)

Pivot Selection Strategies

The pivot choice dramatically affects performance:

Last element (Lomuto): Simple but worst-case on sorted data.

Random element: Excellent average case. Makes worst-case astronomically unlikely.

Median-of-three: Take the median of the first, middle, and last elements. Good balance of simplicity and performance.

Ninther: Median of three medians-of-three. Used in some high-performance implementations.

Why Quick Sort is King in Practice

Despite merge sort's better worst-case guarantee, quick sort is the default choice for most standard library sort implementations:

  • C's qsort() — typically introsort (quick sort + heap sort fallback)
  • C++ std::sort() — introsort
  • Java Arrays.sort() for primitives — dual-pivot quick sort
  • Python sorted() — uses Timsort (based on merge sort, actually)

The constant factors in quick sort are small, cache performance is excellent, and the worst case is easily avoided.

Practice These Concepts

Apply partitioning and sorting to real problems:

Related Articles

See It In Action

Quick sort's recursive partitioning is much easier to understand when you can see it happening. Our interactive visualization shows:

  • The pivot being selected and highlighted
  • Elements being compared and swapped during partitioning
  • The recursive breakdown into smaller and smaller sub-arrays
  • Each pivot landing in its final sorted position

Step through it at your own pace, or watch it run at full speed to appreciate its efficiency.

Interactive Visualization

Quick Sort Visualization

See this concept in action with our step-by-step interactive visualization.

Try the Visualization