← Iris
comparisons 0
swaps 0
array accesses 0
ready
Algorithm
Array size 50
Speed Medium
Best O(n)
Average O(n²)
Worst O(n²)
Space O(1)
Stable Yes

What you are seeing

Each vertical bar represents a number. The array is shuffled randomly, then the chosen sorting algorithm rearranges the bars from shortest to tallest. Gold highlights mark elements currently being compared. Orange flashes indicate swaps. Dimmer bars have already reached their final sorted position.

The algorithms

Bubble sort repeatedly walks through the array, swapping adjacent elements that are out of order. Simple but slow — O(n²) in the average case. Insertion sort builds the sorted portion one element at a time, sliding each new element into its correct position. Excellent on nearly-sorted data. Selection sort finds the minimum unsorted element on each pass and places it at the front.

Merge sort divides the array in half recursively, then merges the sorted halves back together — guaranteed O(n log n) but requires extra memory. Quick sort picks a pivot and partitions elements around it; on average O(n log n) but degrades to O(n²) on adversarial inputs. Heap sort builds a max-heap and repeatedly extracts the maximum, achieving O(n log n) in all cases with O(1) extra space.

Radix sort is the odd one out: it never compares elements at all. Instead, it sorts digit by digit from least significant to most significant, using counting sort as a stable subroutine. Its time complexity depends on the number of digits, not comparisons.

Try this

Run bubble sort on a reversed array, then on a nearly-sorted array. Watch how the number of comparisons changes. Then try the same with insertion sort — you will see why input distribution matters as much as Big-O notation.