← Iris

Generation 0
Best distance 0
Improvement 0%
Diversity 100%
Cities 0
Click to add/remove cities
Preset:
Population 200
Mutation rate 0.05
Speed 5
Elitism 5%
Crossover:
Selection:

The Traveling Salesman Problem

Given a list of cities and the distances between them, find the shortest route that visits every city exactly once and returns to the starting point. It sounds simple. It is not. The Traveling Salesman Problem (TSP) is NP-hard — no known algorithm solves it in polynomial time, and most computer scientists believe none ever will. The difficulty is combinatorial: for 20 cities there are 20!/2 = 1.2×1018 possible routes. For 50 cities the number exceeds the atoms in the observable universe. You cannot search them all. You must be clever.

Despite its theoretical intractability, the TSP appears everywhere in practice. Circuit board manufacturing uses it to plan drill paths that minimize tool travel. Genome sequencing uses TSP-like formulations to assemble DNA fragments in the correct order. Logistics companies solve variants of it every day to route delivery trucks. The Hubble Space Telescope uses TSP algorithms to plan the order in which it observes targets, minimizing the time spent slewing between them. Any problem that requires visiting a set of locations in an efficient order is, at its heart, a traveling salesman.

Natural Selection as Algorithm

In 1975, John Holland published Adaptation in Natural and Artificial Systems, laying the formal groundwork for genetic algorithms. His insight was that evolution itself is a search algorithm. A population of organisms explores a vast space of possible genomes. Selection amplifies the fit; recombination mixes their traits; mutation introduces novelty. Over generations, the population converges on solutions to the problem of survival — without any individual organism understanding the problem at all.

A genetic algorithm follows the same loop: encode solutions as chromosomes, evaluate each one’s fitness, select the best to reproduce, crossover pairs of parents to create offspring, and mutate the offspring for diversity. Repeat. For the TSP, each chromosome is a permutation of city indices — a route. Fitness is the inverse of total distance. The algorithm knows nothing about geometry, angles, or maps. It only knows which routes are shorter, and it evolves toward them.

// The evolutionary loop while (!converged) { fitness = population.map(route => 1 / totalDistance(route)); parents = select(population, fitness); offspring = crossover(parents); offspring = mutate(offspring); population = nextGeneration(population, offspring); }

Order Crossover

The central challenge of applying genetic algorithms to the TSP is that routes are permutations. You cannot simply swap segments between two parent routes the way you would with binary strings — the result would contain duplicate cities and miss others entirely. Order Crossover (OX), introduced by Davis in 1985, solves this elegantly. It copies a random segment from Parent 1 into the child, then fills the remaining positions with cities from Parent 2, in the order they appear, skipping any that are already present.

// Order Crossover (OX) Parent 1: [3 4 8 2 7 1 6 5 9 0] // copy segment 4-8-2-7 Parent 2: [0 1 2 5 4 6 9 3 7 8] // fill from P2, skip 4,8,2,7 Child: [1 6 5 4 8 2 7 9 0 3] // valid permutation

What makes OX powerful is that it preserves relative order from Parent 2 while keeping an absolute segment from Parent 1. A good sub-route from one parent survives intact; the overall sequencing from the other parent fills in the gaps. This respects the structure of the TSP — both the local connections between adjacent cities and the global ordering of the tour carry information about route quality. Partially Mapped Crossover (PMX) takes a different approach, preserving position-based relationships instead of order-based ones, which can be better when specific cities need to appear at specific positions in the tour.

Selection Pressure

How you choose parents determines everything about the algorithm’s behavior. Tournament selection picks k random individuals and lets the best one reproduce. A large tournament size means only the very best survive — high pressure, fast convergence, but risk of premature convergence to a local optimum. A small tournament is gentler: mediocre solutions survive longer, maintaining diversity, but convergence is slower. The parameter k is a dial between exploitation (refining what you have) and exploration (searching for something better).

Roulette wheel selection assigns each individual a slice of a wheel proportional to its fitness. A spinner selects parents. The fittest individuals get the biggest slices, but even weak solutions have a chance. This stochastic element is crucial: in a rugged fitness landscape with many local optima, the best current solution is often not on the path to the global optimum. You need some bad solutions to survive, because their offspring — combined with the right mate — might leapfrog to a distant, better peak. Evolution does not hill-climb. It explores in parallel.

The Fitness Landscape

Imagine the space of all possible routes as a vast terrain, where elevation represents route quality (lower is better). The TSP fitness landscape is rugged — riddled with local optima, plateaus, and deceptive valleys. A route that looks good locally (every nearby swap makes it worse) may be far from the global optimum. Pure hill-climbing — the strategy of always accepting improvements and never going uphill — will get trapped in the first valley it falls into. For 50 cities, there are astronomically many such traps.

A genetic algorithm navigates this landscape differently. Instead of a single point climbing a single hill, it maintains an entire population scattered across the terrain. Selection pulls the population toward peaks. Crossover lets individuals on different peaks exchange information, potentially discovering the valley between them leads to a higher peak beyond. Mutation randomly teleports individuals to new locations, occasionally discovering entirely unexplored regions. The population is a parallel search party, and its diversity is its most valuable resource. When diversity collapses — when all individuals converge to the same peak — the search effectively ends, whether or not the global optimum has been found.

From Biology to Engineering

Genetic algorithms have escaped the textbook. NASA used a GA to design the antenna for the ST5 spacecraft in 2006 — the evolved design outperformed every human-engineered alternative and looked like nothing a human engineer would ever draw, a twisted wire sculpture that happened to have ideal radiation patterns. Google used evolutionary strategies (a close relative of GAs) in their AutoML system to evolve neural network architectures, discovering designs that beat hand-crafted networks on image classification benchmarks.

The power of genetic algorithms lies in problems where the fitness function is available but the solution structure is opaque. If you can evaluate a solution but cannot derive one — if you can say “this antenna works well” but not “here is the equation for the perfect antenna” — then evolution is a natural fit. Protein folding, drug discovery, game AI, scheduling, chip layout, art generation: the list of domains where GAs have found purchase reads like a survey of hard optimization problems. What you see in the canvas above is the same fundamental process that produced every living thing on Earth, compressed into a few hundred generations and applied to the simple, beautiful, intractable problem of finding the shortest path through a set of points.