← Iris
Place:
Select tool

Circuit info

Gates 0
Wires 0
Inputs 0
Outputs 0
Max depth 0
Prop. delay 0 ns

Truth table

Build a circuit to see its truth table

Preset circuits

Select a gate type from the toolbar and click the canvas to place it  ·  Click an output port (right side) then click an input port (left side) to connect with a wire  ·  Click an input switch to toggle HIGH/LOW  ·  Drag a component to move it  ·  Select tool + click a component, then press Delete or use the delete button to remove it

How Logic Gates Work

Boolean Algebra Basics

All of digital computing reduces to three fundamental operations: AND, OR, and NOT. George Boole formalized these in 1854 as an algebra of logic, where variables take only two values — true or false, 1 or 0. AND outputs 1 only when both inputs are 1. OR outputs 1 when at least one input is 1. NOT inverts its single input. From these three, every possible logical function can be constructed.

AND:  A · B     →  1 only if A=1 AND B=1
OR:   A + B     →  1 if A=1 OR B=1 (or both)
NOT:  ¬A (or A') →  flips 0→1, 1→0
XOR:  A ⊕ B     →  1 if A ≠ B (exclusive or)

Universal Gates

NAND and NOR are called universal gates because either one, alone, can implement any Boolean function. This is a remarkable result: you need only a single gate type to build an entire computer. To prove it, build NOT, AND, and OR from NAND gates alone:

NOT from NAND:  ¬A = NAND(A, A)
AND from NAND:  A · B = NAND(NAND(A,B), NAND(A,B))
OR  from NAND:  A + B = NAND(NAND(A,A), NAND(B,B))

Since NOT, AND, and OR are sufficient for any Boolean function, and we can build all three from NAND, NAND alone is functionally complete. The same argument works for NOR. This is why real integrated circuits are often built entirely from NAND or NOR gates — manufacturing one gate type is simpler and cheaper than manufacturing many.

Combinational vs. Sequential

Circuits divide into two fundamental categories. Combinational circuits have outputs determined solely by their current inputs — no memory, no feedback. An adder is combinational: give it two numbers and it returns their sum, the same way every time. Sequential circuits include feedback loops that let them store state. The SR latch in the presets above is the simplest example: it remembers which input was last activated. Flip-flops, registers, and RAM are all sequential circuits. The distinction matters because combinational circuits are described by truth tables, while sequential circuits require state diagrams.

From Gates to CPUs

A modern CPU contains billions of transistors, each acting as a tiny switch. Groups of transistors form logic gates. Gates combine into functional units: adders, multiplexers, decoders, registers. Registers feed into arithmetic logic units (ALUs). ALUs, control logic, and memory together form a processor. The entire hierarchy — from transistor to computer — is built from the same Boolean logic you see in this simulator. What changes is scale, not principle. The half adder here uses two gates; a 64-bit ALU uses thousands; but both follow the same rules.

Clock signals synchronize everything. Each tick of the clock lets signals propagate through one layer of gates. The “propagation delay” shown in the stats panel estimates how many nanoseconds the circuit needs for its outputs to stabilize after inputs change — the fundamental speed limit of digital computation.

History

George Boole published An Investigation of the Laws of Thought in 1854, creating the algebraic framework for logic. For eighty years, this remained pure mathematics with no practical application. Then in 1937, Claude Shannon — a 21-year-old master’s student at MIT — wrote what has been called the most important master’s thesis of the twentieth century. Shannon showed that Boole’s algebra mapped perfectly onto electrical relay circuits: a closed switch was 1, an open switch was 0, series connections were AND, parallel connections were OR. This single insight — that abstract logic could be physically implemented with switches — is the foundation of every digital computer ever built.

The transition from relays to vacuum tubes to transistors to integrated circuits changed the speed and scale of computation by factors of billions, but the underlying logic has not changed since Shannon’s thesis. A NAND gate in a modern 3nm chip implements the same Boolean function as a pair of relays in a 1940s telephone exchange.