A system is a river flowing through a channel over distance.
Six variables define everything about how it behaves.
The variables
Granularity of flow. How finely change can be expressed. The precision of the pipe.
The amount of state moving. Δ defines both throughput and the transformation applied at each step.
New possibilities entering the system. Tributaries feeding the river. The source of growth.
Stabilisation. Dissipation. The river mouth regulating the system. What leaves and what survives.
Total water in the system at once. How much can flow concurrently. High V → parallelism. Low V → serial execution.
Distance the river travels. Number of transformation stages. High L → deeper refinement. Low L → shallow, immediate output.
System behaviour
Δ moves the flow
φ feeds the system
ψ stabilises it
V determines how much runs in parallel
L determines how far transformation proceeds
Failure modes
AI mapping
ψ → model constraints (weights)
Δ → token transitions (inference steps)
ε → sampling resolution (temperature / top-k)
V → context window + parallel tokens
L → layers / depth of reasoning / chain length
Every system you've ever built is a river.
The question was never "what does it do?"
The question is: how wide is the channel, how fast is the flow, how deep does it go, and what keeps it from flooding?
Computation is flow through constraint, at scale and depth.