← cognition

Computation as Flow — φ, ψ, Δ, ε, V, L

A system is a river flowing through a channel over distance.

Six variables define everything about how it behaves.

The variables

ε epsilon
Channel width · Resolution

Granularity of flow. How finely change can be expressed. The precision of the pipe.

Δ delta
Flow rate · Transformation

The amount of state moving. Δ defines both throughput and the transformation applied at each step.

φ phi
Inflow · Expansion

New possibilities entering the system. Tributaries feeding the river. The source of growth.

ψ psi
Outflow · Constraint

Stabilisation. Dissipation. The river mouth regulating the system. What leaves and what survives.

V volume
Concurrent capacity

Total water in the system at once. How much can flow concurrently. High V → parallelism. Low V → serial execution.

L length
Pipeline depth

Distance the river travels. Number of transformation stages. High L → deeper refinement. Low L → shallow, immediate output.

System behaviour

ε defines the channel
Δ moves the flow
φ feeds the system
ψ stabilises it
V determines how much runs in parallel
L determines how far transformation proceeds

Failure modes

high φ + low ψ
Flooding — runaway expansion
high Δ + low ε
Turbulence — loss of fidelity
low V
Starvation — underutilised system
high V + low ε
Congestion — contention
high L + low ψ
Drift — error accumulation
low L
Shallow reasoning

AI mapping

φ → token expansion (candidate space)
ψ → model constraints (weights)
Δ → token transitions (inference steps)
ε → sampling resolution (temperature / top-k)
V → context window + parallel tokens
L → layers / depth of reasoning / chain length

Every system you've ever built is a river.

The question was never "what does it do?"

The question is: how wide is the channel, how fast is the flow, how deep does it go, and what keeps it from flooding?

Computation is flow through constraint, at scale and depth.