Divergence (computer science)

In domains where computations are expected to be infinite, such as process calculi, a computation is said to diverge if it fails to be productive (i.e. to continue producing an action within a finite amount of time).

Various subfields of computer science use varying, but mathematically precise, definitions of what it means for a computation to converge or diverge.

[2] The notation t ↓ n means that t reduces to normal form n in zero or more reductions, t↓ means t reduces to some normal form in zero or more reductions, and t↑ means t does not reduce to a normal form; the latter is impossible in a terminating rewriting system.

In the lambda calculus an expression is divergent if it has no normal form.

where ⊥ (bottom) indicates that the object function or its argument diverges.

In the calculus of communicating sequential processes (CSP), divergence occurs when a process performs an endless series of hidden actions.

cannot do anything other than perform hidden actions forever, it is equivalent to the process that does nothing but diverge, denoted