The size-change termination principle (SCT) guarantees termination for a computer program by proving that infinite computations always trigger infinite descent in data values that are well-founded.
[1] Size-change termination analysis utilizes this principle in order to solve the universal halting problem for a certain class of programs.
[5] The latter application preceded by four years the general formulation of the principle by Lee et al.
The size-change termination principle was introduced by Chin Soon Lee, Neil D. Jones, and Amir M. Ben-Amram in 2001.
The method analyzes these graphs to make conclusions about the existence of monotonically decreasing sequences of parameters throughout the execution of a program.