In statistical mechanics, the Gibbs algorithm, introduced by J. Willard Gibbs in 1902, is a criterion for choosing a probability distribution for the statistical ensemble of microstates of a thermodynamic system by minimizing the average log probability subject to the probability distribution pi satisfying a set of constraints (usually expectation values) corresponding to the known macroscopic quantities.
[1] in 1948, Claude Shannon interpreted the negative of this quantity, which he called information entropy, as a measure of the uncertainty in a probability distribution.
Jaynes realized that this quantity could be interpreted as missing information about anything, and generalized the Gibbs algorithm to non-equilibrium systems with the principle of maximum entropy and maximum entropy thermodynamics.
This general result of the Gibbs algorithm is then a maximum entropy probability distribution.
Statisticians identify such distributions as belonging to exponential families.