In physics, the Tsallis entropy is a generalization of the standard Boltzmann–Gibbs entropy.
It is proportional to the expectation of the q-logarithm of a distribution.
The concept was introduced in 1988 by Constantino Tsallis[1] as a basis for generalizing the standard statistical mechanics and is identical in form to Havrda–Charvát structural α-entropy,[2] introduced in 1967 within information theory.
Given a discrete set of probabilities
any real number, the Tsallis entropy is defined as where
is a real parameter sometimes called entropic-index and
, the usual Boltzmann–Gibbs entropy is recovered, namely where one identifies
with the Boltzmann constant
For continuous probability distributions, we define the entropy as where
is a probability density function.
The cross-entropy pendant is the expectation of the negative q-logarithm with respect to a second distribution,
computes the negative of the slope of
, raising this expectation relates to log-likelihood maximalization.
A logarithm can be expressed in terms of a slope through
resulting in the following formula for the standard entropy: Likewise, the discrete Tsallis entropy satisfies where Dq is the q-derivative with respect to x.
Given two independent systems A and B, for which the joint probability density satisfies the Tsallis entropy of this system satisfies From this result, it is evident that the parameter
is a measure of the departure from additivity.
In the limit when q = 1, which is what is expected for an additive system.
This property is sometimes referred to as "pseudo-additivity".
Many common distributions like the normal distribution belongs to the statistical exponential families.
Tsallis entropy for an exponential family can be written [3] as where F is log-normalizer and k the term indicating the carrier measure.
For multivariate normal, term k is zero, and therefore the Tsallis entropy is in closed-form.
The Tsallis Entropy has been used along with the Principle of maximum entropy to derive the Tsallis distribution.
In scientific literature, the physical relevance of the Tsallis entropy has been debated.
[4][5][6] However, from the years 2000 on, an increasingly wide spectrum of natural, artificial and social complex systems have been identified which confirm the predictions and consequences that are derived from this nonadditive entropy, such as nonextensive statistical mechanics,[7] which generalizes the Boltzmann–Gibbs theory.
Among the various experimental verifications and applications presently available in the literature, the following ones deserve a special mention: Among the various available theoretical results which clarify the physical conditions under which Tsallis entropy and associated statistics apply, the following ones can be selected: For further details a bibliography is available at http://tsallis.cat.cbpf.br/biblio.htm Several interesting physical systems[28] abide by entropic functionals that are more general than the standard Tsallis entropy.
Therefore, several physically meaningful generalizations have been introduced.
The two most general of these are notably: Superstatistics, introduced by C. Beck and E. G. D. Cohen in 2003[29] and Spectral Statistics, introduced by G. A. Tsekouras and Constantino Tsallis in 2005.
[30] Both these entropic forms have Tsallis and Boltzmann–Gibbs statistics as special cases; Spectral Statistics has been proven to at least contain Superstatistics and it has been conjectured to also cover some additional cases.