Tsallis entropy

In physics, the Tsallis entropy is a generalization of the standard Boltzmann–Gibbs entropy. It is proportional to the expectation of the q-logarithm of a distribution.

History
The concept was introduced in 1988 by Constantino Tsallis as a basis for generalizing the standard statistical mechanics and is identical in form to Havrda–Charvát structural α-entropy, introduced in 1967 within information theory.

Definition
Given a discrete set of probabilities $$\{p_i\}$$ with the condition $$\sum_i p_i=1$$, and $$q$$ any real number, the Tsallis entropy is defined as


 * $$S_q({p_i}) = k \cdot \frac{1}{q-1} \left( 1 - \sum_i p_i^q \right),$$

where $$q$$ is a real parameter sometimes called entropic-index and $$k$$ a positive constant.

In the limit as $$q \to 1$$, the usual Boltzmann–Gibbs entropy is recovered, namely


 * $$S_\text{BG} = S_1(p) = -k \sum_i p_i \ln p_i ,$$

where one identifies $$k$$ with the Boltzmann constant $$k_B$$.

For continuous probability distributions, we define the entropy as


 * $$S_q[p] = {1 \over q - 1} \left( 1 - \int (p(x))^q\, dx \right),$$

where $$p(x)$$ is a probability density function.

Cross-entropy
The cross-entropy pendant is the expectation of the negative q-logarithm with respect to a second distribution, $$r$$. So $$\tfrac{1}{q-1}(1 - {\textstyle \sum_i} p_i^q\cdot \tfrac{r_i}{p_i})$$.

Using $$t = q - 1$$, this may be written $$(1 - E_r[p^t])/t$$. For smaller $$t$$, values $$p_i^t$$ all tend towards $$1$$.

The limit $$q\to 1$$ computes the negative of the slope of $$E_r[p^t]$$ at $$t=0$$ and one recovers $$-{\textstyle \sum_i} r_i \ln p_i$$. So for fixed small $$t$$, raising this expectation relates to log-likelihood maximalization.

Identities
A logarithm can be expressed in terms of a slope through $$\tfrac{d}{dx} p^x = p^{x} \ln p$$ resulting in the following formula for the standard entropy:
 * $$S = -\lim_{x\rightarrow 1}\tfrac{d}{dx} \sum_i p_i^x = -{\textstyle \sum_i} p_i \ln p_i$$

Likewise, the discrete Tsallis entropy satisfies
 * $$S_q = -\lim_{x\rightarrow 1}D_q \sum_i p_i^x $$

where Dq is the q-derivative with respect to x.

Non-additivity
Given two independent systems A and B, for which the joint probability density satisfies


 * $$p(A, B) = p(A) p(B),\,$$

the Tsallis entropy of this system satisfies


 * $$S_q(A,B) = S_q(A) + S_q(B) + (1-q)S_q(A) S_q(B).\,$$

From this result, it is evident that the parameter $$|1-q|$$ is a measure of the departure from additivity. In the limit when q = 1,


 * $$S(A,B) = S(A) + S(B),\,$$

which is what is expected for an additive system. This property is sometimes referred to as "pseudo-additivity".

Exponential families
Many common distributions like the normal distribution belongs to the statistical exponential families. Tsallis entropy for an exponential family can be written as


 * $$H^T_q(p_F(x;\theta)) =  \frac{1}{1-q} \left((e^{F(q\theta)-q F(\theta)}) E_p[e^{(q-1)k(x)}]-1  \right)$$

where F is log-normalizer and k the term indicating the carrier measure. For multivariate normal, term k is zero, and therefore the Tsallis entropy is in closed-form.

Applications
The Tsallis Entropy has been used along with the Principle of maximum entropy to derive the Tsallis distribution.

In scientific literature, the physical relevance of the Tsallis entropy has been debated. However, from the years 2000 on, an increasingly wide spectrum of natural, artificial and social complex systems have been identified which confirm the predictions and consequences that are derived from this nonadditive entropy, such as nonextensive statistical mechanics, which generalizes the Boltzmann–Gibbs theory.

Among the various experimental verifications and applications presently available in the literature, the following ones deserve a special mention:


 * 1) The distribution characterizing the motion of cold atoms in dissipative optical lattices predicted in 2003 and observed in 2006.
 * 2) The fluctuations of the magnetic field in the solar wind enabled the calculation of the q-triplet (or Tsallis triplet).
 * 3) The velocity distributions in a driven dissipative dusty plasma.
 * 4) Spin glass relaxation.
 * 5) Trapped ion interacting with a classical buffer gas.
 * 6) High energy collisional experiments at LHC/CERN (CMS, ATLAS and ALICE detectors)  and RHIC/Brookhaven (STAR and PHENIX detectors).

Among the various available theoretical results which clarify the physical conditions under which Tsallis entropy and associated statistics apply, the following ones can be selected:
 * 1) Anomalous diffusion.
 * 2) Uniqueness theorem.
 * 3) Sensitivity to initial conditions and entropy production at the edge of chaos.
 * 4) Probability sets that make the nonadditive Tsallis entropy to be extensive in the thermodynamical sense.
 * 5) Strongly quantum entangled systems and thermodynamics.
 * 6) Thermostatistics of overdamped motion of interacting particles.
 * 7) Nonlinear generalizations of the Schrödinger, Klein–Gordon and Dirac equations.
 * 8) Blackhole entropy calculation.

For further details a bibliography is available at http://tsallis.cat.cbpf.br/biblio.htm

Generalized entropies
Several interesting physical systems abide by entropic functionals that are more general than the standard Tsallis entropy. Therefore, several physically meaningful generalizations have been introduced. The two most general of these are notably: Superstatistics, introduced by C. Beck and E. G. D. Cohen in 2003 and Spectral Statistics, introduced by G. A. Tsekouras and Constantino Tsallis in 2005. Both these entropic forms have Tsallis and Boltzmann–Gibbs statistics as special cases; Spectral Statistics has been proven to at least contain Superstatistics and it has been conjectured to also cover some additional cases.