Quantum dilogarithm

From formulasearchengine
Revision as of 09:40, 13 January 2014 by 176.73.3.128 (talk) (Had it upside down, sorry :D)
Jump to navigation Jump to search

My name is Jestine (34 years old) and my hobbies are Origami and Microscopy.

Here is my web site; http://Www.hostgator1centcoupon.info/ (support.file1.com) In information theory, dual total correlation (Han 1978) or excess entropy (Olbrich 2008) is one of the two known non-negative generalizations of mutual information. While total correlation is bounded by the sum entropies of the n elements, the dual total correlation is bounded by the joint-entropy of the n elements. Although well behaved, dual total correlation has received much less attention than the total correlation. A measure known as "TSE-complexity" defines a continuum between the total correlation and dual total correlation (Ay 2001).

Definition

For a set of n random variables {X1,,Xn}, the dual total correlation D(X1,,Xn) is given by

D(X1,,Xn)=H(X1,,Xn)i=1nH(Xi|X1,,Xi1,Xi+1,,Xn),

where H(X1,,Xn) is the joint entropy of the variable set {X1,,Xn} and H(Xi|...) is the conditional entropy of variable Xi, given the rest.

Normalized

The dual total correlation normalized between [0,1] is simply the dual total correlation divided by its maximum value H(X1,,Xn),

ND(X1,,Xn)=D(X1,,Xn)H(X1,,Xn).

Bounds

Dual total correlation is non-negative and bounded above by the joint entropy H(X1,,Xn).

0D(X1,,Xn)H(X1,,Xn).

Secondly, Dual total correlation has a close relationship with total correlation, C(X1,,Xn). In particular,

C(X1,,Xn)n1D(X1,,Xn)(n1)C(X1,,Xn).

History

Han (1978) originally defined the dual total correlation as,

D(X1,,Xn)[i=1nH(X1,,Xi1,Xi+1,,Xn)](n1)H(X1,,Xn).

However Abdallah and Plumbley (2010) showed its equivalence to the easier-to-understand form of the joint entropy minus the sum of conditional entropies via the following:

D(X1,,Xn)[i=1nH(X1,,Xi1,Xi+1,,Xn)](n1)H(X1,,Xn)=[i=1nH(X1,,Xi1,Xi+1,,Xn)]+(1n)H(X1,,Xn)=H(X1,,Xn)+[i=1nH(X1,,Xi1,Xi+1,,Xn)H(X1,,Xn)]=H(X1,,Xn)i=1nH(Xi|X1,,Xi1,Xi+1,,Xn).

See also

Template:No footnotes

References

  • Han T. S. (1978). Nonnegative entropy measures of multivariate symmetric correlations, Information and Control 36, 133–156.
  • Fujishige Satoru (1978). Polymatroidal Dependence Structure of a Set of Random Variables, Information and Control 39, 55–72. 21 year-old Glazier James Grippo from Edam, enjoys hang gliding, industrial property developers in singapore developers in singapore and camping. Finds the entire world an motivating place we have spent 4 months at Alejandro de Humboldt National Park..
  • Olbrich, E. and Bertschinger, N. and Ay, N. and Jost, J. (2008). How should complexity scale with system size?, The European Physical Journal B - Condensed Matter and Complex Systems. 21 year-old Glazier James Grippo from Edam, enjoys hang gliding, industrial property developers in singapore developers in singapore and camping. Finds the entire world an motivating place we have spent 4 months at Alejandro de Humboldt National Park..
  • Abdallah S. A. and Plumbley, M. D. (2010). A measure of statistical complexity based on predictive information, ArXiv e-prints. Template:Arxiv.
  • Nihat Ay, E. Olbrich, N. Bertschinger (2001). A unifying framework for complexity measures of finite systems. European Conference on Complex Systems. pdf.