List of earthquakes in South Africa: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
en>TheAlteredState
mNo edit summary
 
en>HelenOnline
20th and 21st century: updated orkney depth per usgs website
 
Line 1: Line 1:
'''Generalized relative entropy''' (<math>\epsilon</math>-relative entropy) is a measure of dissimilarity between two [[quantum states]]. It is a "one-shot" analogue of [[quantum relative entropy]] and shares many properties of the latter quantity.
39 yr old Chiropractor Harry Bedell from Strathroy, enjoys to spend some time fencing, property developers in singapore and soccer. Lately had a family trip to Madrid.<br><br>Here is my blog: [http://sungfoundation.org/?option=com_k2&view=itemlist&task=user&id=25848 apartment for sale]
 
In the study of [[quantum information theory]], we typically assume that information processing tasks are repeated multiple times, independently. The corresponding information-theoretic notions are therefore defined in the asymptotic limit. The quintessential entropy measure, [[von Neumann entropy]], is one such notion. In contrast, the study of one-shot quantum information theory is concerned with information processing when a task is conducted only once. New entropic measures emerge in this scenario, as traditional notions cease to give a precise characterization of resource requirements. <math>\epsilon</math>-relative entropy is one such particularly interesting measure.
 
In the asymptotic scenario, relative entropy acts as a parent quantity for other measures besides being an important measure itself. Similarly, <math>\epsilon</math>-relative entropy functions as a parent quantity for other measures in the one-shot scenario.
 
== Definition ==
 
To motivate the definition of the <math>\epsilon</math>-relative entropy <math>D^{\epsilon}(\rho||\sigma)</math>, consider the information processing task of [[hypothesis testing]]. In hypothesis testing, we wish to devise a strategy to distinguish between two density operators <math>\rho</math> and <math>\sigma</math>. A strategy is a [[POVM]] with elements <math>Q</math> and <math>I - Q</math>. The probability that the strategy produces a correct guess on input <math>\rho</math> is given by <math>\operatorname{Tr}(\rho Q)</math> and the probability that it produces a wrong guess is given by <math>\operatorname{Tr}(\sigma Q)</math>. <math>\epsilon</math>-relative entropy captures the minimum probability of error when the state is <math>\sigma</math>, given that the success probability for <math>\rho</math> is at least <math>\epsilon</math>.
 
For <math>\epsilon \in (0,1)</math>, the <math>\epsilon</math>-relative entropy between two quantum states<math>\rho</math> and <math>\sigma</math> is defined as
:::<math> D^{\epsilon}(\rho||\sigma) = - \log \frac{1}{\epsilon} \min \{ \langle Q, \sigma \rangle | 0 \leq Q \leq I \text{ and } \langle Q ,\rho\rangle \geq \epsilon\} ~.</math>
 
From the definition, it is clear that <math>D^{\epsilon}(\rho||\sigma)\geq 0</math>. This inequality is saturated if and only if <math>\rho = \sigma</math>, as shown [[#Relationship to the trace distance|below]].
 
== Relationship to the trace distance ==
Suppose the [[trace distance]] between two density operators <math>\rho</math> and <math>\sigma</math> is
::: <math>||\rho - \sigma||_1 = \delta ~.</math>
 
For  <math>0< \epsilon< 1</math>, it holds that
:::a) <math>\log \frac{\epsilon}{\epsilon - (1-\epsilon)\delta} \quad \leq \quad D^{\epsilon}(\rho||\sigma) \quad \leq \quad \log \frac{\epsilon}{\epsilon - \delta} ~.</math>
 
In particular, this implies the following analogue of the Pinsker inequality<ref>Watrous, J. Theory of Quantum Information, Fall 2013. Ch. 5, page 194 https://cs.uwaterloo.ca/~watrous/CS766/DraftChapters/5.QuantumEntropy.pdf</ref>
 
:::b) <math>\frac{1-\epsilon}{\epsilon}||\rho-\sigma||_1 \quad \leq \quad D^{\epsilon}(\rho||\sigma) ~.</math>
 
Furthermore, the proposition implies that for any <math>\epsilon \in (0,1)</math>, <math>D^{\epsilon}(\rho||\sigma) = 0</math> if and only if <math>\rho = \sigma</math>, inheriting this property from the trace distance. This result and its proof can be found in Dupuis et al.<ref>Dupuis, F., et al. "Generalized entropies."  {{arxiv|1211.3141}}.</ref>
 
=== Proof of inequality a)===
 
''Upper bound'': Trace distance can be written as
 
::: <math> ||\rho - \sigma||_1 = \max_{0\leq Q \leq 1} \operatorname{Tr}(Q(\rho - \sigma)) ~.</math>
 
This maximum is achieved when <math>Q</math> is the orthogonal projector onto the positive eigenspace of <math>\rho - \sigma</math>. For any [[POVM]] element <math>Q</math> we have
:::<math>\operatorname{Tr}(Q(\rho - \sigma)) \leq \delta</math>
so that if <math>\operatorname{Tr}(Q\rho) \geq \epsilon</math>, we have
:::<math>\operatorname{Tr}(Q\sigma) ~\geq~ \operatorname{Tr}(Q\rho) - \delta ~\geq~ \epsilon - \delta~.</math>
 
From the definition of the <math>\epsilon</math>-relative entropy, we get
::: <math>2^{- D^{\epsilon}(\rho||\sigma)}\geq \frac{\epsilon - \delta}{\epsilon}  ~.</math>
 
''Lower bound'': Let <math>Q</math> be the orthogonal projection onto the positive eigenspace of <math>\rho - \sigma</math>, and let <math>\bar Q</math> be the following convex combination of <math>I</math> and <math>Q</math>:
::: <math> \bar Q = (\epsilon - \mu)I + (1 - \epsilon + \mu)Q</math>
where <math>\mu = \frac{(1-\epsilon)\operatorname{Tr}(Q\rho)}{1 - \operatorname{Tr}(Q\rho)} ~.</math>
 
This means
:::<math>\mu = (1-\epsilon + \mu)\operatorname{Tr}(Q\rho)</math>
and thus
:::<math> \operatorname{Tr}(\bar Q \rho) ~=~ (\epsilon - \mu) + (1-\epsilon + \mu)\operatorname{Tr}(Q\rho) ~=~ \epsilon ~.</math>
Moreover,
:::<math>\operatorname{Tr}(\bar Q \sigma) ~=~ \epsilon - \mu + (1-\epsilon + \mu)\operatorname{Tr}(Q\sigma) ~.</math>
Using <math>\mu = (1-\epsilon + \mu)\operatorname{Tr}(Q\rho)</math>, our choice of <math>Q</math>, and finally the definition of <math>\mu</math>,  we can re-write this as
:::<math>\operatorname{Tr}(\bar Q \sigma)  ~=~ \epsilon - (1 - \epsilon + \mu)\operatorname{Tr}(Q\rho) + (1 - \epsilon + \mu)\operatorname{Tr}(Q\sigma)</math>
::::::<math> ~=~ \epsilon - \frac{(1-\epsilon)\delta}{1-\operatorname{Tr} (Q\rho)} ~\leq~ \epsilon - (1-\epsilon)\delta ~.
</math>
 
Hence
:::<math>D^{\epsilon}(\rho||\sigma) \geq \log \frac{\epsilon}{\epsilon - (1-\epsilon)\delta} ~.</math>
 
===Proof of inequality (b)===
 
To derive this ''Pinsker-like inequality'', observe that
:::<math>\log \frac{\epsilon}{\epsilon - (1-\epsilon)\delta}  ~=~ -\log\left( 1 - \frac{(1-\epsilon)\delta}{\epsilon} \right) ~\geq~ \delta \frac{1-\epsilon}{\epsilon} ~.</math>
 
== Alternative proof of the Data Processing inequality ==
A fundamental property of von Neumann entropy is [[strong subadditivity of quantum entropy|strong subadditivity]].  Let <math>S(\sigma)</math> denote the von Neumann entropy of the quantum state <math>\sigma</math>, and let <math>\rho_{ABC}</math> be a quantum state on the tensor product [[Hilbert space]] <math>\mathcal{H}_A\otimes \mathcal{H}_B \otimes \mathcal{H}_C</math>. Strong subadditivity states that
:::<math>S(\rho_{ABC}) + S(\rho_B) \leq S(\rho_{AB}) + S(\rho_{BC})</math>
where <math>\rho_{AB},\rho_{BC},\rho_{B}</math> refer to the [[reduced density matrices]] on the spaces indicated by the subscripts.
When re-written in terms of [[mutual information]], this inequality has an intuitive interpretation; it states that the information content in a system cannot increase by the action of a local [[quantum operation]] on that system. In this form, it is better known as the [[data processing inequality]], and is equivalent to the monotonicity of relative entropy under quantum operations:<ref>Ruskai, Mary Beth. "Inequalities for quantum entropy: A review with conditions for equality." Journal of Mathematical Physics 43 (2002): 4358. {{arXiv|0205064}}</ref>
:::<math>S(\rho||\sigma) - S(\mathcal{E}(\rho)||\mathcal{E}(\sigma)) \geq 0</math>
for every [[quantum channel|CPTP map]] <math>\mathcal{E}</math>, where <math>S(\omega||\tau)</math> denotes the relative entropy of the quantum states <math>\omega, \tau</math>.
 
It is readily seen that <math>\epsilon</math>-relative entropy also obeys monotonicity under quantum operations:<ref>Wang, L. and Renner, R. "One-shot classical-quantum capacity and hypothesis testing." Physical Review Letters 108.20 (2012): 200501. {{arXiv|1007.5456v3}}</ref>
:::<math>D^{\epsilon}(\rho||\sigma) \geq D^{\epsilon}(\mathcal{E}(\rho)||\mathcal{E}(\sigma))</math>,
for any CPTP map <math>\mathcal{E}</math>.
To see this, suppose we have a POVM <math>(R,I-R)</math> to distinguish between <math>\mathcal{E}(\rho)</math> and <math>\mathcal{E}(\sigma)</math> such that <math>\langle R, \mathcal{E}(\rho)\rangle = \langle \mathcal{E}^{\dagger}(R), \rho \rangle \geq \epsilon</math>. We construct a new POVM <math>(\mathcal{E}^{\dagger}(R), I - \mathcal{E}^{\dagger}(R))</math> to distinguish between <math>\rho</math> and <math>\sigma</math>. Since the adjoint of any CPTP map is also positive and unital,  this is a valid POVM. Note that <math>\langle R, \mathcal{E}(\sigma)\rangle = \langle \mathcal{E}^{\dagger}(R), \sigma\rangle \geq \langle Q,\sigma\rangle</math>, where <math>(Q, I-Q)</math> is the POVM that achieves <math>D^{\epsilon}(\rho||\sigma)</math>.
Not only is this interesting in itself, but it also gives us the following alternative method to prove the data processing inequality.<ref>Dupuis, F., et al. "Generalized entropies." {{arxiv|1211.3141}} (2012).</ref>
 
By the quantum analogue of the Stein lemma,<ref>Petz, Dénes. Quantum information theory and quantum statistics. Springer, 2008. Chapter 8</ref>
:::<math>\lim_{n\rightarrow\infty}\frac{1}{n}D^{\epsilon}(\rho^{\otimes n}||\sigma^{\otimes n})  = \lim_{n\rightarrow\infty}\frac{-1}{n}\log \min \frac{1}{\epsilon}\operatorname{Tr}(\sigma^{\otimes n} Q) </math>
:::::::::::<math> = D(\rho||\sigma) - \lim_{n\rightarrow\infty}\frac{1}{n}\left( \log\frac{1}{\epsilon} \right) </math>
:::::::::::<math> = D(\rho||\sigma) ~, </math>
where the minimum is taken over <math>0\leq Q\leq 1</math> such that <math>\operatorname{Tr}(Q\rho^{\otimes n})\geq \epsilon ~.</math>
 
Applying the data processing inequality to the states <math>\rho^{\otimes n}</math> and <math>\sigma^{\otimes n}</math> with the CPTP map <math>\mathcal{E}^{\otimes n}</math>, we get
:::<math>D^{\epsilon}(\rho^{\otimes n}||\sigma^{\otimes n}) ~\geq~ D^{\epsilon}(\mathcal{E}(\rho)^{\otimes n}||\mathcal{E}(\sigma)^{\otimes n}) ~.</math>
Dividing by <math>n</math> on either side and taking the limit as <math>n \rightarrow\infty</math>, we get the desired result.
 
==See also==
*[[Quantum relative entropy]]
*[[Strong subadditivity]]
*[[information theory|Classical information theory]]
*[[Min entropy]]
 
==References==
{{reflist}}
 
 
 
[[Category:Quantum mechanical entropy]]

Latest revision as of 18:19, 16 October 2014

39 yr old Chiropractor Harry Bedell from Strathroy, enjoys to spend some time fencing, property developers in singapore and soccer. Lately had a family trip to Madrid.

Here is my blog: apartment for sale